Framework for assessing key variable dependencies in loose-abrasive grinding and polishing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, J.S.; Aikens, D.M.; Brown, N.J.
1995-12-01
This memo describes a framework for identifying all key variables that determine the figuring performance of loose-abrasive lapping and polishing machines. This framework is intended as a tool for prioritizing R&D issues, assessing the completeness of process models and experimental data, and for providing a mechanism to identify any assumptions in analytical models or experimental procedures. Future plans for preparing analytical models or performing experiments can refer to this framework in establishing the context of the work.
DOE Office of Scientific and Technical Information (OSTI.GOV)
López C, Diana C.; Wozny, Günter; Flores-Tlacuahuac, Antonio
2016-03-23
The lack of informative experimental data and the complexity of first-principles battery models make the recovery of kinetic, transport, and thermodynamic parameters complicated. We present a computational framework that combines sensitivity, singular value, and Monte Carlo analysis to explore how different sources of experimental data affect parameter structural ill conditioning and identifiability. Our study is conducted on a modified version of the Doyle-Fuller-Newman model. We demonstrate that the use of voltage discharge curves only enables the identification of a small parameter subset, regardless of the number of experiments considered. Furthermore, we show that the inclusion of a single electrolyte concentrationmore » measurement significantly aids identifiability and mitigates ill-conditioning.« less
A framework for self-experimentation in personalized health.
Karkar, Ravi; Zia, Jasmine; Vilardaga, Roger; Mishra, Sonali R; Fogarty, James; Munson, Sean A; Kientz, Julie A
2016-05-01
To describe an interdisciplinary and methodological framework for applying single case study designs to self-experimentation in personalized health. The authors examine the framework's applicability to various health conditions and present an initial case study with irritable bowel syndrome (IBS). An in-depth literature review was performed to develop the framework and to identify absolute and desired health condition requirements for the application of this framework. The authors developed mobile application prototypes, storyboards, and process flows of the framework using IBS as the case study. The authors conducted three focus groups and an online survey using a human-centered design approach for assessing the framework's feasibility. All 6 focus group participants had a positive view about our framework and volunteered to participate in future studies. Most stated they would trust the results because it was their own data being analyzed. They were most concerned about confounds, nonmeaningful measures, and erroneous assumptions on the timing of trigger effects. Survey respondents (N = 60) were more likely to be adherent to an 8- vs 12-day study length even if it meant lower confidence results. Implementation of the self-experimentation framework in a mobile application appears to be feasible for people with IBS. This framework can likely be applied to other health conditions. Considerations include the learning curve for teaching self-experimentation to non-experts and the challenges involved in operationalizing and customizing study designs. Using mobile technology to guide people through self-experimentation to investigate health questions is a feasible and promising approach to advancing personalized health. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Kwok, T; Smith, K A
2000-09-01
The aim of this paper is to study both the theoretical and experimental properties of chaotic neural network (CNN) models for solving combinatorial optimization problems. Previously we have proposed a unifying framework which encompasses the three main model types, namely, Chen and Aihara's chaotic simulated annealing (CSA) with decaying self-coupling, Wang and Smith's CSA with decaying timestep, and the Hopfield network with chaotic noise. Each of these models can be represented as a special case under the framework for certain conditions. This paper combines the framework with experimental results to provide new insights into the effect of the chaotic neurodynamics of each model. By solving the N-queen problem of various sizes with computer simulations, the CNN models are compared in different parameter spaces, with optimization performance measured in terms of feasibility, efficiency, robustness and scalability. Furthermore, characteristic chaotic neurodynamics crucial to effective optimization are identified, together with a guide to choosing the corresponding model parameters.
JCell--a Java-based framework for inferring regulatory networks from time series data.
Spieth, C; Supper, J; Streichert, F; Speer, N; Zell, A
2006-08-15
JCell is a Java-based application for reconstructing gene regulatory networks from experimental data. The framework provides several algorithms to identify genetic and metabolic dependencies based on experimental data conjoint with mathematical models to describe and simulate regulatory systems. Owing to the modular structure, researchers can easily implement new methods. JCell is a pure Java application with additional scripting capabilities and thus widely usable, e.g. on parallel or cluster computers. The software is freely available for download at http://www-ra.informatik.uni-tuebingen.de/software/JCell.
Pagan, Darren C.; Miller, Matthew P.
2014-01-01
A forward modeling diffraction framework is introduced and employed to identify slip system activity in high-energy diffraction microscopy (HEDM) experiments. In the framework, diffraction simulations are conducted on virtual mosaic crystals with orientation gradients consistent with Nye’s model of heterogeneous single slip. Simulated diffraction peaks are then compared against experimental measurements to identify slip system activity. Simulation results compared against diffraction data measured in situ from a silicon single-crystal specimen plastically deformed under single-slip conditions indicate that slip system activity can be identified during HEDM experiments. PMID:24904242
A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori
2005-07-01
The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectivesmore » of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.« less
A Framework to Design and Optimize Chemical Flooding Processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori
2006-08-31
The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectivesmore » of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.« less
A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori
2004-11-01
The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectivesmore » of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.« less
A Novel Framework Based on FastICA for High Density Surface EMG Decomposition
Chen, Maoqi; Zhou, Ping
2015-01-01
This study presents a progressive FastICA peel-off (PFP) framework for high density surface electromyogram (EMG) decomposition. The novel framework is based on a shift-invariant model for describing surface EMG. The decomposition process can be viewed as progressively expanding the set of motor unit spike trains, which is primarily based on FastICA. To overcome the local convergence of FastICA, a “peel off” strategy (i.e. removal of the estimated motor unit action potential (MUAP) trains from the previous step) is used to mitigate the effects of the already identified motor units, so more motor units can be extracted. Moreover, a constrained FastICA is applied to assess the extracted spike trains and correct possible erroneous or missed spikes. These procedures work together to improve the decomposition performance. The proposed framework was validated using simulated surface EMG signals with different motor unit numbers (30, 70, 91) and signal to noise ratios (SNRs) (20, 10, 0 dB). The results demonstrated relatively large numbers of extracted motor units and high accuracies (high F1-scores). The framework was also tested with 111 trials of 64-channel electrode array experimental surface EMG signals during the first dorsal interosseous (FDI) muscle contraction at different intensities. On average 14.1 ± 5.0 motor units were identified from each trial of experimental surface EMG signals. PMID:25775496
ERIC Educational Resources Information Center
Wilhelm, William J.; Czyzewski, Alan B.
2006-01-01
This study was designed to identify classroom interventions that can be used by core business course instructors (as opposed to trained business ethicists) to positively affect levels of moral reasoning in undergraduate business students. The quasi-experimental study conducted at a Midwestern university, focused on determining if the utilization…
Carlson, Jean M.
2018-01-01
In this paper we study antibiotic-induced C. difficile infection (CDI), caused by the toxin-producing C. difficile (CD), and implement clinically-inspired simulated treatments in a computational framework that synthesizes a generalized Lotka-Volterra (gLV) model with SIR modeling techniques. The gLV model uses parameters derived from an experimental mouse model, in which the mice are administered antibiotics and subsequently dosed with CD. We numerically identify which of the experimentally measured initial conditions are vulnerable to CD colonization, then formalize the notion of CD susceptibility analytically. We simulate fecal transplantation, a clinically successful treatment for CDI, and discover that both the transplant timing and transplant donor are relevant to the the efficacy of the treatment, a result which has clinical implications. We incorporate two nongeneric yet dangerous attributes of CD into the gLV model, sporulation and antibiotic-resistant mutation, and for each identify relevant SIR techniques that describe the desired attribute. Finally, we rely on the results of our framework to analyze an experimental study of fecal transplants in mice, and are able to explain observed experimental results, validate our simulated results, and suggest model-motivated experiments. PMID:29451873
Jones, Eric W; Carlson, Jean M
2018-02-01
In this paper we study antibiotic-induced C. difficile infection (CDI), caused by the toxin-producing C. difficile (CD), and implement clinically-inspired simulated treatments in a computational framework that synthesizes a generalized Lotka-Volterra (gLV) model with SIR modeling techniques. The gLV model uses parameters derived from an experimental mouse model, in which the mice are administered antibiotics and subsequently dosed with CD. We numerically identify which of the experimentally measured initial conditions are vulnerable to CD colonization, then formalize the notion of CD susceptibility analytically. We simulate fecal transplantation, a clinically successful treatment for CDI, and discover that both the transplant timing and transplant donor are relevant to the the efficacy of the treatment, a result which has clinical implications. We incorporate two nongeneric yet dangerous attributes of CD into the gLV model, sporulation and antibiotic-resistant mutation, and for each identify relevant SIR techniques that describe the desired attribute. Finally, we rely on the results of our framework to analyze an experimental study of fecal transplants in mice, and are able to explain observed experimental results, validate our simulated results, and suggest model-motivated experiments.
Practical use of a framework for network science experimentation
NASA Astrophysics Data System (ADS)
Toth, Andrew; Bergamaschi, Flavio
2014-06-01
In 2006, the US Army Research Laboratory (ARL) and the UK Ministry of Defence (MoD) established a collaborative research alliance with academia and industry, called the International Technology Alliance (ITA)1 In Network and Information Sciences, to address fundamental issues concerning Network and Information Sciences that will enhance decision making for coalition operations and enable rapid, secure formation of ad hoc teams in coalition environments and enhance US and UK capabilities to conduct coalition warfare. Research conducted under the ITA was extended through collaboration between ARL and IBM UK to characterize and dene a software stack and tooling that has become the reference framework for network science experimentation in support for validation of theoretical research. This paper discusses the composition of the reference framework for experimentation resulting from the ARL/IBM UK collaboration and its use, by the Network Science Collaborative Technology Alliance (NS CTA)2 , in a recent network science experiment conducted at ARL. It also discusses how the experiment was modeled using the reference framework, the integration of two new components, the Apollo Fact-Finder3 tool and the Medusa Crowd Sensing4 application, the limitations identified and how they shall be addressed in future work.
Wu, Zujian; Pang, Wei; Coghill, George M
2015-01-01
Both qualitative and quantitative model learning frameworks for biochemical systems have been studied in computational systems biology. In this research, after introducing two forms of pre-defined component patterns to represent biochemical models, we propose an integrative qualitative and quantitative modelling framework for inferring biochemical systems. In the proposed framework, interactions between reactants in the candidate models for a target biochemical system are evolved and eventually identified by the application of a qualitative model learning approach with an evolution strategy. Kinetic rates of the models generated from qualitative model learning are then further optimised by employing a quantitative approach with simulated annealing. Experimental results indicate that our proposed integrative framework is feasible to learn the relationships between biochemical reactants qualitatively and to make the model replicate the behaviours of the target system by optimising the kinetic rates quantitatively. Moreover, potential reactants of a target biochemical system can be discovered by hypothesising complex reactants in the synthetic models. Based on the biochemical models learned from the proposed framework, biologists can further perform experimental study in wet laboratory. In this way, natural biochemical systems can be better understood.
Parent Praise to 1-3 Year-Olds Predicts Children’s Motivational Frameworks 5 Years Later
Gunderson, Elizabeth A.; Gripshover, Sarah J.; Romero, Carissa; Dweck, Carol S.; Goldin-Meadow, Susan; Levine, Susan C.
2013-01-01
In laboratory studies, praising children’s effort encourages them to adopt incremental motivational frameworks—they believe ability is malleable, attribute success to hard work, enjoy challenges, and generate strategies for improvement. In contrast, praising children’s inherent abilities encourages them to adopt fixed-ability frameworks. Does the praise parents spontaneously give children at home show the same effects? Although parents’ early praise of inherent characteristics was not associated with children’s later fixed-ability frameworks, parents’ praise of children’s effort at 14-38 months (N=53) did predict incremental frameworks at 7-8 years, suggesting that causal mechanisms identified in experimental work may be operating in home environments. PMID:23397904
Nelson, Kurt; James, Scott C.; Roberts, Jesse D.; ...
2017-06-05
A modelling framework identifies deployment locations for current-energy-capture devices that maximise power output while minimising potential environmental impacts. The framework, based on the Environmental Fluid Dynamics Code, can incorporate site-specific environmental constraints. Over a 29-day period, energy outputs from three array layouts were estimated for: (1) the preliminary configuration (baseline), (2) an updated configuration that accounted for environmental constraints, (3) and an improved configuration subject to no environmental constraints. Of these layouts, array placement that did not consider environmental constraints extracted the most energy from flow (4.38 MW-hr/day), 19% higher than output from the baseline configuration (3.69 MW-hr/day). Array placementmore » that considered environmental constraints removed 4.27 MW-hr/day of energy (16% more than baseline). In conclusion, this analysis framework accounts for bathymetry and flow-pattern variations that typical experimental studies cannot, demonstrating that it is a valuable tool for identifying improved array layouts for field deployments.« less
Identifying agro-ecoregions for the Long Term Agroecosystem Research (LTAR) network
USDA-ARS?s Scientific Manuscript database
The LTAR network exists to examine questions of agricultural sustainability in working lands across the continental US. To effectively address agricultural sustainability at this scale, experimental designs must be placed into the context of a regional framework that can extrapolate site-specific re...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, Kurt; James, Scott C.; Roberts, Jesse D.
A modelling framework identifies deployment locations for current-energy-capture devices that maximise power output while minimising potential environmental impacts. The framework, based on the Environmental Fluid Dynamics Code, can incorporate site-specific environmental constraints. Over a 29-day period, energy outputs from three array layouts were estimated for: (1) the preliminary configuration (baseline), (2) an updated configuration that accounted for environmental constraints, (3) and an improved configuration subject to no environmental constraints. Of these layouts, array placement that did not consider environmental constraints extracted the most energy from flow (4.38 MW-hr/day), 19% higher than output from the baseline configuration (3.69 MW-hr/day). Array placementmore » that considered environmental constraints removed 4.27 MW-hr/day of energy (16% more than baseline). In conclusion, this analysis framework accounts for bathymetry and flow-pattern variations that typical experimental studies cannot, demonstrating that it is a valuable tool for identifying improved array layouts for field deployments.« less
Arighi, Cecilia; Shamovsky, Veronica; Masci, Anna Maria; Ruttenberg, Alan; Smith, Barry; Natale, Darren A; Wu, Cathy; D'Eustachio, Peter
2015-01-01
The Protein Ontology (PRO) provides terms for and supports annotation of species-specific protein complexes in an ontology framework that relates them both to their components and to species-independent families of complexes. Comprehensive curation of experimentally known forms and annotations thereof is expected to expose discrepancies, differences, and gaps in our knowledge. We have annotated the early events of innate immune signaling mediated by Toll-Like Receptor 3 and 4 complexes in human, mouse, and chicken. The resulting ontology and annotation data set has allowed us to identify species-specific gaps in experimental data and possible functional differences between species, and to employ inferred structural and functional relationships to suggest plausible resolutions of these discrepancies and gaps.
Identifying the perceptive users for online social systems
Liu, Xiao-Lu; Guo, Qiang; Han, Jing-Ti
2017-01-01
In this paper, the perceptive user, who could identify the high-quality objects in their initial lifespan, is presented. By tracking the ratings given to the rewarded objects, we present a method to identify the user perceptibility, which is defined as the capability that a user can identify these objects at their early lifespan. Moreover, we investigate the behavior patterns of the perceptive users from three dimensions: User activity, correlation characteristics of user rating series and user reputation. The experimental results for the empirical networks indicate that high perceptibility users show significantly different behavior patterns with the others: Having larger degree, stronger correlation of rating series and higher reputation. Furthermore, in view of the hysteresis in finding the rewarded objects, we present a general framework for identifying the high perceptibility users based on user behavior patterns. The experimental results show that this work is helpful for deeply understanding the collective behavior patterns for online users. PMID:28704382
Identifying the perceptive users for online social systems.
Liu, Jian-Guo; Liu, Xiao-Lu; Guo, Qiang; Han, Jing-Ti
2017-01-01
In this paper, the perceptive user, who could identify the high-quality objects in their initial lifespan, is presented. By tracking the ratings given to the rewarded objects, we present a method to identify the user perceptibility, which is defined as the capability that a user can identify these objects at their early lifespan. Moreover, we investigate the behavior patterns of the perceptive users from three dimensions: User activity, correlation characteristics of user rating series and user reputation. The experimental results for the empirical networks indicate that high perceptibility users show significantly different behavior patterns with the others: Having larger degree, stronger correlation of rating series and higher reputation. Furthermore, in view of the hysteresis in finding the rewarded objects, we present a general framework for identifying the high perceptibility users based on user behavior patterns. The experimental results show that this work is helpful for deeply understanding the collective behavior patterns for online users.
Moorkanikkara, Srinivas Nageswaran; Blankschtein, Daniel
2010-12-21
How does one design a surfactant mixture using a set of available surfactants such that it exhibits a desired adsorption kinetics behavior? The traditional approach used to address this design problem involves conducting trial-and-error experiments with specific surfactant mixtures. This approach is typically time-consuming and resource-intensive and becomes increasingly challenging when the number of surfactants that can be mixed increases. In this article, we propose a new theoretical framework to identify a surfactant mixture that most closely meets a desired adsorption kinetics behavior. Specifically, the new theoretical framework involves (a) formulating the surfactant mixture design problem as an optimization problem using an adsorption kinetics model and (b) solving the optimization problem using a commercial optimization package. The proposed framework aims to identify the surfactant mixture that most closely satisfies the desired adsorption kinetics behavior subject to the predictive capabilities of the chosen adsorption kinetics model. Experiments can then be conducted at the identified surfactant mixture condition to validate the predictions. We demonstrate the reliability and effectiveness of the proposed theoretical framework through a realistic case study by identifying a nonionic surfactant mixture consisting of up to four alkyl poly(ethylene oxide) surfactants (C(10)E(4), C(12)E(5), C(12)E(6), and C(10)E(8)) such that it most closely exhibits a desired dynamic surface tension (DST) profile. Specifically, we use the Mulqueen-Stebe-Blankschtein (MSB) adsorption kinetics model (Mulqueen, M.; Stebe, K. J.; Blankschtein, D. Langmuir 2001, 17, 5196-5207) to formulate the optimization problem as well as the SNOPT commercial optimization solver to identify a surfactant mixture consisting of these four surfactants that most closely exhibits the desired DST profile. Finally, we compare the experimental DST profile measured at the surfactant mixture condition identified by the new theoretical framework with the desired DST profile and find good agreement between the two profiles.
NASA Technical Reports Server (NTRS)
Macelroy, Robert D.; Smernoff, David T.; Rummel, John D.
1987-01-01
Problems of food production by higher plants are addressed. Experimentation requirements and necessary equipment for designing an experimental Controlled Ecological Life Support System (CELSS) Plant Growth Module are defined. A framework is provided for the design of laboratory sized plant growth chambers. The rationale for the development of an informal collaborative effort between investigators from universities and industry and those at Ames is evaluated. Specific research problems appropriate for collaborative efforts are identified.
Choi, Insook
2018-01-01
Sonification is an open-ended design task to construct sound informing a listener of data. Understanding application context is critical for shaping design requirements for data translation into sound. Sonification requires methodology to maintain reproducibility when data sources exhibit non-linear properties of self-organization and emergent behavior. This research formalizes interactive sonification in an extensible model to support reproducibility when data exhibits emergent behavior. In the absence of sonification theory, extensibility demonstrates relevant methods across case studies. The interactive sonification framework foregrounds three factors: reproducible system implementation for generating sonification; interactive mechanisms enhancing a listener's multisensory observations; and reproducible data from models that characterize emergent behavior. Supramodal attention research suggests interactive exploration with auditory feedback can generate context for recognizing irregular patterns and transient dynamics. The sonification framework provides circular causality as a signal pathway for modeling a listener interacting with emergent behavior. The extensible sonification model adopts a data acquisition pathway to formalize functional symmetry across three subsystems: Experimental Data Source, Sound Generation, and Guided Exploration. To differentiate time criticality and dimensionality of emerging dynamics, tuning functions are applied between subsystems to maintain scale and symmetry of concurrent processes and temporal dynamics. Tuning functions accommodate sonification design strategies that yield order parameter values to render emerging patterns discoverable as well as rehearsable, to reproduce desired instances for clinical listeners. Case studies are implemented with two computational models, Chua's circuit and Swarm Chemistry social agent simulation, generating data in real-time that exhibits emergent behavior. Heuristic Listening is introduced as an informal model of a listener's clinical attention to data sonification through multisensory interaction in a context of structured inquiry. Three methods are introduced to assess the proposed sonification framework: Listening Scenario classification, data flow Attunement, and Sonification Design Patterns to classify sound control. Case study implementations are assessed against these methods comparing levels of abstraction between experimental data and sound generation. Outcomes demonstrate the framework performance as a reference model for representing experimental implementations, also for identifying common sonification structures having different experimental implementations, identifying common functions implemented in different subsystems, and comparing impact of affordances across multiple implementations of listening scenarios. PMID:29755311
ERIC Educational Resources Information Center
Schweizer, Karl
2008-01-01
Structural equation modeling provides the framework for investigating experimental effects on the basis of variances and covariances in repeated measurements. A special type of confirmatory factor analysis as part of this framework enables the appropriate representation of the experimental effect and the separation of experimental and…
Cho, Kwang-Hyun; Choo, Sang-Mok; Wellstead, Peter; Wolkenhauer, Olaf
2005-08-15
We propose a unified framework for the identification of functional interaction structures of biomolecular networks in a way that leads to a new experimental design procedure. In developing our approach, we have built upon previous work. Thus we begin by pointing out some of the restrictions associated with existing structure identification methods and point out how these restrictions may be eased. In particular, existing methods use specific forms of experimental algebraic equations with which to identify the functional interaction structure of a biomolecular network. In our work, we employ an extended form of these experimental algebraic equations which, while retaining their merits, also overcome some of their disadvantages. Experimental data are required in order to estimate the coefficients of the experimental algebraic equation set associated with the structure identification task. However, experimentalists are rarely provided with guidance on which parameters to perturb, and to what extent, to perturb them. When a model of network dynamics is required then there is also the vexed question of sample rate and sample time selection to be resolved. Supplying some answers to these questions is the main motivation of this paper. The approach is based on stationary and/or temporal data obtained from parameter perturbations, and unifies the previous approaches of Kholodenko et al. (PNAS 99 (2002) 12841-12846) and Sontag et al. (Bioinformatics 20 (2004) 1877-1886). By way of demonstration, we apply our unified approach to a network model which cannot be properly identified by existing methods. Finally, we propose an experiment design methodology, which is not limited by the amount of parameter perturbations, and illustrate its use with an in numero example.
A critique of the hypothesis, and a defense of the question, as a framework for experimentation.
Glass, David J
2010-07-01
Scientists are often steered by common convention, funding agencies, and journal guidelines into a hypothesis-driven experimental framework, despite Isaac Newton's dictum that hypotheses have no place in experimental science. Some may think that Newton's cautionary note, which was in keeping with an experimental approach espoused by Francis Bacon, is inapplicable to current experimental method since, in accord with the philosopher Karl Popper, modern-day hypotheses are framed to serve as instruments of falsification, as opposed to verification. But Popper's "critical rationalist" framework too is problematic. It has been accused of being: inconsistent on philosophical grounds; unworkable for modern "large science," such as systems biology; inconsistent with the actual goals of experimental science, which is verification and not falsification; and harmful to the process of discovery as a practical matter. A criticism of the hypothesis as a framework for experimentation is offered. Presented is an alternative framework-the query/model approach-which many scientists may discover is the framework they are actually using, despite being required to give lip service to the hypothesis.
Oh, Hong-Choon; Toh, Hong-Guan; Giap Cheong, Eddy Seng
2011-11-01
Using the classical process improvement framework of Plan-Do-Study-Act (PDSA), the diagnostic radiology department of a tertiary hospital identified several patient cycle time reduction strategies. Experimentation of these strategies (which included procurement of new machines, hiring of new staff, redesign of queue system, etc.) through pilot scale implementation was impractical because it might incur substantial expenditure or be operationally disruptive. With this in mind, simulation modeling was used to test these strategies via performance of "what if" analyses. Using the output generated by the simulation model, the team was able to identify a cost-free cycle time reduction strategy, which subsequently led to a reduction of patient cycle time and achievement of a management-defined performance target. As healthcare professionals work continually to improve healthcare operational efficiency in response to rising healthcare costs and patient expectation, simulation modeling offers an effective scientific framework that can complement established process improvement framework like PDSA to realize healthcare process enhancement. © 2011 National Association for Healthcare Quality.
PREMIX: PRivacy-preserving EstiMation of Individual admiXture.
Chen, Feng; Dow, Michelle; Ding, Sijie; Lu, Yao; Jiang, Xiaoqian; Tang, Hua; Wang, Shuang
2016-01-01
In this paper we proposed a framework: PRivacy-preserving EstiMation of Individual admiXture (PREMIX) using Intel software guard extensions (SGX). SGX is a suite of software and hardware architectures to enable efficient and secure computation over confidential data. PREMIX enables multiple sites to securely collaborate on estimating individual admixture within a secure enclave inside Intel SGX. We implemented a feature selection module to identify most discriminative Single Nucleotide Polymorphism (SNP) based on informativeness and an Expectation Maximization (EM)-based Maximum Likelihood estimator to identify the individual admixture. Experimental results based on both simulation and 1000 genome data demonstrated the efficiency and accuracy of the proposed framework. PREMIX ensures a high level of security as all operations on sensitive genomic data are conducted within a secure enclave using SGX.
Gene Profiling in Experimental Models of Eye Growth: Clues to Myopia Pathogenesis
Stone, Richard A.; Khurana, Tejvir S.
2010-01-01
To understand the complex regulatory pathways that underlie the development of refractive errors, expression profiling has evaluated gene expression in ocular tissues of well-characterized experimental models that alter postnatal eye growth and induce refractive errors. Derived from a variety of platforms (e.g. differential display, spotted microarrays or Affymetrix GeneChips), gene expression patterns are now being identified in species that include chicken, mouse and primate. Reconciling available results is hindered by varied experimental designs and analytical/statistical features. Continued application of these methods offers promise to provide the much-needed mechanistic framework to develop therapies to normalize refractive development in children. PMID:20363242
Sumida, Kenji; Stück, David; Mino, Lorenzo; Chai, Jeng-Da; Bloch, Eric D; Zavorotynska, Olena; Murray, Leslie J; Dincă, Mircea; Chavan, Sachin; Bordiga, Silvia; Head-Gordon, Martin; Long, Jeffrey R
2013-01-23
Microporous metal-organic frameworks are a class of materials being vigorously investigated for mobile hydrogen storage applications. For high-pressure storage at ambient temperatures, the M(3)[(M(4)Cl)(3)(BTT)(8)](2) (M-BTT; BTT(3-) = 1,3,5-benzenetristetrazolate) series of frameworks are of particular interest due to the high density of exposed metal cation sites on the pore surface. These sites give enhanced zero-coverage isosteric heats of adsorption (Q(st)) approaching the optimal value for ambient storage applications. However, the Q(st) parameter provides only a limited insight into the thermodynamics of the individual adsorption sites, the tuning of which is paramount for optimizing the storage performance. Here, we begin by performing variable-temperature infrared spectroscopy studies of Mn-, Fe-, and Cu-BTT, allowing the thermodynamics of H(2) adsorption to be probed experimentally. This is complemented by a detailed DFT study, in which molecular fragments representing the metal clusters within the extended solid are simulated to obtain a more thorough description of the structural and thermodynamic aspects of H(2) adsorption at the strongest binding sites. Then, the effect of substitutions at the metal cluster (metal ion and anion within the tetranuclear cluster) is discussed, showing that the configuration of this unit indeed plays an important role in determining the affinity of the framework toward H(2). Interestingly, the theoretical study has identified that the Zn-based analogs would be expected to facilitate enhanced adsorption profiles over the compounds synthesized experimentally, highlighting the importance of a combined experimental and theoretical approach to the design and synthesis of new frameworks for H(2) storage applications.
Agarwal, Shashank; Liu, Feifan; Yu, Hong
2011-10-03
Protein-protein interaction (PPI) is an important biomedical phenomenon. Automatically detecting PPI-relevant articles and identifying methods that are used to study PPI are important text mining tasks. In this study, we have explored domain independent features to develop two open source machine learning frameworks. One performs binary classification to determine whether the given article is PPI relevant or not, named "Simple Classifier", and the other one maps the PPI relevant articles with corresponding interaction method nodes in a standardized PSI-MI (Proteomics Standards Initiative-Molecular Interactions) ontology, named "OntoNorm". We evaluated our system in the context of BioCreative challenge competition using the standardized data set. Our systems are amongst the top systems reported by the organizers, attaining 60.8% F1-score for identifying relevant documents, and 52.3% F1-score for mapping articles to interaction method ontology. Our results show that domain-independent machine learning frameworks can perform competitively well at the tasks of detecting PPI relevant articles and identifying the methods that were used to study the interaction in such articles. Simple Classifier is available at http://sourceforge.net/p/simpleclassify/home/ and OntoNorm at http://sourceforge.net/p/ontonorm/home/.
Detection and analysis of part load and full load instabilities in a real Francis turbine prototype
NASA Astrophysics Data System (ADS)
Presas, Alexandre; Valentin, David; Egusquiza, Eduard; Valero, Carme
2017-04-01
Francis turbines operate in many cases out of its best efficiency point, in order to regulate their output power according to the instantaneous energy demand of the grid. Therefore, it is of paramount importance to analyse and determine the unstable operating points for these kind of units. In the framework of the HYPERBOLE project (FP7-ENERGY-2013-1; Project number 608532) a large Francis unit was investigated numerically, experimentally in a reduced scale model and also experimentally and numerically in the real prototype. This paper shows the unstable operating points identified during the experimental tests on the real Francis unit and the analysis of the main characteristics of these instabilities. Finally, it is shown that similar phenomena have been identified on previous research in the LMH (Laboratory for Hydraulic Machines, Lausanne) with the reduced scale model.
NASA Astrophysics Data System (ADS)
Stanley, Jacob T.; Su, Weifeng; Lewandowski, H. J.
2017-12-01
We demonstrate how students' use of modeling can be examined and assessed using student notebooks collected from an upper-division electronics lab course. The use of models is a ubiquitous practice in undergraduate physics education, but the process of constructing, testing, and refining these models is much less common. We focus our attention on a lab course that has been transformed to engage students in this modeling process during lab activities. The design of the lab activities was guided by a framework that captures the different components of model-based reasoning, called the Modeling Framework for Experimental Physics. We demonstrate how this framework can be used to assess students' written work and to identify how students' model-based reasoning differed from activity to activity. Broadly speaking, we were able to identify the different steps of students' model-based reasoning and assess the completeness of their reasoning. Varying degrees of scaffolding present across the activities had an impact on how thoroughly students would engage in the full modeling process, with more scaffolded activities resulting in more thorough engagement with the process. Finally, we identified that the step in the process with which students had the most difficulty was the comparison between their interpreted data and their model prediction. Students did not use sufficiently sophisticated criteria in evaluating such comparisons, which had the effect of halting the modeling process. This may indicate that in order to engage students further in using model-based reasoning during lab activities, the instructor needs to provide further scaffolding for how students make these types of experimental comparisons. This is an important design consideration for other such courses attempting to incorporate modeling as a learning goal.
ERIC Educational Resources Information Center
Leighton, Jacqueline P.; Bustos Gómez, María Clara
2018-01-01
Formative assessments and feedback are vital to enhancing learning outcomes but require that learners feel at ease identifying their errors, and receiving feedback from a trusted source--teachers. An experimental test of a new theoretical framework was conducted to cultivate a pedagogical alliance to enhance students' (a) trust in the teacher, (b)…
Estimating differential expression from multiple indicators
Ilmjärv, Sten; Hundahl, Christian Ansgar; Reimets, Riin; Niitsoo, Margus; Kolde, Raivo; Vilo, Jaak; Vasar, Eero; Luuk, Hendrik
2014-01-01
Regardless of the advent of high-throughput sequencing, microarrays remain central in current biomedical research. Conventional microarray analysis pipelines apply data reduction before the estimation of differential expression, which is likely to render the estimates susceptible to noise from signal summarization and reduce statistical power. We present a probe-level framework, which capitalizes on the high number of concurrent measurements to provide more robust differential expression estimates. The framework naturally extends to various experimental designs and target categories (e.g. transcripts, genes, genomic regions) as well as small sample sizes. Benchmarking in relation to popular microarray and RNA-sequencing data-analysis pipelines indicated high and stable performance on the Microarray Quality Control dataset and in a cell-culture model of hypoxia. Experimental-data-exhibiting long-range epigenetic silencing of gene expression was used to demonstrate the efficacy of detecting differential expression of genomic regions, a level of analysis not embraced by conventional workflows. Finally, we designed and conducted an experiment to identify hypothermia-responsive genes in terms of monotonic time-response. As a novel insight, hypothermia-dependent up-regulation of multiple genes of two major antioxidant pathways was identified and verified by quantitative real-time PCR. PMID:24586062
Quantum state engineering using one-dimensional discrete-time quantum walks
NASA Astrophysics Data System (ADS)
Innocenti, Luca; Majury, Helena; Giordani, Taira; Spagnolo, Nicolò; Sciarrino, Fabio; Paternostro, Mauro; Ferraro, Alessandro
2017-12-01
Quantum state preparation in high-dimensional systems is an essential requirement for many quantum-technology applications. The engineering of an arbitrary quantum state is, however, typically strongly dependent on the experimental platform chosen for implementation, and a general framework is still missing. Here we show that coined quantum walks on a line, which represent a framework general enough to encompass a variety of different platforms, can be used for quantum state engineering of arbitrary superpositions of the walker's sites. We achieve this goal by identifying a set of conditions that fully characterize the reachable states in the space comprising walker and coin and providing a method to efficiently compute the corresponding set of coin parameters. We assess the feasibility of our proposal by identifying a linear optics experiment based on photonic orbital angular momentum technology.
Drug Use Normalization: A Systematic and Critical Mixed-Methods Review.
Sznitman, Sharon R; Taubman, Danielle S
2016-09-01
Drug use normalization, which is a process whereby drug use becomes less stigmatized and more accepted as normative behavior, provides a conceptual framework for understanding contemporary drug issues and changes in drug use trends. Through a mixed-methods systematic review of the normalization literature, this article seeks to (a) critically examine how the normalization framework has been applied in empirical research and (b) make recommendations for future research in this area. Twenty quantitative, 26 qualitative, and 4 mixed-methods studies were identified through five electronic databases and reference lists of published studies. Studies were assessed for relevance, study characteristics, quality, and aspects of normalization examined. None of the studies applied the most rigorous research design (experiments) or examined all of the originally proposed normalization dimensions. The most commonly assessed dimension of drug use normalization was "experimentation." In addition to the original dimensions, the review identified the following new normalization dimensions in the literature: (a) breakdown of demographic boundaries and other risk factors in relation to drug use; (b) de-normalization; (c) drug use as a means to achieve normal goals; and (d) two broad forms of micro-politics associated with managing the stigma of illicit drug use: assimilative and transformational normalization. Further development in normalization theory and methodology promises to provide researchers with a novel framework for improving our understanding of drug use in contemporary society. Specifically, quasi-experimental designs that are currently being made feasible by swift changes in cannabis policy provide researchers with new and improved opportunities to examine normalization processes.
Wang, Nizhuan; Chang, Chunqi; Zeng, Weiming; Shi, Yuhu; Yan, Hongjie
2017-01-01
Independent component analysis (ICA) has been widely used in functional magnetic resonance imaging (fMRI) data analysis to evaluate functional connectivity of the brain; however, there are still some limitations on ICA simultaneously handling neuroimaging datasets with diverse acquisition parameters, e.g., different repetition time, different scanner, etc. Therefore, it is difficult for the traditional ICA framework to effectively handle ever-increasingly big neuroimaging datasets. In this research, a novel feature-map based ICA framework (FMICA) was proposed to address the aforementioned deficiencies, which aimed at exploring brain functional networks (BFNs) at different scales, e.g., the first level (individual subject level), second level (intragroup level of subjects within a certain dataset) and third level (intergroup level of subjects across different datasets), based only on the feature maps extracted from the fMRI datasets. The FMICA was presented as a hierarchical framework, which effectively made ICA and constrained ICA as a whole to identify the BFNs from the feature maps. The simulated and real experimental results demonstrated that FMICA had the excellent ability to identify the intergroup BFNs and to characterize subject-specific and group-specific difference of BFNs from the independent component feature maps, which sharply reduced the size of fMRI datasets. Compared with traditional ICAs, FMICA as a more generalized framework could efficiently and simultaneously identify the variant BFNs at the subject-specific, intragroup, intragroup-specific and intergroup levels, implying that FMICA was able to handle big neuroimaging datasets in neuroscience research.
Tempest: Tools for Addressing the Needs of Next-Generation Climate Models
NASA Astrophysics Data System (ADS)
Ullrich, P. A.; Guerra, J. E.; Pinheiro, M. C.; Fong, J.
2015-12-01
Tempest is a comprehensive simulation-to-science infrastructure that tackles the needs of next-generation, high-resolution, data intensive climate modeling activities. This project incorporates three key components: TempestDynamics, a global modeling framework for experimental numerical methods and high-performance computing; TempestRemap, a toolset for arbitrary-order conservative and consistent remapping between unstructured grids; and TempestExtremes, a suite of detection and characterization tools for identifying weather extremes in large climate datasets. In this presentation, the latest advances with the implementation of this framework will be discussed, and a number of projects now utilizing these tools will be featured.
Abdelgaied, A; Fisher, J; Jennings, L M
2018-02-01
A more robust pre-clinical wear simulation framework is required in order to simulate wider and higher ranges of activities, observed in different patient populations such as younger more active patients. Such a framework will help to understand and address the reported higher failure rates for younger and more active patients (National_Joint_Registry, 2016). The current study has developed and validated a comprehensive combined experimental and computational framework for pre-clinical wear simulation of total knee replacements (TKR). The input mechanical (elastic modulus and Poisson's ratio) and wear parameters of the moderately cross-linked ultra-high molecular weight polyethylene (UHMWPE) bearing material were independently measured from experimental studies under realistic test conditions, similar to the loading conditions found in the total knee replacements. The wear predictions from the computational wear simulation were validated against the direct experimental wear measurements for size 3 Sigma curved total knee replacements (DePuy, UK) in an independent experimental wear simulation study under three different daily activities; walking, deep squat, and stairs ascending kinematic conditions. The measured compressive mechanical properties of the moderately cross-linked UHMWPE material were more than 20% lower than that reported in the literature under tensile test conditions. The pin-on-plate wear coefficient of moderately cross-linked UHMWPE was significantly dependant of the contact stress and the degree of cross-shear at the articulating surfaces. The computational wear predictions for the TKR from the current framework were consistent and in a good agreement with the independent full TKR experimental wear simulation measurements, with 0.94 coefficient of determination of the framework. In addition, the comprehensive combined experimental and computational framework was able to explain the complex experimental wear trends from the three different daily activities investigated. Therefore, such a framework can be adopted as a pre-clinical simulation approach to optimise different designs, materials, as well as patient's specific total knee replacements for a range of activities. Copyright © 2017. Published by Elsevier Ltd.
Constitutive formulations for the mechanical investigation of colonic tissues.
Carniel, Emanuele Luigi; Gramigna, Vera; Fontanella, Chiara Giulia; Stefanini, Cesare; Natali, Arturo N
2014-05-01
A constitutive framework is provided for the characterization of the mechanical behavior of colonic tissues, as a fundamental tool for the development of numerical models of the colonic structures. The constitutive analysis is performed by a multidisciplinary approach that requires the cooperation between experimental and computational competences. The preliminary investigation pertains to the review of the tissues histology. The complex structural configuration of the tissues and the specific distributions of fibrous elements entail the nonlinear mechanical behavior and the anisotropic response. The identification of the mechanical properties requires to perform mechanical tests according to different loading situations, as different loading directions. Because of the typical functionality of colon structures, the tissues mechanics is investigated by tensile tests, which are performed on taenia coli and haustra specimens from fresh pig colons. Accounting for the histological investigation and the results from the mechanical tests, a specific hyperelastic framework is provided within the theory of fiber-reinforced composite materials. Preliminary analytical formulations are defined to identify the constitutive parameters by the inverse analysis of the experimental tests. Finite element models of the specimens are developed accounting for the actual configuration of the colon structures to verify the quality of the results. The good agreement between experimental and numerical model results suggests the reliability of the constitutive formulations and parameters. Finally, the developed constitutive analysis makes it possible to identify the mechanical behavior and properties of the different colonic tissues. Copyright © 2013 Wiley Periodicals, Inc.
Suzuki, Ryo; Ito, Kohta; Lee, Taeyong; Ogihara, Naomichi
2017-12-01
Identifying the viscous properties of the plantar soft tissue is crucial not only for understanding the dynamic interaction of the foot with the ground during locomotion, but also for development of improved footwear products and therapeutic footwear interventions. In the present study, the viscous and hyperelastic material properties of the plantar soft tissue were experimentally identified using a spherical indentation test and an analytical contact model of the spherical indentation test. Force-relaxation curves of the heel pads were obtained from the indentation experiment. The curves were fit to the contact model incorporating a five-element Maxwell model to identify the viscous material parameters. The finite element method with the experimentally identified viscoelastic parameters could successfully reproduce the measured force-relaxation curves, indicating the material parameters were correctly estimated using the proposed method. Although there are some methodological limitations, the proposed framework to identify the viscous material properties may facilitate the development of subject-specific finite element modeling of the foot and other biological materials. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.
Tucker, George; Loh, Po-Ru; Berger, Bonnie
2013-10-04
Comprehensive protein-protein interaction (PPI) maps are a powerful resource for uncovering the molecular basis of genetic interactions and providing mechanistic insights. Over the past decade, high-throughput experimental techniques have been developed to generate PPI maps at proteome scale, first using yeast two-hybrid approaches and more recently via affinity purification combined with mass spectrometry (AP-MS). Unfortunately, data from both protocols are prone to both high false positive and false negative rates. To address these issues, many methods have been developed to post-process raw PPI data. However, with few exceptions, these methods only analyze binary experimental data (in which each potential interaction tested is deemed either observed or unobserved), neglecting quantitative information available from AP-MS such as spectral counts. We propose a novel method for incorporating quantitative information from AP-MS data into existing PPI inference methods that analyze binary interaction data. Our approach introduces a probabilistic framework that models the statistical noise inherent in observations of co-purifications. Using a sampling-based approach, we model the uncertainty of interactions with low spectral counts by generating an ensemble of possible alternative experimental outcomes. We then apply the existing method of choice to each alternative outcome and aggregate results over the ensemble. We validate our approach on three recent AP-MS data sets and demonstrate performance comparable to or better than state-of-the-art methods. Additionally, we provide an in-depth discussion comparing the theoretical bases of existing approaches and identify common aspects that may be key to their performance. Our sampling framework extends the existing body of work on PPI analysis using binary interaction data to apply to the richer quantitative data now commonly available through AP-MS assays. This framework is quite general, and many enhancements are likely possible. Fruitful future directions may include investigating more sophisticated schemes for converting spectral counts to probabilities and applying the framework to direct protein complex prediction methods.
Some peculiarities of interactions of weakly bound lithium nuclei at near-barrier energies
NASA Astrophysics Data System (ADS)
Kabyshev, A. M.; Kuterbekov, K. A.; Sobolev, Yu G.; Penionzhkevich, Yu E.; Kubenova, M. M.; Azhibekov, A. K.; Mukhambetzhan, A. M.; Lukyanov, S. M.; Maslov, V. A.; Kabdrakhimova, G. D.
2018-02-01
This paper presents new experimental data on the total cross sections of 9Li + 28Si reactions at low energies as well as the analysis of previously obtained data for 6,7Li. Based on a large collection of data (authors’ and literature data) we carried out a comparative analysis of the two main experimental interaction cross sections (angular distributions of the differential cross sections and total reaction cross sections) for weakly bound lithium (6-9Li, 11Li) nuclei in the framework of Kox parameterization and the macroscopic optical model. We identified specific features of these interactions and predicted the experimental trend in the total reaction cross sections for Li isotopes at energies close to the Coulomb barrier.
Willems, Sander; Fraiture, Marie-Alice; Deforce, Dieter; De Keersmaecker, Sigrid C J; De Loose, Marc; Ruttink, Tom; Herman, Philippe; Van Nieuwerburgh, Filip; Roosens, Nancy
2016-02-01
Because the number and diversity of genetically modified (GM) crops has significantly increased, their analysis based on real-time PCR (qPCR) methods is becoming increasingly complex and laborious. While several pioneers already investigated Next Generation Sequencing (NGS) as an alternative to qPCR, its practical use has not been assessed for routine analysis. In this study a statistical framework was developed to predict the number of NGS reads needed to detect transgene sequences, to prove their integration into the host genome and to identify the specific transgene event in a sample with known composition. This framework was validated by applying it to experimental data from food matrices composed of pure GM rice, processed GM rice (noodles) or a 10% GM/non-GM rice mixture, revealing some influential factors. Finally, feasibility of NGS for routine analysis of GM crops was investigated by applying the framework to samples commonly encountered in routine analysis of GM crops. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Perotti, Luigi E; Ponnaluri, Aditya V S; Krishnamoorthi, Shankarjee; Balzani, Daniel; Ennis, Daniel B; Klug, William S
2017-11-01
Quantitative measurement of the material properties (eg, stiffness) of biological tissues is poised to become a powerful diagnostic tool. There are currently several methods in the literature to estimating material stiffness, and we extend this work by formulating a framework that leads to uniquely identified material properties. We design an approach to work with full-field displacement data-ie, we assume the displacement field due to the applied forces is known both on the boundaries and also within the interior of the body of interest-and seek stiffness parameters that lead to balanced internal and external forces in a model. For in vivo applications, the displacement data can be acquired clinically using magnetic resonance imaging while the forces may be computed from pressure measurements, eg, through catheterization. We outline a set of conditions under which the least-square force error objective function is convex, yielding uniquely identified material properties. An important component of our framework is a new numerical strategy to formulate polyconvex material energy laws that are linear in the material properties and provide one optimal description of the available experimental data. An outcome of our approach is the analysis of the reliability of the identified material properties, even for material laws that do not admit unique property identification. Lastly, we evaluate our approach using passive myocardium experimental data at the material point and show its application to identifying myocardial stiffness with an in silico experiment modeling the passive filling of the left ventricle. Copyright © 2017 John Wiley & Sons, Ltd.
Hua, Carol; Doheny, Patrick William; Ding, Bowen; Chan, Bun; Yu, Michelle; Kepert, Cameron J; D'Alessandro, Deanna M
2018-05-04
Understanding the nature of charge transfer mechanisms in 3-dimensional Metal-Organic Frameworks (MOFs) is an important goal owing to the possibility of harnessing this knowledge to design conductive frameworks. These materials have been implicated as the basis for the next generation of technological devices for applications in energy storage and conversion, including electrochromic devices, electrocatalysts, and battery materials. After nearly two decades of intense research into MOFs, the mechanisms of charge transfer remain relatively poorly understood, and new strategies to achieve charge mobility remain elusive and challenging to experimentally explore, validate and model. We now demonstrate that aromatic stacking interactions in Zn(II) frameworks containing cofacial thiazolo[5,4-d]thiazole units lead to a mixed-valence state upon electrochemical or chemical reduction. This through-space Intervalence Charge Transfer (IVCT) phenomenon represents a new mechanism for charge delocalisation in MOFs. Computational modelling of the optical data combined with application of Marcus-Hush theory to the IVCT bands for the mixed-valence framework has enabled quantification of the degree of delocalisation using both in situ and ex situ electro- and spectro-electrochemical methods. A distance dependence for the through-space electron transfer has also been identified on the basis of experimental studies and computational calculations. This work provides a new window into electron transfer phenomena in 3-dimensional coordination space, of relevance to electroactive MOFs where new mechanisms for charge transfer are highly sought after, and to understanding biological light harvesting systems where through-space mixed-valence interactions are operative.
Assembling evidence for identifying reservoirs of infection
Viana, Mafalda; Mancy, Rebecca; Biek, Roman; Cleaveland, Sarah; Cross, Paul C.; Lloyd-Smith, James O.; Haydon, Daniel T.
2014-01-01
Many pathogens persist in multihost systems, making the identification of infection reservoirs crucial for devising effective interventions. Here, we present a conceptual framework for classifying patterns of incidence and prevalence, and review recent scientific advances that allow us to study and manage reservoirs simultaneously. We argue that interventions can have a crucial role in enriching our mechanistic understanding of how reservoirs function and should be embedded as quasi-experimental studies in adaptive management frameworks. Single approaches to the study of reservoirs are unlikely to generate conclusive insights whereas the formal integration of data and methodologies, involving interventions, pathogen genetics, and contemporary surveillance techniques, promises to open up new opportunities to advance understanding of complex multihost systems. PMID:24726345
Assembling evidence for identifying reservoirs of infection
Mafalda, Viana; Rebecca, Mancy; Roman, Biek; Sarah, Cleaveland; Cross, Paul C.; James O, Lloyd-Smith; Daniel T, Haydon
2014-01-01
Many pathogens persist in multihost systems, making the identification of infection reservoirs crucial for devising effective interventions. Here, we present a conceptual framework for classifying patterns of incidence and prevalence, and review recent scientific advances that allow us to study and manage reservoirs simultaneously. We argue that interventions can have a crucial role in enriching our mechanistic understanding of how reservoirs function and should be embedded as quasi-experimental studies in adaptive management frameworks. Single approaches to the study of reservoirs are unlikely to generate conclusive insights whereas the formal integration of data and methodologies, involving interventions, pathogen genetics, and contemporary surveillance techniques, promises to open up new opportunities to advance understanding of complex multihost systems.
Supervised Semantic Classification for Nuclear Proliferation Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vatsavai, Raju; Cheriyadat, Anil M; Gleason, Shaun Scott
2010-01-01
Existing feature extraction and classification approaches are not suitable for monitoring proliferation activity using high-resolution multi-temporal remote sensing imagery. In this paper we present a supervised semantic labeling framework based on the Latent Dirichlet Allocation method. This framework is used to analyze over 120 images collected under different spatial and temporal settings over the globe representing three major semantic categories: airports, nuclear, and coal power plants. Initial experimental results show a reasonable discrimination of these three categories even though coal and nuclear images share highly common and overlapping objects. This research also identified several research challenges associated with nuclear proliferationmore » monitoring using high resolution remote sensing images.« less
Lin, Chun-Yu; Zhang, Lipeng; Zhao, Zhenghang; Xia, Zhenhai
2017-05-01
Covalent organic frameworks (COFs), an emerging class of framework materials linked by covalent bonds, hold potential for various applications such as efficient electrocatalysts, photovoltaics, and sensors. To rationally design COF-based electrocatalysts for oxygen reduction and evolution reactions in fuel cells and metal-air batteries, activity descriptors, derived from orbital energy and bonding structures, are identified with the first-principle calculations for the COFs, which correlate COF structures with their catalytic activities. The calculations also predict that alkaline-earth metal-porphyrin COFs could catalyze the direct production of H 2 O 2 , a green oxidizer and an energy carrier. These predictions are supported by experimental data, and the design principles derived from the descriptors provide an approach for rational design of new electrocatalysts for both clean energy conversion and green oxidizer production. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
A conceptual review of decision making in social dilemmas: applying a logic of appropriateness.
Weber, J Mark; Kopelman, Shirli; Messick, David M
2004-01-01
Despite decades of experimental social dilemma research, "theoretical integration has proven elusive" (Smithson & Foddy, 1999, p. 14). To advance a theory of decision making in social dilemmas, this article provides a conceptual review of the literature that applies a "logic of appropriateness" (March, 1994) framework. The appropriateness framework suggests that people making decisions ask themselves (explicitly or implicitly), "What does a person like me do in a situation like this? " This question identifies 3 significant factors: recognition and classification of the kind of situation encountered, the identity of the individual making the decision, and the application of rules or heuristics in guiding behavioral choice. In contrast with dominant rational choice models, the appropriateness framework proposed accommodates the inherently social nature of social dilemmas, and the role of rule and heuristic based processing. Implications for the interpretation of past findings and the direction of future research are discussed.
Questioning and Experimentation
ERIC Educational Resources Information Center
Mutanen, Arto
2014-01-01
The paper is a philosophical analysis of experimentation. The philosophical framework of the analysis is the interrogative model of inquiry developed by Hintikka. The basis of the model is explicit and well-formed logic of questions and answers. The framework allows us to formulate a flexible logic of experimentation. In particular, the formulated…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boyer, M. D.; Andre, R.; Gates, D. A.
The high-performance operational goals of NSTX-U will require development of advanced feedback control algorithms, including control of ßN and the safety factor profile. In this work, a novel approach to simultaneously controlling ßN and the value of the safety factor on the magnetic axis, q0, through manipulation of the plasma boundary shape and total beam power, is proposed. Simulations of the proposed scheme show promising results and motivate future experimental implementation and eventual integration into a more complex current profile control scheme planned to include actuation of individual beam powers, density, and loop voltage. As part of this work, amore » flexible framework for closed loop simulations within the high-fidelity code TRANSP was developed. The framework, used here to identify control-design-oriented models and to tune and test the proposed controller, exploits many of the predictive capabilities of TRANSP and provides a means for performing control calculations based on user-supplied data (controller matrices, target waveforms, etc.). The flexible framework should enable high-fidelity testing of a variety of control algorithms, thereby reducing the amount of expensive experimental time needed to implement new control algorithms on NSTX-U and other devices.« less
NASA Astrophysics Data System (ADS)
Boyer, M. D.; Andre, R.; Gates, D. A.; Gerhardt, S.; Goumiri, I. R.; Menard, J.
2015-05-01
The high-performance operational goals of NSTX-U will require development of advanced feedback control algorithms, including control of βN and the safety factor profile. In this work, a novel approach to simultaneously controlling βN and the value of the safety factor on the magnetic axis, q0, through manipulation of the plasma boundary shape and total beam power, is proposed. Simulations of the proposed scheme show promising results and motivate future experimental implementation and eventual integration into a more complex current profile control scheme planned to include actuation of individual beam powers, density, and loop voltage. As part of this work, a flexible framework for closed loop simulations within the high-fidelity code TRANSP was developed. The framework, used here to identify control-design-oriented models and to tune and test the proposed controller, exploits many of the predictive capabilities of TRANSP and provides a means for performing control calculations based on user-supplied data (controller matrices, target waveforms, etc). The flexible framework should enable high-fidelity testing of a variety of control algorithms, thereby reducing the amount of expensive experimental time needed to implement new control algorithms on NSTX-U and other devices.
Bosch, Thomas C. G.; Adamska, Maja; Augustin, René; Domazet-Loso, Tomislav; Foret, Sylvain; Fraune, Sebastian; Funayama, Noriko; Grasis, Juris; Hamada, Mayuko; Hatta, Masayuki; Hobmayer, Bert; Kawai, Kotoe; Klimovich, Alexander; Manuel, Michael; Shinzato, Chuya; Technau, Uli; Yum, Seungshic; Miller, David J.
2014-01-01
Ecological developmental biology (eco-devo) explores the mechanistic relationships between the processes of individual development and environmental factors. Recent studies imply that some of these relationships have deep evolutionary origins, and may even predate the divergences of the simplest extant animals, including cnidarians and sponges. Development of these early diverging metazoans is often sensitive to environmental factors, and these interactions occur in the context of conserved signaling pathways and mechanisms of tissue homeostasis whose detailed molecular logic remain elusive. Efficient methods for transgenesis in cnidarians together with the ease of experimental manipulation in cnidarians and sponges make them ideal models for understanding causal relationships between environmental factors and developmental mechanisms. Here, we identify major questions at the interface between animal evolution and development and outline a road map for research aimed at identifying the mechanisms that link environmental factors to developmental mechanisms in early diverging metazoans. PMID:25205353
Experimentation in software engineering
NASA Technical Reports Server (NTRS)
Basili, V. R.; Selby, R. W.; Hutchens, D. H.
1986-01-01
Experimentation in software engineering supports the advancement of the field through an iterative learning process. In this paper, a framework for analyzing most of the experimental work performed in software engineering over the past several years is presented. A variety of experiments in the framework is described and their contribution to the software engineering discipline is discussed. Some useful recommendations for the application of the experimental process in software engineering are included.
Tricco, Andrea C; Cogo, Elise; Ashoor, Huda; Perrier, Laure; McKibbon, K Ann; Grimshaw, Jeremy M; Straus, Sharon E
2013-05-14
Knowledge translation (KT also known as research utilisation, translational medicine and implementation science) is a dynamic and iterative process that includes the synthesis, dissemination, exchange and ethically sound application of knowledge to improve health. After the implementation of KT interventions, their impact on relevant outcomes should be monitored. The objectives of this scoping review are to: (1) conduct a systematic search of the literature to identify the impact on healthcare outcomes beyond 1 year, or beyond the termination of funding of the initiative of KT interventions targeting chronic disease management for end-users including patients, clinicians, public health officials, health services managers and policy-makers; (2) identify factors that influence sustainability of effective KT interventions; (3) identify how sustained change from KT interventions should be measured; and (4) develop a framework for assessing sustainability of KT interventions. Comprehensive searches of relevant electronic databases (eg, MEDLINE, EMBASE, Cochrane Central Register of Controlled Trials), websites of funding agencies and websites of healthcare provider organisations will be conducted to identify relevant material. We will include experimental, quasi-experimental and observational studies providing information on the sustainability of KT interventions targeting chronic disease management in adults and focusing on end-users including patients, clinicians, public health officials, health services managers and policy-makers. Two reviewers will pilot-test the screening criteria and data abstraction form. They will then screen all citations, full articles and abstract data in duplicate independently. The results of the scoping review will be synthesised descriptively and used to develop a framework to assess the sustainability of KT interventions. Our results will help inform end-users (ie, patients, clinicians, public health officials, health services managers and policy-makers) regarding the sustainability of KT interventions. Our dissemination plan includes publications, presentations, website posting and a stakeholder meeting.
NASA Technical Reports Server (NTRS)
Chudnovsky, A.
1984-01-01
A damage parameter is introduced in addition to conventional parameters of continuum mechanics and consider a crack surrounded by an array of microdefects within the continuum mechanics framework. A system consisting of the main crack and surrounding damage is called crack layer (CL). Crack layer propagation is an irreversible process. The general framework of the thermodynamics of irreversible processes are employed to identify the driving forces (causes) and to derive the constitutive equation of CL propagation, that is, the relationship between the rates of the crack growth and damage dissemination from one side and the conjugated thermodynamic forces from another. The proposed law of CL propagation is in good agreement with the experimental data on fatigue CL propagation in various materials. The theory also elaborates material toughness characterization.
NASA Technical Reports Server (NTRS)
Chudnovsky, A.
1987-01-01
A damage parameter is introduced in addition to conventional parameters of continuum mechanics and consider a crack surrounded by an array of microdefects within the continuum mechanics framework. A system consisting of the main crack and surrounding damage is called crack layer (CL). Crack layer propagation is an irreversible process. The general framework of the thermodynamics of irreversible processes are employed to identify the driving forces (causes) and to derive the constitutive equation of CL propagation, that is, the relationship between the rates of the crack growth and damage dissemination from one side and the conjugated thermodynamic forces from another. The proposed law of CL propagation is in good agreement with the experimental data on fatigue CL propagation in various materials. The theory also elaborates material toughness characterization.
Evolution equation for quantum coherence
Hu, Ming-Liang; Fan, Heng
2016-01-01
The estimation of the decoherence process of an open quantum system is of both theoretical significance and experimental appealing. Practically, the decoherence can be easily estimated if the coherence evolution satisfies some simple relations. We introduce a framework for studying evolution equation of coherence. Based on this framework, we prove a simple factorization relation (FR) for the l1 norm of coherence, and identified the sets of quantum channels for which this FR holds. By using this FR, we further determine condition on the transformation matrix of the quantum channel which can support permanently freezing of the l1 norm of coherence. We finally reveal the universality of this FR by showing that it holds for many other related coherence and quantum correlation measures. PMID:27382933
Finger Vein Recognition Based on a Personalized Best Bit Map
Yang, Gongping; Xi, Xiaoming; Yin, Yilong
2012-01-01
Finger vein patterns have recently been recognized as an effective biometric identifier. In this paper, we propose a finger vein recognition method based on a personalized best bit map (PBBM). Our method is rooted in a local binary pattern based method and then inclined to use the best bits only for matching. We first present the concept of PBBM and the generating algorithm. Then we propose the finger vein recognition framework, which consists of preprocessing, feature extraction, and matching. Finally, we design extensive experiments to evaluate the effectiveness of our proposal. Experimental results show that PBBM achieves not only better performance, but also high robustness and reliability. In addition, PBBM can be used as a general framework for binary pattern based recognition. PMID:22438735
Towards a data-driven analysis of hadronic light-by-light scattering
NASA Astrophysics Data System (ADS)
Colangelo, Gilberto; Hoferichter, Martin; Kubis, Bastian; Procura, Massimiliano; Stoffer, Peter
2014-11-01
The hadronic light-by-light contribution to the anomalous magnetic moment of the muon was recently analyzed in the framework of dispersion theory, providing a systematic formalism where all input quantities are expressed in terms of on-shell form factors and scattering amplitudes that are in principle accessible in experiment. We briefly review the main ideas behind this framework and discuss the various experimental ingredients needed for the evaluation of one- and two-pion intermediate states. In particular, we identify processes that in the absence of data for doubly-virtual pion-photon interactions can help constrain parameters in the dispersive reconstruction of the relevant input quantities, the pion transition form factor and the helicity partial waves for γ*γ* → ππ.
Finger vein recognition based on a personalized best bit map.
Yang, Gongping; Xi, Xiaoming; Yin, Yilong
2012-01-01
Finger vein patterns have recently been recognized as an effective biometric identifier. In this paper, we propose a finger vein recognition method based on a personalized best bit map (PBBM). Our method is rooted in a local binary pattern based method and then inclined to use the best bits only for matching. We first present the concept of PBBM and the generating algorithm. Then we propose the finger vein recognition framework, which consists of preprocessing, feature extraction, and matching. Finally, we design extensive experiments to evaluate the effectiveness of our proposal. Experimental results show that PBBM achieves not only better performance, but also high robustness and reliability. In addition, PBBM can be used as a general framework for binary pattern based recognition.
Assembling evidence for identifying reservoirs of infection.
Viana, Mafalda; Mancy, Rebecca; Biek, Roman; Cleaveland, Sarah; Cross, Paul C; Lloyd-Smith, James O; Haydon, Daniel T
2014-05-01
Many pathogens persist in multihost systems, making the identification of infection reservoirs crucial for devising effective interventions. Here, we present a conceptual framework for classifying patterns of incidence and prevalence, and review recent scientific advances that allow us to study and manage reservoirs simultaneously. We argue that interventions can have a crucial role in enriching our mechanistic understanding of how reservoirs function and should be embedded as quasi-experimental studies in adaptive management frameworks. Single approaches to the study of reservoirs are unlikely to generate conclusive insights whereas the formal integration of data and methodologies, involving interventions, pathogen genetics, and contemporary surveillance techniques, promises to open up new opportunities to advance understanding of complex multihost systems. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
Barbarich-Marsteller, Nicole C.; Underwood, Mark D.; Foltin, Richard W.; Myers, Michael M.; Walsh, B. Timothy; Barrett, Jeffrey S.; Marsteller, Douglas A.
2018-01-01
Objective Activity-based anorexia is a translational rodent model that results in severe weight loss, hyperactivity, and voluntary self-starvation. The goal of our investigation was to identify vulnerable and resistant phenotypes of activity-based anorexia in adolescent female rats. Method Sprague-Dawley rats were maintained under conditions of restricted access to food (N = 64; or unlimited access, N = 16) until experimental exit, predefined as a target weight loss of 30–35% or meeting predefined criteria for animal health. Nonlinear mixed effects statistical modeling was used to describe wheel running behavior, time to event analysis was used to assess experimental exit, and a regressive partitioning algorithm was used to classify phenotypes. Results Objective criteria were identified for distinguishing novel phenotypes of activity-based anorexia, including a vulnerable phenotype that conferred maximal hyperactivity, minimal food intake, and the shortest time to experimental exit, and a resistant phenotype that conferred minimal activity and the longest time to experimental exit. Discussion The identification of objective criteria for defining vulnerable and resistant phenotypes of activity-based anorexia in adolescent female rats provides an important framework for studying the neural mechanisms that promote vulnerability to or protection against the development of self-starvation and hyperactivity during adolescence. Ultimately, future studies using these novel phenotypes may provide important translational insights into the mechanisms that promote these maladaptive behaviors characteristic of anorexia nervosa. PMID:23853140
Barbarich-Marsteller, Nicole C; Underwood, Mark D; Foltin, Richard W; Myers, Michael M; Walsh, B Timothy; Barrett, Jeffrey S; Marsteller, Douglas A
2013-11-01
Activity-based anorexia is a translational rodent model that results in severe weight loss, hyperactivity, and voluntary self-starvation. The goal of our investigation was to identify vulnerable and resistant phenotypes of activity-based anorexia in adolescent female rats. Sprague-Dawley rats were maintained under conditions of restricted access to food (N = 64; or unlimited access, N = 16) until experimental exit, predefined as a target weight loss of 30-35% or meeting predefined criteria for animal health. Nonlinear mixed effects statistical modeling was used to describe wheel running behavior, time to event analysis was used to assess experimental exit, and a regressive partitioning algorithm was used to classify phenotypes. Objective criteria were identified for distinguishing novel phenotypes of activity-based anorexia, including a vulnerable phenotype that conferred maximal hyperactivity, minimal food intake, and the shortest time to experimental exit, and a resistant phenotype that conferred minimal activity and the longest time to experimental exit. The identification of objective criteria for defining vulnerable and resistant phenotypes of activity-based anorexia in adolescent female rats provides an important framework for studying the neural mechanisms that promote vulnerability to or protection against the development of self-starvation and hyperactivity during adolescence. Ultimately, future studies using these novel phenotypes may provide important translational insights into the mechanisms that promote these maladaptive behaviors characteristic of anorexia nervosa. Copyright © 2013 Wiley Periodicals, Inc.
Greaves, Lorraine; Jategaonkar, Natasha
2006-09-01
This article assesses the effects of comprehensive tobacco control policies on diverse subpopulations of girls and women who are at increased vulnerability to tobacco use because of disadvantage. The authors report on a recent assessment of experimental literature examining tobacco taxation; smoking location restrictions in public and private spaces; and sales restrictions. A comprehensive search was undertaken to identify relevant studies and evaluation reports. Gender based and diversity analyses were performed to identify pertinent sex differences and gender influences that would affect the application and impact of the policy. Finally, the results were contextualised within the wider literature on women's tobacco use and women's health. The authors consider not only the intended policy effects, but also explicitly examine the gendered and/or unintended consequences of these policies on other aspects of girls and women's health and wellbeing. A framework for developing gender sensitive tobacco programmes and policies for low income girls and women is provided.
Greaves, Lorraine; Jategaonkar, Natasha
2006-01-01
This article assesses the effects of comprehensive tobacco control policies on diverse subpopulations of girls and women who are at increased vulnerability to tobacco use because of disadvantage. The authors report on a recent assessment of experimental literature examining tobacco taxation; smoking location restrictions in public and private spaces; and sales restrictions. A comprehensive search was undertaken to identify relevant studies and evaluation reports. Gender based and diversity analyses were performed to identify pertinent sex differences and gender influences that would affect the application and impact of the policy. Finally, the results were contextualised within the wider literature on women's tobacco use and women's health. The authors consider not only the intended policy effects, but also explicitly examine the gendered and/or unintended consequences of these policies on other aspects of girls and women's health and wellbeing. A framework for developing gender sensitive tobacco programmes and policies for low income girls and women is provided. PMID:17708012
Schindlbeck, Christopher; Pape, Christian; Reithmeier, Eduard
2018-04-16
Alignment of optical components is crucial for the assembly of optical systems to ensure their full functionality. In this paper we present a novel predictor-corrector framework for the sequential assembly of serial optical systems. Therein, we use a hybrid optical simulation model that comprises virtual and identified component positions. The hybrid model is constantly adapted throughout the assembly process with the help of nonlinear identification techniques and wavefront measurements. This enables prediction of the future wavefront at the detector plane and therefore allows for taking corrective measures accordingly during the assembly process if a user-defined tolerance on the wavefront error is violated. We present a novel notation for the so-called hybrid model and outline the work flow of the presented predictor-corrector framework. A beam expander is assembled as demonstrator for experimental verification of the framework. The optical setup consists of a laser, two bi-convex spherical lenses each mounted to a five degree-of-freedom stage to misalign and correct components, and a Shack-Hartmann sensor for wavefront measurements.
Origin and Consequences of the Relationship between Protein Mean and Variance
Vallania, Francesco Luigi Massimo; Sherman, Marc; Goodwin, Zane; Mogno, Ilaria; Cohen, Barak Alon; Mitra, Robi David
2014-01-01
Cell-to-cell variance in protein levels (noise) is a ubiquitous phenomenon that can increase fitness by generating phenotypic differences within clonal populations of cells. An important challenge is to identify the specific molecular events that control noise. This task is complicated by the strong dependence of a protein's cell-to-cell variance on its mean expression level through a power-law like relationship (σ2∝μ1.69). Here, we dissect the nature of this relationship using a stochastic model parameterized with experimentally measured values. This framework naturally recapitulates the power-law like relationship (σ2∝μ1.6) and accurately predicts protein variance across the yeast proteome (r2 = 0.935). Using this model we identified two distinct mechanisms by which protein variance can be increased. Variables that affect promoter activation, such as nucleosome positioning, increase protein variance by changing the exponent of the power-law relationship. In contrast, variables that affect processes downstream of promoter activation, such as mRNA and protein synthesis, increase protein variance in a mean-dependent manner following the power-law. We verified our findings experimentally using an inducible gene expression system in yeast. We conclude that the power-law-like relationship between noise and protein mean is due to the kinetics of promoter activation. Our results provide a framework for understanding how molecular processes shape stochastic variation across the genome. PMID:25062021
Ikeda-like chaos on a dynamically filtered supercontinuum light source
NASA Astrophysics Data System (ADS)
Chembo, Yanne K.; Jacquot, Maxime; Dudley, John M.; Larger, Laurent
2016-08-01
We demonstrate temporal chaos in a color-selection mechanism from the visible spectrum of a supercontinuum light source. The color-selection mechanism is governed by an acousto-optoelectronic nonlinear delayed-feedback scheme modeled by an Ikeda-like equation. Initially motivated by the design of a broad audience live demonstrator in the framework of the International Year of Light 2015, the setup also provides a different experimental tool to investigate the dynamical complexity of delayed-feedback dynamics. Deterministic hyperchaos is analyzed here from the experimental time series. A projection method identifies the delay parameter, for which the chaotic strange attractor originally evolving in an infinite-dimensional phase space can be revealed in a two-dimensional subspace.
Fusion and Sense Making of Heterogeneous Sensor Network and Other Sources
2017-03-16
multimodal fusion framework that uses both training data and web resources for scene classification, the experimental results on the benchmark datasets...show that the proposed text-aided scene classification framework could significantly improve classification performance. Experimental results also show...human whose adaptability is achieved by reliability- dependent weighting of different sensory modalities. Experimental results show that the proposed
NASA Astrophysics Data System (ADS)
Launay, Jean; Hivet, Gilles; Vu Duong, Ahn; Boisse, Philippe
2007-04-01
Two tests are mainly used to identify the shear behavior of fabrics. The "picture frame" which uses a lozenge framework made of four rigid and articulated bars and the "bias test" which is a tensile test on a sample with initially a 45° angle between the yarns and the edges. The picture frame test is the more commonly used because the whole specimen is theoretically in a pure shear state. Nevertheless the absence of tension in the woven reinforcement supposes a perfect alignment of fibres and positioning of the clamping point with regards to the framework articulations. In addition, it is often necessary in practice to impose an initial tension which is not quantified and whose consequences are ignored in the classical picture frame test. An experimental device making it possible to measure the tensions during the test is carried out. Different types of teste on different fabrics have been performed. Results presented here concern a twintex fabric that has been selected for a shear benchmark Thanks to this device, it is shown that tensions play an important role in plane shear behaviour.
NASA Astrophysics Data System (ADS)
Wen, Di; Ding, Xiaoqing
2003-12-01
In this paper we propose a general framework for character segmentation in complex multilingual documents, which is an endeavor to combine the traditionally separated segmentation and recognition processes into a cooperative system. The framework contains three basic steps: Dissection, Local Optimization and Global Optimization, which are designed to fuse various properties of the segmentation hypotheses hierarchically into a composite evaluation to decide the final recognition results. Experimental results show that this framework is general enough to be applied in variety of documents. A sample system based on this framework to recognize Chinese, Japanese and Korean documents and experimental performance is reported finally.
Investigation of effective strategies for developing creative science thinking
NASA Astrophysics Data System (ADS)
Yang, Kuay-Keng; Lee, Ling; Hong, Zuway-R.; Lin, Huann-shyang
2016-09-01
The purpose of this study was to explore the effectiveness of the creative inquiry-based science teaching on students' creative science thinking and science inquiry performance. A quasi-experimental design consisting one experimental group (N = 20) and one comparison group (N = 24) with pretest and post-test was conducted. The framework of the intervention focused on potential strategies such as promoting divergent and convergent thinking and providing an open, inquiry-based learning environment that are recommended by the literature. Results revealed that the experimental group students outperformed their counterparts in the comparison group on the performances of science inquiry and convergent thinking. Additional qualitative data analyses from classroom observations and case teacher interviews identified supportive teaching strategies (e.g. facilitating associative thinking, sharing impressive ideas, encouraging evidence-based conclusions, and reviewing and commenting on group presentations) for developing students' creative science thinking.
Molecular system identification for enzyme directed evolution and design
NASA Astrophysics Data System (ADS)
Guan, Xiangying; Chakrabarti, Raj
2017-09-01
The rational design of chemical catalysts requires methods for the measurement of free energy differences in the catalytic mechanism for any given catalyst Hamiltonian. The scope of experimental learning algorithms that can be applied to catalyst design would also be expanded by the availability of such methods. Methods for catalyst characterization typically either estimate apparent kinetic parameters that do not necessarily correspond to free energy differences in the catalytic mechanism or measure individual free energy differences that are not sufficient for establishing the relationship between the potential energy surface and catalytic activity. Moreover, in order to enhance the duty cycle of catalyst design, statistically efficient methods for the estimation of the complete set of free energy differences relevant to the catalytic activity based on high-throughput measurements are preferred. In this paper, we present a theoretical and algorithmic system identification framework for the optimal estimation of free energy differences in solution phase catalysts, with a focus on one- and two-substrate enzymes. This framework, which can be automated using programmable logic, prescribes a choice of feasible experimental measurements and manipulated input variables that identify the complete set of free energy differences relevant to the catalytic activity and minimize the uncertainty in these free energy estimates for each successive Hamiltonian design. The framework also employs decision-theoretic logic to determine when model reduction can be applied to improve the duty cycle of high-throughput catalyst design. Automation of the algorithm using fluidic control systems is proposed, and applications of the framework to the problem of enzyme design are discussed.
Song, Jiangning; Li, Fuyi; Takemoto, Kazuhiro; Haffari, Gholamreza; Akutsu, Tatsuya; Chou, Kuo-Chen; Webb, Geoffrey I
2018-04-14
Determining the catalytic residues in an enzyme is critical to our understanding the relationship between protein sequence, structure, function, and enhancing our ability to design novel enzymes and their inhibitors. Although many enzymes have been sequenced, and their primary and tertiary structures determined, experimental methods for enzyme functional characterization lag behind. Because experimental methods used for identifying catalytic residues are resource- and labor-intensive, computational approaches have considerable value and are highly desirable for their ability to complement experimental studies in identifying catalytic residues and helping to bridge the sequence-structure-function gap. In this study, we describe a new computational method called PREvaIL for predicting enzyme catalytic residues. This method was developed by leveraging a comprehensive set of informative features extracted from multiple levels, including sequence, structure, and residue-contact network, in a random forest machine-learning framework. Extensive benchmarking experiments on eight different datasets based on 10-fold cross-validation and independent tests, as well as side-by-side performance comparisons with seven modern sequence- and structure-based methods, showed that PREvaIL achieved competitive predictive performance, with an area under the receiver operating characteristic curve and area under the precision-recall curve ranging from 0.896 to 0.973 and from 0.294 to 0.523, respectively. We demonstrated that this method was able to capture useful signals arising from different levels, leveraging such differential but useful types of features and allowing us to significantly improve the performance of catalytic residue prediction. We believe that this new method can be utilized as a valuable tool for both understanding the complex sequence-structure-function relationships of proteins and facilitating the characterization of novel enzymes lacking functional annotations. Copyright © 2018 Elsevier Ltd. All rights reserved.
A statistical framework for biomedical literature mining.
Chung, Dongjun; Lawson, Andrew; Zheng, W Jim
2017-09-30
In systems biology, it is of great interest to identify new genes that were not previously reported to be associated with biological pathways related to various functions and diseases. Identification of these new pathway-modulating genes does not only promote understanding of pathway regulation mechanisms but also allow identification of novel targets for therapeutics. Recently, biomedical literature has been considered as a valuable resource to investigate pathway-modulating genes. While the majority of currently available approaches are based on the co-occurrence of genes within an abstract, it has been reported that these approaches show only sub-optimal performances because 70% of abstracts contain information only for a single gene. To overcome such limitation, we propose a novel statistical framework based on the concept of ontology fingerprint that uses gene ontology to extract information from large biomedical literature data. The proposed framework simultaneously identifies pathway-modulating genes and facilitates interpreting functions of these new genes. We also propose a computationally efficient posterior inference procedure based on Metropolis-Hastings within Gibbs sampler for parameter updates and the poor man's reversible jump Markov chain Monte Carlo approach for model selection. We evaluate the proposed statistical framework with simulation studies, experimental validation, and an application to studies of pathway-modulating genes in yeast. The R implementation of the proposed model is currently available at https://dongjunchung.github.io/bayesGO/. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
A framework for the identification of reusable processes
NASA Astrophysics Data System (ADS)
de Vries, Marné; Gerber, Aurona; van der Merwe, Alta
2013-11-01
A significant challenge that faces IT management is that of aligning the IT infrastructure of an enterprise with its business goals and practices, also called business-IT alignment. A particular business-IT alignment approach, the foundation for execution approach, was well-accepted by practitioners due to a novel construct, called the operating model (OM). The OM supports business-IT alignment by directing the coherent and consistent design of business and IT components. Even though the OM is a popular construct, our previous research detected the need to enhance the OM, since the OM does not specify methods to identify opportunities for data sharing and process reuse in an enterprise. In this article, we address one of the identified deficiencies in the OM. We present a process reuse identification framework (PRIF) that could be used to enhance the OM in identifying process reuse opportunities in an enterprise. We applied design research to develop PRIF as an artefact, where the development process of PRIF was facilitated by means of the business-IT alignment model (BIAM). We demonstrate the use of the PRIF as well as report on the results of evaluating PRIF in terms of its usefulness and ease-of-use, using experimentation and a questionnaire.
Rare itemsets mining algorithm based on RP-Tree and spark framework
NASA Astrophysics Data System (ADS)
Liu, Sainan; Pan, Haoan
2018-05-01
For the issues of the rare itemsets mining in big data, this paper proposed a rare itemsets mining algorithm based on RP-Tree and Spark framework. Firstly, it arranged the data vertically according to the transaction identifier, in order to solve the defects of scan the entire data set, the vertical datasets are divided into frequent vertical datasets and rare vertical datasets. Then, it adopted the RP-Tree algorithm to construct the frequent pattern tree that contains rare items and generate rare 1-itemsets. After that, it calculated the support of the itemsets by scanning the two vertical data sets, finally, it used the iterative process to generate rare itemsets. The experimental show that the algorithm can effectively excavate rare itemsets and have great superiority in execution time.
Plant Species Identification by Bi-channel Deep Convolutional Networks
NASA Astrophysics Data System (ADS)
He, Guiqing; Xia, Zhaoqiang; Zhang, Qiqi; Zhang, Haixi; Fan, Jianping
2018-04-01
Plant species identification achieves much attention recently as it has potential application in the environmental protection and human life. Although deep learning techniques can be directly applied for plant species identification, it still needs to be designed for this specific task to obtain the state-of-art performance. In this paper, a bi-channel deep learning framework is developed for identifying plant species. In the framework, two different sub-networks are fine-tuned over their pretrained models respectively. And then a stacking layer is used to fuse the output of two different sub-networks. We construct a plant dataset of Orchidaceae family for algorithm evaluation. Our experimental results have demonstrated that our bi-channel deep network can achieve very competitive performance on accuracy rates compared to the existing deep learning algorithm.
NASA Technical Reports Server (NTRS)
Coates, G. D.; Alluisi, E. A.
1975-01-01
The effects of aircraft noise on human performance is considered. Progress is reported in the following areas: (1) review of the literature to identify the methodological and stimulus parameters involved in the study of noise effects on human performance; (2) development of a theoretical framework to provide working hypotheses as to the effects of noise on complex human performance; and (3) data collection on the first of several experimental investigations designed to provide tests of the hypotheses.
Tetraquark mixing framework for isoscalar resonances in light mesons
NASA Astrophysics Data System (ADS)
Kim, Hungchong; Kim, K. S.; Cheoun, Myung-Ki; Oka, Makoto
2018-05-01
Recently, a tetraquark mixing framework has been proposed for light mesons and applied more or less successfully to the isovector resonances, a0(980 ) , a0(1450 ) , as well as to the isodoublet resonances, K0*(800 ),K0*(1430 ). In this work, we present a more extensive view on the mixing framework and apply this framework to the isoscalar resonances, f0(500 ), f0(980 ), f0(1370 ), f0(1500 ). Tetraquarks in this framework can have two spin configurations containing either spin-0 diquark or spin-1 diquark and each configuration forms a nonet in flavor space. The two spin configurations are found to mix strongly through the color-spin interactions. Their mixtures, which diagonalize the hyperfine masses, can generate the physical resonances constituting two nonets, which, in fact, coincide roughly with the experimental observation. We identify that f0(500 ), f0(980 ) are the isoscalar members in the light nonet, and f0(1370 ), f0(1500 ) are the similar members in the heavy nonet. This means that the spin configuration mixing, as it relates the corresponding members in the two nonets, can generate f0(500 ) , f0(1370 ) among the members in light mass, and f0(980 ) , f0(1500 ) in heavy mass. The complication arises because the isoscalar members of each nonet are subject to an additional flavor mixing known as Okubo-Zweig-Iizuka rule so that f0(500 ) , f0(980 ) , and similarly f0(1370 ) , f0(1500 ) , are the mixture of two isoscalar members belonging to an octet and a singlet in SUf(3 ) . The tetraquark mixing framework including the flavor mixing is tested for the isoscalar resonances in terms of the mass splitting and the fall-apart decay modes. The mass splitting among the isoscalar resonances is found to be consistent qualitatively with their hyperfine mass splitting strongly driven by the spin configuration mixing, which suggests that the tetraquark mixing framework works. The fall-apart modes from our tetraquarks also seem to be consistent with the experimental modes. We also discuss possible existence of the spin-1 tetraquarks that can be constructed by the spin-1 diquark.
Experimental evolution and the dynamics of genomic mutation rate modifiers.
Raynes, Y; Sniegowski, P D
2014-11-01
Because genes that affect mutation rates are themselves subject to mutation, mutation rates can be influenced by natural selection and other evolutionary forces. The population genetics of mutation rate modifier alleles has been a subject of theoretical interest for many decades. Here, we review experimental contributions to our understanding of mutation rate modifier dynamics. Numerous evolution experiments have shown that mutator alleles (modifiers that elevate the genomic mutation rate) can readily rise to high frequencies via genetic hitchhiking in non-recombining microbial populations. Whereas these results certainly provide an explanatory framework for observations of sporadically high mutation rates in pathogenic microbes and in cancer lineages, it is nonetheless true that most natural populations have very low mutation rates. This raises the interesting question of how mutator hitchhiking is suppressed or its phenotypic effect reversed in natural populations. Very little experimental work has addressed this question; with this in mind, we identify some promising areas for future experimental investigation.
NASA Astrophysics Data System (ADS)
Lee, Bo Mi; Loh, Kenneth J.
2017-04-01
Carbon nanotubes can be randomly deposited in polymer thin film matrices to form nanocomposite strain sensors. However, a computational framework that enables the direct design of these nanocomposite thin films is still lacking. The objective of this study is to derive an experimentally validated and two-dimensional numerical model of carbon nanotube-based thin film strain sensors. This study consisted of two parts. First, multi-walled carbon nanotube (MWCNT)-Pluronic strain sensors were fabricated using vacuum filtration, and their physical, electrical, and electromechanical properties were evaluated. Second, scanning electron microscope images of the films were used for identifying topological features of the percolated MWCNT network, where the information obtained was then utilized for developing the numerical model. Validation of the numerical model was achieved by ensuring that the area ratios (of MWCNTs relative to the polymer matrix) were equivalent for both the experimental and modeled cases. Strain sensing behavior of the percolation-based model was simulated and then compared to experimental test results.
Analyzing the Discovery Potential for Light Dark Matter.
Izaguirre, Eder; Krnjaic, Gordan; Schuster, Philip; Toro, Natalia
2015-12-18
In this Letter, we determine the present status of sub-GeV thermal dark matter annihilating through standard model mixing, with special emphasis on interactions through the vector portal. Within representative simple models, we carry out a complete and precise calculation of the dark matter abundance and of all available constraints. We also introduce a concise framework for comparing different experimental approaches, and use this comparison to identify important ranges of dark matter mass and couplings to better explore in future experiments. The requirement that dark matter be a thermal relic sets a sharp sensitivity target for terrestrial experiments, and so we highlight complementary experimental approaches that can decisively reach this milestone sensitivity over the entire sub-GeV mass range.
Identification of nonlinear modes using phase-locked-loop experimental continuation and normal form
NASA Astrophysics Data System (ADS)
Denis, V.; Jossic, M.; Giraud-Audine, C.; Chomette, B.; Renault, A.; Thomas, O.
2018-06-01
In this article, we address the model identification of nonlinear vibratory systems, with a specific focus on systems modeled with distributed nonlinearities, such as geometrically nonlinear mechanical structures. The proposed strategy theoretically relies on the concept of nonlinear modes of the underlying conservative unforced system and the use of normal forms. Within this framework, it is shown that without internal resonance, a valid reduced order model for a nonlinear mode is a single Duffing oscillator. We then propose an efficient experimental strategy to measure the backbone curve of a particular nonlinear mode and we use it to identify the free parameters of the reduced order model. The experimental part relies on a Phase-Locked Loop (PLL) and enables a robust and automatic measurement of backbone curves as well as forced responses. It is theoretically and experimentally shown that the PLL is able to stabilize the unstable part of Duffing-like frequency responses, thus enabling its robust experimental measurement. Finally, the whole procedure is tested on three experimental systems: a circular plate, a chinese gong and a piezoelectric cantilever beam. It enable to validate the procedure by comparison to available theoretical models as well as to other experimental identification methods.
Structures of cage, prism, and book isomers of water hexamer from broadband rotational spectroscopy.
Pérez, Cristóbal; Muckle, Matt T; Zaleski, Daniel P; Seifert, Nathan A; Temelso, Berhane; Shields, George C; Kisiel, Zbigniew; Pate, Brooks H
2012-05-18
Theory predicts the water hexamer to be the smallest water cluster with a three-dimensional hydrogen-bonding network as its minimum energy structure. There are several possible low-energy isomers, and calculations with different methods and basis sets assign them different relative stabilities. Previous experimental work has provided evidence for the cage, book, and cyclic isomers, but no experiment has identified multiple coexisting structures. Here, we report that broadband rotational spectroscopy in a pulsed supersonic expansion unambiguously identifies all three isomers; we determined their oxygen framework structures by means of oxygen-18-substituted water (H(2)(18)O). Relative isomer populations at different expansion conditions establish that the cage isomer is the minimum energy structure. Rotational spectra consistent with predicted heptamer and nonamer structures have also been identified.
Roztocki, Kornel; Lupa, Magdalena; Sławek, Andrzej; Makowski, Wacław; Senkovska, Irena; Kaskel, Stefan; Matoga, Dariusz
2018-03-19
A new microporous cadmium metal-organic framework was synthesized both mechanochemically and in solution by using a sulfonyl-functionalized dicarboxylate linker and an acylhydrazone colinker. The three-dimensional framework is highly stable upon heating to 300 °C as well as in aqueous solutions at elevated temperatures or acidic conditions. The thermally activated material exhibits steep water vapor uptake at low relative pressures at 298 K and excellent recyclability up to 260 °C as confirmed by both quasi-equilibrated temperature-programmed desorption and adsorption (QE-TPDA) method as well as adsorption isotherm measurements. Reversible isotherms and hysteretic isobars recorded for the desorption-adsorption cycles indicate the maximum uptake of 0.19 g/g (at 298 K, up to p/p 0 = 1) or 0.18 g/g (at 1 bar, within 295-375 K range), respectively. The experimental isosteric heat of adsorption (48.9 kJ/mol) indicates noncoordinative interactions of water molecules with the framework. Exchange of the solvent molecules in the as-made material with water, performed in the single-crystal to single-crystal manner, allows direct comparison of both X-ray crystal structures. The single-crystal X-ray diffraction for the water-loaded framework demonstrates the orientation of water clusters in the framework cavities and reveals their strong hydrogen bonding with sulfonyl, acyl, and carboxylate groups of the two linkers. The grand canonical Monte Carlo (GCMC) simulations of H 2 O adsorption corroborate the experimental findings and reveal preferable locations of guest molecules in the framework voids at various pressures. Additionally, both experimental and GCMC simulation insights into the adsorption of CO 2 (at 195 K) on the activated framework are presented.
Yin, Zhong; Zhang, Jianhua
2014-07-01
Identifying the abnormal changes of mental workload (MWL) over time is quite crucial for preventing the accidents due to cognitive overload and inattention of human operators in safety-critical human-machine systems. It is known that various neuroimaging technologies can be used to identify the MWL variations. In order to classify MWL into a few discrete levels using representative MWL indicators and small-sized training samples, a novel EEG-based approach by combining locally linear embedding (LLE), support vector clustering (SVC) and support vector data description (SVDD) techniques is proposed and evaluated by using the experimentally measured data. The MWL indicators from different cortical regions are first elicited by using the LLE technique. Then, the SVC approach is used to find the clusters of these MWL indicators and thereby to detect MWL variations. It is shown that the clusters can be interpreted as the binary class MWL. Furthermore, a trained binary SVDD classifier is shown to be capable of detecting slight variations of those indicators. By combining the two schemes, a SVC-SVDD framework is proposed, where the clear-cut (smaller) cluster is detected by SVC first and then a subsequent SVDD model is utilized to divide the overlapped (larger) cluster into two classes. Finally, three-class MWL levels (low, normal and high) can be identified automatically. The experimental data analysis results are compared with those of several existing methods. It has been demonstrated that the proposed framework can lead to acceptable computational accuracy and has the advantages of both unsupervised and supervised training strategies. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Lyons, Eli; Sheridan, Paul; Tremmel, Georg; Miyano, Satoru; Sugano, Sumio
2017-10-24
High-throughput screens allow for the identification of specific biomolecules with characteristics of interest. In barcoded screens, DNA barcodes are linked to target biomolecules in a manner allowing for the target molecules making up a library to be identified by sequencing the DNA barcodes using Next Generation Sequencing. To be useful in experimental settings, the DNA barcodes in a library must satisfy certain constraints related to GC content, homopolymer length, Hamming distance, and blacklisted subsequences. Here we report a novel framework to quickly generate large-scale libraries of DNA barcodes for use in high-throughput screens. We show that our framework dramatically reduces the computation time required to generate large-scale DNA barcode libraries, compared with a naїve approach to DNA barcode library generation. As a proof of concept, we demonstrate that our framework is able to generate a library consisting of one million DNA barcodes for use in a fragment antibody phage display screening experiment. We also report generating a general purpose one billion DNA barcode library, the largest such library yet reported in literature. Our results demonstrate the value of our novel large-scale DNA barcode library generation framework for use in high-throughput screening applications.
NASA Astrophysics Data System (ADS)
Sandhu, Rajinder; Kaur, Jaspreet; Thapar, Vivek
2018-02-01
Dengue, also known as break-bone fever, is a tropical disease transmitted by mosquitoes. If the similarity between dengue infected users can be identified, it can help government's health agencies to manage the outbreak more effectively. To find similarity between cases affected by Dengue, user's personal and health information are the two fundamental requirements. Identification of similar symptoms, causes, effects, predictions and treatment procedures, is important. In this paper, an effective framework is proposed which finds similar patients suffering from dengue using keyword aware domain thesaurus and case base reasoning method. This paper focuses on the use of ontology dependent domain thesaurus technique to extract relevant keywords and then build cases with the help of case base reasoning method. Similar cases can be shared with users, nearby hospitals and health organizations to manage the problem more adequately. Two million case bases were generated to test the proposed similarity method. Experimental evaluations of proposed framework resulted in high accuracy and low error rate for finding similar cases of dengue as compared to UPCC and IPCC algorithms. The framework developed in this paper is for dengue but can easily be extended to other domains also.
NASA Astrophysics Data System (ADS)
Chen, Shuyi; Lu, Huigong; Wu, Yi-nan; Gu, Yifan; Li, Fengting; Morlay, Catherine
2016-09-01
Alumina-hercynite nano-spinel powders were prepared via one-step pyrolysis of iron-acetylacetone-doped Al-based metal-organic framework (MOF), i.e., MIL-53(Al). Organic ferric source, iron acetylacetone, was incorporated in situ into the framework of MIL-53(Al) during the solvothermal synthesis process. Under high-temperature pyrolysis, alumina derived from the MIL-53(Al) matrix and ferric oxides originated from the decomposition of organic ferric precursor in the framework were thermally converted into hercynite (FeAl2O4). The prepared samples were characterized using transmission electron microscopy, X-ray diffraction, N2 sorption, thermogravimetry, Raman spectroscopy and X-ray photoelectron spectroscopy. The final products were identified to be composed of alumina, hercynite and trace amounts of carbon depending on pyrolysis temperature. The experimental results showed that hercynite phase can be obtained and stabilized at low temperatures between 900 and 1100 °C under inert atmosphere. The final products were composed of nano-sized particles with an average size below 100 nm of individual crystal and specific surface areas of 18-49 m2 g-1.
Integrated framework for developing search and discrimination metrics
NASA Astrophysics Data System (ADS)
Copeland, Anthony C.; Trivedi, Mohan M.
1997-06-01
This paper presents an experimental framework for evaluating target signature metrics as models of human visual search and discrimination. This framework is based on a prototype eye tracking testbed, the Integrated Testbed for Eye Movement Studies (ITEMS). ITEMS determines an observer's visual fixation point while he studies a displayed image scene, by processing video of the observer's eye. The utility of this framework is illustrated with an experiment using gray-scale images of outdoor scenes that contain randomly placed targets. Each target is a square region of a specific size containing pixel values from another image of an outdoor scene. The real-world analogy of this experiment is that of a military observer looking upon the sensed image of a static scene to find camouflaged enemy targets that are reported to be in the area. ITEMS provides the data necessary to compute various statistics for each target to describe how easily the observers located it, including the likelihood the target was fixated or identified and the time required to do so. The computed values of several target signature metrics are compared to these statistics, and a second-order metric based on a model of image texture was found to be the most highly correlated.
A compressive sensing based secure watermark detection and privacy preserving storage framework.
Qia Wang; Wenjun Zeng; Jun Tian
2014-03-01
Privacy is a critical issue when the data owners outsource data storage or processing to a third party computing service, such as the cloud. In this paper, we identify a cloud computing application scenario that requires simultaneously performing secure watermark detection and privacy preserving multimedia data storage. We then propose a compressive sensing (CS)-based framework using secure multiparty computation (MPC) protocols to address such a requirement. In our framework, the multimedia data and secret watermark pattern are presented to the cloud for secure watermark detection in a CS domain to protect the privacy. During CS transformation, the privacy of the CS matrix and the watermark pattern is protected by the MPC protocols under the semi-honest security model. We derive the expected watermark detection performance in the CS domain, given the target image, watermark pattern, and the size of the CS matrix (but without the CS matrix itself). The correctness of the derived performance has been validated by our experiments. Our theoretical analysis and experimental results show that secure watermark detection in the CS domain is feasible. Our framework can also be extended to other collaborative secure signal processing and data-mining applications in the cloud.
A Framework for Debugging Geoscience Projects in a High Performance Computing Environment
NASA Astrophysics Data System (ADS)
Baxter, C.; Matott, L.
2012-12-01
High performance computing (HPC) infrastructure has become ubiquitous in today's world with the emergence of commercial cloud computing and academic supercomputing centers. Teams of geoscientists, hydrologists and engineers can take advantage of this infrastructure to undertake large research projects - for example, linking one or more site-specific environmental models with soft computing algorithms, such as heuristic global search procedures, to perform parameter estimation and predictive uncertainty analysis, and/or design least-cost remediation systems. However, the size, complexity and distributed nature of these projects can make identifying failures in the associated numerical experiments using conventional ad-hoc approaches both time- consuming and ineffective. To address these problems a multi-tiered debugging framework has been developed. The framework allows for quickly isolating and remedying a number of potential experimental failures, including: failures in the HPC scheduler; bugs in the soft computing code; bugs in the modeling code; and permissions and access control errors. The utility of the framework is demonstrated via application to a series of over 200,000 numerical experiments involving a suite of 5 heuristic global search algorithms and 15 mathematical test functions serving as cheap analogues for the simulation-based optimization of pump-and-treat subsurface remediation systems.
Menon, Rajasree; Wen, Yuchen; Omenn, Gilbert S.; Kretzler, Matthias; Guan, Yuanfang
2013-01-01
Integrating large-scale functional genomic data has significantly accelerated our understanding of gene functions. However, no algorithm has been developed to differentiate functions for isoforms of the same gene using high-throughput genomic data. This is because standard supervised learning requires ‘ground-truth’ functional annotations, which are lacking at the isoform level. To address this challenge, we developed a generic framework that interrogates public RNA-seq data at the transcript level to differentiate functions for alternatively spliced isoforms. For a specific function, our algorithm identifies the ‘responsible’ isoform(s) of a gene and generates classifying models at the isoform level instead of at the gene level. Through cross-validation, we demonstrated that our algorithm is effective in assigning functions to genes, especially the ones with multiple isoforms, and robust to gene expression levels and removal of homologous gene pairs. We identified genes in the mouse whose isoforms are predicted to have disparate functionalities and experimentally validated the ‘responsible’ isoforms using data from mammary tissue. With protein structure modeling and experimental evidence, we further validated the predicted isoform functional differences for the genes Cdkn2a and Anxa6. Our generic framework is the first to predict and differentiate functions for alternatively spliced isoforms, instead of genes, using genomic data. It is extendable to any base machine learner and other species with alternatively spliced isoforms, and shifts the current gene-centered function prediction to isoform-level predictions. PMID:24244129
Frontiers in research on biodiversity and disease.
Johnson, Pieter T J; Ostfeld, Richard S; Keesing, Felicia
2015-10-01
Global losses of biodiversity have galvanised efforts to understand how changes to communities affect ecological processes, including transmission of infectious pathogens. Here, we review recent research on diversity-disease relationships and identify future priorities. Growing evidence from experimental, observational and modelling studies indicates that biodiversity changes alter infection for a range of pathogens and through diverse mechanisms. Drawing upon lessons from the community ecology of free-living organisms, we illustrate how recent advances from biodiversity research generally can provide necessary theoretical foundations, inform experimental designs, and guide future research at the interface between infectious disease risk and changing ecological communities. Dilution effects are expected when ecological communities are nested and interactions between the pathogen and the most competent host group(s) persist or increase as biodiversity declines. To move beyond polarising debates about the generality of diversity effects and develop a predictive framework, we emphasise the need to identify how the effects of diversity vary with temporal and spatial scale, to explore how realistic patterns of community assembly affect transmission, and to use experimental studies to consider mechanisms beyond simple changes in host richness, including shifts in trophic structure, functional diversity and symbiont composition. © 2015 John Wiley & Sons Ltd/CNRS.
Frontiers in research on biodiversity and disease
Johnson, Pieter T. J.; Ostfeld, Richard S.; Keesing, Felicia
2016-01-01
Global losses of biodiversity have galvanised efforts to understand how changes to communities affect ecological processes, including transmission of infectious pathogens. Here, we review recent research on diversity–disease relationships and identify future priorities. Growing evidence from experimental, observational and modelling studies indicates that biodiversity changes alter infection for a range of pathogens and through diverse mechanisms. Drawing upon lessons from the community ecology of free-living organisms, we illustrate how recent advances from biodiversity research generally can provide necessary theoretical foundations, inform experimental designs, and guide future research at the interface between infectious disease risk and changing ecological communities. Dilution effects are expected when ecological communities are nested and interactions between the pathogen and the most competent host group(s) persist or increase as biodiversity declines. To move beyond polarising debates about the generality of diversity effects and develop a predictive framework, we emphasise the need to identify how the effects of diversity vary with temporal and spatial scale, to explore how realistic patterns of community assembly affect transmission, and to use experimental studies to consider mechanisms beyond simple changes in host richness, including shifts in trophic structure, functional diversity and symbiont composition. PMID:26261049
A geometrical approach to control and controllability of nonlinear dynamical networks
Wang, Le-Zhi; Su, Ri-Qi; Huang, Zi-Gang; Wang, Xiao; Wang, Wen-Xu; Grebogi, Celso; Lai, Ying-Cheng
2016-01-01
In spite of the recent interest and advances in linear controllability of complex networks, controlling nonlinear network dynamics remains an outstanding problem. Here we develop an experimentally feasible control framework for nonlinear dynamical networks that exhibit multistability. The control objective is to apply parameter perturbation to drive the system from one attractor to another, assuming that the former is undesired and the latter is desired. To make our framework practically meaningful, we consider restricted parameter perturbation by imposing two constraints: it must be experimentally realizable and applied only temporarily. We introduce the concept of attractor network, which allows us to formulate a quantifiable controllability framework for nonlinear dynamical networks: a network is more controllable if the attractor network is more strongly connected. We test our control framework using examples from various models of experimental gene regulatory networks and demonstrate the beneficial role of noise in facilitating control. PMID:27076273
NASA Astrophysics Data System (ADS)
Li, Shuanghong; Cao, Hongliang; Yang, Yupu
2018-02-01
Fault diagnosis is a key process for the reliability and safety of solid oxide fuel cell (SOFC) systems. However, it is difficult to rapidly and accurately identify faults for complicated SOFC systems, especially when simultaneous faults appear. In this research, a data-driven Multi-Label (ML) pattern identification approach is proposed to address the simultaneous fault diagnosis of SOFC systems. The framework of the simultaneous-fault diagnosis primarily includes two components: feature extraction and ML-SVM classifier. The simultaneous-fault diagnosis approach can be trained to diagnose simultaneous SOFC faults, such as fuel leakage, air leakage in different positions in the SOFC system, by just using simple training data sets consisting only single fault and not demanding simultaneous faults data. The experimental result shows the proposed framework can diagnose the simultaneous SOFC system faults with high accuracy requiring small number training data and low computational burden. In addition, Fault Inference Tree Analysis (FITA) is employed to identify the correlations among possible faults and their corresponding symptoms at the system component level.
Force Density Function Relationships in 2-D Granular Media
NASA Technical Reports Server (NTRS)
Youngquist, Robert C.; Metzger, Philip T.; Kilts, Kelly N.
2004-01-01
An integral transform relationship is developed to convert between two important probability density functions (distributions) used in the study of contact forces in granular physics. Developing this transform has now made it possible to compare and relate various theoretical approaches with one another and with the experimental data despite the fact that one may predict the Cartesian probability density and another the force magnitude probability density. Also, the transforms identify which functional forms are relevant to describe the probability density observed in nature, and so the modified Bessel function of the second kind has been identified as the relevant form for the Cartesian probability density corresponding to exponential forms in the force magnitude distribution. Furthermore, it is shown that this transform pair supplies a sufficient mathematical framework to describe the evolution of the force magnitude distribution under shearing. Apart from the choice of several coefficients, whose evolution of values must be explained in the physics, this framework successfully reproduces the features of the distribution that are taken to be an indicator of jamming and unjamming in a granular packing. Key words. Granular Physics, Probability Density Functions, Fourier Transforms
EXACT2: the semantics of biomedical protocols
2014-01-01
Background The reliability and reproducibility of experimental procedures is a cornerstone of scientific practice. There is a pressing technological need for the better representation of biomedical protocols to enable other agents (human or machine) to better reproduce results. A framework that ensures that all information required for the replication of experimental protocols is essential to achieve reproducibility. Methods We have developed the ontology EXACT2 (EXperimental ACTions) that is designed to capture the full semantics of biomedical protocols required for their reproducibility. To construct EXACT2 we manually inspected hundreds of published and commercial biomedical protocols from several areas of biomedicine. After establishing a clear pattern for extracting the required information we utilized text-mining tools to translate the protocols into a machine amenable format. We have verified the utility of EXACT2 through the successful processing of previously 'unseen' (not used for the construction of EXACT2) protocols. Results The paper reports on a fundamentally new version EXACT2 that supports the semantically-defined representation of biomedical protocols. The ability of EXACT2 to capture the semantics of biomedical procedures was verified through a text mining use case. In this EXACT2 is used as a reference model for text mining tools to identify terms pertinent to experimental actions, and their properties, in biomedical protocols expressed in natural language. An EXACT2-based framework for the translation of biomedical protocols to a machine amenable format is proposed. Conclusions The EXACT2 ontology is sufficient to record, in a machine processable form, the essential information about biomedical protocols. EXACT2 defines explicit semantics of experimental actions, and can be used by various computer applications. It can serve as a reference model for for the translation of biomedical protocols in natural language into a semantically-defined format. PMID:25472549
Liu, Li; Helbling, Damian E; Kohler, Hans-Peter E; Smets, Barth F
2014-11-18
Pollutants such as pesticides and their degradation products occur ubiquitously in natural aquatic environments at trace concentrations (μg L(-1) and lower). Microbial biodegradation processes have long been known to contribute to the attenuation of pesticides in contaminated environments. However, challenges remain in developing engineered remediation strategies for pesticide-contaminated environments because the fundamental processes that regulate growth-linked biodegradation of pesticides in natural environments remain poorly understood. In this research, we developed a model framework to describe growth-linked biodegradation of pesticides at trace concentrations. We used experimental data reported in the literature or novel simulations to explore three fundamental kinetic processes in isolation. We then combine these kinetic processes into a unified model framework. The three kinetic processes described were: the growth-linked biodegradation of micropollutant at environmentally relevant concentrations; the effect of coincidental assimilable organic carbon substrates; and the effect of coincidental microbes that compete for assimilable organic carbon substrates. We used Monod kinetic models to describe substrate utilization and microbial growth rates for specific pesticide and degrader pairs. We then extended the model to include terms for utilization of assimilable organic carbon substrates by the specific degrader and coincidental microbes, growth on assimilable organic carbon substrates by the specific degrader and coincidental microbes, and endogenous metabolism. The proposed model framework enables interpretation and description of a range of experimental observations on micropollutant biodegradation. The model provides a useful tool to identify environmental conditions with respect to the occurrence of assimilable organic carbon and coincidental microbes that may result in enhanced or reduced micropollutant biodegradation.
Computational discovery of metal-organic frameworks with high gas deliverable capacity
NASA Astrophysics Data System (ADS)
Bao, Yi
Metal-organic frameworks (MOFs) are a rapidly emerging class of nanoporous materials with largely tunable chemistry and diverse applications in gas storage, gas purification, catalysis, sensing and drug delivery. Efforts have been made to develop new MOFs with desirable properties both experimentally and computationally for decades. To guide experimental synthesis, we here develop a computational methodology to explore MOFs with high gas deliverable capacity. This de novo design procedure applies known chemical reactions, considers synthesizability and geometric requirements of organic linkers, and efficiently evolves a population of MOFs to optimize a desirable property. We identify 48 MOFs with higher methane deliverable capacity at 65-5.8 bar condition than the MOF-5 reference in nine networks. In a more comprehensive work, we predict two sets of MOFs with high methane deliverable capacity at a 65-5.8 bar loading-delivery condition or a 35-5.8 bar loading-delivery condition. We also optimize a set of MOFs with high methane accessible internal surface area to investigate the relationship between deliverable capacities and internal surface area. This methodology can be extended to MOFs with multiple types of linkers and multiple SBUs. Flexibile MOFs may allow for sophisticated heat management strategies and also provide higher gas deliverable capacity than rigid frameworks. We investigate flexible MOFs, such as MIL-53 families, and Fe(bdp) and Co(bdp) analogs, to understand the structural phase transition of frameworks and the resulting influence on heat of adsorption. Challenges of simulating a system with a flexible host structure and incoming guest molecules are discussed. Preliminary results from isotherm simulation using the hybrid MC/MD simulation scheme on MIL-53(Cr) are presented. Suggestions for proceeding to understand the free energy profile of flexible MOFs are provided.
Del Paggio, Joseph C; Sullivan, Richard; Schrag, Deborah; Hopman, Wilma M; Azariah, Biju; Pramesh, C S; Tannock, Ian F; Booth, Christopher M
2017-07-01
The American Society of Clinical Oncology (ASCO) and the European Society for Medical Oncology (ESMO) have developed frameworks that quantify survival gains in light of toxicity and quality of life to assess the benefits of cancer therapies. We applied these frameworks to a cohort of contemporary randomised controlled trials to explore agreement between the two approaches and to assess the relation between treatment benefit and cost. We identified all randomised controlled trials of systemic therapies in non-small-cell lung cancer, breast cancer, colorectal cancer, and pancreatic cancer published between Jan 1, 2011, and Dec 31, 2015, and assessed their abstracts and methods. Trials were eligible for inclusion in our cohort if significant differences favouring the experimental group in a prespecified primary or secondary outcome were reported (secondary outcomes were assessed only if primary outcomes were not significant). We assessed trial endpoints with the ASCO and ESMO frameworks at two timepoints 3 months apart to confirm intra-rater reliability. Cohen's κ statistic was calculated to establish agreement between the two frameworks on the basis of the median ASCO score, which was used as an arbitrary threshold of benefit, and the framework-recommended ESMO threshold. Differences in monthly drug cost between the experimental and control groups of each randomised controlled trial (ie, incremental drug cost) were derived from 2016 average wholesale prices. 109 randomised controlled trials were eligible for inclusion, 42 (39%) in non-small-cell lung cancer, 36 (33%) in breast cancer, 25 (23%) in colorectal cancer, and six (6%) in pancreatic cancer. ASCO scores ranged from 2 to 77; median score was 25 (IQR 16-35). 41 (38%) trials met the benefit thresholds in the ESMO framework. Agreement between the two frameworks was fair (κ=0·326). Among the 100 randomised controlled trials for which drug costing data were available, ASCO benefit score and monthly incremental drug costs were negatively correlated (ρ=-0·207; p=0·039). Treatments that met ESMO benefit thresholds had a lower median incremental drug cost than did those that did not meet benefit thresholds (US$2981 [IQR 320-9059] vs $8621 [1174-13 930]; p=0·018). There is only fair correlation between these two major value care frameworks, and negative correlations between framework outputs and drug costs. Delivery of optimal cancer care in a sustainable health system will necessitate future oncologists, investigators, and policy makers to reconcile the disconnect between drug cost and clinical benefit. None. Copyright © 2017 Elsevier Ltd. All rights reserved.
Transport Phenomena During Equiaxed Solidification of Alloys
NASA Technical Reports Server (NTRS)
Beckermann, C.; deGroh, H. C., III
1997-01-01
Recent progress in modeling of transport phenomena during dendritic alloy solidification is reviewed. Starting from the basic theorems of volume averaging, a general multiphase modeling framework is outlined. This framework allows for the incorporation of a variety of microscale phenomena in the macroscopic transport equations. For the case of diffusion dominated solidification, a simplified set of model equations is examined in detail and validated through comparisons with numerous experimental data for both columnar and equiaxed dendritic growth. This provides a critical assessment of the various model assumptions. Models that include melt flow and solid phase transport are also discussed, although their validation is still at an early stage. Several numerical results are presented that illustrate some of the profound effects of convective transport on the final compositional and structural characteristics of a solidified part. Important issues that deserve continuing attention are identified.
Kawano, Shin; Watanabe, Tsutomu; Mizuguchi, Sohei; Araki, Norie; Katayama, Toshiaki; Yamaguchi, Atsuko
2014-07-01
TogoTable (http://togotable.dbcls.jp/) is a web tool that adds user-specified annotations to a table that a user uploads. Annotations are drawn from several biological databases that use the Resource Description Framework (RDF) data model. TogoTable uses database identifiers (IDs) in the table as a query key for searching. RDF data, which form a network called Linked Open Data (LOD), can be searched from SPARQL endpoints using a SPARQL query language. Because TogoTable uses RDF, it can integrate annotations from not only the reference database to which the IDs originally belong, but also externally linked databases via the LOD network. For example, annotations in the Protein Data Bank can be retrieved using GeneID through links provided by the UniProt RDF. Because RDF has been standardized by the World Wide Web Consortium, any database with annotations based on the RDF data model can be easily incorporated into this tool. We believe that TogoTable is a valuable Web tool, particularly for experimental biologists who need to process huge amounts of data such as high-throughput experimental output. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.
Daneshian, Mardas; Akbarsha, Mohammad A; Blaauboer, Bas; Caloni, Francesca; Cosson, Pierre; Curren, Rodger; Goldberg, Alan; Gruber, Franz; Ohl, Frauke; Pfaller, Walter; van der Valk, Jan; Vinardell, Pilar; Zurlo, Joanne; Hartung, Thomas; Leist, Marcel
2011-01-01
Development of improved communication and education strategies is important to make alternatives to the use of animals, and the broad range of applications of the 3Rs concept better known and understood by different audiences. For this purpose, the Center for Alternatives to Animal Testing in Europe (CAAT-Europe) together with the Transatlantic Think Tank for Toxicology (t(4)) hosted a three-day workshop on "Teaching Alternative Methods to Animal Experimentation". A compilation of the recommendations by a group of international specialists in the field is summarized in this report. Initially, the workshop participants identified the different audience groups to be addressed and also the communication media that may be used. The main outcome of the workshop was a framework for a comprehensive educational program. The modular structure of the teaching program presented here allows adaptation to different audiences with their specific needs; different time schedules can be easily accommodated on this basis. The topics cover the 3Rs principle, basic research, toxicological applications, method development and validation, regulatory aspects, case studies and ethical aspects of 3Rs approaches. This expert consortium agreed to generating teaching materials covering all modules and providing them in an open access online repository.
A Unified Analysis of Structured Sonar-terrain Data using Bayesian Functional Mixed Models.
Zhu, Hongxiao; Caspers, Philip; Morris, Jeffrey S; Wu, Xiaowei; Müller, Rolf
2018-01-01
Sonar emits pulses of sound and uses the reflected echoes to gain information about target objects. It offers a low cost, complementary sensing modality for small robotic platforms. While existing analytical approaches often assume independence across echoes, real sonar data can have more complicated structures due to device setup or experimental design. In this paper, we consider sonar echo data collected from multiple terrain substrates with a dual-channel sonar head. Our goals are to identify the differential sonar responses to terrains and study the effectiveness of this dual-channel design in discriminating targets. We describe a unified analytical framework that achieves these goals rigorously, simultaneously, and automatically. The analysis was done by treating the echo envelope signals as functional responses and the terrain/channel information as covariates in a functional regression setting. We adopt functional mixed models that facilitate the estimation of terrain and channel effects while capturing the complex hierarchical structure in data. This unified analytical framework incorporates both Gaussian models and robust models. We fit the models using a full Bayesian approach, which enables us to perform multiple inferential tasks under the same modeling framework, including selecting models, estimating the effects of interest, identifying significant local regions, discriminating terrain types, and describing the discriminatory power of local regions. Our analysis of the sonar-terrain data identifies time regions that reflect differential sonar responses to terrains. The discriminant analysis suggests that a multi- or dual-channel design achieves target identification performance comparable with or better than a single-channel design.
A Unified Analysis of Structured Sonar-terrain Data using Bayesian Functional Mixed Models
Zhu, Hongxiao; Caspers, Philip; Morris, Jeffrey S.; Wu, Xiaowei; Müller, Rolf
2017-01-01
Sonar emits pulses of sound and uses the reflected echoes to gain information about target objects. It offers a low cost, complementary sensing modality for small robotic platforms. While existing analytical approaches often assume independence across echoes, real sonar data can have more complicated structures due to device setup or experimental design. In this paper, we consider sonar echo data collected from multiple terrain substrates with a dual-channel sonar head. Our goals are to identify the differential sonar responses to terrains and study the effectiveness of this dual-channel design in discriminating targets. We describe a unified analytical framework that achieves these goals rigorously, simultaneously, and automatically. The analysis was done by treating the echo envelope signals as functional responses and the terrain/channel information as covariates in a functional regression setting. We adopt functional mixed models that facilitate the estimation of terrain and channel effects while capturing the complex hierarchical structure in data. This unified analytical framework incorporates both Gaussian models and robust models. We fit the models using a full Bayesian approach, which enables us to perform multiple inferential tasks under the same modeling framework, including selecting models, estimating the effects of interest, identifying significant local regions, discriminating terrain types, and describing the discriminatory power of local regions. Our analysis of the sonar-terrain data identifies time regions that reflect differential sonar responses to terrains. The discriminant analysis suggests that a multi- or dual-channel design achieves target identification performance comparable with or better than a single-channel design. PMID:29749977
Courellis, Hristos; Mullen, Tim; Poizner, Howard; Cauwenberghs, Gert; Iversen, John R.
2017-01-01
Quantification of dynamic causal interactions among brain regions constitutes an important component of conducting research and developing applications in experimental and translational neuroscience. Furthermore, cortical networks with dynamic causal connectivity in brain-computer interface (BCI) applications offer a more comprehensive view of brain states implicated in behavior than do individual brain regions. However, models of cortical network dynamics are difficult to generalize across subjects because current electroencephalography (EEG) signal analysis techniques are limited in their ability to reliably localize sources across subjects. We propose an algorithmic and computational framework for identifying cortical networks across subjects in which dynamic causal connectivity is modeled among user-selected cortical regions of interest (ROIs). We demonstrate the strength of the proposed framework using a “reach/saccade to spatial target” cognitive task performed by 10 right-handed individuals. Modeling of causal cortical interactions was accomplished through measurement of cortical activity using (EEG), application of independent component clustering to identify cortical ROIs as network nodes, estimation of cortical current density using cortically constrained low resolution electromagnetic brain tomography (cLORETA), multivariate autoregressive (MVAR) modeling of representative cortical activity signals from each ROI, and quantification of the dynamic causal interaction among the identified ROIs using the Short-time direct Directed Transfer function (SdDTF). The resulting cortical network and the computed causal dynamics among its nodes exhibited physiologically plausible behavior, consistent with past results reported in the literature. This physiological plausibility of the results strengthens the framework's applicability in reliably capturing complex brain functionality, which is required by applications, such as diagnostics and BCI. PMID:28566997
Inkpen, S Andrew
2016-06-01
Experimental ecologists often invoke trade-offs to describe the constraints they encounter when choosing between alternative experimental designs, such as between laboratory, field, and natural experiments. In making these claims, they tend to rely on Richard Levins' analysis of trade-offs in theoretical model-building. But does Levins' framework apply to experiments? In this paper, I focus this question on one desideratum widely invoked in the modelling literature: generality. Using the case of generality, I assess whether Levins-style treatments of modelling provide workable resources for assessing trade-offs in experimental design. I argue that, of four strategies modellers employ to increase generality, only one may be unproblematically applied to experimental design. Furthermore, modelling desiderata do not have obvious correlates in experimental design, and when we define these desiderata in a way that seem consistent with ecologists' usage, the trade-off framework falls apart. I conclude that a Levins-inspired framework for modelling does not provide the content for a similar approach to experimental practice; this does not, however, mean that it cannot provide the form. Copyright © 2016 Elsevier Ltd. All rights reserved.
Optimizing Experimental Design for Comparing Models of Brain Function
Daunizeau, Jean; Preuschoff, Kerstin; Friston, Karl; Stephan, Klaas
2011-01-01
This article presents the first attempt to formalize the optimization of experimental design with the aim of comparing models of brain function based on neuroimaging data. We demonstrate our approach in the context of Dynamic Causal Modelling (DCM), which relates experimental manipulations to observed network dynamics (via hidden neuronal states) and provides an inference framework for selecting among candidate models. Here, we show how to optimize the sensitivity of model selection by choosing among experimental designs according to their respective model selection accuracy. Using Bayesian decision theory, we (i) derive the Laplace-Chernoff risk for model selection, (ii) disclose its relationship with classical design optimality criteria and (iii) assess its sensitivity to basic modelling assumptions. We then evaluate the approach when identifying brain networks using DCM. Monte-Carlo simulations and empirical analyses of fMRI data from a simple bimanual motor task in humans serve to demonstrate the relationship between network identification and the optimal experimental design. For example, we show that deciding whether there is a feedback connection requires shorter epoch durations, relative to asking whether there is experimentally induced change in a connection that is known to be present. Finally, we discuss limitations and potential extensions of this work. PMID:22125485
LDPC-based iterative joint source-channel decoding for JPEG2000.
Pu, Lingling; Wu, Zhenyu; Bilgin, Ali; Marcellin, Michael W; Vasic, Bane
2007-02-01
A framework is proposed for iterative joint source-channel decoding of JPEG2000 codestreams. At the encoder, JPEG2000 is used to perform source coding with certain error-resilience (ER) modes, and LDPC codes are used to perform channel coding. During decoding, the source decoder uses the ER modes to identify corrupt sections of the codestream and provides this information to the channel decoder. Decoding is carried out jointly in an iterative fashion. Experimental results indicate that the proposed method requires fewer iterations and improves overall system performance.
Model-based reasoning in the physics laboratory: Framework and initial results
NASA Astrophysics Data System (ADS)
Zwickl, Benjamin M.; Hu, Dehui; Finkelstein, Noah; Lewandowski, H. J.
2015-12-01
[This paper is part of the Focused Collection on Upper Division Physics Courses.] We review and extend existing frameworks on modeling to develop a new framework that describes model-based reasoning in introductory and upper-division physics laboratories. Constructing and using models are core scientific practices that have gained significant attention within K-12 and higher education. Although modeling is a broadly applicable process, within physics education, it has been preferentially applied to the iterative development of broadly applicable principles (e.g., Newton's laws of motion in introductory mechanics). A significant feature of the new framework is that measurement tools (in addition to the physical system being studied) are subjected to the process of modeling. Think-aloud interviews were used to refine the framework and demonstrate its utility by documenting examples of model-based reasoning in the laboratory. When applied to the think-aloud interviews, the framework captures and differentiates students' model-based reasoning and helps identify areas of future research. The interviews showed how students productively applied similar facets of modeling to the physical system and measurement tools: construction, prediction, interpretation of data, identification of model limitations, and revision. Finally, we document students' challenges in explicitly articulating assumptions when constructing models of experimental systems and further challenges in model construction due to students' insufficient prior conceptual understanding. A modeling perspective reframes many of the seemingly arbitrary technical details of measurement tools and apparatus as an opportunity for authentic and engaging scientific sense making.
Fostering synergy between cell biology and systems biology.
Eddy, James A; Funk, Cory C; Price, Nathan D
2015-08-01
In the shared pursuit of elucidating detailed mechanisms of cell function, systems biology presents a natural complement to ongoing efforts in cell biology. Systems biology aims to characterize biological systems through integrated and quantitative modeling of cellular information. The process of model building and analysis provides value through synthesizing and cataloging information about cells and molecules, predicting mechanisms and identifying generalizable themes, generating hypotheses and guiding experimental design, and highlighting knowledge gaps and refining understanding. In turn, incorporating domain expertise and experimental data is crucial for building towards whole cell models. An iterative cycle of interaction between cell and systems biologists advances the goals of both fields and establishes a framework for mechanistic understanding of the genome-to-phenome relationship. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Guo, Feng-Kun; Hanhart, Christoph; Meißner, Ulf-G.; Wang, Qian; Zhao, Qiang; Zou, Bing-Song
2018-01-01
A large number of experimental discoveries especially in the heavy quarkonium sector that did not meet the expectations of the until then very successful quark model led to a renaissance of hadron spectroscopy. Among various explanations of the internal structure of these excitations, hadronic molecules, being analogs of light nuclei, play a unique role since for those predictions can be made with controlled uncertainty. Experimental evidence of various candidates of hadronic molecules and methods of identifying such structures are reviewed. Nonrelativistic effective field theories are the suitable framework for studying hadronic molecules and are discussed in both the continuum and finite volumes. Also pertinent lattice QCD results are presented. Further, the production mechanisms and decays of hadronic molecules are discussed and comments are given on the reliability of certain assertions often made in the literature.
A Framework for Human Performance Criteria for Advanced Reactor Operational Concepts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacques V Hugo; David I Gertman; Jeffrey C Joe
2014-08-01
This report supports the determination of new Operational Concept models needed in support of the operational design of new reactors. The objective of this research is to establish the technical bases for human performance and human performance criteria frameworks, models, and guidance for operational concepts for advanced reactor designs. The report includes a discussion of operating principles for advanced reactors, the human performance issues and requirements for human performance based upon work domain analysis and current regulatory requirements, and a description of general human performance criteria. The major findings and key observations to date are that there is some operatingmore » experience that informs operational concepts for baseline designs for SFR and HGTRs, with the Experimental Breeder Reactor-II (EBR-II) as a best-case predecessor design. This report summarizes the theoretical and operational foundations for the development of a framework and model for human performance criteria that will influence the development of future Operational Concepts. The report also highlights issues associated with advanced reactor design and clarifies and codifies the identified aspects of technology and operating scenarios.« less
GeoSegmenter: A statistically learned Chinese word segmenter for the geoscience domain
NASA Astrophysics Data System (ADS)
Huang, Lan; Du, Youfu; Chen, Gongyang
2015-03-01
Unlike English, the Chinese language has no space between words. Segmenting texts into words, known as the Chinese word segmentation (CWS) problem, thus becomes a fundamental issue for processing Chinese documents and the first step in many text mining applications, including information retrieval, machine translation and knowledge acquisition. However, for the geoscience subject domain, the CWS problem remains unsolved. Although a generic segmenter can be applied to process geoscience documents, they lack the domain specific knowledge and consequently their segmentation accuracy drops dramatically. This motivated us to develop a segmenter specifically for the geoscience subject domain: the GeoSegmenter. We first proposed a generic two-step framework for domain specific CWS. Following this framework, we built GeoSegmenter using conditional random fields, a principled statistical framework for sequence learning. Specifically, GeoSegmenter first identifies general terms by using a generic baseline segmenter. Then it recognises geoscience terms by learning and applying a model that can transform the initial segmentation into the goal segmentation. Empirical experimental results on geoscience documents and benchmark datasets showed that GeoSegmenter could effectively recognise both geoscience terms and general terms.
A knowledge-based system for prototypical reasoning
NASA Astrophysics Data System (ADS)
Lieto, Antonio; Minieri, Andrea; Piana, Alberto; Radicioni, Daniele P.
2015-04-01
In this work we present a knowledge-based system equipped with a hybrid, cognitively inspired architecture for the representation of conceptual information. The proposed system aims at extending the classical representational and reasoning capabilities of the ontology-based frameworks towards the realm of the prototype theory. It is based on a hybrid knowledge base, composed of a classical symbolic component (grounded on a formal ontology) with a typicality based one (grounded on the conceptual spaces framework). The resulting system attempts to reconcile the heterogeneous approach to the concepts in Cognitive Science with the dual process theories of reasoning and rationality. The system has been experimentally assessed in a conceptual categorisation task where common sense linguistic descriptions were given in input, and the corresponding target concepts had to be identified. The results show that the proposed solution substantially extends the representational and reasoning 'conceptual' capabilities of standard ontology-based systems.
Forensic characterization of camcorded movies: digital cinema vs. celluloid film prints
NASA Astrophysics Data System (ADS)
Rolland-Nevière, Xavier; Chupeau, Bertrand; Do"rr, Gwena"l.; Blondé, Laurent
2012-03-01
Digital camcording in the premises of cinema theaters is the main source of pirate copies of newly released movies. To trace such recordings, watermarking systems are exploited in order for each projection to be unique and thus identifiable. The forensic analysis to recover these marks is different for digital and legacy cinemas. To avoid running both detectors, a reliable oracle discriminating between cams originating from analog or digital projections is required. This article details a classification framework relying on three complementary features : the spatial uniformity of the screen illumination, the vertical (in)stability of the projected image, and the luminance artifacts due to the interplay between the display and acquisition devices. The system has been tuned with cams captured in a controlled environment and benchmarked against a medium-sized dataset (61 samples) composed of real-life pirate cams. Reported experimental results demonstrate that such a framework yields over 80% classification accuracy.
Surface Termination of the Metal-Organic Framework HKUST-1: A Theoretical Investigation.
Amirjalayer, Saeed; Tafipolsky, Maxim; Schmid, Rochus
2014-09-18
The surface morphology and termination of metal-organic frameworks (MOF) is of critical importance in many applications, but the surface properties of these soft materials are conceptually different from those of other materials like metal or oxide surfaces. Up to now, experimental investigations are scarce and theoretical simulations have focused on the bulk properties. The possible surface structure of the archetypal MOF HKUST-1 is investigated by a first-principles derived force field in combination with DFT calculations of model systems. The computed surface energies correctly predict the [111] surface to be most stable and allow us to obtain an unprecedented atomistic picture of the surface termination. Entropic factors are identified to determine the preferred surface termination and to be the driving force for the MOF growth. On the basis of this, reported strategies like employing "modulators" during the synthesis to tailor the crystal morphology are discussed.
Interoperability between phenotype and anatomy ontologies.
Hoehndorf, Robert; Oellrich, Anika; Rebholz-Schuhmann, Dietrich
2010-12-15
Phenotypic information is important for the analysis of the molecular mechanisms underlying disease. A formal ontological representation of phenotypic information can help to identify, interpret and infer phenotypic traits based on experimental findings. The methods that are currently used to represent data and information about phenotypes fail to make the semantics of the phenotypic trait explicit and do not interoperate with ontologies of anatomy and other domains. Therefore, valuable resources for the analysis of phenotype studies remain unconnected and inaccessible to automated analysis and reasoning. We provide a framework to formalize phenotypic descriptions and make their semantics explicit. Based on this formalization, we provide the means to integrate phenotypic descriptions with ontologies of other domains, in particular anatomy and physiology. We demonstrate how our framework leads to the capability to represent disease phenotypes, perform powerful queries that were not possible before and infer additional knowledge. http://bioonto.de/pmwiki.php/Main/PheneOntology.
Set membership experimental design for biological systems.
Marvel, Skylar W; Williams, Cranos M
2012-03-21
Experimental design approaches for biological systems are needed to help conserve the limited resources that are allocated for performing experiments. The assumptions used when assigning probability density functions to characterize uncertainty in biological systems are unwarranted when only a small number of measurements can be obtained. In these situations, the uncertainty in biological systems is more appropriately characterized in a bounded-error context. Additionally, effort must be made to improve the connection between modelers and experimentalists by relating design metrics to biologically relevant information. Bounded-error experimental design approaches that can assess the impact of additional measurements on model uncertainty are needed to identify the most appropriate balance between the collection of data and the availability of resources. In this work we develop a bounded-error experimental design framework for nonlinear continuous-time systems when few data measurements are available. This approach leverages many of the recent advances in bounded-error parameter and state estimation methods that use interval analysis to generate parameter sets and state bounds consistent with uncertain data measurements. We devise a novel approach using set-based uncertainty propagation to estimate measurement ranges at candidate time points. We then use these estimated measurements at the candidate time points to evaluate which candidate measurements furthest reduce model uncertainty. A method for quickly combining multiple candidate time points is presented and allows for determining the effect of adding multiple measurements. Biologically relevant metrics are developed and used to predict when new data measurements should be acquired, which system components should be measured and how many additional measurements should be obtained. The practicability of our approach is illustrated with a case study. This study shows that our approach is able to 1) identify candidate measurement time points that maximize information corresponding to biologically relevant metrics and 2) determine the number at which additional measurements begin to provide insignificant information. This framework can be used to balance the availability of resources with the addition of one or more measurement time points to improve the predictability of resulting models.
Set membership experimental design for biological systems
2012-01-01
Background Experimental design approaches for biological systems are needed to help conserve the limited resources that are allocated for performing experiments. The assumptions used when assigning probability density functions to characterize uncertainty in biological systems are unwarranted when only a small number of measurements can be obtained. In these situations, the uncertainty in biological systems is more appropriately characterized in a bounded-error context. Additionally, effort must be made to improve the connection between modelers and experimentalists by relating design metrics to biologically relevant information. Bounded-error experimental design approaches that can assess the impact of additional measurements on model uncertainty are needed to identify the most appropriate balance between the collection of data and the availability of resources. Results In this work we develop a bounded-error experimental design framework for nonlinear continuous-time systems when few data measurements are available. This approach leverages many of the recent advances in bounded-error parameter and state estimation methods that use interval analysis to generate parameter sets and state bounds consistent with uncertain data measurements. We devise a novel approach using set-based uncertainty propagation to estimate measurement ranges at candidate time points. We then use these estimated measurements at the candidate time points to evaluate which candidate measurements furthest reduce model uncertainty. A method for quickly combining multiple candidate time points is presented and allows for determining the effect of adding multiple measurements. Biologically relevant metrics are developed and used to predict when new data measurements should be acquired, which system components should be measured and how many additional measurements should be obtained. Conclusions The practicability of our approach is illustrated with a case study. This study shows that our approach is able to 1) identify candidate measurement time points that maximize information corresponding to biologically relevant metrics and 2) determine the number at which additional measurements begin to provide insignificant information. This framework can be used to balance the availability of resources with the addition of one or more measurement time points to improve the predictability of resulting models. PMID:22436240
Perski, Olga; Blandford, Ann; West, Robert; Michie, Susan
2017-06-01
"Engagement" with digital behaviour change interventions (DBCIs) is considered important for their effectiveness. Evaluating engagement is therefore a priority; however, a shared understanding of how to usefully conceptualise engagement is lacking. This review aimed to synthesise literature on engagement to identify key conceptualisations and to develop an integrative conceptual framework involving potential direct and indirect influences on engagement and relationships between engagement and intervention effectiveness. Four electronic databases (Ovid MEDLINE, PsycINFO, ISI Web of Knowledge, ScienceDirect) were searched in November 2015. We identified 117 articles that met the inclusion criteria: studies employing experimental or non-experimental designs with adult participants explicitly or implicitly referring to engagement with DBCIs, digital games or technology. Data were synthesised using principles from critical interpretive synthesis. Engagement with DBCIs is conceptualised in terms of both experiential and behavioural aspects. A conceptual framework is proposed in which engagement with a DBCI is influenced by the DBCI itself (content and delivery), the context (the setting in which the DBCI is used and the population using it) and the behaviour that the DBCI is targeting. The context and "mechanisms of action" may moderate the influence of the DBCI on engagement. Engagement, in turn, moderates the influence of the DBCI on those mechanisms of action. In the research literature, engagement with DBCIs has been conceptualised in terms of both experience and behaviour and sits within a complex system involving the DBCI, the context of use, the mechanisms of action of the DBCI and the target behaviour.
Blanquart, François; Bataillon, Thomas
2016-01-01
The fitness landscape defines the relationship between genotypes and fitness in a given environment and underlies fundamental quantities such as the distribution of selection coefficient and the magnitude and type of epistasis. A better understanding of variation in landscape structure across species and environments is thus necessary to understand and predict how populations will adapt. An increasing number of experiments investigate the properties of fitness landscapes by identifying mutations, constructing genotypes with combinations of these mutations, and measuring the fitness of these genotypes. Yet these empirical landscapes represent a very small sample of the vast space of all possible genotypes, and this sample is often biased by the protocol used to identify mutations. Here we develop a rigorous statistical framework based on Approximate Bayesian Computation to address these concerns and use this flexible framework to fit a broad class of phenotypic fitness models (including Fisher’s model) to 26 empirical landscapes representing nine diverse biological systems. Despite uncertainty owing to the small size of most published empirical landscapes, the inferred landscapes have similar structure in similar biological systems. Surprisingly, goodness-of-fit tests reveal that this class of phenotypic models, which has been successful so far in interpreting experimental data, is a plausible in only three of nine biological systems. More precisely, although Fisher’s model was able to explain several statistical properties of the landscapes—including the mean and SD of selection and epistasis coefficients—it was often unable to explain the full structure of fitness landscapes. PMID:27052568
A Framework to Improve Energy Efficient Behaviour at Home through Activity and Context Monitoring
García, Óscar; Alonso, Ricardo S.; Corchado, Juan M.
2017-01-01
Real-time Localization Systems have been postulated as one of the most appropriated technologies for the development of applications that provide customized services. These systems provide us with the ability to locate and trace users and, among other features, they help identify behavioural patterns and habits. Moreover, the implementation of policies that will foster energy saving in homes is a complex task that involves the use of this type of systems. Although there are multiple proposals in this area, the implementation of frameworks that combine technologies and use Social Computing to influence user behaviour have not yet reached any significant savings in terms of energy. In this work, the CAFCLA framework (Context-Aware Framework for Collaborative Learning Applications) is used to develop a recommendation system for home users. The proposed system integrates a Real-Time Localization System and Wireless Sensor Networks, making it possible to develop applications that work under the umbrella of Social Computing. The implementation of an experimental use case aided efficient energy use, achieving savings of 17%. Moreover, the conducted case study pointed to the possibility of attaining good energy consumption habits in the long term. This can be done thanks to the system’s real time and historical localization, tracking and contextual data, based on which customized recommendations are generated. PMID:28758987
Effects of Cognitive Load on Driving Performance: The Cognitive Control Hypothesis.
Engström, Johan; Markkula, Gustav; Victor, Trent; Merat, Natasha
2017-08-01
The objective of this paper was to outline an explanatory framework for understanding effects of cognitive load on driving performance and to review the existing experimental literature in the light of this framework. Although there is general consensus that taking the eyes off the forward roadway significantly impairs most aspects of driving, the effects of primarily cognitively loading tasks on driving performance are not well understood. Based on existing models of driver attention, an explanatory framework was outlined. This framework can be summarized in terms of the cognitive control hypothesis: Cognitive load selectively impairs driving subtasks that rely on cognitive control but leaves automatic performance unaffected. An extensive literature review was conducted wherein existing results were reinterpreted based on the proposed framework. It was demonstrated that the general pattern of experimental results reported in the literature aligns well with the cognitive control hypothesis and that several apparent discrepancies between studies can be reconciled based on the proposed framework. More specifically, performance on nonpracticed or inherently variable tasks, relying on cognitive control, is consistently impaired by cognitive load, whereas the performance on automatized (well-practiced and consistently mapped) tasks is unaffected and sometimes even improved. Effects of cognitive load on driving are strongly selective and task dependent. The present results have important implications for the generalization of results obtained from experimental studies to real-world driving. The proposed framework can also serve to guide future research on the potential causal role of cognitive load in real-world crashes.
Model Selection in Systems Biology Depends on Experimental Design
Silk, Daniel; Kirk, Paul D. W.; Barnes, Chris P.; Toni, Tina; Stumpf, Michael P. H.
2014-01-01
Experimental design attempts to maximise the information available for modelling tasks. An optimal experiment allows the inferred models or parameters to be chosen with the highest expected degree of confidence. If the true system is faithfully reproduced by one of the models, the merit of this approach is clear - we simply wish to identify it and the true parameters with the most certainty. However, in the more realistic situation where all models are incorrect or incomplete, the interpretation of model selection outcomes and the role of experimental design needs to be examined more carefully. Using a novel experimental design and model selection framework for stochastic state-space models, we perform high-throughput in-silico analyses on families of gene regulatory cascade models, to show that the selected model can depend on the experiment performed. We observe that experimental design thus makes confidence a criterion for model choice, but that this does not necessarily correlate with a model's predictive power or correctness. Finally, in the special case of linear ordinary differential equation (ODE) models, we explore how wrong a model has to be before it influences the conclusions of a model selection analysis. PMID:24922483
Model selection in systems biology depends on experimental design.
Silk, Daniel; Kirk, Paul D W; Barnes, Chris P; Toni, Tina; Stumpf, Michael P H
2014-06-01
Experimental design attempts to maximise the information available for modelling tasks. An optimal experiment allows the inferred models or parameters to be chosen with the highest expected degree of confidence. If the true system is faithfully reproduced by one of the models, the merit of this approach is clear - we simply wish to identify it and the true parameters with the most certainty. However, in the more realistic situation where all models are incorrect or incomplete, the interpretation of model selection outcomes and the role of experimental design needs to be examined more carefully. Using a novel experimental design and model selection framework for stochastic state-space models, we perform high-throughput in-silico analyses on families of gene regulatory cascade models, to show that the selected model can depend on the experiment performed. We observe that experimental design thus makes confidence a criterion for model choice, but that this does not necessarily correlate with a model's predictive power or correctness. Finally, in the special case of linear ordinary differential equation (ODE) models, we explore how wrong a model has to be before it influences the conclusions of a model selection analysis.
Proposed framework for thermomechanical life modeling of metal matrix composites
NASA Technical Reports Server (NTRS)
Halford, Gary R.; Lerch, Bradley A.; Saltsman, James F.
1993-01-01
The framework of a mechanics of materials model is proposed for thermomechanical fatigue (TMF) life prediction of unidirectional, continuous-fiber metal matrix composites (MMC's). Axially loaded MMC test samples are analyzed as structural components whose fatigue lives are governed by local stress-strain conditions resulting from combined interactions of the matrix, interfacial layer, and fiber constituents. The metallic matrix is identified as the vehicle for tracking fatigue crack initiation and propagation. The proposed framework has three major elements. First, TMF flow and failure characteristics of in situ matrix material are approximated from tests of unreinforced matrix material, and matrix TMF life prediction equations are numerically calibrated. The macrocrack initiation fatigue life of the matrix material is divided into microcrack initiation and microcrack propagation phases. Second, the influencing factors created by the presence of fibers and interfaces are analyzed, characterized, and documented in equation form. Some of the influences act on the microcrack initiation portion of the matrix fatigue life, others on the microcrack propagation life, while some affect both. Influencing factors include coefficient of thermal expansion mismatch strains, residual (mean) stresses, multiaxial stress states, off-axis fibers, internal stress concentrations, multiple initiation sites, nonuniform fiber spacing, fiber debonding, interfacial layers and cracking, fractured fibers, fiber deflections of crack fronts, fiber bridging of matrix cracks, and internal oxidation along internal interfaces. Equations exist for some, but not all, of the currently identified influencing factors. The third element is the inclusion of overriding influences such as maximum tensile strain limits of brittle fibers that could cause local fractures and ensuing catastrophic failure of surrounding matrix material. Some experimental data exist for assessing the plausibility of the proposed framework.
Barreiro, Andrea K; Gautam, Shree Hari; Shew, Woodrow L; Ly, Cheng
2017-10-01
Determining how synaptic coupling within and between regions is modulated during sensory processing is an important topic in neuroscience. Electrophysiological recordings provide detailed information about neural spiking but have traditionally been confined to a particular region or layer of cortex. Here we develop new theoretical methods to study interactions between and within two brain regions, based on experimental measurements of spiking activity simultaneously recorded from the two regions. By systematically comparing experimentally-obtained spiking statistics to (efficiently computed) model spike rate statistics, we identify regions in model parameter space that are consistent with the experimental data. We apply our new technique to dual micro-electrode array in vivo recordings from two distinct regions: olfactory bulb (OB) and anterior piriform cortex (PC). Our analysis predicts that: i) inhibition within the afferent region (OB) has to be weaker than the inhibition within PC, ii) excitation from PC to OB is generally stronger than excitation from OB to PC, iii) excitation from PC to OB and inhibition within PC have to both be relatively strong compared to presynaptic inputs from OB. These predictions are validated in a spiking neural network model of the OB-PC pathway that satisfies the many constraints from our experimental data. We find when the derived relationships are violated, the spiking statistics no longer satisfy the constraints from the data. In principle this modeling framework can be adapted to other systems and be used to investigate relationships between other neural attributes besides network connection strengths. Thus, this work can serve as a guide to further investigations into the relationships of various neural attributes within and across different regions during sensory processing.
Burton, Brett M; Aras, Kedar K; Good, Wilson W; Tate, Jess D; Zenger, Brian; MacLeod, Rob S
2018-05-21
The biophysical basis for electrocardiographic evaluation of myocardial ischemia stems from the notion that ischemic tissues develop, with relative uniformity, along the endocardial aspects of the heart. These injured regions of subendocardial tissue give rise to intramural currents that lead to ST segment deflections within electrocardiogram (ECG) recordings. The concept of subendocardial ischemic regions is often used in clinical practice, providing a simple and intuitive description of ischemic injury; however, such a model grossly oversimplifies the presentation of ischemic disease-inadvertently leading to errors in ECG-based diagnoses. Furthermore, recent experimental studies have brought into question the subendocardial ischemia paradigm suggesting instead a more distributed pattern of tissue injury. These findings come from experiments and so have both the impact and the limitations of measurements from living organisms. Computer models have often been employed to overcome the constraints of experimental approaches and have a robust history in cardiac simulation. To this end, we have developed a computational simulation framework aimed at elucidating the effects of ischemia on measurable cardiac potentials. To validate our framework, we simulated, visualized, and analyzed 226 experimentally derived acute myocardial ischemic events. Simulation outcomes agreed both qualitatively (feature comparison) and quantitatively (correlation, average error, and significance) with experimentally obtained epicardial measurements, particularly under conditions of elevated ischemic stress. Our simulation framework introduces a novel approach to incorporating subject-specific, geometric models and experimental results that are highly resolved in space and time into computational models. We propose this framework as a means to advance the understanding of the underlying mechanisms of ischemic disease while simultaneously putting in place the computational infrastructure necessary to study and improve ischemia models aimed at reducing diagnostic errors in the clinic.
Computational approaches to protein inference in shotgun proteomics
2012-01-01
Shotgun proteomics has recently emerged as a powerful approach to characterizing proteomes in biological samples. Its overall objective is to identify the form and quantity of each protein in a high-throughput manner by coupling liquid chromatography with tandem mass spectrometry. As a consequence of its high throughput nature, shotgun proteomics faces challenges with respect to the analysis and interpretation of experimental data. Among such challenges, the identification of proteins present in a sample has been recognized as an important computational task. This task generally consists of (1) assigning experimental tandem mass spectra to peptides derived from a protein database, and (2) mapping assigned peptides to proteins and quantifying the confidence of identified proteins. Protein identification is fundamentally a statistical inference problem with a number of methods proposed to address its challenges. In this review we categorize current approaches into rule-based, combinatorial optimization and probabilistic inference techniques, and present them using integer programing and Bayesian inference frameworks. We also discuss the main challenges of protein identification and propose potential solutions with the goal of spurring innovative research in this area. PMID:23176300
Bamberger, Michael; Tarsilla, Michele; Hesse-Biber, Sharlene
2016-04-01
Many widely-used impact evaluation designs, including randomized control trials (RCTs) and quasi-experimental designs (QEDs), frequently fail to detect what are often quite serious unintended consequences of development programs. This seems surprising as experienced planners and evaluators are well aware that unintended consequences frequently occur. Most evaluation designs are intended to determine whether there is credible evidence (statistical, theory-based or narrative) that programs have achieved their intended objectives and the logic of many evaluation designs, even those that are considered the most "rigorous," does not permit the identification of outcomes that were not specified in the program design. We take the example of RCTs as they are considered by many to be the most rigorous evaluation designs. We present a numbers of cases to illustrate how infusing RCTs with a mixed-methods approach (sometimes called an "RCT+" design) can strengthen the credibility of these designs and can also capture important unintended consequences. We provide a Mixed Methods Evaluation Framework that identifies 9 ways in which UCs can occur, and we apply this framework to two of the case studies. Copyright © 2016 Elsevier Ltd. All rights reserved.
Srinivasan, Srikant; Broderick, Scott R; Zhang, Ruifeng; Mishra, Amrita; Sinnott, Susan B; Saxena, Surendra K; LeBeau, James M; Rajan, Krishna
2015-12-18
A data driven methodology is developed for tracking the collective influence of the multiple attributes of alloying elements on both thermodynamic and mechanical properties of metal alloys. Cobalt-based superalloys are used as a template to demonstrate the approach. By mapping the high dimensional nature of the systematics of elemental data embedded in the periodic table into the form of a network graph, one can guide targeted first principles calculations that identify the influence of specific elements on phase stability, crystal structure and elastic properties. This provides a fundamentally new means to rapidly identify new stable alloy chemistries with enhanced high temperature properties. The resulting visualization scheme exhibits the grouping and proximity of elements based on their impact on the properties of intermetallic alloys. Unlike the periodic table however, the distance between neighboring elements uncovers relationships in a complex high dimensional information space that would not have been easily seen otherwise. The predictions of the methodology are found to be consistent with reported experimental and theoretical studies. The informatics based methodology presented in this study can be generalized to a framework for data analysis and knowledge discovery that can be applied to many material systems and recreated for different design objectives.
An experimental study of graph connectivity for unsupervised word sense disambiguation.
Navigli, Roberto; Lapata, Mirella
2010-04-01
Word sense disambiguation (WSD), the task of identifying the intended meanings (senses) of words in context, has been a long-standing research objective for natural language processing. In this paper, we are concerned with graph-based algorithms for large-scale WSD. Under this framework, finding the right sense for a given word amounts to identifying the most "important" node among the set of graph nodes representing its senses. We introduce a graph-based WSD algorithm which has few parameters and does not require sense-annotated data for training. Using this algorithm, we investigate several measures of graph connectivity with the aim of identifying those best suited for WSD. We also examine how the chosen lexicon and its connectivity influences WSD performance. We report results on standard data sets and show that our graph-based approach performs comparably to the state of the art.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hills, Richard G.; Maniaci, David Charles; Naughton, Jonathan W.
2015-09-01
A Verification and Validation (V&V) framework is presented for the development and execution of coordinated modeling and experimental program s to assess the predictive capability of computational models of complex systems through focused, well structured, and formal processes.The elements of the framework are based on established V&V methodology developed by various organizations including the Department of Energy, National Aeronautics and Space Administration, the American Institute of Aeronautics and Astronautics, and the American Society of Mechanical Engineers. Four main topics are addressed: 1) Program planning based on expert elicitation of the modeling physics requirements, 2) experimental design for model assessment, 3)more » uncertainty quantification for experimental observations and computational model simulations, and 4) assessment of the model predictive capability. The audience for this document includes program planners, modelers, experimentalist, V &V specialist, and customers of the modeling results.« less
Condron, Robin; Farrokh, Choreh; Jordan, Kieran; McClure, Peter; Ross, Tom; Cerf, Olivier
2015-01-02
Studies on the heat resistance of dairy pathogens are a vital part of assessing the safety of dairy products. However, harmonized methodology for the study of heat resistance of food pathogens is lacking, even though there is a need for such harmonized experimental design protocols and for harmonized validation procedures for heat treatment studies. Such an approach is of particular importance to allow international agreement on appropriate risk management of emerging potential hazards for human and animal health. This paper is working toward establishment of a harmonized protocol for the study of the heat resistance of pathogens, identifying critical issues for establishment of internationally agreed protocols, including a harmonized framework for reporting and interpretation of heat inactivation studies of potentially pathogenic microorganisms. Copyright © 2014 Elsevier B.V. All rights reserved.
Competition between monomeric and dimeric crystals in schematic models for globular proteins.
Fusco, Diana; Charbonneau, Patrick
2014-07-17
Advances in experimental techniques and in theoretical models have improved our understanding of protein crystallization. However, they have also left open questions regarding the protein phase behavior and self-assembly kinetics, such as why (nearly) identical crystallization conditions can sometimes result in the formation of different crystal forms. Here, we develop a patchy particle model with competing sets of patches that provides a microscopic explanation of this phenomenon. We identify different regimes in which one or two crystal forms can coexist with a low-density fluid. Using analytical approximations, we extend our findings to different crystal phases, providing a general framework for treating protein crystallization when multiple crystal forms compete. Our results also suggest different experimental routes for targeting a specific crystal form, and for reducing the dynamical competition between the two forms, thus facilitating protein crystal assembly.
NASA Astrophysics Data System (ADS)
Radhakrishnan, Regunathan; Divakaran, Ajay; Xiong, Ziyou; Otsuka, Isao
2006-12-01
We propose a content-adaptive analysis and representation framework to discover events using audio features from "unscripted" multimedia such as sports and surveillance for summarization. The proposed analysis framework performs an inlier/outlier-based temporal segmentation of the content. It is motivated by the observation that "interesting" events in unscripted multimedia occur sparsely in a background of usual or "uninteresting" events. We treat the sequence of low/mid-level features extracted from the audio as a time series and identify subsequences that are outliers. The outlier detection is based on eigenvector analysis of the affinity matrix constructed from statistical models estimated from the subsequences of the time series. We define the confidence measure on each of the detected outliers as the probability that it is an outlier. Then, we establish a relationship between the parameters of the proposed framework and the confidence measure. Furthermore, we use the confidence measure to rank the detected outliers in terms of their departures from the background process. Our experimental results with sequences of low- and mid-level audio features extracted from sports video show that "highlight" events can be extracted effectively as outliers from a background process using the proposed framework. We proceed to show the effectiveness of the proposed framework in bringing out suspicious events from surveillance videos without any a priori knowledge. We show that such temporal segmentation into background and outliers, along with the ranking based on the departure from the background, can be used to generate content summaries of any desired length. Finally, we also show that the proposed framework can be used to systematically select "key audio classes" that are indicative of events of interest in the chosen domain.
ERIC Educational Resources Information Center
Bodily, Robert; Nyland, Rob; Wiley, David
2017-01-01
The RISE (Resource Inspection, Selection, and Enhancement) Framework is a framework supporting the continuous improvement of open educational resources (OER). The framework is an automated process that identifies learning resources that should be evaluated and either eliminated or improved. This is particularly useful in OER contexts where the…
Israel, J A; May, B
2010-03-01
The utility of genetic measures for kinship reconstruction in polysomic species is not well evaluated. We developed a framework to test hypotheses about estimating breeding population size indirectly from collections of outmigrating green sturgeon juveniles. We evaluated a polysomic dataset, in allelic frequency and phenotypic formats, from green sturgeon to describe the relationship among known progeny from experimental families. The distributions of relatedness values for kin classes were used for reconstructing green sturgeon pedigrees from juveniles of unknown relationship. We compared three rarefaction functions that described the relationship between the number of kin groups and number of samples in a pedigree to estimate the annual abundance of spawners contributing to the threatened green sturgeon Southern Distinct Population Segment in the upper Sacramento River. Results suggested the estimated abundance of breeding green sturgeon remained roughly constant in the upper Sacramento River over a 5-year period, ranging from 10 to 28 individuals depending on the year and rarefaction method. These results demonstrate an empirical understanding for the distribution of relatedness values among individuals is a benefit for assessing pedigree reconstruction methods and identifying misclassification rates. Monitoring of rare species using these indirect methods is feasible and can provide insight into breeding and ontogenetic behaviour. While this framework was developed for specific application to studying fish populations in a riverscape, the framework could be advanced to improve genetic estimation of breeding population size and to identify important breeding habitats of rare species when combined with finer-scaled sampling of offspring.
Hopping and the Stokes–Einstein relation breakdown in simple glass formers
Charbonneau, Patrick; Jin, Yuliang; Parisi, Giorgio; Zamponi, Francesco
2014-01-01
One of the most actively debated issues in the study of the glass transition is whether a mean-field description is a reasonable starting point for understanding experimental glass formers. Although the mean-field theory of the glass transition—like that of other statistical systems—is exact when the spatial dimension d→∞, the evolution of systems properties with d may not be smooth. Finite-dimensional effects could dramatically change what happens in physical dimensions, d=2,3. For standard phase transitions finite-dimensional effects are typically captured by renormalization group methods, but for glasses the corrections are much more subtle and only partially understood. Here, we investigate hopping between localized cages formed by neighboring particles in a model that allows to cleanly isolate that effect. By bringing together results from replica theory, cavity reconstruction, void percolation, and molecular dynamics, we obtain insights into how hopping induces a breakdown of the Stokes–Einstein relation and modifies the mean-field scenario in experimental systems. Although hopping is found to supersede the dynamical glass transition, it nonetheless leaves a sizable part of the critical regime untouched. By providing a constructive framework for identifying and quantifying the role of hopping, we thus take an important step toward describing dynamic facilitation in the framework of the mean-field theory of glasses. PMID:25288722
Hopping and the Stokes-Einstein relation breakdown in simple glass formers.
Charbonneau, Patrick; Jin, Yuliang; Parisi, Giorgio; Zamponi, Francesco
2014-10-21
One of the most actively debated issues in the study of the glass transition is whether a mean-field description is a reasonable starting point for understanding experimental glass formers. Although the mean-field theory of the glass transition--like that of other statistical systems--is exact when the spatial dimension d → ∞, the evolution of systems properties with d may not be smooth. Finite-dimensional effects could dramatically change what happens in physical dimensions,d = 2, 3. For standard phase transitions finite-dimensional effects are typically captured by renormalization group methods, but for glasses the corrections are much more subtle and only partially understood. Here, we investigate hopping between localized cages formed by neighboring particles in a model that allows to cleanly isolate that effect. By bringing together results from replica theory, cavity reconstruction, void percolation, and molecular dynamics, we obtain insights into how hopping induces a breakdown of the Stokes-Einstein relation and modifies the mean-field scenario in experimental systems. Although hopping is found to supersede the dynamical glass transition, it nonetheless leaves a sizable part of the critical regime untouched. By providing a constructive framework for identifying and quantifying the role of hopping, we thus take an important step toward describing dynamic facilitation in the framework of the mean-field theory of glasses.
Bergeron, Kim; Abdi, Samiya; DeCorby, Kara; Mensah, Gloria; Rempel, Benjamin; Manson, Heather
2017-11-28
There is limited research on capacity building interventions that include theoretical foundations. The purpose of this systematic review is to identify underlying theories, models and frameworks used to support capacity building interventions relevant to public health practice. The aim is to inform and improve capacity building practices and services offered by public health organizations. Four search strategies were used: 1) electronic database searching; 2) reference lists of included papers; 3) key informant consultation; and 4) grey literature searching. Inclusion and exclusion criteria are outlined with included papers focusing on capacity building, learning plans, professional development plans in combination with tools, resources, processes, procedures, steps, model, framework, guideline, described in a public health or healthcare setting, or non-government, government, or community organizations as they relate to healthcare, and explicitly or implicitly mention a theory, model and/or framework that grounds the type of capacity building approach developed. Quality assessment were performed on all included articles. Data analysis included a process for synthesizing, analyzing and presenting descriptive summaries, categorizing theoretical foundations according to which theory, model and/or framework was used and whether or not the theory, model or framework was implied or explicitly identified. Nineteen articles were included in this review. A total of 28 theories, models and frameworks were identified. Of this number, two theories (Diffusion of Innovations and Transformational Learning), two models (Ecological and Interactive Systems Framework for Dissemination and Implementation) and one framework (Bloom's Taxonomy of Learning) were identified as the most frequently cited. This review identifies specific theories, models and frameworks to support capacity building interventions relevant to public health organizations. It provides public health practitioners with a menu of potentially usable theories, models and frameworks to support capacity building efforts. The findings also support the need for the use of theories, models or frameworks to be intentional, explicitly identified, referenced and for it to be clearly outlined how they were applied to the capacity building intervention.
Penetration with Long Rods: A Theoretical Framework and Comparison with Instrumented Impacts,
1980-06-01
theoretical framework for an experimental program is described. The theory of one dimensional wave propagation is used to show how data from instrumented long rods and targets may be fitted together to give a...the theoretical framework . In the final section the results to date are discussed.
Assessing the impact of modeling limits on intelligent systems
NASA Technical Reports Server (NTRS)
Rouse, William B.; Hammer, John M.
1990-01-01
The knowledge bases underlying intelligent systems are validated. A general conceptual framework is provided for considering the roles in intelligent systems of models of physical, behavioral, and operational phenomena. A methodology is described for identifying limits in particular intelligent systems, and the use of the methodology is illustrated via an experimental evaluation of the pilot-vehicle interface within the Pilot's Associate. The requirements and functionality are outlined for a computer based knowledge engineering environment which would embody the approach advocated and illustrated in earlier discussions. Issues considered include the specific benefits of this functionality, the potential breadth of applicability, and technical feasibility.
Probing the fusion of neutron-rich nuclei with re-accelerated radioactive beams
NASA Astrophysics Data System (ADS)
Vadas, J.; Singh, Varinderjit; Wiggins, B. B.; Huston, J.; Hudan, S.; deSouza, R. T.; Lin, Z.; Horowitz, C. J.; Chbihi, A.; Ackermann, D.; Famiano, M.; Brown, K. W.
2018-03-01
We report the first measurement of the fusion excitation functions for K,4739+28Si at near-barrier energies. Evaporation residues resulting from the fusion process were identified by direct measurement of their energy and time of flight with high geometric efficiency. At the lowest incident energy, the cross section measured for the neutron-rich 47K-induced reaction is ≈6 times larger than that of the β -stable system. This experimental approach, both in measurement and in analysis, demonstrates how to efficiently measure fusion with low-intensity re-accelerated radioactive beams, establishing the framework for future studies.
Burger, Brian T.; Imam, Saheed; Scarborough, Matthew J.; ...
2017-06-06
Rhodobacter sphaeroides is one of the best-studied alphaproteobacteria from biochemical, genetic, and genomic perspectives. To gain a better systems-level understanding of this organism, we generated a large transposon mutant library and used transposon sequencing (Tn-seq) to identify genes that are essential under several growth conditions. Using newly developed Tn-seq analysis software (TSAS), we identified 493 genes as essential for aerobic growth on a rich medium. We then used the mutant library to identify conditionally essential genes under two laboratory growth conditions, identifying 85 additional genes required for aerobic growth in a minimal medium and 31 additional genes required for photosyntheticmore » growth. In all instances, our analyses confirmed essentiality for many known genes and identified genes not previously considered to be essential. We used the resulting Tn-seq data to refine and improve a genome-scale metabolic network model (GEM) for R. sphaeroides. Together, we demonstrate how genetic, genomic, and computational approaches can be combined to obtain a systems-level understanding of the genetic framework underlying metabolic diversity in bacterial species.« less
Identifying direct miRNA-mRNA causal regulatory relationships in heterogeneous data.
Zhang, Junpeng; Le, Thuc Duy; Liu, Lin; Liu, Bing; He, Jianfeng; Goodall, Gregory J; Li, Jiuyong
2014-12-01
Discovering the regulatory relationships between microRNAs (miRNAs) and mRNAs is an important problem that interests many biologists and medical researchers. A number of computational methods have been proposed to infer miRNA-mRNA regulatory relationships, and are mostly based on the statistical associations between miRNAs and mRNAs discovered in observational data. The miRNA-mRNA regulatory relationships identified by these methods can be both direct and indirect regulations. However, differentiating direct regulatory relationships from indirect ones is important for biologists in experimental designs. In this paper, we present a causal discovery based framework (called DirectTarget) to infer direct miRNA-mRNA causal regulatory relationships in heterogeneous data, including expression profiles of miRNAs and mRNAs, and miRNA target information. DirectTarget is applied to the Epithelial to Mesenchymal Transition (EMT) datasets. The validation by experimentally confirmed target databases suggests that the proposed method can effectively identify direct miRNA-mRNA regulatory relationships. To explore the upstream regulators of miRNA regulation, we further identify the causal feedforward patterns (CFFPs) of TF-miRNA-mRNA to provide insights into the miRNA regulation in EMT. DirectTarget has the potential to be applied to other datasets to elucidate the direct miRNA-mRNA causal regulatory relationships and to explore the regulatory patterns. Copyright © 2014 Elsevier Inc. All rights reserved.
Parameterization models for pesticide exposure via crop consumption.
Fantke, Peter; Wieland, Peter; Juraske, Ronnie; Shaddick, Gavin; Itoiz, Eva Sevigné; Friedrich, Rainer; Jolliet, Olivier
2012-12-04
An approach for estimating human exposure to pesticides via consumption of six important food crops is presented that can be used to extend multimedia models applied in health risk and life cycle impact assessment. We first assessed the variation of model output (pesticide residues per kg applied) as a function of model input variables (substance, crop, and environmental properties) including their possible correlations using matrix algebra. We identified five key parameters responsible for between 80% and 93% of the variation in pesticide residues, namely time between substance application and crop harvest, degradation half-lives in crops and on crop surfaces, overall residence times in soil, and substance molecular weight. Partition coefficients also play an important role for fruit trees and tomato (Kow), potato (Koc), and lettuce (Kaw, Kow). Focusing on these parameters, we develop crop-specific models by parametrizing a complex fate and exposure assessment framework. The parametric models thereby reflect the framework's physical and chemical mechanisms and predict pesticide residues in harvest using linear combinations of crop, crop surface, and soil compartments. Parametric model results correspond well with results from the complex framework for 1540 substance-crop combinations with total deviations between a factor 4 (potato) and a factor 66 (lettuce). Predicted residues also correspond well with experimental data previously used to evaluate the complex framework. Pesticide mass in harvest can finally be combined with reduction factors accounting for food processing to estimate human exposure from crop consumption. All parametric models can be easily implemented into existing assessment frameworks.
A Surrogate Approach to the Experimental Optimization of Multielement Airfoils
NASA Technical Reports Server (NTRS)
Otto, John C.; Landman, Drew; Patera, Anthony T.
1996-01-01
The incorporation of experimental test data into the optimization process is accomplished through the use of Bayesian-validated surrogates. In the surrogate approach, a surrogate for the experiment (e.g., a response surface) serves in the optimization process. The validation step of the framework provides a qualitative assessment of the surrogate quality, and bounds the surrogate-for-experiment error on designs "near" surrogate-predicted optimal designs. The utility of the framework is demonstrated through its application to the experimental selection of the trailing edge ap position to achieve a design lift coefficient for a three-element airfoil.
Todorova, Tanya K; Rozanska, Xavier; Gervais, Christel; Legrand, Alexandre; Ho, Linh N; Berruyer, Pierrick; Lesage, Anne; Emsley, Lyndon; Farrusseng, David; Canivet, Jérôme; Mellot-Draznieks, Caroline
2016-11-07
We use density functional theory, newly parameterized molecular dynamics simulations, and last generation 15 N dynamic nuclear polarization surface enhanced solid-state NMR spectroscopy (DNP SENS) to understand graft-host interactions and effects imposed by the metal-organic framework (MOF) host on peptide conformations in a peptide-functionalized MOF. Focusing on two grafts typified by MIL-68-proline (-Pro) and MIL-68-glycine-proline (-Gly-Pro), we identified the most likely peptide conformations adopted in the functionalized hybrid frameworks. We found that hydrogen bond interactions between the graft and the surface hydroxyl groups of the MOF are essential in determining the peptides conformation(s). DNP SENS methodology shows unprecedented signal enhancements when applied to these peptide-functionalized MOFs. The calculated chemical shifts of selected MIL-68-NH-Pro and MIL-68-NH-Gly-Pro conformations are in a good agreement with the experimentally obtained 15 N NMR signals. The study shows that the conformations of peptides when grafted in a MOF host are unlikely to be freely distributed, and conformational selection is directed by strong host-guest interactions. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Organizational Use of a Framework for Innovation Adoption
2011-09-01
in current processes , the eight practices identified by Denning and Dunham’s The Innovator’s Way, Essential Practices For Successful Innovation (2010...framework for identifying gaps in current processes , the eight practices identified by Denning and Dunham’s The Innovator’s Way, Essential Practices For...60 2. Methods to Use within the Eight Practice Framework ..................63 a. Marine Corps Planning Process (MCPP) for Executing
Davis, Kevin C; Blitstein, Jonathan L; Evans, W Douglas; Kamyab, Kian
2010-07-21
Prior research supports the notion that parents have the ability to influence their children's decisions regarding sexual behavior. Yet parent-based approaches to curbing teen pregnancy and STDs have been relatively unexplored. The Parents Speak Up National Campaign (PSUNC) is a multimedia campaign that attempts to fill this void by targeting parents of teens to encourage parent-child communication about waiting to have sex. The campaign follows a theoretical framework that identifies cognitions that are targeted in campaign messages and theorized to influence parent-child communication. While a previous experimental study showed PSUNC messages to be effective in increasing parent-child communication, it did not address how these effects manifest through the PSUNC theoretical framework. The current study examines the PSUNC theoretical framework by 1) estimating the impact of PSUNC on specific cognitions identified in the theoretical framework and 2) examining whether those cognitions are indeed associated with parent-child communication Our study consists of a randomized efficacy trial of PSUNC messages under controlled conditions. A sample of 1,969 parents was randomly assigned to treatment (PSUNC exposure) and control (no exposure) conditions. Parents were surveyed at baseline, 4 weeks, 6 months, 12 months, and 18 months post-baseline. Linear regression procedures were used in our analyses. Outcome variables included self-efficacy to communicate with child, long-term outcome expectations that communication would be successful, and norms on appropriate age for sexual initiation. We first estimated multivariable models to test whether these cognitive variables predict parent-child communication longitudinally. Longitudinal change in each cognitive variable was then estimated as a function of treatment condition, controlling for baseline individual characteristics. Norms related to appropriate age for sexual initiation and outcome expectations that communication would be successful were predictive of parent-child communication among both mothers and fathers. Treatment condition mothers exhibited larger changes than control mothers in both of these cognitive variables. Fathers exhibited no exposure effects. Results suggest that within a controlled setting, the "wait until older norm" and long-term outcome expectations were appropriate cognitions to target and the PSUNC media materials were successful in impacting them, particularly among mothers. This study highlights the importance of theoretical frameworks for parent-focused campaigns that identify appropriate behavioral precursors that are both predictive of a campaign's distal behavioral outcome and sensitive to campaign messages.
Yee, Susan H; Bradley, Patricia; Fisher, William S; Perreault, Sally D; Quackenboss, James; Johnson, Eric D; Bousquin, Justin; Murphy, Patricia A
2012-12-01
The U.S. Environmental Protection Agency has recently realigned its research enterprise around the concept of sustainability. Scientists from across multiple disciplines have a role to play in contributing the information, methods, and tools needed to more fully understand the long-term impacts of decisions on the social and economic sustainability of communities. Success will depend on a shift in thinking to integrate, organize, and prioritize research within a systems context. We used the Driving forces-Pressures-State-Impact-Response (DPSIR) framework as a basis for integrating social, cultural, and economic aspects of environmental and human health into a single framework. To make the framework broadly applicable to sustainability research planning, we provide a hierarchical system of DPSIR keywords and guidelines for use as a communication tool. The applicability of the integrated framework was first tested on a public health issue (asthma disparities) for purposes of discussion. We then applied the framework at a science planning meeting to identify opportunities for sustainable and healthy communities research. We conclude that an integrated systems framework has many potential roles in science planning, including identifying key issues, visualizing interactions within the system, identifying research gaps, organizing information, developing computational models, and identifying indicators.
The theoretical tools of experimental gravitation
NASA Technical Reports Server (NTRS)
Will, C. M.
1972-01-01
Theoretical frameworks for testing relativistic gravity are presented in terms of a system for analyzing theories of gravity invented as alternatives to Einstein. The parametrized post-Newtonian (PPN) formalism, based on the Dicke framework and the Eotvos-Dicke-Braginsky experiment, is discussed in detail. The metric theories of gravity, and their post-Newtonian limits are reviewed, and PPN equations of motion are derived. These equations are used to analyze specific effects and experimental tests in the solar system.
Soler, Miguel A; de Marco, Ario; Fortuna, Sara
2016-10-10
Nanobodies (VHHs) have proved to be valuable substitutes of conventional antibodies for molecular recognition. Their small size represents a precious advantage for rational mutagenesis based on modelling. Here we address the problem of predicting how Camelidae nanobody sequences can tolerate mutations by developing a simulation protocol based on all-atom molecular dynamics and whole-molecule docking. The method was tested on two sets of nanobodies characterized experimentally for their biophysical features. One set contained point mutations introduced to humanize a wild type sequence, in the second the CDRs were swapped between single-domain frameworks with Camelidae and human hallmarks. The method resulted in accurate scoring approaches to predict experimental yields and enabled to identify the structural modifications induced by mutations. This work is a promising tool for the in silico development of single-domain antibodies and opens the opportunity to customize single functional domains of larger macromolecules.
NASA Astrophysics Data System (ADS)
Soler, Miguel A.; De Marco, Ario; Fortuna, Sara
2016-10-01
Nanobodies (VHHs) have proved to be valuable substitutes of conventional antibodies for molecular recognition. Their small size represents a precious advantage for rational mutagenesis based on modelling. Here we address the problem of predicting how Camelidae nanobody sequences can tolerate mutations by developing a simulation protocol based on all-atom molecular dynamics and whole-molecule docking. The method was tested on two sets of nanobodies characterized experimentally for their biophysical features. One set contained point mutations introduced to humanize a wild type sequence, in the second the CDRs were swapped between single-domain frameworks with Camelidae and human hallmarks. The method resulted in accurate scoring approaches to predict experimental yields and enabled to identify the structural modifications induced by mutations. This work is a promising tool for the in silico development of single-domain antibodies and opens the opportunity to customize single functional domains of larger macromolecules.
Akimbekov, Zamirbek; Katsenis, Athanassios D; Nagabhushana, G P; Ayoub, Ghada; Arhangelskis, Mihails; Morris, Andrew J; Friščić, Tomislav; Navrotsky, Alexandra
2017-06-14
We provide the first combined experimental and theoretical evaluation of how differences in ligand structure and framework topology affect the relative stabilities of isocompositional (i.e., true polymorph) metal-organic frameworks (MOFs). We used solution calorimetry and periodic DFT calculations to analyze the thermodynamics of two families of topologically distinct polymorphs of zinc zeolitic imidazolate frameworks (ZIFs) based on 2-methyl- and 2-ethylimidazolate linkers, demonstrating a correlation between measured thermodynamic stability and density, and a pronounced effect of the ligand substituent on their stability. The results show that mechanochemical syntheses and transformations of ZIFs are consistent with Ostwald's rule of stages and proceed toward thermodynamically increasingly stable, more dense phases.
Beckwith, Sue; Dickinson, Angela; Kendall, Sally
2008-12-01
This paper draws on the work of Paley and Duncan et al in order to extend and engender debate regarding the use of Concept Analysis frameworks. Despite the apparent plethora of Concept Analysis frameworks used in nursing studies we found that over half of those used were derived from the work of one author. This paper explores the suitability and use of these frameworks and is set at a time when the numbers of published concept analysis papers are increasing. For the purpose of this study thirteen commonly used frameworks, identified from the nursing journals 1993 to 2005, were explored to reveal their origins, ontological and philosophical stance, and any common elements. The frameworks were critiqued and links made between their antecedents. It was noted if the articles contained discussion of any possible tensions between the ontological perspective of the framework used, the process of analysis, praxis and possible nursing theory developments. It was found that the thirteen identified frameworks are mainly based on hermeneutic propositions regarding understandings and are interpretive procedures founded on self-reflective modes of discovery. Six frameworks rely on or include the use of casuistry. Seven of the frameworks identified are predicated on, or adapt the work of Wilson, a school master writing for his pupils. Wilson's framework has a simplistic eleven step, binary and reductionist structure. Other frameworks identified include Morse et al's framework which this article suggests employs a contestable theory of concept maturity. Based on the findings revealed through our exploration of the use of concept analysis frameworks in the nursing literature, concerns were raised regarding the unjustified adaptation and alterations and the uncritical use of the frameworks. There is little evidence that these frameworks provide the necessary depth, rigor or replicability to enable the development in nursing theory which they underpin.
Karkar, Ravi; Schroeder, Jessica; Epstein, Daniel A; Pina, Laura R; Scofield, Jeffrey; Fogarty, James; Kientz, Julie A; Munson, Sean A; Vilardaga, Roger; Zia, Jasmine
2017-05-02
Diagnostic self-tracking, the recording of personal information to diagnose or manage a health condition, is a common practice, especially for people with chronic conditions. Unfortunately, many who attempt diagnostic self-tracking have trouble accomplishing their goals. People often lack knowledge and skills needed to design and conduct scientifically rigorous experiments, and current tools provide little support. To address these shortcomings and explore opportunities for diagnostic self-tracking, we designed, developed, and evaluated a mobile app that applies a self-experimentation framework to support patients suffering from irritable bowel syndrome (IBS) in identifying their personal food triggers. TummyTrials aids a person in designing, executing, and analyzing self-experiments to evaluate whether a specific food triggers their symptoms. We examined the feasibility of this approach in a field study with 15 IBS patients, finding that participants could use the tool to reliably undergo a self-experiment. However, we also discovered an underlying tension between scientific validity and the lived experience of self-experimentation. We discuss challenges of applying clinical research methods in everyday life, motivating a need for the design of self-experimentation systems to balance rigor with the uncertainties of everyday life.
2009-08-05
Socio-cultural data acquisition, extraction, and management.??? First the idea of a theoretical framework will be very briefly discussed as well as...SUBJECT TERMS human behavior, theoretical framework , hypothesis development, experimental design, ethical research, statistical power, human laboratory...who throw rocks? • How can we make them stay too far away to throw rocks? UNCLASSIFIED – Approved for Public Release Theoretical Framework / Conceptual
New phenomena in non-equilibrium quantum physics
NASA Astrophysics Data System (ADS)
Kitagawa, Takuya
From its beginning in the early 20th century, quantum theory has become progressively more important especially due to its contributions to the development of technologies. Quantum mechanics is crucial for current technology such as semiconductors, and also holds promise for future technologies such as superconductors and quantum computing. Despite of the success of quantum theory, its applications have been mostly limited to equilibrium or static systems due to 1. lack of experimental controllability of non-equilibrium quantum systems 2. lack of theoretical frameworks to understand non-equilibrium dynamics. Consequently, physicists have not yet discovered too many interesting phenomena in non-equilibrium quantum systems from both theoretical and experimental point of view and thus, non-equilibrium quantum physics did not attract too much attentions. The situation has recently changed due to the rapid development of experimental techniques in condensed matter as well as cold atom systems, which now enables a better control of non-equilibrium quantum systems. Motivated by this experimental progress, we constructed theoretical frameworks to study three different non-equilibrium regimes of transient dynamics, steady states and periodically drives. These frameworks provide new perspectives for dynamical quantum process, and help to discover new phenomena in these systems. In this thesis, we describe these frameworks through explicit examples and demonstrate their versatility. Some of these theoretical proposals have been realized in experiments, confirming the applicability of the theories to realistic experimental situations. These studies have led to not only the improved fundamental understanding of non-equilibrium processes in quantum systems, but also suggested entirely different venues for developing quantum technologies.
An algorithm to predict the connectome of neural microcircuits
Reimann, Michael W.; King, James G.; Muller, Eilif B.; Ramaswamy, Srikanth; Markram, Henry
2015-01-01
Experimentally mapping synaptic connections, in terms of the numbers and locations of their synapses and estimating connection probabilities, is still not a tractable task, even for small volumes of tissue. In fact, the six layers of the neocortex contain thousands of unique types of synaptic connections between the many different types of neurons, of which only a handful have been characterized experimentally. Here we present a theoretical framework and a data-driven algorithmic strategy to digitally reconstruct the complete synaptic connectivity between the different types of neurons in a small well-defined volume of tissue—the micro-scale connectome of a neural microcircuit. By enforcing a set of established principles of synaptic connectivity, and leveraging interdependencies between fundamental properties of neural microcircuits to constrain the reconstructed connectivity, the algorithm yields three parameters per connection type that predict the anatomy of all types of biologically viable synaptic connections. The predictions reproduce a spectrum of experimental data on synaptic connectivity not used by the algorithm. We conclude that an algorithmic approach to the connectome can serve as a tool to accelerate experimental mapping, indicating the minimal dataset required to make useful predictions, identifying the datasets required to improve their accuracy, testing the feasibility of experimental measurements, and making it possible to test hypotheses of synaptic connectivity. PMID:26500529
Challenges to inferring causality from viral information dispersion in dynamic social networks
NASA Astrophysics Data System (ADS)
Ternovski, John
2014-06-01
Understanding the mechanism behind large-scale information dispersion through complex networks has important implications for a variety of industries ranging from cyber-security to public health. With the unprecedented availability of public data from online social networks (OSNs) and the low cost nature of most OSN outreach, randomized controlled experiments, the "gold standard" of causal inference methodologies, have been used with increasing regularity to study viral information dispersion. And while these studies have dramatically furthered our understanding of how information disseminates through social networks by isolating causal mechanisms, there are still major methodological concerns that need to be addressed in future research. This paper delineates why modern OSNs are markedly different from traditional sociological social networks and why these differences present unique challenges to experimentalists and data scientists. The dynamic nature of OSNs is particularly troublesome for researchers implementing experimental designs, so this paper identifies major sources of bias arising from network mutability and suggests strategies to circumvent and adjust for these biases. This paper also discusses the practical considerations of data quality and collection, which may adversely impact the efficiency of the estimator. The major experimental methodologies used in the current literature on virality are assessed at length, and their strengths and limits identified. Other, as-yetunsolved threats to the efficiency and unbiasedness of causal estimators--such as missing data--are also discussed. This paper integrates methodologies and learnings from a variety of fields under an experimental and data science framework in order to systematically consolidate and identify current methodological limitations of randomized controlled experiments conducted in OSNs.
Developing a theoretical framework for complex community-based interventions.
Angeles, Ricardo N; Dolovich, Lisa; Kaczorowski, Janusz; Thabane, Lehana
2014-01-01
Applying existing theories to research, in the form of a theoretical framework, is necessary to advance knowledge from what is already known toward the next steps to be taken. This article proposes a guide on how to develop a theoretical framework for complex community-based interventions using the Cardiovascular Health Awareness Program as an example. Developing a theoretical framework starts with identifying the intervention's essential elements. Subsequent steps include the following: (a) identifying and defining the different variables (independent, dependent, mediating/intervening, moderating, and control); (b) postulating mechanisms how the independent variables will lead to the dependent variables; (c) identifying existing theoretical models supporting the theoretical framework under development; (d) scripting the theoretical framework into a figure or sets of statements as a series of hypotheses, if/then logic statements, or a visual model; (e) content and face validation of the theoretical framework; and (f) revising the theoretical framework. In our example, we combined the "diffusion of innovation theory" and the "health belief model" to develop our framework. Using the Cardiovascular Health Awareness Program as the model, we demonstrated a stepwise process of developing a theoretical framework. The challenges encountered are described, and an overview of the strategies employed to overcome these challenges is presented.
2014-01-01
Background Striking a balance between the degree of model complexity and parameter identifiability, while still producing biologically feasible simulations using modelling is a major challenge in computational biology. While these two elements of model development are closely coupled, parameter fitting from measured data and analysis of model mechanisms have traditionally been performed separately and sequentially. This process produces potential mismatches between model and data complexities that can compromise the ability of computational frameworks to reveal mechanistic insights or predict new behaviour. In this study we address this issue by presenting a generic framework for combined model parameterisation, comparison of model alternatives and analysis of model mechanisms. Results The presented methodology is based on a combination of multivariate metamodelling (statistical approximation of the input–output relationships of deterministic models) and a systematic zooming into biologically feasible regions of the parameter space by iterative generation of new experimental designs and look-up of simulations in the proximity of the measured data. The parameter fitting pipeline includes an implicit sensitivity analysis and analysis of parameter identifiability, making it suitable for testing hypotheses for model reduction. Using this approach, under-constrained model parameters, as well as the coupling between parameters within the model are identified. The methodology is demonstrated by refitting the parameters of a published model of cardiac cellular mechanics using a combination of measured data and synthetic data from an alternative model of the same system. Using this approach, reduced models with simplified expressions for the tropomyosin/crossbridge kinetics were found by identification of model components that can be omitted without affecting the fit to the parameterising data. Our analysis revealed that model parameters could be constrained to a standard deviation of on average 15% of the mean values over the succeeding parameter sets. Conclusions Our results indicate that the presented approach is effective for comparing model alternatives and reducing models to the minimum complexity replicating measured data. We therefore believe that this approach has significant potential for reparameterising existing frameworks, for identification of redundant model components of large biophysical models and to increase their predictive capacity. PMID:24886522
ERIC Educational Resources Information Center
Gray, Ron
2014-01-01
Inquiry experiences in secondary science classrooms are heavily weighted toward experimentation. We know, however, that many fields of science (e.g., evolutionary biology, cosmology, and paleontology), while they may utilize experiments, are not justified by experimental methodologies. With the focus on experimentation in schools, these fields of…
Presence within a mixed reality environment.
van Schaik, Paul; Turnbull, Triece; van Wersch, Anna; Drummond, Sarah
2004-10-01
Mixed reality environments represent a new approach to creating technology-mediated experiences. However, there is a lack of empirical research investigating users' actual experience. The aim of the current exploratory, non-experimental study was to establish levels of and identify factors associated with presence, within the framework of Schubert et al.'s model of presence. Using questionnaire and interview methods, the experience of the final performance of the Desert Rain mixed reality environment was investigated. Levels of general and spatial presence were relatively high, but levels of involvement and realness were not. Overall, intrinsic motivation, confidence and intention to re-visit Desert Rain were high. However, age was negatively associated with both spatial presence and confidence to play. Furthermore, various problems in navigating the environment were identified. Results are discussed in terms of Schubert's model and other theoretical perspectives. Implications for system design are presented.
Mapping genomic features to functional traits through microbial whole genome sequences.
Zhang, Wei; Zeng, Erliang; Liu, Dan; Jones, Stuart E; Emrich, Scott
2014-01-01
Recently, the utility of trait-based approaches for microbial communities has been identified. Increasing availability of whole genome sequences provide the opportunity to explore the genetic foundations of a variety of functional traits. We proposed a machine learning framework to quantitatively link the genomic features with functional traits. Genes from bacteria genomes belonging to different functional traits were grouped to Cluster of Orthologs (COGs), and were used as features. Then, TF-IDF technique from the text mining domain was applied to transform the data to accommodate the abundance and importance of each COG. After TF-IDF processing, COGs were ranked using feature selection methods to identify their relevance to the functional trait of interest. Extensive experimental results demonstrated that functional trait related genes can be detected using our method. Further, the method has the potential to provide novel biological insights.
Börjesson, Karl; Ćoso, Dušan; Gray, Victor; Grossman, Jeffrey C; Guan, Jingqi; Harris, Charles B; Hertkorn, Norbert; Hou, Zongrui; Kanai, Yosuke; Lee, Donghwa; Lomont, Justin P; Majumdar, Arun; Meier, Steven K; Moth-Poulsen, Kasper; Myrabo, Randy L; Nguyen, Son C; Segalman, Rachel A; Srinivasan, Varadharajan; Tolman, Willam B; Vinokurov, Nikolai; Vollhardt, K Peter C; Weidman, Timothy W
2014-11-17
A study of the scope and limitations of varying the ligand framework around the dinuclear core of FvRu2 in its function as a molecular solar thermal energy storage framework is presented. It includes DFT calculations probing the effect of substituents, other metals, and CO exchange for other ligands on ΔHstorage . Experimentally, the system is shown to be robust in as much as it tolerates a number of variations, except for the identity of the metal and certain substitution patterns. Failures include 1,1',3,3'-tetra-tert-butyl (4), 1,2,2',3'-tetraphenyl (9), diiron (28), diosmium (24), mixed iron-ruthenium (27), dimolybdenum (29), and ditungsten (30) derivatives. An extensive screen of potential catalysts for the thermal reversal identified AgNO3 -SiO2 as a good candidate, although catalyst decomposition remains a challenge. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Arshad, Sannia; Rho, Seungmin
2014-01-01
We have presented a classification framework that combines multiple heterogeneous classifiers in the presence of class label noise. An extension of m-Mediods based modeling is presented that generates model of various classes whilst identifying and filtering noisy training data. This noise free data is further used to learn model for other classifiers such as GMM and SVM. A weight learning method is then introduced to learn weights on each class for different classifiers to construct an ensemble. For this purpose, we applied genetic algorithm to search for an optimal weight vector on which classifier ensemble is expected to give the best accuracy. The proposed approach is evaluated on variety of real life datasets. It is also compared with existing standard ensemble techniques such as Adaboost, Bagging, and Random Subspace Methods. Experimental results show the superiority of proposed ensemble method as compared to its competitors, especially in the presence of class label noise and imbalance classes. PMID:25295302
Khalid, Shehzad; Arshad, Sannia; Jabbar, Sohail; Rho, Seungmin
2014-01-01
We have presented a classification framework that combines multiple heterogeneous classifiers in the presence of class label noise. An extension of m-Mediods based modeling is presented that generates model of various classes whilst identifying and filtering noisy training data. This noise free data is further used to learn model for other classifiers such as GMM and SVM. A weight learning method is then introduced to learn weights on each class for different classifiers to construct an ensemble. For this purpose, we applied genetic algorithm to search for an optimal weight vector on which classifier ensemble is expected to give the best accuracy. The proposed approach is evaluated on variety of real life datasets. It is also compared with existing standard ensemble techniques such as Adaboost, Bagging, and Random Subspace Methods. Experimental results show the superiority of proposed ensemble method as compared to its competitors, especially in the presence of class label noise and imbalance classes.
Aligning vocabulary for interoperability of ISR assets using authoritative sources
NASA Astrophysics Data System (ADS)
Hookway, Steve; Patten, Terry; Gorman, Joe
2017-05-01
The growing arsenal of network-centric sensor platforms shows great potential to enhance situational awareness capabilities. Non-traditional sensors collect a diverse range of data that can provide a more accurate and comprehensive common operational picture when combined with conventional intelligence, surveillance, and reconnaissance (ISR) products. One of the integration challenges is mediating differences in terminology that different data providers use to describe the data they have extracted. A data consumer should be able to reference information using the vocabulary that they are familiar with and rely on the framework to handle the mediation; for example, it should be up to the framework to identify that two different terms are synonyms for the same concept. In this paper we present an approach for automatically performing this alignment using authoritative sources such as Wikipedia (a stand-in for the Intellipedia wiki), and present experimental results that demonstrate that this approach is able to align a large number of concepts between different terminologies.
Lu, Qiongshi; Hu, Yiming; Sun, Jiehuan; Cheng, Yuwei; Cheung, Kei-Hoi; Zhao, Hongyu
2015-05-27
Identifying functional regions in the human genome is a major goal in human genetics. Great efforts have been made to functionally annotate the human genome either through computational predictions, such as genomic conservation, or high-throughput experiments, such as the ENCODE project. These efforts have resulted in a rich collection of functional annotation data of diverse types that need to be jointly analyzed for integrated interpretation and annotation. Here we present GenoCanyon, a whole-genome annotation method that performs unsupervised statistical learning using 22 computational and experimental annotations thereby inferring the functional potential of each position in the human genome. With GenoCanyon, we are able to predict many of the known functional regions. The ability of predicting functional regions as well as its generalizable statistical framework makes GenoCanyon a unique and powerful tool for whole-genome annotation. The GenoCanyon web server is available at http://genocanyon.med.yale.edu.
An extended genotyping framework for Salmonella enterica serovar Typhi, the cause of human typhoid
Wong, Vanessa K.; Baker, Stephen; Connor, Thomas R.; Pickard, Derek; Page, Andrew J.; Dave, Jayshree; Murphy, Niamh; Holliman, Richard; Sefton, Armine; Millar, Michael; Dyson, Zoe A.; Dougan, Gordon; Holt, Kathryn E.; Parkhill, Julian; Feasey, Nicholas A.; Kingsley, Robert A.; Thomson, Nicholas R.; Keane, Jacqueline A.; Weill, François- Xavier; Le Hello, Simon; Hawkey, Jane; Edwards, David J.; Harris, Simon R.; Cain, Amy K.; Hadfield, James; Hart, Peter J.; Thieu, Nga Tran Vu; Klemm, Elizabeth J.; Breiman, Robert F.; Watson, Conall H.; Edmunds, W. John; Kariuki, Samuel; Gordon, Melita A.; Heyderman, Robert S.; Okoro, Chinyere; Jacobs, Jan; Lunguya, Octavie; Msefula, Chisomo; Chabalgoity, Jose A.; Kama, Mike; Jenkins, Kylie; Dutta, Shanta; Marks, Florian; Campos, Josefina; Thompson, Corinne; Obaro, Stephen; MacLennan, Calman A.; Dolecek, Christiane; Keddy, Karen H.; Smith, Anthony M.; Parry, Christopher M.; Karkey, Abhilasha; Dongol, Sabina; Basnyat, Buddha; Arjyal, Amit; Mulholland, E. Kim; Campbell, James I.; Dufour, Muriel; Bandaranayake, Don; Toleafoa, Take N.; Singh, Shalini Pravin; Hatta, Mochammad; Newton, Paul N.; Dance, David; Davong, Viengmon; Onsare, Robert S.; Isaia, Lupeoletalalelei; Thwaites, Guy; Wijedoru, Lalith; Crump, John A.; De Pinna, Elizabeth; Nair, Satheesh; Nilles, Eric J.; Thanh, Duy Pham; Turner, Paul; Soeng, Sona; Valcanis, Mary; Powling, Joan; Dimovski, Karolina; Hogg, Geoff; Farrar, Jeremy; Mather, Alison E.; Amos, Ben
2016-01-01
The population of Salmonella enterica serovar Typhi (S. Typhi), the causative agent of typhoid fever, exhibits limited DNA sequence variation, which complicates efforts to rationally discriminate individual isolates. Here we utilize data from whole-genome sequences (WGS) of nearly 2,000 isolates sourced from over 60 countries to generate a robust genotyping scheme that is phylogenetically informative and compatible with a range of assays. These data show that, with the exception of the rapidly disseminating H58 subclade (now designated genotype 4.3.1), the global S. Typhi population is highly structured and includes dozens of subclades that display geographical restriction. The genotyping approach presented here can be used to interrogate local S. Typhi populations and help identify recent introductions of S. Typhi into new or previously endemic locations, providing information on their likely geographical source. This approach can be used to classify clinical isolates and provides a universal framework for further experimental investigations. PMID:27703135
A Dynamic Bayesian Observer Model Reveals Origins of Bias in Visual Path Integration.
Lakshminarasimhan, Kaushik J; Petsalis, Marina; Park, Hyeshin; DeAngelis, Gregory C; Pitkow, Xaq; Angelaki, Dora E
2018-06-20
Path integration is a strategy by which animals track their position by integrating their self-motion velocity. To identify the computational origins of bias in visual path integration, we asked human subjects to navigate in a virtual environment using optic flow and found that they generally traveled beyond the goal location. Such a behavior could stem from leaky integration of unbiased self-motion velocity estimates or from a prior expectation favoring slower speeds that causes velocity underestimation. Testing both alternatives using a probabilistic framework that maximizes expected reward, we found that subjects' biases were better explained by a slow-speed prior than imperfect integration. When subjects integrate paths over long periods, this framework intriguingly predicts a distance-dependent bias reversal due to buildup of uncertainty, which we also confirmed experimentally. These results suggest that visual path integration in noisy environments is limited largely by biases in processing optic flow rather than by leaky integration. Copyright © 2018 Elsevier Inc. All rights reserved.
Targeted Single-Site MOF Node Modification: Trivalent Metal Loading via Atomic Layer Deposition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, In Soo; Borycz, Joshua; Platero-Prats, Ana E.
Postsynthetic functionalization of metal organic frameworks (MOFs) enables the controlled, high-density incorporation of new atoms on a crystallographically precise framework. Leveraging the broad palette of known atomic layer deposition (ALD) chemistries, ALD in MOFs (AIM) is one such targeted approach to construct diverse, highly functional, few-atom clusters. We here demonstrate the saturating reaction of trimethylindium (InMe3) with the node hydroxyls and ligated water of NU-1000, which takes place without significant loss of MOF crystallinity or internal surface area. We computationally identify the elementary steps by which trimethylated trivalent metal compounds (ALD precursors) react with this Zr-based MOF node to generatemore » a uniform and well characterized new surface layer on the node itself, and we predict a final structure that is fully consistent with experimental X-ray pair distribution function (PDF) analysis. We further demonstrate tunable metal loading through controlled number density of the reactive handles (-OH and -OH2) achieved through node dehydration at elevated temperatures.« less
Anomaly Detection of Electromyographic Signals.
Ijaz, Ahsan; Choi, Jongeun
2018-04-01
In this paper, we provide a robust framework to detect anomalous electromyographic (EMG) signals and identify contamination types. As a first step for feature selection, optimally selected Lawton wavelets transform is applied. Robust principal component analysis (rPCA) is then performed on these wavelet coefficients to obtain features in a lower dimension. The rPCA based features are used for constructing a self-organizing map (SOM). Finally, hierarchical clustering is applied on the SOM that separates anomalous signals residing in the smaller clusters and breaks them into logical units for contamination identification. The proposed methodology is tested using synthetic and real world EMG signals. The synthetic EMG signals are generated using a heteroscedastic process mimicking desired experimental setups. A sub-part of these synthetic signals is introduced with anomalies. These results are followed with real EMG signals introduced with synthetic anomalies. Finally, a heterogeneous real world data set is used with known quality issues under an unsupervised setting. The framework provides recall of 90% (± 3.3) and precision of 99%(±0.4).
Towards adaptive, streaming analysis of x-ray tomography data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas, Mathew; Kleese van Dam, Kerstin; Marshall, Matthew J.
2015-03-04
Temporal and spatial resolution of chemical imaging methodologies such as x-ray tomography are rapidly increasing, leading to more complex experimental procedures and fast growing data volumes. Automated analysis pipelines and big data analytics are becoming essential to effectively evaluate the results of such experiments. Offering those data techniques in an adaptive, streaming environment can further substantially improve the scientific discovery process, by enabling experimental control and steering based on the evaluation of emerging phenomena as they are observed by the experiment. Pacific Northwest National Laboratory (PNNL)’ Chemical Imaging Initiative (CII - http://imaging.pnnl.gov/ ) has worked since 2011 towards developing amore » framework that allows users to rapidly compose and customize high throughput experimental analysis pipelines for multiple instrument types. The framework, named ‘Rapid Experimental Analysis’ (REXAN) Framework [1], is based on the idea of reusable component libraries and utilizes the PNNL developed collaborative data management and analysis environment ‘Velo’, to provide a user friendly analysis and data management environment for experimental facilities. This article will, discuss the capabilities established for X-Ray tomography, discuss lessons learned, and provide an overview of our more recent work in the Analysis in Motion Initiative (AIM - http://aim.pnnl.gov/ ) at PNNL to provide REXAN capabilities in a streaming environment.« less
A Social-Attributional Analysis of Alcohol Response
Fairbairn, Catharine E.; Sayette, Michael A.
2014-01-01
Conventional wisdom and survey data indicate that alcohol is a social lubricant and is consumed for its social effects. In contrast, the experimental literature examining alcohol’s effects within a social context reveals that alcohol does not consistently enhance social-emotional experience. We identify a methodological factor that might explain inconsistent alcohol-administration findings, distinguishing between studies featuring unscripted interactions among naïve participants (k = 18) and those featuring scripted social interactions with individuals identified as study confederates (k = 18). While 89% of naïve-participant studies find positive effects of alcohol on mood (d = 0.5), only 11% of confederate studies find evidence of significant alcohol-related mood enhancement (d = −0.01). The naïve-participant versus confederate distinction remains robust after controlling for various moderators including stress manipulations, gender, group size, anxiety outcome measure, and within-group consistency of beverage assignment. Based on the findings of our review, we propose a multidimensional, social-attributional framework for understanding alcohol-related reward. Borrowing organizing principles from attribution theory, the social-attributional approach predicts that alcohol will enhance mood when negative outcomes are perceived to be unstable and/or self-relevant. Our framework proposes that alcohol’s effects within a social context are largely explained by its tendency to free individuals from preoccupation with social rejection, allowing them to access social rewards. The social-attributional approach represents a novel framework for integrating distinct, well-validated concepts derived from several theories of alcohol’s effects. It further presents promising lines of inquiry for future research examining the role of social factors in alcohol reward and addiction susceptibility. PMID:25180806
Thompson, Chad M.; Haws, Laurie C.; Harris, Mark A.; Gatto, Nicole M.; Proctor, Deborah M.
2011-01-01
Mode of action (MOA) analysis provides a systematic description of key events leading to adverse health effects in animal bioassays for the purpose of informing human health risk assessment. Uncertainties and data gaps identified in the MOA analysis may also be used to guide future research to improve understanding of the MOAs underlying a specific toxic response and foster development of toxicokinetic and toxicodynamic models. An MOA analysis, consistent with approaches outlined in the MOA Framework as described in the Guidelines for Carcinogen Risk Assessment, was conducted to evaluate small intestinal tumors observed in mice chronically exposed to relatively high concentrations of hexavalent chromium (Cr(VI)) in drinking water. Based on review of the literature, key events in the MOA are hypothesized to include saturation of the reductive capacity of the upper gastrointestinal tract, absorption of Cr(VI) into the intestinal epithelium, oxidative stress and inflammation, cell proliferation, direct and/or indirect DNA modification, and mutagenesis. Although available data generally support the plausibility of these key events, several unresolved questions and data gaps were identified, highlighting the need for obtaining critical toxicokinetic and toxicodynamic data in the target tissue and in the low-dose range. Experimental assays that can address these data gaps are discussed along with strategies for comparisons between responsive and nonresponsive tissues and species. This analysis provides a practical application of MOA Framework guidance and is instructive for the design of studies to improve upon the information available for quantitative risk assessment. PMID:20947717
Caring for people with AIDS: nurses' attitudes and feelings.
Breault, A J; Polifroni, E C
1992-01-01
A qualitative, non-experimental study was conducted to identify the feelings and attitudes that nurses associate with caring for people with AIDS. Data collection and analysis were guided by the phenomenological method. Cognitive dissonance theory served as the theoretical framework to view the experience of caring for someone with AIDS. Data analysis of audiotaped, semi-structured interviews resulted in the identification of six mutually inclusive as well as exclusive themes which represent the attitudes and feelings of nurses: fear, anger, sympathy, self-enhancement, fatigue and helplessness. Particularly evident were differences in the way respondents perceived and treated AIDS patients who are intravenous drug users and those who are homosexuals.
Probing the fusion of neutron-rich nuclei with re-accelerated radioactive beams
Vadas, J.; Singh, Varinderjit; Wiggins, B. B.; ...
2018-03-27
Here, we report the first measurement of the fusion excitation functions for 39,47K + 28Si at near-barrier energies. Evaporation residues resulting from the fusion process were identified by direct measurement of their energy and time-of-flight with high geometric efficiency. At the lowest incident energy, the cross section measured for the neutron-rich 47K-induced reaction is ≈6 times larger than that of the β-stable system. This experimental approach, both in measurement and in analysis, demonstrates how to efficiently measure fusion with low-intensity re-accelerated radioactive beams, establishing the framework for future studies.
Structure and properties of microporous titanosilicate determined by first-principles calculations
NASA Astrophysics Data System (ADS)
Ching, W. Y.; Xu, Yong-Nian; Gu, Zong-Quan
1996-12-01
The structure of EST-10, a member of synthetic microporous titanosilicates, was recently determined by an ingenious combination of experimental and simulational techniques. However, the locations of the alkali atoms in the framework remain elusive and its electronic structure is totally unknown. Based on first-principles local density calculations, the possible locations of the alkali atoms are identified and its electronic structure and bonding fully elucidated. ETS-10 is a semiconductor with a direct band gap of 2.33 eV. The Na atoms are likely to locate inside the seven-member ring pore adjacent to the one-dimensional Ti-O-Ti-O- chain.
Harmonizing the MSSM with the Galactic Center excess
NASA Astrophysics Data System (ADS)
Butter, Anja; Murgia, Simona; Plehn, Tilman; Tait, Tim M. P.
2017-08-01
The minimal supersymmetric setup offers a comprehensive framework to interpret the Fermi-LAT Galactic Center excess. Taking into account experimental, theoretical, and astrophysical uncertainties we can identify valid parameter regions linked to different annihilation channels. They extend to dark matter masses above 250 GeV. There exists a very mild tension between the observed relic density and the annihilation rate in the center of our Galaxy for specific channels. The strongest additional constraints come from the new generation of direct detection experiments, ruling out much of the light and intermediate dark matter mass regime and giving preference to heavier dark matter annihilating into a pair of top quarks.
Probing the fusion of neutron-rich nuclei with re-accelerated radioactive beams
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vadas, J.; Singh, Varinderjit; Wiggins, B. B.
Here, we report the first measurement of the fusion excitation functions for 39,47K + 28Si at near-barrier energies. Evaporation residues resulting from the fusion process were identified by direct measurement of their energy and time-of-flight with high geometric efficiency. At the lowest incident energy, the cross section measured for the neutron-rich 47K-induced reaction is ≈6 times larger than that of the β-stable system. This experimental approach, both in measurement and in analysis, demonstrates how to efficiently measure fusion with low-intensity re-accelerated radioactive beams, establishing the framework for future studies.
The CRISP theory of hippocampal function in episodic memory
Cheng, Sen
2013-01-01
Over the past four decades, a “standard framework” has emerged to explain the neural mechanisms of episodic memory storage. This framework has been instrumental in driving hippocampal research forward and now dominates the design and interpretation of experimental and theoretical studies. It postulates that cortical inputs drive plasticity in the recurrent cornu ammonis 3 (CA3) synapses to rapidly imprint memories as attractor states in CA3. Here we review a range of experimental studies and argue that the evidence against the standard framework is mounting, notwithstanding the considerable evidence in its support. We propose CRISP as an alternative theory to the standard framework. CRISP is based on Context Reset by dentate gyrus (DG), Intrinsic Sequences in CA3, and Pattern completion in cornu ammonis 1 (CA1). Compared to previous models, CRISP uses a radically different mechanism for storing episodic memories in the hippocampus. Neural sequences are intrinsic to CA3, and inputs are mapped onto these intrinsic sequences through synaptic plasticity in the feedforward projections of the hippocampus. Hence, CRISP does not require plasticity in the recurrent CA3 synapses during the storage process. Like in other theories DG and CA1 play supporting roles, however, their function in CRISP have distinct implications. For instance, CA1 performs pattern completion in the absence of CA3 and DG contributes to episodic memory retrieval, increasing the speed, precision, and robustness of retrieval. We propose the conceptual theory, discuss its implications for experimental results and suggest testable predictions. It appears that CRISP not only accounts for those experimental results that are consistent with the standard framework, but also for results that are at odds with the standard framework. We therefore suggest that CRISP is a viable, and perhaps superior, theory for the hippocampal function in episodic memory. PMID:23653597
Density profiles in the Scrape-Off Layer interpreted through filament dynamics
NASA Astrophysics Data System (ADS)
Militello, Fulvio
2017-10-01
We developed a new theoretical framework to clarify the relation between radial Scrape-Off Layer density profiles and the fluctuations that generate them. The framework provides an interpretation of the experimental features of the profiles and of the turbulence statistics on the basis of simple properties of the filaments, such as their radial motion and their draining towards the divertor. L-mode and inter-ELM filaments are described as a Poisson process in which each event is independent and modelled with a wave function of amplitude and width statistically distributed according to experimental observations and evolving according to fluid equations. We will rigorously show that radially accelerating filaments, less efficient parallel exhaust and also a statistical distribution of their radial velocity can contribute to induce flatter profiles in the far SOL and therefore enhance plasma-wall interactions. A quite general result of our analysis is the resiliency of this non-exponential nature of the profiles and the increase of the relative fluctuation amplitude towards the wall, as experimentally observed. According to the framework, profile broadening at high fueling rates can be caused by interactions with neutrals (e.g. charge exchange) in the divertor or by a significant radial acceleration of the filaments. The framework assumptions were tested with 3D numerical simulations of seeded SOL filaments based on a two fluid model. In particular, filaments interact through the electrostatic field they generate only when they are in close proximity (separation comparable to their width in the drift plane), thus justifying our independence hypothesis. In addition, we will discuss how isolated filament motion responds to variations in the plasma conditions, and specifically divertor conditions. Finally, using the theoretical framework we will reproduce and interpret experimental results obtained on JET, MAST and HL-2A.
A judgment and decision-making model for plant behavior.
Karban, Richard; Orrock, John L
2018-06-12
Recently plant biologists have documented that plants, like animals, engage in many activities that can be considered as behaviors, although plant biologists currently lack a conceptual framework to understand these processes. Borrowing the well-established framework developed by psychologists, we propose that plant behaviors can be constructively modeled by identifying four distinct components: 1) a cue or stimulus that provides information, 2) a judgment whereby the plant perceives and processes this informative cue, 3) a decision whereby the plant chooses among several options based on their relative costs and benefits, and 4) action. Judgment for plants can be determined empirically by monitoring signaling associated with electrical, calcium, or hormonal fluxes. Decision-making can be evaluated empirically by monitoring gene expression or differential allocation of resources. We provide examples of the utility of this judgment and decision-making framework by considering cases in which plants either successfully or unsuccessfully induced resistance against attacking herbivores. Separating judgment from decision-making suggests new analytical paradigms (i.e., Bayesian methods for judgment and economic utility models for decision-making). Following this framework, we propose an experimental approach to plant behavior that explicitly manipulates the stimuli provided to plants, uses plants that vary in sensory abilities, and examines how environmental context affects plant responses. The concepts and approaches that follow from the judgment and decision-making framework can shape how we study and understand plant-herbivore interactions, biological invasions, plant responses to climate change, and the susceptibility of plants to evolutionary traps. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
An Experimental Framework for Generating Evolvable Chemical Systems in the Laboratory
NASA Astrophysics Data System (ADS)
Baum, David A.; Vetsigian, Kalin
2017-12-01
Most experimental work on the origin of life has focused on either characterizing the chemical synthesis of particular biochemicals and their precursors or on designing simple chemical systems that manifest life-like properties such as self-propagation or adaptive evolution. Here we propose a new class of experiments, analogous to artificial ecosystem selection, where we select for spontaneously forming self-propagating chemical assemblages in the lab and then seek evidence of a response to that selection as a key indicator that life-like chemical systems have arisen. Since surfaces and surface metabolism likely played an important role in the origin of life, a key experimental challenge is to find conditions that foster nucleation and spread of chemical consortia on surfaces. We propose high-throughput screening of a diverse set of conditions in order to identify combinations of "food," energy sources, and mineral surfaces that foster the emergence of surface-associated chemical consortia that are capable of adaptive evolution. Identification of such systems would greatly advance our understanding of the emergence of self-propagating entities and the onset of adaptive evolution during the origin of life.
A Novel Multi-Class Ensemble Model for Classifying Imbalanced Biomedical Datasets
NASA Astrophysics Data System (ADS)
Bikku, Thulasi; Sambasiva Rao, N., Dr; Rao, Akepogu Ananda, Dr
2017-08-01
This paper mainly focuseson developing aHadoop based framework for feature selection and classification models to classify high dimensionality data in heterogeneous biomedical databases. Wide research has been performing in the fields of Machine learning, Big data and Data mining for identifying patterns. The main challenge is extracting useful features generated from diverse biological systems. The proposed model can be used for predicting diseases in various applications and identifying the features relevant to particular diseases. There is an exponential growth of biomedical repositories such as PubMed and Medline, an accurate predictive model is essential for knowledge discovery in Hadoop environment. Extracting key features from unstructured documents often lead to uncertain results due to outliers and missing values. In this paper, we proposed a two phase map-reduce framework with text preprocessor and classification model. In the first phase, mapper based preprocessing method was designed to eliminate irrelevant features, missing values and outliers from the biomedical data. In the second phase, a Map-Reduce based multi-class ensemble decision tree model was designed and implemented in the preprocessed mapper data to improve the true positive rate and computational time. The experimental results on the complex biomedical datasets show that the performance of our proposed Hadoop based multi-class ensemble model significantly outperforms state-of-the-art baselines.
An historical perspective on the pioneering experiments of John Saunders.
Tickle, Cheryll
2017-09-15
John Saunders was a highly skilled embryologist who pioneered the study of limb development. His studies on chick embryos provided the fundamental framework for understanding how vertebrate limbs develop. This framework inspired generations of scientists and formed the bridge from experimental embryology to molecular mechanisms. Saunders investigated how feathers become organized into tracts in the skin of the chick wing and also identified regions of programmed cell death. He discovered that a region of thickened ectoderm that rims the chick wing bud - the apical ectodermal ridge - is required for outgrowth and the laying down of structures along the proximo-distal axis (long axis) of the wing, identified the zone of polarizing activity (ZPA; polarizing region) that controls development across the anteroposterior axis ("thumb to little finger "axis) and contributed to uncovering the importance of the ectoderm in development of structures along the dorso-ventral axis ( "back of hand to palm" axis). This review looks in depth at some of his original papers and traces how he made the crucial findings about how limbs develop, considering these findings both in the context of contemporary knowledge at the time and also in terms of their immediate impact on the field. Copyright © 2017 Elsevier Inc. All rights reserved.
On-line Bayesian model updating for structural health monitoring
NASA Astrophysics Data System (ADS)
Rocchetta, Roberto; Broggi, Matteo; Huchet, Quentin; Patelli, Edoardo
2018-03-01
Fatigue induced cracks is a dangerous failure mechanism which affects mechanical components subject to alternating load cycles. System health monitoring should be adopted to identify cracks which can jeopardise the structure. Real-time damage detection may fail in the identification of the cracks due to different sources of uncertainty which have been poorly assessed or even fully neglected. In this paper, a novel efficient and robust procedure is used for the detection of cracks locations and lengths in mechanical components. A Bayesian model updating framework is employed, which allows accounting for relevant sources of uncertainty. The idea underpinning the approach is to identify the most probable crack consistent with the experimental measurements. To tackle the computational cost of the Bayesian approach an emulator is adopted for replacing the computationally costly Finite Element model. To improve the overall robustness of the procedure, different numerical likelihoods, measurement noises and imprecision in the value of model parameters are analysed and their effects quantified. The accuracy of the stochastic updating and the efficiency of the numerical procedure are discussed. An experimental aluminium frame and on a numerical model of a typical car suspension arm are used to demonstrate the applicability of the approach.
NASA Astrophysics Data System (ADS)
Medlyn, B.; Jiang, M.; Zaehle, S.
2017-12-01
There is now ample experimental evidence that the response of terrestrial vegetation to rising atmospheric CO2 concentration is modified by soil nutrient availability. How to represent nutrient cycling processes is thus a key consideration for vegetation models. We have previously used model intercomparison to demonstrate that models incorporating different assumptions predict very different responses at Free-Air CO2 Enrichment experiments. Careful examination of model outputs has provided some insight into the reasons for the different model outcomes, but it is difficult to attribute outcomes to specific assumptions. Here we investigate the impact of individual assumptions in a generic plant carbon-nutrient cycling model. The G'DAY (Generic Decomposition And Yield) model is modified to incorporate alternative hypotheses for nutrient cycling. We analyse the impact of these assumptions in the model using a simple analytical approach known as "two-timing". This analysis identifies the quasi-equilibrium behaviour of the model at the time scales of the component pools. The analysis provides a useful mathematical framework for probing model behaviour and identifying the most critical assumptions for experimental study.
Nonparametric estimates of drift and diffusion profiles via Fokker-Planck algebra.
Lund, Steven P; Hubbard, Joseph B; Halter, Michael
2014-11-06
Diffusion processes superimposed upon deterministic motion play a key role in understanding and controlling the transport of matter, energy, momentum, and even information in physics, chemistry, material science, biology, and communications technology. Given functions defining these random and deterministic components, the Fokker-Planck (FP) equation is often used to model these diffusive systems. Many methods exist for estimating the drift and diffusion profiles from one or more identifiable diffusive trajectories; however, when many identical entities diffuse simultaneously, it may not be possible to identify individual trajectories. Here we present a method capable of simultaneously providing nonparametric estimates for both drift and diffusion profiles from evolving density profiles, requiring only the validity of Langevin/FP dynamics. This algebraic FP manipulation provides a flexible and robust framework for estimating stationary drift and diffusion coefficient profiles, is not based on fluctuation theory or solved diffusion equations, and may facilitate predictions for many experimental systems. We illustrate this approach on experimental data obtained from a model lipid bilayer system exhibiting free diffusion and electric field induced drift. The wide range over which this approach provides accurate estimates for drift and diffusion profiles is demonstrated through simulation.
Implications of a framework for student reasoning in an interview
NASA Astrophysics Data System (ADS)
Gray, Kara E.; Hrepic, Zdeslav; Itza-Ortiz, Salomon F.; Allbaugh, Alicia R.; Engelhardt, Paula V.; Rebello, N. Sanjay; Zollman, Dean A.
2004-09-01
We discuss the implications of a framework to characterize student reasoning in an interview and its underpinnings in cognitive psychology. Our framework, described in a previous paper in these Proceedings, enables a researcher to identify various cognitive elements used by a student during an interview. Our thesis is that this framework can help identify reasoning paths used by the students. We discuss how this framework can be applied to both a coarse and fine grained analysis of reasoning and how it can be used to infer a student's implicit reasoning processes.
ERIC Educational Resources Information Center
Ifenthaler, Dirk; Gosper, Maree
2014-01-01
This paper introduces the MAPLET framework that was developed to map and link teaching aims, learning processes, learner expertise and technologies. An experimental study with 65 participants is reported to test the effectiveness of the framework as a guide to the design of lessons embedded within larger units of study. The findings indicate the…
Li, Simon Y W; Magrabi, Farah; Coiera, Enrico
2012-01-01
To understand the complex effects of interruption in healthcare. As interruptions have been well studied in other domains, the authors undertook a systematic review of experimental studies in psychology and human-computer interaction to identify the task types and variables influencing interruption effects. 63 studies were identified from 812 articles retrieved by systematic searches. On the basis of interruption profiles for generic tasks, it was found that clinical tasks can be distinguished into three broad types: procedural, problem-solving, and decision-making. Twelve experimental variables that influence interruption effects were identified. Of these, six are the most important, based on the number of studies and because of their centrality to interruption effects, including working memory load, interruption position, similarity, modality, handling strategies, and practice effect. The variables are explained by three main theoretical frameworks: the activation-based goal memory model, prospective memory, and multiple resource theory. This review provides a useful starting point for a more comprehensive examination of interruptions potentially leading to an improved understanding about the impact of this phenomenon on patient safety and task efficiency. The authors provide some recommendations to counter interruption effects. The effects of interruption are the outcome of a complex set of variables and should not be considered as uniformly predictable or bad. The task types, variables, and theories should help us better to identify which clinical tasks and contexts are most susceptible and assist in the design of information systems and processes that are resilient to interruption.
A Framework for Identifying and Classifying Undergraduate Student Proof Errors
ERIC Educational Resources Information Center
Strickland, S.; Rand, B.
2016-01-01
This paper describes a framework for identifying, classifying, and coding student proofs, modified from existing proof-grading rubrics. The framework includes 20 common errors, as well as categories for interpreting the severity of the error. The coding scheme is intended for use in a classroom context, for providing effective student feedback. In…
An Integrated Finite Element-based Simulation Framework: From Hole Piercing to Hole Expansion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, Xiaohua; Sun, Xin; Golovashchenko, Segey F.
An integrated finite element-based modeling framework is developed to predict the hole expansion ratio (HER) of AA6111-T4 sheet by considering the piercing-induced damages around the hole edge. Using damage models and parameters calibrated from previously reported tensile stretchability studies, the predicted HER correlates well with experimentally measured HER values for different hole piercing clearances. The hole piercing model shows burrs are not generated on the sheared surface for clearances less than 20%, which corresponds well with the experimental data on pierced holes cross-sections. Finite-element-calculated HER also is not especially sensitive to piercing clearances less than this value. However, as clearancesmore » increase to 30% and further to 40%, the HER values are predicted to be considerably smaller, also consistent with experimental measurements. Upon validation, the integrated modeling framework is used to examine the effects of different hole piercing and hole expansion conditions on the critical HERs for AA6111-T4.« less
Crystal plasticity modeling of irradiation growth in Zircaloy-2
NASA Astrophysics Data System (ADS)
Patra, Anirban; Tomé, Carlos N.; Golubov, Stanislav I.
2017-08-01
A physically based reaction-diffusion model is implemented in the visco-plastic self-consistent (VPSC) crystal plasticity framework to simulate irradiation growth in hcp Zr and its alloys. The reaction-diffusion model accounts for the defects produced by the cascade of displaced atoms, their diffusion to lattice sinks and the contribution to crystallographic strain at the level of single crystals. The VPSC framework accounts for intergranular interactions and irradiation creep, and calculates the strain in the polycrystalline ensemble. A novel scheme is proposed to model the simultaneous evolution of both, number density and radius, of irradiation-induced dislocation loops directly from experimental data of dislocation density evolution during irradiation. This framework is used to predict the irradiation growth behaviour of cold-worked Zircaloy-2 and trends compared to available experimental data. The role of internal stresses in inducing irradiation creep is discussed. Effects of grain size, texture and external stress on the coupled irradiation growth and creep behaviour are also studied and compared with available experimental data.
Liu, Bin; Long, Ren; Chou, Kuo-Chen
2016-08-15
Regulatory DNA elements are associated with DNase I hypersensitive sites (DHSs). Accordingly, identification of DHSs will provide useful insights for in-depth investigation into the function of noncoding genomic regions. In this study, using the strategy of ensemble learning framework, we proposed a new predictor called iDHS-EL for identifying the location of DHS in human genome. It was formed by fusing three individual Random Forest (RF) classifiers into an ensemble predictor. The three RF operators were respectively based on the three special modes of the general pseudo nucleotide composition (PseKNC): (i) kmer, (ii) reverse complement kmer and (iii) pseudo dinucleotide composition. It has been demonstrated that the new predictor remarkably outperforms the relevant state-of-the-art methods in both accuracy and stability. For the convenience of most experimental scientists, a web server for iDHS-EL is established at http://bioinformatics.hitsz.edu.cn/iDHS-EL, which is the first web-server predictor ever established for identifying DHSs, and by which users can easily get their desired results without the need to go through the mathematical details. We anticipate that IDHS-EL: will become a very useful high throughput tool for genome analysis. bliu@gordonlifescience.org or bliu@insun.hit.edu.cn Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Complex optimization for big computational and experimental neutron datasets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bao, Feng; Oak Ridge National Lab.; Archibald, Richard
Here, we present a framework to use high performance computing to determine accurate solutions to the inverse optimization problem of big experimental data against computational models. We demonstrate how image processing, mathematical regularization, and hierarchical modeling can be used to solve complex optimization problems on big data. We also demonstrate how both model and data information can be used to further increase solution accuracy of optimization by providing confidence regions for the processing and regularization algorithms. Finally, we use the framework in conjunction with the software package SIMPHONIES to analyze results from neutron scattering experiments on silicon single crystals, andmore » refine first principles calculations to better describe the experimental data.« less
Complex optimization for big computational and experimental neutron datasets
Bao, Feng; Oak Ridge National Lab.; Archibald, Richard; ...
2016-11-07
Here, we present a framework to use high performance computing to determine accurate solutions to the inverse optimization problem of big experimental data against computational models. We demonstrate how image processing, mathematical regularization, and hierarchical modeling can be used to solve complex optimization problems on big data. We also demonstrate how both model and data information can be used to further increase solution accuracy of optimization by providing confidence regions for the processing and regularization algorithms. Finally, we use the framework in conjunction with the software package SIMPHONIES to analyze results from neutron scattering experiments on silicon single crystals, andmore » refine first principles calculations to better describe the experimental data.« less
Narrative review of frameworks for translating research evidence into policy and practice.
Milat, Andrew J; Li, Ben
2017-02-15
A significant challenge in research translation is that interested parties interpret and apply the associated terms and conceptual frameworks in different ways. The purpose of this review was to: a) examine different research translation frameworks; b) examine the similarities and differences between the frameworks; and c) identify key strengths and weaknesses of the models when they are applied in practice. The review involved a keyword search of PubMed. The search string was (translational research OR knowledge translation OR evidence to practice) AND (framework OR model OR theory) AND (public health OR health promotion OR medicine). Included studies were published in English between January 1990 and December 2014, and described frameworks, models or theories associated with research translation. The final review included 98 papers, and 41 different frameworks and models were identified. The most frequently applied knowledge translation framework in the literature was RE-AIM, followed by the knowledge translation continuum or 'T' models, the Knowledge to Action framework, the PARiHS framework, evidence based public health models, and the stages of research and evaluation model. The models identified in this review stem from different fields, including implementation science, basic and medical sciences, health services research and public health, and propose different but related pathways to closing the research-practice gap.
Experimental Learning Enhancing Improvisation Skills
ERIC Educational Resources Information Center
Pereira Christopoulos, Tania; Wilner, Adriana; Trindade Bestetti, Maria Luisa
2016-01-01
Purpose: This study aims to present improvisation training and experimentation as an alternative method to deal with unexpected events in which structured processes do not seem to work. Design/Methodology/Approach: Based on the literature of sensemaking and improvisation, the study designs a framework and process model of experimental learning…
DOT National Transportation Integrated Search
2008-03-01
This document, the AMS Experimental Plan, lays out the scope of analysis that will be conducted through the application of the AMS methodology to the Test Corridor. The specific objectives of the Experimental Plan are: create an AMS framework that id...
A unified framework for evaluating the risk of re-identification of text de-identification tools.
Scaiano, Martin; Middleton, Grant; Arbuckle, Luk; Kolhatkar, Varada; Peyton, Liam; Dowling, Moira; Gipson, Debbie S; El Emam, Khaled
2016-10-01
It has become regular practice to de-identify unstructured medical text for use in research using automatic methods, the goal of which is to remove patient identifying information to minimize re-identification risk. The metrics commonly used to determine if these systems are performing well do not accurately reflect the risk of a patient being re-identified. We therefore developed a framework for measuring the risk of re-identification associated with textual data releases. We apply the proposed evaluation framework to a data set from the University of Michigan Medical School. Our risk assessment results are then compared with those that would be obtained using a typical contemporary micro-average evaluation of recall in order to illustrate the difference between the proposed evaluation framework and the current baseline method. We demonstrate how this framework compares against common measures of the re-identification risk associated with an automated text de-identification process. For the probability of re-identification using our evaluation framework we obtained a mean value for direct identifiers of 0.0074 and a mean value for quasi-identifiers of 0.0022. The 95% confidence interval for these estimates were below the relevant thresholds. The threshold for direct identifier risk was based on previously used approaches in the literature. The threshold for quasi-identifiers was determined based on the context of the data release following commonly used de-identification criteria for structured data. Our framework attempts to correct for poorly distributed evaluation corpora, accounts for the data release context, and avoids the often optimistic assumptions that are made using the more traditional evaluation approach. It therefore provides a more realistic estimate of the true probability of re-identification. This framework should be used as a basis for computing re-identification risk in order to more realistically evaluate future text de-identification tools. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Disaster Metrics: A Comprehensive Framework for Disaster Evaluation Typologies.
Wong, Diana F; Spencer, Caroline; Boyd, Lee; Burkle, Frederick M; Archer, Frank
2017-10-01
Introduction The frequency of disasters is increasing around the world with more people being at risk. There is a moral imperative to improve the way in which disaster evaluations are undertaken and reported with the aim of reducing preventable mortality and morbidity in future events. Disasters are complex events and undertaking disaster evaluations is a specialized area of study at an international level. Hypothesis/Problem While some frameworks have been developed to support consistent disaster research and evaluation, they lack validation, consistent terminology, and standards for reporting across the different phases of a disaster. There is yet to be an agreed, comprehensive framework to structure disaster evaluation typologies. The aim of this paper is to outline an evolving comprehensive framework for disaster evaluation typologies. It is anticipated that this new framework will facilitate an agreement on identifying, structuring, and relating the various evaluations found in the disaster setting with a view to better understand the process, outcomes, and impacts of the effectiveness and efficiency of interventions. Research was undertaken in two phases: (1) a scoping literature review (peer-reviewed and "grey literature") was undertaken to identify current evaluation frameworks and typologies used in the disaster setting; and (2) a structure was developed that included the range of typologies identified in Phase One and suggests possible relationships in the disaster setting. No core, unifying framework to structure disaster evaluation and research was identified in the literature. The authors propose a "Comprehensive Framework for Disaster Evaluation Typologies" that identifies, structures, and suggests relationships for the various typologies detected. The proposed Comprehensive Framework for Disaster Evaluation Typologies outlines the different typologies of disaster evaluations that were identified in this study and brings them together into a single framework. This unique, unifying framework has relevance at an international level and is expected to benefit the disaster, humanitarian, and development sectors. The next step is to undertake a validation process that will include international leaders with experience in evaluation, in general, and disasters specifically. This work promotes an environment for constructive dialogue on evaluations in the disaster setting to strengthen the evidence base for interventions across the disaster spectrum. It remains a work in progress. Wong DF , Spencer C , Boyd L , Burkle FM Jr. , Archer F . Disaster metrics: a comprehensive framework for disaster evaluation typologies. Prehosp Disaster Med. 2017;32(5):501-514.
NASA Astrophysics Data System (ADS)
Militello, F.; Farley, T.; Mukhi, K.; Walkden, N.; Omotani, J. T.
2018-05-01
A statistical framework was introduced in Militello and Omotani [Nucl. Fusion 56, 104004 (2016)] to correlate the dynamics and statistics of L-mode and inter-ELM plasma filaments with the radial profiles of thermodynamic quantities they generate in the Scrape Off Layer. This paper extends the framework to cases in which the filaments are emitted from the separatrix at different toroidal positions and with a finite toroidal velocity. It is found that the toroidal velocity does not affect the profiles, while the toroidal distribution of filament emission renormalises the waiting time between two events. Experimental data collected by visual camera imaging are used to evaluate the statistics of the fluctuations, to inform the choice of the probability distribution functions used in the application of the framework. It is found that the toroidal separation of the filaments is exponentially distributed, thus suggesting the lack of a toroidal modal structure. Finally, using these measurements, the framework is applied to an experimental case and good agreement is found.
Generic experimental design for product strategy evaluation : crumb rubber modified materials.
DOT National Transportation Integrated Search
2005-02-01
This report presents the framework for a generic process to evaluate new products and/or strategies for possible use within Caltrans. The framework is the result of a collaborative effort among Caltrans, the University of California Partnered Pavemen...
Yildizoglu, Tugce; Weislogel, Jan-Marek; Mohammad, Farhan; Chan, Edwin S-Y; Assam, Pryseley N; Claridge-Chang, Adam
2015-12-01
Genetic studies in Drosophila reveal that olfactory memory relies on a brain structure called the mushroom body. The mainstream view is that each of the three lobes of the mushroom body play specialized roles in short-term aversive olfactory memory, but a number of studies have made divergent conclusions based on their varying experimental findings. Like many fields, neurogenetics uses null hypothesis significance testing for data analysis. Critics of significance testing claim that this method promotes discrepancies by using arbitrary thresholds (α) to apply reject/accept dichotomies to continuous data, which is not reflective of the biological reality of quantitative phenotypes. We explored using estimation statistics, an alternative data analysis framework, to examine published fly short-term memory data. Systematic review was used to identify behavioral experiments examining the physiological basis of olfactory memory and meta-analytic approaches were applied to assess the role of lobular specialization. Multivariate meta-regression models revealed that short-term memory lobular specialization is not supported by the data; it identified the cellular extent of a transgenic driver as the major predictor of its effect on short-term memory. These findings demonstrate that effect sizes, meta-analysis, meta-regression, hierarchical models and estimation methods in general can be successfully harnessed to identify knowledge gaps, synthesize divergent results, accommodate heterogeneous experimental design and quantify genetic mechanisms.
Extended Theories of Gravitation. Observation Protocols and Experimental Tests
NASA Astrophysics Data System (ADS)
Fatibene, Lorenzo; Ferraris, Marco; Francaviglia, Mauro; Magnano, Guido
2013-09-01
Within the framework of extended theories of gravitation we shall discuss physical equivalences among different formalisms and classical tests. As suggested by the Ehlers-Pirani-Schild framework, the conformal invariance will be preserved and its effect on observational protocols discussed. Accordingly, we shall review standard tests showing how Palatini f(R)-theories naturally passes solar system tests. Observation protocols will be discussed in this wider framework.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-20
... recommended changing Indiana's experimental late Canada goose season status to operational. Service Response: We concur with the Mississippi Flyway Council's recommendation to make Indiana's experimental late Canada goose season in the Terre Haute region operational. In 2007, Indiana initiated an experimental...
The translation research in a dental setting (TRiaDS) programme protocol
2010-01-01
Background It is well documented that the translation of knowledge into clinical practice is a slow and haphazard process. This is no less true for dental healthcare than other types of healthcare. One common policy strategy to help promote knowledge translation is the production of clinical guidance, but it has been demonstrated that the simple publication of guidance is unlikely to optimise practice. Additional knowledge translation interventions have been shown to be effective, but effectiveness varies and much of this variation is unexplained. The need for researchers to move beyond single studies to develop a generalisable, theory based, knowledge translation framework has been identified. For dentistry in Scotland, the production of clinical guidance is the responsibility of the Scottish Dental Clinical Effectiveness Programme (SDCEP). TRiaDS (Translation Research in a Dental Setting) is a multidisciplinary research collaboration, embedded within the SDCEP guidance development process, which aims to establish a practical evaluative framework for the translation of guidance and to conduct and evaluate a programme of integrated, multi-disciplinary research to enhance the science of knowledge translation. Methods Set in General Dental Practice the TRiaDS programmatic evaluation employs a standardised process using optimal methods and theory. For each SDCEP guidance document a diagnostic analysis is undertaken alongside the guidance development process. Information is gathered about current dental care activities. Key recommendations and their required behaviours are identified and prioritised. Stakeholder questionnaires and interviews are used to identify and elicit salient beliefs regarding potential barriers and enablers towards the key recommendations and behaviours. Where possible routinely collected data are used to measure compliance with the guidance and to inform decisions about whether a knowledge translation intervention is required. Interventions are theory based and informed by evidence gathered during the diagnostic phase and by prior published evidence. They are evaluated using a range of experimental and quasi-experimental study designs, and data collection continues beyond the end of the intervention to investigate the sustainability of an intervention effect. Discussion The TRiaDS programmatic approach is a significant step forward towards the development of a practical, generalisable framework for knowledge translation research. The multidisciplinary composition of the TRiaDS team enables consideration of the individual, organisational and system determinants of professional behaviour change. In addition the embedding of TRiaDS within a national programme of guidance development offers a unique opportunity to inform and influence the guidance development process, and enables TRiaDS to inform dental services practitioners, policy makers and patients on how best to translate national recommendations into routine clinical activities. PMID:20646275
Callon, Wynne; Beach, Mary Catherine; Links, Anne R; Wasserman, Carly; Boss, Emily F
2018-03-11
We aimed to develop a comprehensive, descriptive framework to measure shared decision making (SDM) in clinical encounters. We combined a top-down (theoretical) approach with a bottom-up approach based on audio-recorded dialogue to identify all communication processes related to decision making. We coded 55 pediatric otolaryngology visits using the framework and report interrater reliability. We identified 14 clinician behaviors and 5 patient behaviors that have not been previously described, and developed a new SDM framework that is descriptive (what does happen) rather than normative (what should happen). Through the bottom-up approach we identified three broad domains not present in other SDM frameworks: socioemotional support, understandability of clinician dialogue, and recommendation-giving. We also specify the ways in which decision-making roles are assumed implicitly rather than discussed explicitly. Interrater reliability was >75% for 92% of the coded behaviors. This SDM framework allows for a more expansive understanding and analysis of how decision making takes place in clinical encounters, including new domains and behaviors not present in existing measures. We hope that this new framework will bring attention to a broader conception of SDM and allow researchers to further explore the new domains and behaviors identified. Copyright © 2018. Published by Elsevier B.V.
Conceptual frameworks of individual work performance: a systematic review.
Koopmans, Linda; Bernaards, Claire M; Hildebrandt, Vincent H; Schaufeli, Wilmar B; de Vet Henrica, C W; van der Beek, Allard J
2011-08-01
Individual work performance is differently conceptualized and operationalized in different disciplines. The aim of the current review was twofold: (1) identifying conceptual frameworks of individual work performance and (2) integrating these to reach a heuristic conceptual framework. A systematic review was conducted in medical, psychological, and management databases. Studies were selected independently by two researchers and included when they presented a conceptual framework of individual work performance. A total of 17 generic frameworks (applying across occupations) and 18 job-specific frameworks (applying to specific occupations) were identified. Dimensions frequently used to describe individual work performance were task performance, contextual performance, counterproductive work behavior, and adaptive performance. On the basis of the literature, a heuristic conceptual framework of individual work performance was proposed. This framework can serve as a theoretical basis for future research and practice.
NASA Astrophysics Data System (ADS)
Jackson, I.; Kennett, B. L.; Faul, U. H.
2009-12-01
In parallel with cooperative developments in seismology during the past 25 years, there have been phenomenal advances in mineral/rock physics making laboratory-based interpretation of seismological models increasingly useful. However, the assimilation of diverse experimental data into a physically sound framework for seismological application is not without its challenges as demonstrated by two examples. In the first example, that of equation-of-state and elasticity data, an appropriate, thermodynamically consistent framework involves finite-strain expansion of the Helmholz free energy incorporating the Debye approximation to the lattice vibrational energy, as advocated by Stixrude and Lithgow-Bertelloni. Within this context, pressure, specific heat and entropy, thermal expansion, elastic constants and their adiabatic and isothermal pressure derivatives are all calculable without further approximation in an internally consistent manner. The opportunities and challenges of assimilating a wide range of sometimes marginally incompatible experimental data into a single model of this type will be demonstrated with reference to MgO, unquestionably the most thoroughly studied mantle mineral. A neighbourhood-algorithm inversion has identified a broadly satisfactory model, but uncertainties in key parameters associated particularly with pressure calibration remain sufficiently large as to preclude definitive conclusions concerning lower-mantle chemical composition and departures from adiabaticity. The second example is the much less complete dataset concerning seismic-wave dispersion and attenuation emerging from low-frequency forced-oscillation experiments. Significant progress has been made during the past decade towards an understanding of high-temperature, micro-strain viscoelastic relaxation in upper-mantle materials, especially as regards the roles of oscillation period, temperature, grain size and melt fraction. However, the influence of other potentially important variables such as dislocation density and the concentration of structurally bound water remain as targets for ongoing research. The state-of-the-art will be illustrated by highlighting the challenge in reconciling the substantial and growing amount of experimental data concerning grain-size sensitive relaxation in fine-grained olivine with micro-mechanical models of grain-boundary sliding.
A multiscale active structural model of the arterial wall accounting for smooth muscle dynamics.
Coccarelli, Alberto; Edwards, David Hughes; Aggarwal, Ankush; Nithiarasu, Perumal; Parthimos, Dimitris
2018-02-01
Arterial wall dynamics arise from the synergy of passive mechano-elastic properties of the vascular tissue and the active contractile behaviour of smooth muscle cells (SMCs) that form the media layer of vessels. We have developed a computational framework that incorporates both these components to account for vascular responses to mechanical and pharmacological stimuli. To validate the proposed framework and demonstrate its potential for testing hypotheses on the pathogenesis of vascular disease, we have employed a number of pharmacological probes that modulate the arterial wall contractile machinery by selectively inhibiting a range of intracellular signalling pathways. Experimental probes used on ring segments from the rabbit central ear artery are: phenylephrine, a selective α 1-adrenergic receptor agonist that induces vasoconstriction; cyclopiazonic acid (CPA), a specific inhibitor of sarcoplasmic/endoplasmic reticulum Ca 2+ -ATPase; and ryanodine, a diterpenoid that modulates Ca 2+ release from the sarcoplasmic reticulum. These interventions were able to delineate the role of membrane versus intracellular signalling, previously identified as main factors in smooth muscle contraction and the generation of vessel tone. Each SMC was modelled by a system of nonlinear differential equations that account for intracellular ionic signalling, and in particular Ca 2+ dynamics. Cytosolic Ca 2+ concentrations formed the catalytic input to a cross-bridge kinetics model. Contractile output from these cellular components forms the input to the finite-element model of the arterial rings under isometric conditions that reproduces the experimental conditions. The model does not account for the role of the endothelium, as the nitric oxide production was suppressed by the action of L-NAME, and also due to the absence of shear stress on the arterial ring, as the experimental set-up did not involve flow. Simulations generated by the integrated model closely matched experimental observations qualitatively, as well as quantitatively within a range of physiological parametric values. The model also illustrated how increased intercellular coupling led to smooth muscle coordination and the genesis of vascular tone. © 2018 The Authors.
A Learning Framework for Development.
ERIC Educational Resources Information Center
Chi, Michelene T. H.; Rees, Ernest T.
1983-01-01
Responding to recent advances in developmental and cognitive science research on knowledge acquisition, this report presents a theoretical framework for analyzing cognitive development as a process of learning. The first section summarizes three developmental characteristics recognized in both the Piagetian and the quantita experimental tradition:…
Leontidis, Georgios
2017-11-01
Human retina is a diverse and important tissue, vastly studied for various retinal and other diseases. Diabetic retinopathy (DR), a leading cause of blindness, is one of them. This work proposes a novel and complete framework for the accurate and robust extraction and analysis of a series of retinal vascular geometric features. It focuses on studying the registered bifurcations in successive years of progression from diabetes (no DR) to DR, in order to identify the vascular alterations. Retinal fundus images are utilised, and multiple experimental designs are employed. The framework includes various steps, such as image registration and segmentation, extraction of features, statistical analysis and classification models. Linear mixed models are utilised for making the statistical inferences, alongside the elastic-net logistic regression, boruta algorithm, and regularised random forests for the feature selection and classification phases, in order to evaluate the discriminative potential of the investigated features and also build classification models. A number of geometric features, such as the central retinal artery and vein equivalents, are found to differ significantly across the experiments and also have good discriminative potential. The classification systems yield promising results with the area under the curve values ranging from 0.821 to 0.968, across the four different investigated combinations. Copyright © 2017 Elsevier Ltd. All rights reserved.
Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng
2015-01-09
The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly.
Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng
2015-01-01
The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly. PMID:25572661
NASA Astrophysics Data System (ADS)
Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng
2015-01-01
The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly.
MROrchestrator: A Fine-Grained Resource Orchestration Framework for MapReduce Clusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharma, Bikash; Prabhakar, Ramya; Kandemir, Mahmut
2012-01-01
Efficient resource management in data centers and clouds running large distributed data processing frameworks like MapReduce is crucial for enhancing the performance of hosted applications and boosting resource utilization. However, existing resource scheduling schemes in Hadoop MapReduce allocate resources at the granularity of fixed-size, static portions of nodes, called slots. In this work, we show that MapReduce jobs have widely varying demands for multiple resources, making the static and fixed-size slot-level resource allocation a poor choice both from the performance and resource utilization standpoints. Furthermore, lack of co-ordination in the management of mul- tiple resources across nodes prevents dynamic slotmore » reconfigura- tion, and leads to resource contention. Motivated by this, we propose MROrchestrator, a MapReduce resource Orchestrator framework, which can dynamically identify resource bottlenecks, and resolve them through fine-grained, co-ordinated, and on- demand resource allocations. We have implemented MROrches- trator on two 24-node native and virtualized Hadoop clusters. Experimental results with a suite of representative MapReduce benchmarks demonstrate up to 38% reduction in job completion times, and up to 25% increase in resource utilization. We further show how popular resource managers like NGM and Mesos when augmented with MROrchestrator can hike up their performance.« less
Caccavale, Justin; Fiumara, David; Stapf, Michael; Sweitzer, Liedeke; Anderson, Hannah J; Gorky, Jonathan; Dhurjati, Prasad; Galileo, Deni S
2017-12-11
Glioblastoma multiforme (GBM) is a devastating brain cancer for which there is no known cure. Its malignancy is due to rapid cell division along with high motility and invasiveness of cells into the brain tissue. Simple 2-dimensional laboratory assays (e.g., a scratch assay) commonly are used to measure the effects of various experimental perturbations, such as treatment with chemical inhibitors. Several mathematical models have been developed to aid the understanding of the motile behavior and proliferation of GBM cells. However, many are mathematically complicated, look at multiple interdependent phenomena, and/or use modeling software not freely available to the research community. These attributes make the adoption of models and simulations of even simple 2-dimensional cell behavior an uncommon practice by cancer cell biologists. Herein, we developed an accurate, yet simple, rule-based modeling framework to describe the in vitro behavior of GBM cells that are stimulated by the L1CAM protein using freely available NetLogo software. In our model L1CAM is released by cells to act through two cell surface receptors and a point of signaling convergence to increase cell motility and proliferation. A simple graphical interface is provided so that changes can be made easily to several parameters controlling cell behavior, and behavior of the cells is viewed both pictorially and with dedicated graphs. We fully describe the hierarchical rule-based modeling framework, show simulation results under several settings, describe the accuracy compared to experimental data, and discuss the potential usefulness for predicting future experimental outcomes and for use as a teaching tool for cell biology students. It is concluded that this simple modeling framework and its simulations accurately reflect much of the GBM cell motility behavior observed experimentally in vitro in the laboratory. Our framework can be modified easily to suit the needs of investigators interested in other similar intrinsic or extrinsic stimuli that influence cancer or other cell behavior. This modeling framework of a commonly used experimental motility assay (scratch assay) should be useful to both researchers of cell motility and students in a cell biology teaching laboratory.
A framework for selecting indicators of bioenergy sustainability
Dale, Virginia H.; Efroymson, Rebecca Ann; Kline, Keith L.; ...
2015-05-11
A framework for selecting and evaluating indicators of bioenergy sustainability is presented. This framework is designed to facilitate decision-making about which indicators are useful for assessing sustainability of bioenergy systems and supporting their deployment. Efforts to develop sustainability indicators in the United States and Europe are reviewed. The first steps of the framework for indicator selection are defining the sustainability goals and other goals for a bioenergy project or program, gaining an understanding of the context, and identifying the values of stakeholders. From the goals, context, and stakeholders, the objectives for analysis and criteria for indicator selection can be developed.more » The user of the framework identifies and ranks indicators, applies them in an assessment, and then evaluates their effectiveness, while identifying gaps that prevent goals from being met, assessing lessons learned, and moving toward best practices. The framework approach emphasizes that the selection of appropriate criteria and indicators is driven by the specific purpose of an analysis. Realistic goals and measures of bioenergy sustainability can be developed systematically with the help of the framework presented here.« less
Modelling proteins' hidden conformations to predict antibiotic resistance
NASA Astrophysics Data System (ADS)
Hart, Kathryn M.; Ho, Chris M. W.; Dutta, Supratik; Gross, Michael L.; Bowman, Gregory R.
2016-10-01
TEM β-lactamase confers bacteria with resistance to many antibiotics and rapidly evolves activity against new drugs. However, functional changes are not easily explained by differences in crystal structures. We employ Markov state models to identify hidden conformations and explore their role in determining TEM's specificity. We integrate these models with existing drug-design tools to create a new technique, called Boltzmann docking, which better predicts TEM specificity by accounting for conformational heterogeneity. Using our MSMs, we identify hidden states whose populations correlate with activity against cefotaxime. To experimentally detect our predicted hidden states, we use rapid mass spectrometric footprinting and confirm our models' prediction that increased cefotaxime activity correlates with reduced Ω-loop flexibility. Finally, we design novel variants to stabilize the hidden cefotaximase states, and find their populations predict activity against cefotaxime in vitro and in vivo. Therefore, we expect this framework to have numerous applications in drug and protein design.
Williams, A Mark; Ericsson, K Anders
2005-06-01
The number of researchers studying perceptual-cognitive expertise in sport is increasing. The intention in this paper is to review the currently accepted framework for studying expert performance and to consider implications for undertaking research work in the area of perceptual-cognitive expertise in sport. The expert performance approach presents a descriptive and inductive approach for the systematic study of expert performance. The nature of expert performance is initially captured in the laboratory using representative tasks that identify reliably superior performance. Process-tracing measures are employed to determine the mechanisms that mediate expert performance on the task. Finally, the specific types of activities that lead to the acquisition and development of these mediating mechanisms are identified. General principles and mechanisms may be discovered and then validated by more traditional experimental designs. The relevance of this approach to the study of perceptual-cognitive expertise in sport is discussed and suggestions for future work highlighted.
Patiño Cano, Laura P; Quintana Manfredi, Rodrigo; Pérez, Miriam; García, Mónica; Blustein, Guillermo; Cordeiro, Ralf; Pérez, Carlos D; Schejter, Laura; Palermo, Jorge A
2018-01-01
Three azulenoid sesquiterpenes (1 - 3) were isolated from the Antarctic gorgonian Acanthogorgia laxa collected by bottom trawls at -343 m. Besides linderazulene (1), and the known ketolactone 2, a new brominated C 16 linderazulene derivative (3) was also identified. This compound has an extra carbon atom at C(7) of the linderazulene framework. The antifouling activity of compounds 1 and 2 was assayed in the laboratory with Artemia salina larvae, and also in field tests, by incorporation in soluble-matrix experimental antifouling paints. The results obtained after a 45 days field trial of the paints, showed that compounds 1 and 2 displayed good antifouling potencies against a wide array of organisms. Compound 3, a benzylic bromide, was unstable and for this reason was not submitted to bioassays. Two known cembranolides: pukalide and epoxypukalide, were also identified as minor components of the extract. © 2018 Wiley-VHCA AG, Zurich, Switzerland.
Lens and dendrite formation during colloidal solidification
NASA Astrophysics Data System (ADS)
Worster, Grae; You, Jiaxue
2017-11-01
Colloidal particles in suspension are forced into a variety of morphologies when the suspending fluid medium is frozen: soil is compacted between ice lenses during frost heave; ice templating is a recent and growing technology to produce bio-inspired, micro-porous materials; cells and tissue can be damaged during cryosurgery; and metal-matrix composites with tailored microstructure can be fabricated by controlled casting. Various instabilities that affect the microscopic morphology are controlled by fluid flow through the compacted layer of particles that accumulates ahead of the solidification front. By analysing the flow in connection with equilibrium phase relationships, we develop a theoretical framework that identifies two different mechanisms for ice-lens formation, with and without a frozen fringe, identifies the external parameters that differentiates between them and the possibility of dendritic formations, and unifies a range of apparently disparate conclusions drawn from previous experimental studies. China Scholarship Council and the British Council.
Modelling proteins’ hidden conformations to predict antibiotic resistance
Hart, Kathryn M.; Ho, Chris M. W.; Dutta, Supratik; Gross, Michael L.; Bowman, Gregory R.
2016-01-01
TEM β-lactamase confers bacteria with resistance to many antibiotics and rapidly evolves activity against new drugs. However, functional changes are not easily explained by differences in crystal structures. We employ Markov state models to identify hidden conformations and explore their role in determining TEM’s specificity. We integrate these models with existing drug-design tools to create a new technique, called Boltzmann docking, which better predicts TEM specificity by accounting for conformational heterogeneity. Using our MSMs, we identify hidden states whose populations correlate with activity against cefotaxime. To experimentally detect our predicted hidden states, we use rapid mass spectrometric footprinting and confirm our models’ prediction that increased cefotaxime activity correlates with reduced Ω-loop flexibility. Finally, we design novel variants to stabilize the hidden cefotaximase states, and find their populations predict activity against cefotaxime in vitro and in vivo. Therefore, we expect this framework to have numerous applications in drug and protein design. PMID:27708258
An Experimental Framework for Executing Applications in Dynamic Grid Environments
NASA Technical Reports Server (NTRS)
Huedo, Eduardo; Montero, Ruben S.; Llorente, Ignacio M.; Bushnell, Dennis M. (Technical Monitor)
2002-01-01
The Grid opens up opportunities for resource-starved scientists and engineers to harness highly distributed computing resources. A number of Grid middleware projects are currently available to support the simultaneous exploitation of heterogeneous resources distributed in different administrative domains. However, efficient job submission and management continue being far from accessible to ordinary scientists and engineers due to the dynamic and complex nature of the Grid. This report describes a new Globus framework that allows an easier and more efficient execution of jobs in a 'submit and forget' fashion. Adaptation to dynamic Grid conditions is achieved by supporting automatic application migration following performance degradation, 'better' resource discovery, requirement change, owner decision or remote resource failure. The report also includes experimental results of the behavior of our framework on the TRGP testbed.
Discriminative prediction of mammalian enhancers from DNA sequence
Lee, Dongwon; Karchin, Rachel; Beer, Michael A.
2011-01-01
Accurately predicting regulatory sequences and enhancers in entire genomes is an important but difficult problem, especially in large vertebrate genomes. With the advent of ChIP-seq technology, experimental detection of genome-wide EP300/CREBBP bound regions provides a powerful platform to develop predictive tools for regulatory sequences and to study their sequence properties. Here, we develop a support vector machine (SVM) framework which can accurately identify EP300-bound enhancers using only genomic sequence and an unbiased set of general sequence features. Moreover, we find that the predictive sequence features identified by the SVM classifier reveal biologically relevant sequence elements enriched in the enhancers, but we also identify other features that are significantly depleted in enhancers. The predictive sequence features are evolutionarily conserved and spatially clustered, providing further support of their functional significance. Although our SVM is trained on experimental data, we also predict novel enhancers and show that these putative enhancers are significantly enriched in both ChIP-seq signal and DNase I hypersensitivity signal in the mouse brain and are located near relevant genes. Finally, we present results of comparisons between other EP300/CREBBP data sets using our SVM and uncover sequence elements enriched and/or depleted in the different classes of enhancers. Many of these sequence features play a role in specifying tissue-specific or developmental-stage-specific enhancer activity, but our results indicate that some features operate in a general or tissue-independent manner. In addition to providing a high confidence list of enhancer targets for subsequent experimental investigation, these results contribute to our understanding of the general sequence structure of vertebrate enhancers. PMID:21875935
Comparability of outcome frameworks in medical education: Implications for framework development.
Hautz, Stefanie C; Hautz, Wolf E; Feufel, Markus A; Spies, Claudia D
2015-01-01
Given the increasing mobility of medical students and practitioners, there is a growing need for harmonization of medical education and qualifications. Although several initiatives have sought to compare national outcome frameworks, this task has proven a challenge. Drawing on an analysis of existing outcome frameworks, we identify factors that hinder comparability and suggest ways of facilitating comparability during framework development and revisions. We searched MedLine, EmBase and the Internet for outcome frameworks in medical education published by national or governmental organizations. We analyzed these frameworks for differences and similarities that influence comparability. Of 1816 search results, 13 outcome frameworks met our inclusion criteria. These frameworks differ in five core features: history and origins, formal structure, medical education system, target audience and key terms. Many frameworks reference other frameworks without acknowledging these differences. Importantly, the level of detail of the outcomes specified differs both within and between frameworks. The differences identified explain some of the challenges involved in comparing outcome frameworks and medical qualifications. We propose a two-level model distinguishing between "core" competencies and culture-specific "secondary" competencies. This approach could strike a balance between local specifics and cross-national comparability of outcome frameworks and medical education.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dale, Virginia H.; Efroymson, Rebecca Ann; Kline, Keith L.
A framework for selecting and evaluating indicators of bioenergy sustainability is presented. This framework is designed to facilitate decision-making about which indicators are useful for assessing sustainability of bioenergy systems and supporting their deployment. Efforts to develop sustainability indicators in the United States and Europe are reviewed. The first steps of the framework for indicator selection are defining the sustainability goals and other goals for a bioenergy project or program, gaining an understanding of the context, and identifying the values of stakeholders. From the goals, context, and stakeholders, the objectives for analysis and criteria for indicator selection can be developed.more » The user of the framework identifies and ranks indicators, applies them in an assessment, and then evaluates their effectiveness, while identifying gaps that prevent goals from being met, assessing lessons learned, and moving toward best practices. The framework approach emphasizes that the selection of appropriate criteria and indicators is driven by the specific purpose of an analysis. Realistic goals and measures of bioenergy sustainability can be developed systematically with the help of the framework presented here.« less
Orpana, H.; Vachon, J.; Dykxhoorn, J.; McRae, L.; Jayaraman, G.
2016-01-01
Abstract Introduction: The Mental Health Strategy for Canada identified a need to enhance the collection of data on mental health in Canada. While surveillance systems on mental illness have been established, a data gap for monitoring positive mental health and its determinants was identified. The goal of this project was to develop a Positive Mental Health Surveillance Indicator Framework, to provide a picture of the state of positive mental health and its determinants in Canada. Data from this surveillance framework will be used to inform programs and policies to improve the mental health of Canadians. Methods: A literature review and environmental scan were conducted to provide the theoretical base for the framework, and to identify potential positive mental health outcomes and risk and protective factors. The Public Health Agency of Canada’s definition of positive mental health was adopted as the conceptual basis for the outcomes of this framework. After identifying a comprehensive list of risk and protective factors, mental health experts, other governmental partners and non-governmental stakeholders were consulted to prioritize these indicators. Subsequently, these groups were consulted to identify the most promising measurement approaches for each indicator. Results: A conceptual framework for surveillance of positive mental health and its determinants has been developed to contain 5 outcome indicators and 25 determinant indicators organized within 4 domains at the individual, family, community and societal level. This indicator framework addresses a data gap identified in Canada’s strategy for mental health and will be used to inform programs and policies to improve the mental health status of Canadians throughout the life course. PMID:26789022
From mechanisms to function: an integrated framework of animal innovation
Tebbich, Sabine; Griffin, Andrea S.; Peschl, Markus F.; Sterelny, Kim
2016-01-01
Animal innovations range from the discovery of novel food types to the invention of completely novel behaviours. Innovations can give access to new opportunities, and thus enable innovating agents to invade and create novel niches. This in turn can pave the way for morphological adaptation and adaptive radiation. The mechanisms that make innovations possible are probably as diverse as the innovations themselves. So too are their evolutionary consequences. Perhaps because of this diversity, we lack a unifying framework that links mechanism to function. We propose a framework for animal innovation that describes the interactions between mechanism, fitness benefit and evolutionary significance, and which suggests an expanded range of experimental approaches. In doing so, we split innovation into factors (components and phases) that can be manipulated systematically, and which can be investigated both experimentally and with correlational studies. We apply this framework to a selection of cases, showing how it helps us ask more precise questions and design more revealing experiments. PMID:26926285
Robust Bayesian Experimental Design for Conceptual Model Discrimination
NASA Astrophysics Data System (ADS)
Pham, H. V.; Tsai, F. T. C.
2015-12-01
A robust Bayesian optimal experimental design under uncertainty is presented to provide firm information for model discrimination, given the least number of pumping wells and observation wells. Firm information is the maximum information of a system can be guaranteed from an experimental design. The design is based on the Box-Hill expected entropy decrease (EED) before and after the experiment design and the Bayesian model averaging (BMA) framework. A max-min programming is introduced to choose the robust design that maximizes the minimal Box-Hill EED subject to that the highest expected posterior model probability satisfies a desired probability threshold. The EED is calculated by the Gauss-Hermite quadrature. The BMA method is used to predict future observations and to quantify future observation uncertainty arising from conceptual and parametric uncertainties in calculating EED. Monte Carlo approach is adopted to quantify the uncertainty in the posterior model probabilities. The optimal experimental design is tested by a synthetic 5-layer anisotropic confined aquifer. Nine conceptual groundwater models are constructed due to uncertain geological architecture and boundary condition. High-performance computing is used to enumerate all possible design solutions in order to identify the most plausible groundwater model. Results highlight the impacts of scedasticity in future observation data as well as uncertainty sources on potential pumping and observation locations.
Boström, Jan; Elger, Christian E.; Mormann, Florian
2016-01-01
Recording extracellulary from neurons in the brains of animals in vivo is among the most established experimental techniques in neuroscience, and has recently become feasible in humans. Many interesting scientific questions can be addressed only when extracellular recordings last several hours, and when individual neurons are tracked throughout the entire recording. Such questions regard, for example, neuronal mechanisms of learning and memory consolidation, and the generation of epileptic seizures. Several difficulties have so far limited the use of extracellular multi-hour recordings in neuroscience: Datasets become huge, and data are necessarily noisy in clinical recording environments. No methods for spike sorting of such recordings have been available. Spike sorting refers to the process of identifying the contributions of several neurons to the signal recorded in one electrode. To overcome these difficulties, we developed Combinato: a complete data-analysis framework for spike sorting in noisy recordings lasting twelve hours or more. Our framework includes software for artifact rejection, automatic spike sorting, manual optimization, and efficient visualization of results. Our completely automatic framework excels at two tasks: It outperforms existing methods when tested on simulated and real data, and it enables researchers to analyze multi-hour recordings. We evaluated our methods on both short and multi-hour simulated datasets. To evaluate the performance of our methods in an actual neuroscientific experiment, we used data from from neurosurgical patients, recorded in order to identify visually responsive neurons in the medial temporal lobe. These neurons responded to the semantic content, rather than to visual features, of a given stimulus. To test our methods with multi-hour recordings, we made use of neurons in the human medial temporal lobe that respond selectively to the same stimulus in the evening and next morning. PMID:27930664
Dimitrova, N; Nagaraj, A B; Razi, A; Singh, S; Kamalakaran, S; Banerjee, N; Joseph, P; Mankovich, A; Mittal, P; DiFeo, A; Varadan, V
2017-04-27
Characterizing the complex interplay of cellular processes in cancer would enable the discovery of key mechanisms underlying its development and progression. Published approaches to decipher driver mechanisms do not explicitly model tissue-specific changes in pathway networks and the regulatory disruptions related to genomic aberrations in cancers. We therefore developed InFlo, a novel systems biology approach for characterizing complex biological processes using a unique multidimensional framework integrating transcriptomic, genomic and/or epigenomic profiles for any given cancer sample. We show that InFlo robustly characterizes tissue-specific differences in activities of signalling networks on a genome scale using unique probabilistic models of molecular interactions on a per-sample basis. Using large-scale multi-omics cancer datasets, we show that InFlo exhibits higher sensitivity and specificity in detecting pathway networks associated with specific disease states when compared to published pathway network modelling approaches. Furthermore, InFlo's ability to infer the activity of unmeasured signalling network components was also validated using orthogonal gene expression signatures. We then evaluated multi-omics profiles of primary high-grade serous ovarian cancer tumours (N=357) to delineate mechanisms underlying resistance to frontline platinum-based chemotherapy. InFlo was the only algorithm to identify hyperactivation of the cAMP-CREB1 axis as a key mechanism associated with resistance to platinum-based therapy, a finding that we subsequently experimentally validated. We confirmed that inhibition of CREB1 phosphorylation potently sensitized resistant cells to platinum therapy and was effective in killing ovarian cancer stem cells that contribute to both platinum-resistance and tumour recurrence. Thus, we propose InFlo to be a scalable and widely applicable and robust integrative network modelling framework for the discovery of evidence-based biomarkers and therapeutic targets.
2011-01-01
Background In today's dynamic health-care system, organizations such as hospitals are required to improve their performance for multiple stakeholders and deliver an integrated care that means to work effectively, be innovative and organize efficiently. Achieved goals and levels of quality can be successfully measured by a multidimensional approach like Balanced Scorecard (BSC). The aim of the study was to verify the opportunity to introduce BSC framework to measure performance in St. Anna University Hospital of Ferrara, applying it to the Clinical Laboratory Operative Unit in order to compare over time performance results and achievements of assigned targets. Methods In the first experience with BSC we distinguished four perspectives, according to Kaplan and Norton, identified Key Performance Areas and Key Performance Indicators, set standards and weights for each objective, collected data for all indicators, recognized cause-and-effect relationships in a strategic map. One year later we proceeded with the next data collection and analysed the preservation of framework aptitude to measure Operative Unit performance. In addition, we verified the ability to underline links between strategic actions belonging to different perspectives in producing outcomes changes. Results The BSC was found to be effective for underlining existing problems and identifying opportunities for improvements. The BSC also revealed the specific perspective contribution to overall performance enhancement. After time results comparison was possible depending on the selection of feasible and appropriate key performance indicators, which was occasionally limited by data collection problems. Conclusions The first use of BSC to compare performance at Operative Unit level, in course of time, suggested this framework can be successfully adopted for results measuring and revealing effective health factors, allowing health-care quality improvements. PMID:21586111
Lupi, Silvia; Verzola, Adriano; Carandina, Gianni; Salani, Manuela; Antonioli, Paola; Gregorio, Pasquale
2011-05-17
In today's dynamic health-care system, organizations such as hospitals are required to improve their performance for multiple stakeholders and deliver an integrated care that means to work effectively, be innovative and organize efficiently. Achieved goals and levels of quality can be successfully measured by a multidimensional approach like Balanced Scorecard (BSC). The aim of the study was to verify the opportunity to introduce BSC framework to measure performance in St. Anna University Hospital of Ferrara, applying it to the Clinical Laboratory Operative Unit in order to compare over time performance results and achievements of assigned targets. In the first experience with BSC we distinguished four perspectives, according to Kaplan and Norton, identified Key Performance Areas and Key Performance Indicators, set standards and weights for each objective, collected data for all indicators, recognized cause-and-effect relationships in a strategic map. One year later we proceeded with the next data collection and analysed the preservation of framework aptitude to measure Operative Unit performance. In addition, we verified the ability to underline links between strategic actions belonging to different perspectives in producing outcomes changes. The BSC was found to be effective for underlining existing problems and identifying opportunities for improvements. The BSC also revealed the specific perspective contribution to overall performance enhancement. After time results comparison was possible depending on the selection of feasible and appropriate key performance indicators, which was occasionally limited by data collection problems. The first use of BSC to compare performance at Operative Unit level, in course of time, suggested this framework can be successfully adopted for results measuring and revealing effective health factors, allowing health-care quality improvements.
Effect of different aging methods on the mechanical behavior of multi-layered ceramic structures.
Borba, Márcia; de Araújo, Maico D; Fukushima, Karen A; Yoshimura, Humberto N; Griggs, Jason A; Della Bona, Álvaro; Cesar, Paulo F
2016-12-01
To evaluate the effect of two aging methods (mechanical cycling and autoclave) on the mechanical behavior of veneer and framework ceramic specimens with different configurations (monolithic, two and three-layers). Three ceramics used as framework for fixed dental prostheses (YZ-Vita In-Ceram YZ; IZ-Vita In-Ceram Zirconia; AL-Vita In-Ceram AL) and two veneering porcelains (VM7 and VM9) were studied. Bar-shaped specimens were produced in three different designs: monolithic, two layers (porcelain-framework) and three layers (porcelain-framework-porcelain). Specimens were tested for three-point flexural strength at 1MPa/s in 37°C artificial saliva. Three different experimental conditions were evaluated (n=10): control; mechanical cycling (2Hz, 37°C artificial saliva); and autoclave aging (134°C, 2 bars, 5h). Bi-layered specimens were tested in both conditions: with porcelain or framework ceramic under tension. Fracture surfaces were analyzed using stereomicroscope and scanning electron microscopy. Results were statistically analyzed using Kruskal-Wallis and Student-Newman-Keuls tests. Only for AL group, mechanical cycling and autoclave aging significantly decreased the flexural strength values in comparison to the control (p<0.01). YZ, AL, VM7 and VM9 monolithic groups showed no strength degradation. For multi-layered specimens, when the porcelain layer was tested in tension (bi and tri-layers), the aging methods evaluated also had no effect on strength (p≥0.05). Total and partial failure modes were identified. Mechanical cycling and autoclave aging protocols had no effect on the flexural strength values and failure behavior of YZ and IZ ceramic structures. Yet, AL monolithic structures showed a significant decrease in flexural strength with any of the aging methods. Copyright © 2016. Published by Elsevier Ltd.
Structure and properties of microporous titanosilicate determined by first-principles calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ching, W.Y.; Xu, Y.; Gu, Z.
1996-12-01
The structure of EST-10, a member of synthetic microporous titanosilicates, was recently determined by an ingenious combination of experimental and simulational techniques. However, the locations of the alkali atoms in the framework remain elusive and its electronic structure is totally unknown. Based on first-principles local density calculations, the possible locations of the alkali atoms are identified and its electronic structure and bonding fully elucidated. ETS-10 is a semiconductor with a direct band gap of 2.33 eV. The Na atoms are likely to locate inside the seven-member ring pore adjacent to the one-dimensional Ti-O-Ti-O- chain. {copyright} {ital 1996 The American Physicalmore » Society.}« less
Iafolla, V; Lefevre, C; Fiorenza, E; Santoli, F; Nozzoli, S; Magnafico, C; Lucente, M; Lucchesi, D; Peron, R; Shapiro, I I; Glashow, S; Lorenzini, E C
2014-01-01
A cryogenic differential accelerometer has been developed to test the weak equivalence principle to a few parts in 10(15) within the framework of the general relativity accuracy test in an Einstein elevator experiment. The prototype sensor was designed to identify, address, and solve the major issues associated with various aspects of the experiment. This paper illustrates the measurements conducted on this prototype sensor to attain a high quality factor (Q ∼ 10(5)) at low frequencies (<20 Hz). Such a value is necessary for reducing the Brownian noise to match the target acceleration noise of 10(-14) g/√Hz, hence providing the desired experimental accuracy.
The Role of Second Phase Hard Particles on Hole Stretchability of two AA6xxx Alloys
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, Xiaohua; Sun, Xin; Golovashchenko, Sergey F.
The hole stretchability of two Aluminum Alloys (AA6111 and AA6022) are studied by using a two stages integrated finite element framework where the edge geometry and edge damages from the hole piercing processes were considered in the subsequent hole expansion processes. Experimentally it has been found that AA6022 has higher hole expansion ratios than those of AA6111. This observation has been nicely captured by finite element simulations. The main cause of differences have been identified to the volume fractions of the random distributed second phase hard particles which play a critical role in determining the fracture strains of the materials.
Structural and electronic properties of M-MOF-74 (M = Mg, Co or Mn)
NASA Astrophysics Data System (ADS)
de Oliveira, Aline; de Lima, Guilherme Ferreira; De Abreu, Heitor Avelino
2018-01-01
The Metal-Organic Frameworks M-MOF-74 (M = Mg, Co or Mn) were investigated through Density Functional Theory calculations. Structural parameters and band gap energies were determined in agreement with experimental data, with errors under 2%. The methods Electron Localization Function and Quantum Theory of Atoms in Molecules were applied to the analyses of the electronic density topology of the three solids. These methodologies indicated that the bonds between the metallic cations and the oxygen atoms are predominantly ionic while the other ones are predominantly covalent. Furthermore, non-conventional hydrogen bonds were identified to Mg-MOF-74 and Co-MOF-74, which were not observed to Mn-MOF-74.
A continuum dislocation dynamics framework for plasticity of polycrystalline materials
NASA Astrophysics Data System (ADS)
Askari, Hesam Aldin
The objective of this research is to investigate the mechanical response of polycrystals in different settings to identify the mechanisms that give rise to specific response observed in the deformation process. Particularly the large deformation of magnesium alloys and yield properties of copper in small scales are investigated. We develop a continuum dislocation dynamics framework based on dislocation mechanisms and interaction laws and implement this formulation in a viscoplastic self-consistent scheme to obtain the mechanical response in a polycrystalline system. The versatility of this method allows various applications in the study of problems involving large deformation, study of microstructure and its evolution, superplasticity, study of size effect in polycrystals and stochastic plasticity. The findings from the numerical solution are compared to the experimental results to validate the simulation results. We apply this framework to study the deformation mechanisms in magnesium alloys at moderate to fast strain rates and room temperature to 450 °C. Experiments for the same range of strain rates and temperatures were carried out to obtain the mechanical and material properties, and to compare with the numerical results. The numerical approach for magnesium is divided into four main steps; 1) room temperature unidirectional loading 2) high temperature deformation without grain boundary sliding 3) high temperature with grain boundary sliding mechanism 4) room temperature cyclic loading. We demonstrate the capability of our modeling approach in prediction of mechanical properties and texture evolution and discuss the improvement obtained by using the continuum dislocation dynamics method. The framework was also applied to nano-sized copper polycrystals to study the yield properties at small scales and address the observed yield scatter. By combining our developed method with a Monte Carlo simulation approach, the stochastic plasticity at small length scales was studied and the sources of the uncertainty in the polycrystalline structure are discussed. Our results suggest that the stochastic response is mainly because of a) stochastic plasticity due to dislocation substructure inside crystals and b) the microstructure of the polycrystalline material. The extent of the uncertainty is correlated to the "effective cell length" in the sampling procedure whether using simulations and experimental approach.
NASA Astrophysics Data System (ADS)
Kopsaftopoulos, Fotis; Nardari, Raphael; Li, Yu-Hung; Chang, Fu-Kuo
2018-01-01
In this work, a novel data-based stochastic "global" identification framework is introduced for aerospace structures operating under varying flight states and uncertainty. In this context, the term "global" refers to the identification of a model that is capable of representing the structure under any admissible flight state based on data recorded from a sample of these states. The proposed framework is based on stochastic time-series models for representing the structural dynamics and aeroelastic response under multiple flight states, with each state characterized by several variables, such as the airspeed, angle of attack, altitude and temperature, forming a flight state vector. The method's cornerstone lies in the new class of Vector-dependent Functionally Pooled (VFP) models which allow the explicit analytical inclusion of the flight state vector into the model parameters and, hence, system dynamics. This is achieved via the use of functional data pooling techniques for optimally treating - as a single entity - the data records corresponding to the various flight states. In this proof-of-concept study the flight state vector is defined by two variables, namely the airspeed and angle of attack of the vehicle. The experimental evaluation and assessment is based on a prototype bio-inspired self-sensing composite wing that is subjected to a series of wind tunnel experiments under multiple flight states. Distributed micro-sensors in the form of stretchable sensor networks are embedded in the composite layup of the wing in order to provide the sensing capabilities. Experimental data collected from piezoelectric sensors are employed for the identification of a stochastic global VFP model via appropriate parameter estimation and model structure selection methods. The estimated VFP model parameters constitute two-dimensional functions of the flight state vector defined by the airspeed and angle of attack. The identified model is able to successfully represent the wing's aeroelastic response under the admissible flight states via a minimum number of estimated parameters compared to standard identification approaches. The obtained results demonstrate the high accuracy and effectiveness of the proposed global identification framework, thus constituting a first step towards the next generation of "fly-by-feel" aerospace vehicles with state awareness capabilities.
Framework for Identifying Cybersecurity Risks in Manufacturing
Hutchins, Margot J.; Bhinge, Raunak; Micali, Maxwell K.; ...
2015-10-21
Increasing connectivity, use of digital computation, and off-site data storage provide potential for dramatic improvements in manufacturing productivity, quality, and cost. However, there are also risks associated with the increased volume and pervasiveness of data that are generated and potentially accessible to competitors or adversaries. Enterprises have experienced cyber attacks that exfiltrate confidential and/or proprietary data, alter information to cause an unexpected or unwanted effect, and destroy capital assets. Manufacturers need tools to incorporate these risks into their existing risk management processes. This article establishes a framework that considers the data flows within a manufacturing enterprise and throughout its supplymore » chain. The framework provides several mechanisms for identifying generic and manufacturing-specific vulnerabilities and is illustrated with details pertinent to an automotive manufacturer. Finally, in addition to providing manufacturers with insights into their potential data risks, this framework addresses an outcome identified by the NIST Cybersecurity Framework.« less
Framework for Identifying Cybersecurity Risks in Manufacturing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hutchins, Margot J.; Bhinge, Raunak; Micali, Maxwell K.
Increasing connectivity, use of digital computation, and off-site data storage provide potential for dramatic improvements in manufacturing productivity, quality, and cost. However, there are also risks associated with the increased volume and pervasiveness of data that are generated and potentially accessible to competitors or adversaries. Enterprises have experienced cyber attacks that exfiltrate confidential and/or proprietary data, alter information to cause an unexpected or unwanted effect, and destroy capital assets. Manufacturers need tools to incorporate these risks into their existing risk management processes. This article establishes a framework that considers the data flows within a manufacturing enterprise and throughout its supplymore » chain. The framework provides several mechanisms for identifying generic and manufacturing-specific vulnerabilities and is illustrated with details pertinent to an automotive manufacturer. Finally, in addition to providing manufacturers with insights into their potential data risks, this framework addresses an outcome identified by the NIST Cybersecurity Framework.« less
Development and validation of a child health workforce competence framework.
Smith, Lynda; Hawkins, Jean; McCrum, Anita
2011-05-01
Providing high quality, effective services is fundamental to the delivery of key health outcomes for children and young people. This requires a competent workforce. This paper reports on the development of a validated competence framework tool for the children and young people's health workforce. The framework brings together policy, strategic agendas and existing workforce competences. The framework will contribute to the improvement of children's physical and mental wellbeing by identifying competences required to provide proactive services that respond to children and young people with acute, continuing and complex needs. It details five core competences for the workforce, the functions that underpin them and levels of competence required to deliver a particular service. The framework will be of value to commissioners to inform contracting, to providers to ensure services are delivered by a workforce with relevant competences to meet identified needs, and to the workforce to assess existing capabilities and identify gaps in competence.
Studies on Experimental Ontology and Knowledge Service Development in Bio-Environmental Engineering
NASA Astrophysics Data System (ADS)
Zhang, Yunliang
2018-01-01
The existing domain-related ontology and information service patterns are analyzed, and the main problems faced by the experimental scheme knowledge service were clarified. The ontology framework model for knowledge service of Bio-environmental Engineering was proposed from the aspects of experimental materials, experimental conditions and experimental instruments, and this ontology will be combined with existing knowledge organization systems to organize scientific and technological literatures, data and experimental schemes. With the similarity and priority calculation, it can improve the related domain research.
Model-Based Reasoning in Upper-division Lab Courses
NASA Astrophysics Data System (ADS)
Lewandowski, Heather
2015-05-01
Modeling, which includes developing, testing, and refining models, is a central activity in physics. Well-known examples from AMO physics include everything from the Bohr model of the hydrogen atom to the Bose-Hubbard model of interacting bosons in a lattice. Modeling, while typically considered a theoretical activity, is most fully represented in the laboratory where measurements of real phenomena intersect with theoretical models, leading to refinement of models and experimental apparatus. However, experimental physicists use models in complex ways and the process is often not made explicit in physics laboratory courses. We have developed a framework to describe the modeling process in physics laboratory activities. The framework attempts to abstract and simplify the complex modeling process undertaken by expert experimentalists. The framework can be applied to understand typical processes such the modeling of the measurement tools, modeling ``black boxes,'' and signal processing. We demonstrate that the framework captures several important features of model-based reasoning in a way that can reveal common student difficulties in the lab and guide the development of curricula that emphasize modeling in the laboratory. We also use the framework to examine troubleshooting in the lab and guide students to effective methods and strategies.
MicroRNAs and complex diseases: from experimental results to computational models.
Chen, Xing; Xie, Di; Zhao, Qi; You, Zhu-Hong
2017-10-17
Plenty of microRNAs (miRNAs) were discovered at a rapid pace in plants, green algae, viruses and animals. As one of the most important components in the cell, miRNAs play a growing important role in various essential and important biological processes. For the recent few decades, amounts of experimental methods and computational models have been designed and implemented to identify novel miRNA-disease associations. In this review, the functions of miRNAs, miRNA-target interactions, miRNA-disease associations and some important publicly available miRNA-related databases were discussed in detail. Specially, considering the important fact that an increasing number of miRNA-disease associations have been experimentally confirmed, we selected five important miRNA-related human diseases and five crucial disease-related miRNAs and provided corresponding introductions. Identifying disease-related miRNAs has become an important goal of biomedical research, which will accelerate the understanding of disease pathogenesis at the molecular level and molecular tools design for disease diagnosis, treatment and prevention. Computational models have become an important means for novel miRNA-disease association identification, which could select the most promising miRNA-disease pairs for experimental validation and significantly reduce the time and cost of the biological experiments. Here, we reviewed 20 state-of-the-art computational models of predicting miRNA-disease associations from different perspectives. Finally, we summarized four important factors for the difficulties of predicting potential disease-related miRNAs, the framework of constructing powerful computational models to predict potential miRNA-disease associations including five feasible and important research schemas, and future directions for further development of computational models. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
2014-09-18
and full/scale experimental verifications towards ground/ satellite quantum key distribution0 Oat Qhotonics 4235>9+7,=5;9!អ \\58^ Zin K. Dao Z. Miu T...Conceptual Modeling of a Quantum Key Distribution Simulation Framework Using the Discrete Event System Specification DISSERTATION Jeffrey D. Morris... QUANTUM KEY DISTRIBUTION SIMULATION FRAMEWORK USING THE DISCRETE EVENT SYSTEM SPECIFICATION DISSERTATION Presented to the Faculty Department of Systems
Data-driven Modeling of Metal-oxide Sensors with Dynamic Bayesian Networks
NASA Astrophysics Data System (ADS)
Gosangi, Rakesh; Gutierrez-Osuna, Ricardo
2011-09-01
We present a data-driven probabilistic framework to model the transient response of MOX sensors modulated with a sequence of voltage steps. Analytical models of MOX sensors are usually built based on the physico-chemical properties of the sensing materials. Although building these models provides an insight into the sensor behavior, they also require a thorough understanding of the underlying operating principles. Here we propose a data-driven approach to characterize the dynamical relationship between sensor inputs and outputs. Namely, we use dynamic Bayesian networks (DBNs), probabilistic models that represent temporal relations between a set of random variables. We identify a set of control variables that influence the sensor responses, create a graphical representation that captures the causal relations between these variables, and finally train the model with experimental data. We validated the approach on experimental data in terms of predictive accuracy and classification performance. Our results show that DBNs can accurately predict the dynamic response of MOX sensors, as well as capture the discriminatory information present in the sensor transients.
Want, Stephen C
2009-09-01
Experimental exposure to idealized media portrayals of women is thought to induce social comparisons in female viewers and thereby to be generally detrimental to female viewers' satisfaction with their own appearance. Through meta-analysis, the present paper examines the impact of moderators of this effect, some identified and updated from a prior meta-analysis and some that have hitherto received little attention. Participants' pre-existing appearance concerns and the processing instructions participants were given when exposed to media portrayals were found to significantly moderate effect sizes. With regard to processing instructions, a novel and counter-intuitive pattern was revealed; effect sizes were smallest when participants were instructed to focus on the appearance of women in media portrayals, and largest when participants processed the portrayals on a distracting, non-appearance dimension. These results are interpreted through a framework that suggests that social comparisons are automatic processes, the effects of which can be modified through conscious processing.
MetaboLights: An Open-Access Database Repository for Metabolomics Data.
Kale, Namrata S; Haug, Kenneth; Conesa, Pablo; Jayseelan, Kalaivani; Moreno, Pablo; Rocca-Serra, Philippe; Nainala, Venkata Chandrasekhar; Spicer, Rachel A; Williams, Mark; Li, Xuefei; Salek, Reza M; Griffin, Julian L; Steinbeck, Christoph
2016-03-24
MetaboLights is the first general purpose, open-access database repository for cross-platform and cross-species metabolomics research at the European Bioinformatics Institute (EMBL-EBI). Based upon the open-source ISA framework, MetaboLights provides Metabolomics Standard Initiative (MSI) compliant metadata and raw experimental data associated with metabolomics experiments. Users can upload their study datasets into the MetaboLights Repository. These studies are then automatically assigned a stable and unique identifier (e.g., MTBLS1) that can be used for publication reference. The MetaboLights Reference Layer associates metabolites with metabolomics studies in the archive and is extensively annotated with data fields such as structural and chemical information, NMR and MS spectra, target species, metabolic pathways, and reactions. The database is manually curated with no specific release schedules. MetaboLights is also recommended by journals for metabolomics data deposition. This unit provides a guide to using MetaboLights, downloading experimental data, and depositing metabolomics datasets using user-friendly submission tools. Copyright © 2016 John Wiley & Sons, Inc.
Harcombe, William R.; Riehl, William J.; Dukovski, Ilija; Granger, Brian R.; Betts, Alex; Lang, Alex H.; Bonilla, Gracia; Kar, Amrita; Leiby, Nicholas; Mehta, Pankaj; Marx, Christopher J.; Segrè, Daniel
2014-01-01
Summary The inter-species exchange of metabolites plays a key role in the spatio-temporal dynamics of microbial communities. This raises the question whether ecosystem-level behavior of structured communities can be predicted using genome-scale models of metabolism for multiple organisms. We developed a modeling framework that integrates dynamic flux balance analysis with diffusion on a lattice, and applied it to engineered consortia. First, we predicted, and experimentally confirmed, the species-ratio to which a 2-species mutualistic consortium converges, and the equilibrium composition of a newly engineered 3-member community. We next identified a specific spatial arrangement of colonies, which gives rise to what we term the “eclipse dilemma”: does a competitor placed between a colony and its cross-feeding partner benefit or hurt growth of the original colony? Our experimentally validated finding, that the net outcome is beneficial, highlights the complex nature of metabolic interactions in microbial communities, while at the same time demonstrating their predictability. PMID:24794435
Alkhatib, Omar J; Abdou, Alaa
2018-04-01
The construction industry is usually characterized as a fragmented system of multiple-organizational entities in which members from different technical backgrounds and moral values join together to develop a particular business or project. The greatest challenge in the construction process for the achievement of a successful practice is the development of an outstanding reputation, which is built on identifying and applying an ethical framework. This framework should reflect a common ethical ground for myriad people involved in this process to survive and compete ethically in today's turbulent construction market. This study establishes a framework for ethical judgment of behavior and actions conducted in the construction process. The framework was primarily developed based on the essential attributes of business management identified in the literature review and subsequently incorporates additional attributes identified to prevent breaches in the construction industry and common ethical values related to professional engineering. The proposed judgment framework is based primarily on the ethical dimension of professional responsibility. The Ethical Judgment Framework consists of descriptive approaches involving technical, professional, administrative, and miscellaneous terms. The framework provides the basis for judging actions as either ethical or unethical. Furthermore, the framework can be implemented as a form of preventive ethics, which would help avoid ethical dilemmas and moral allegations. The framework can be considered a decision-making model to guide actions and improve the ethical reasoning process that would help individuals think through possible implications and consequences of ethical dilemmas in the construction industry.
Review article: A systematic review of emergency department incident classification frameworks.
Murray, Matthew; McCarthy, Sally
2018-06-01
As in any part of the hospital system, safety incidents can occur in the ED. These incidents arguably have a distinct character, as the ED involves unscheduled flows of urgent patients who require disparate services. To aid understanding of safety issues and support risk management of the ED, a comparison of published ED specific incident classification frameworks was performed. A review of emergency medicine, health management and general medical publications, using Ovid SP to interrogate Medline (1976-2016) was undertaken to identify any type of taxonomy or classification-like framework for ED related incidents. These frameworks were then analysed and compared. The review identified 17 publications containing an incident classification framework. Comparison of factors and themes making up the classification constituent elements revealed some commonality, but no overall consistency, nor evolution towards an ideal framework. Inconsistency arises from differences in the evidential basis and design methodology of classifications, with design itself being an inherently subjective process. It was not possible to identify an 'ideal' incident classification framework for ED risk management, and there is significant variation in the selection of categories used by frameworks. The variation in classification could risk an unbalanced emphasis in findings through application of a particular framework. Design of an ED specific, ideal incident classification framework should be informed by a much wider range of theories of how organisations and systems work, in addition to clinical and human factors. © 2017 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.
A Data Protection Framework for Learning Analytics
ERIC Educational Resources Information Center
Cormack, Andrew
2016-01-01
Most studies on the use of digital student data adopt an ethical framework derived from human-subject research, based on the informed consent of the experimental subject. However, consent gives universities little guidance on using learning analytics as a routine part of educational provision: which purposes are legitimate and which analyses…
The future for electrocoagulation as a localised water treatment technology.
Holt, Peter K; Barton, Geoffrey W; Mitchell, Cynthia A
2005-04-01
Electrocoagulation is an electrochemical method of treating polluted water whereby sacrificial anodes corrode to release active coagulant precursors (usually aluminium or iron cations) into solution. Accompanying electrolytic reactions evolve gas (usually as hydrogen bubbles) at the cathode. Electrocoagulation has a long history as a water treatment technology having been employed to remove a wide range of pollutants. However electrocoagulation has never become accepted as a 'mainstream' water treatment technology. The lack of a systematic approach to electrocoagulation reactor design/operation and the issue of electrode reliability (particularly passivation of the electrodes over time) have limited its implementation. However recent technical improvements combined with a growing need for small-scale decentralised water treatment facilities have led to a re-evaluation of electrocoagulation. Starting with a review of electrocoagulation reactor design/operation, this article examines and identifies a conceptual framework for electrocoagulation that focuses on the interactions between electrochemistry, coagulation and flotation. In addition detailed experimental data are provided from a batch reactor system removing suspended solids together with a mathematical analysis based on the 'white water' model for the dissolved air flotation process. Current density is identified as the key operational parameter influencing which pollutant removal mechanism dominates. The conclusion is drawn that electrocoagulation has a future as a decentralised water treatment technology. A conceptual framework is presented for future research directed towards a more mechanistic understanding of the process.
ScaffoldSeq: Software for characterization of directed evolution populations.
Woldring, Daniel R; Holec, Patrick V; Hackel, Benjamin J
2016-07-01
ScaffoldSeq is software designed for the numerous applications-including directed evolution analysis-in which a user generates a population of DNA sequences encoding for partially diverse proteins with related functions and would like to characterize the single site and pairwise amino acid frequencies across the population. A common scenario for enzyme maturation, antibody screening, and alternative scaffold engineering involves naïve and evolved populations that contain diversified regions, varying in both sequence and length, within a conserved framework. Analyzing the diversified regions of such populations is facilitated by high-throughput sequencing platforms; however, length variability within these regions (e.g., antibody CDRs) encumbers the alignment process. To overcome this challenge, the ScaffoldSeq algorithm takes advantage of conserved framework sequences to quickly identify diverse regions. Beyond this, unintended biases in sequence frequency are generated throughout the experimental workflow required to evolve and isolate clones of interest prior to DNA sequencing. ScaffoldSeq software uniquely handles this issue by providing tools to quantify and remove background sequences, cluster similar protein families, and dampen the impact of dominant clones. The software produces graphical and tabular summaries for each region of interest, allowing users to evaluate diversity in a site-specific manner as well as identify epistatic pairwise interactions. The code and detailed information are freely available at http://research.cems.umn.edu/hackel. Proteins 2016; 84:869-874. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Development of a framework to identify research gaps from systematic reviews.
Robinson, Karen A; Saldanha, Ian J; McKoy, Naomi A
2011-12-01
Our objective was to develop a framework to identify research gaps from systematic reviews. We reviewed the practices of (1) evidence-based practice centers (EPCs), and (2) other organizations that conduct evidence syntheses. We developed and pilot tested a framework for identifying research gaps. Four (33%) EPCs and three (8%) other organizations reported using an explicit framework to determine research gaps. Variations of the PICO (population, intervention, comparison, outcomes) framework were most common. We developed a framework incorporating both the characterization of the gap using PICOS elements (also including setting) and the identification of the reason(s) why the gap exists as (1) insufficient or imprecise information, (2) biased information, (3) inconsistency or unknown consistency, and (4) not the right information. We mapped each of these reasons to concepts from three common evidence-grading systems. Our framework determines from systematic reviews where the current evidence falls short and why or how the evidence falls short. This explicit identification of research gaps will allow systematic reviews to maximally inform the types of questions that need to be addressed and the types of studies needed to address the research gaps. Copyright © 2011 Elsevier Inc. All rights reserved.
2014-01-01
Background The ability of science to produce experimental data has outpaced the ability to effectively visualize and integrate the data into a conceptual framework that can further higher order understanding. Multidimensional and shape-based observational data of regenerative biology presents a particularly daunting challenge in this regard. Large amounts of data are available in regenerative biology, but little progress has been made in understanding how organisms such as planaria robustly achieve and maintain body form. An example of this kind of data can be found in a new repository (PlanformDB) that encodes descriptions of planaria experiments and morphological outcomes using a graph formalism. Results We are developing a model discovery framework that uses a cell-based modeling platform combined with evolutionary search to automatically search for and identify plausible mechanisms for the biological behavior described in PlanformDB. To automate the evolutionary search we developed a way to compare the output of the modeling platform to the morphological descriptions stored in PlanformDB. We used a flexible connected component algorithm to create a graph representation of the virtual worm from the robust, cell-based simulation data. These graphs can then be validated and compared with target data from PlanformDB using the well-known graph-edit distance calculation, which provides a quantitative metric of similarity between graphs. The graph edit distance calculation was integrated into a fitness function that was able to guide automated searches for unbiased models of planarian regeneration. We present a cell-based model of planarian that can regenerate anatomical regions following bisection of the organism, and show that the automated model discovery framework is capable of searching for and finding models of planarian regeneration that match experimental data stored in PlanformDB. Conclusion The work presented here, including our algorithm for converting cell-based models into graphs for comparison with data stored in an external data repository, has made feasible the automated development, training, and validation of computational models using morphology-based data. This work is part of an ongoing project to automate the search process, which will greatly expand our ability to identify, consider, and test biological mechanisms in the field of regenerative biology. PMID:24917489
Spectra of conditionalization and typicality in the multiverse
NASA Astrophysics Data System (ADS)
Azhar, Feraz
2016-02-01
An approach to testing theories describing a multiverse, that has gained interest of late, involves comparing theory-generated probability distributions over observables with their experimentally measured values. It is likely that such distributions, were we indeed able to calculate them unambiguously, will assign low probabilities to any such experimental measurements. An alternative to thereby rejecting these theories, is to conditionalize the distributions involved by restricting attention to domains of the multiverse in which we might arise. In order to elicit a crisp prediction, however, one needs to make a further assumption about how typical we are of the chosen domains. In this paper, we investigate interactions between the spectra of available assumptions regarding both conditionalization and typicality, and draw out the effects of these interactions in a concrete setting; namely, on predictions of the total number of species that contribute significantly to dark matter. In particular, for each conditionalization scheme studied, we analyze how correlations between densities of different dark matter species affect the prediction, and explicate the effects of assumptions regarding typicality. We find that the effects of correlations can depend on the conditionalization scheme, and that in each case atypicality can significantly change the prediction. In doing so, we demonstrate the existence of overlaps in the predictions of different "frameworks" consisting of conjunctions of theory, conditionalization scheme and typicality assumption. This conclusion highlights the acute challenges involved in using such tests to identify a preferred framework that aims to describe our observational situation in a multiverse.
NASA Astrophysics Data System (ADS)
Toher, Cormac; Oses, Corey; Plata, Jose J.; Hicks, David; Rose, Frisco; Levy, Ohad; de Jong, Maarten; Asta, Mark; Fornari, Marco; Buongiorno Nardelli, Marco; Curtarolo, Stefano
2017-06-01
Thorough characterization of the thermomechanical properties of materials requires difficult and time-consuming experiments. This severely limits the availability of data and is one of the main obstacles for the development of effective accelerated materials design strategies. The rapid screening of new potential materials requires highly integrated, sophisticated, and robust computational approaches. We tackled the challenge by developing an automated, integrated workflow with robust error-correction within the AFLOW framework which combines the newly developed "Automatic Elasticity Library" with the previously implemented GIBBS method. The first extracts the mechanical properties from automatic self-consistent stress-strain calculations, while the latter employs those mechanical properties to evaluate the thermodynamics within the Debye model. This new thermoelastic workflow is benchmarked against a set of 74 experimentally characterized systems to pinpoint a robust computational methodology for the evaluation of bulk and shear moduli, Poisson ratios, Debye temperatures, Grüneisen parameters, and thermal conductivities of a wide variety of materials. The effect of different choices of equations of state and exchange-correlation functionals is examined and the optimum combination of properties for the Leibfried-Schlömann prediction of thermal conductivity is identified, leading to improved agreement with experimental results than the GIBBS-only approach. The framework has been applied to the AFLOW.org data repositories to compute the thermoelastic properties of over 3500 unique materials. The results are now available online by using an expanded version of the REST-API described in the Appendix.
Computational Design of Functionalized Metal–Organic Framework Nodes for Catalysis
2017-01-01
Recent progress in the synthesis and characterization of metal–organic frameworks (MOFs) has opened the door to an increasing number of possible catalytic applications. The great versatility of MOFs creates a large chemical space, whose thorough experimental examination becomes practically impossible. Therefore, computational modeling is a key tool to support, rationalize, and guide experimental efforts. In this outlook we survey the main methodologies employed to model MOFs for catalysis, and we review selected recent studies on the functionalization of their nodes. We pay special attention to catalytic applications involving natural gas conversion. PMID:29392172
Inferential Framework for Autonomous Cryogenic Loading Operations
NASA Technical Reports Server (NTRS)
Luchinsky, Dmitry G.; Khasin, Michael; Timucin, Dogan; Sass, Jared; Perotti, Jose; Brown, Barbara
2017-01-01
We address problem of autonomous cryogenic management of loading operations on the ground and in space. As a step towards solution of this problem we develop a probabilistic framework for inferring correlations parameters of two-fluid cryogenic flow. The simulation of two-phase cryogenic flow is performed using nearly-implicit scheme. A concise set of cryogenic correlations is introduced. The proposed approach is applied to an analysis of the cryogenic flow in experimental Propellant Loading System built at NASA KSC. An efficient simultaneous optimization of a large number of model parameters is demonstrated and a good agreement with the experimental data is obtained.
A framework for testing and comparing binaural models.
Dietz, Mathias; Lestang, Jean-Hugues; Majdak, Piotr; Stern, Richard M; Marquardt, Torsten; Ewert, Stephan D; Hartmann, William M; Goodman, Dan F M
2018-03-01
Auditory research has a rich history of combining experimental evidence with computational simulations of auditory processing in order to deepen our theoretical understanding of how sound is processed in the ears and in the brain. Despite significant progress in the amount of detail and breadth covered by auditory models, for many components of the auditory pathway there are still different model approaches that are often not equivalent but rather in conflict with each other. Similarly, some experimental studies yield conflicting results which has led to controversies. This can be best resolved by a systematic comparison of multiple experimental data sets and model approaches. Binaural processing is a prominent example of how the development of quantitative theories can advance our understanding of the phenomena, but there remain several unresolved questions for which competing model approaches exist. This article discusses a number of current unresolved or disputed issues in binaural modelling, as well as some of the significant challenges in comparing binaural models with each other and with the experimental data. We introduce an auditory model framework, which we believe can become a useful infrastructure for resolving some of the current controversies. It operates models over the same paradigms that are used experimentally. The core of the proposed framework is an interface that connects three components irrespective of their underlying programming language: The experiment software, an auditory pathway model, and task-dependent decision stages called artificial observers that provide the same output format as the test subject. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Matsubara, Masahiko; Bellotti, Enrico
2017-05-01
Various forms of carbon based complexes in GaN are studied with first-principles calculations employing Heyd-Scuseria-Ernzerhof hybrid functionals within the framework of the density functional theory. We consider carbon complexes made of the combinations of single impurities, i.e., CN-CGa, CI-CN , and CI-CGa , where CN, CGa , and CI denote C substituting nitrogen, C substituting gallium, and interstitial C, respectively, and of neighboring gallium/nitrogen vacancies ( VGa / VN ), i.e., CN-VGa and CGa-VN . Formation energies are computed for all these configurations with different charge states after full geometry optimizations. From our calculated formation energies, thermodynamic transition levels are evaluated, which are related to the thermal activation energies observed in experimental techniques such as deep level transient spectroscopy. Furthermore, the lattice relaxation energies (Franck-Condon shift) are computed to obtain optical activation energies, which are observed in experimental techniques such as deep level optical spectroscopy. We compare our calculated values of activation energies with the energies of experimentally observed C-related trap levels and identify the physical origins of these traps, which were unknown before.
Inference of neuronal network spike dynamics and topology from calcium imaging data
Lütcke, Henry; Gerhard, Felipe; Zenke, Friedemann; Gerstner, Wulfram; Helmchen, Fritjof
2013-01-01
Two-photon calcium imaging enables functional analysis of neuronal circuits by inferring action potential (AP) occurrence (“spike trains”) from cellular fluorescence signals. It remains unclear how experimental parameters such as signal-to-noise ratio (SNR) and acquisition rate affect spike inference and whether additional information about network structure can be extracted. Here we present a simulation framework for quantitatively assessing how well spike dynamics and network topology can be inferred from noisy calcium imaging data. For simulated AP-evoked calcium transients in neocortical pyramidal cells, we analyzed the quality of spike inference as a function of SNR and data acquisition rate using a recently introduced peeling algorithm. Given experimentally attainable values of SNR and acquisition rate, neural spike trains could be reconstructed accurately and with up to millisecond precision. We then applied statistical neuronal network models to explore how remaining uncertainties in spike inference affect estimates of network connectivity and topological features of network organization. We define the experimental conditions suitable for inferring whether the network has a scale-free structure and determine how well hub neurons can be identified. Our findings provide a benchmark for future calcium imaging studies that aim to reliably infer neuronal network properties. PMID:24399936
Makarava, Natallia; Menz, Stephan; Theves, Matthias; Huisinga, Wilhelm; Beta, Carsten; Holschneider, Matthias
2014-10-01
Amoebae explore their environment in a random way, unless external cues like, e.g., nutrients, bias their motion. Even in the absence of cues, however, experimental cell tracks show some degree of persistence. In this paper, we analyzed individual cell tracks in the framework of a linear mixed effects model, where each track is modeled by a fractional Brownian motion, i.e., a Gaussian process exhibiting a long-term correlation structure superposed on a linear trend. The degree of persistence was quantified by the Hurst exponent of fractional Brownian motion. Our analysis of experimental cell tracks of the amoeba Dictyostelium discoideum showed a persistent movement for the majority of tracks. Employing a sliding window approach, we estimated the variations of the Hurst exponent over time, which allowed us to identify points in time, where the correlation structure was distorted ("outliers"). Coarse graining of track data via down-sampling allowed us to identify the dependence of persistence on the spatial scale. While one would expect the (mode of the) Hurst exponent to be constant on different temporal scales due to the self-similarity property of fractional Brownian motion, we observed a trend towards stronger persistence for the down-sampled cell tracks indicating stronger persistence on larger time scales.
The income and health effects of tribal casino gaming on American Indians.
Wolfe, Barbara; Jakubowski, Jessica; Haveman, Robert; Courey, Marissa
2012-05-01
The legalization of American Indian casino gaming in the late 1980s allows examination of the relationship between income and health in a quasi-experimental way. Revenue from gaming accrues to individual tribes and has been used both to supplement tribe members' income and to finance tribal infrastructure. We assembled annual data from 1988-2003 on tribal gaming, health care access (from the Area Resource File), and individual health and socioeconomic characteristics data (from the Behavioral Risk Factors Surveillance System). We use this information within a structural, difference-in-differences framework to study the effect of casino gaming on tribal members' income, health status, access to health care, and health-related behaviors. Our difference-in-differences framework relies on before-after comparisons among American Indians whose tribe has at some time operated a casino and with-without comparisons between American Indians whose tribe has and those whose tribe has not initiated gaming. Our results provide identified estimates of the positive effect of gaming on American Indian income and on several indicators of American Indian health, health-related behaviors, and access to health care.
Multi-Domain Transfer Learning for Early Diagnosis of Alzheimer's Disease.
Cheng, Bo; Liu, Mingxia; Shen, Dinggang; Li, Zuoyong; Zhang, Daoqiang
2017-04-01
Recently, transfer learning has been successfully applied in early diagnosis of Alzheimer's Disease (AD) based on multi-domain data. However, most of existing methods only use data from a single auxiliary domain, and thus cannot utilize the intrinsic useful correlation information from multiple domains. Accordingly, in this paper, we consider the joint learning of tasks in multi-auxiliary domains and the target domain, and propose a novel Multi-Domain Transfer Learning (MDTL) framework for early diagnosis of AD. Specifically, the proposed MDTL framework consists of two key components: 1) a multi-domain transfer feature selection (MDTFS) model that selects the most informative feature subset from multi-domain data, and 2) a multi-domain transfer classification (MDTC) model that can identify disease status for early AD detection. We evaluate our method on 807 subjects from the Alzheimer's Disease Neuroimaging Initiative (ADNI) database using baseline magnetic resonance imaging (MRI) data. The experimental results show that the proposed MDTL method can effectively utilize multi-auxiliary domain data for improving the learning performance in the target domain, compared with several state-of-the-art methods.
Multilevel microvibration test for performance predictions of a space optical load platform
NASA Astrophysics Data System (ADS)
Li, Shiqi; Zhang, Heng; Liu, Shiping; Wang, Yue
2018-05-01
This paper presents a framework for the multilevel microvibration analysis and test of a space optical load platform. The test framework is conducted on three levels, including instrument, subsystem, and system level. Disturbance source experimental investigations are performed to evaluate the vibration amplitude and study vibration mechanism. Transfer characteristics of space camera are validated by a subsystem test, which allows the calculation of transfer functions from various disturbance sources to optical performance outputs. In order to identify the influence of the source on the spacecraft performance, a system level microvibration measurement test has been performed on the ground. From the time domain analysis and spectrum analysis of multilevel microvibration tests, we concluded that the disturbance source has a significant effect on its installation position. After transmitted through mechanical links, the residual vibration reduces to a background noise level. In addition, the angular microvibration of the platform jitter is mainly concentrated in the rotation of y-axes. This work is applied to a real practical application involving the high resolution satellite camera system.
Targeted Single-Site MOF Node Modification: Trivalent Metal Loading via Atomic Layer Deposition
Kim, In Soo; Borycz, Joshua; Platero-Prats, Ana E.; ...
2015-07-02
Postsynthetic functionalization of metal organic frameworks (MOFs) enables the controlled, high-density incorporation of new atoms on a crystallographically precise framework. Leveraging the broad palette of known atomic layer deposition (ALD) chemistries, ALD in MOFs (AIM) is one such targeted approach to construct diverse, highly functional, few-atom clusters. In this paper, we demonstrate the saturating reaction of trimethylindium (InMe 3) with the node hydroxyls and ligated water of NU-1000, which takes place without significant loss of MOF crystallinity or internal surface area. We computationally identify the elementary steps by which trimethylated trivalent metal compounds (ALD precursors) react with this Zr-based MOFmore » node to generate a uniform and well characterized new surface layer on the node itself, and we predict a final structure that is fully consistent with experimental X-ray pair distribution function (PDF) analysis. Finally, we further demonstrate tunable metal loading through controlled number density of the reactive handles (–OH and –OH 2) achieved through node dehydration at elevated temperatures.« less
Patterning in time and space: HoxB cluster gene expression in the developing chick embryo.
Gouveia, Analuce; Marcelino, Hugo M; Gonçalves, Lisa; Palmeirim, Isabel; Andrade, Raquel P
2015-01-01
The developing embryo is a paradigmatic model to study molecular mechanisms of time control in Biology. Hox genes are key players in the specification of tissue identity during embryo development and their expression is under strict temporal regulation. However, the molecular mechanisms underlying timely Hox activation in the early embryo remain unknown. This is hindered by the lack of a rigorous temporal framework of sequential Hox expression within a single cluster. Herein, a thorough characterization of HoxB cluster gene expression was performed over time and space in the early chick embryo. Clear temporal collinearity of HoxB cluster gene expression activation was observed. Spatial collinearity of HoxB expression was evidenced in different stages of development and in multiple tissues. Using embryo explant cultures we showed that HoxB2 is cyclically expressed in the rostral presomitic mesoderm with the same periodicity as somite formation, suggesting a link between timely tissue specification and somite formation. We foresee that the molecular framework herein provided will facilitate experimental approaches aimed at identifying the regulatory mechanisms underlying Hox expression in Time and Space.
Patterning in time and space: HoxB cluster gene expression in the developing chick embryo
Gouveia, Analuce; Marcelino, Hugo M; Gonçalves, Lisa; Palmeirim, Isabel; Andrade, Raquel P
2015-01-01
The developing embryo is a paradigmatic model to study molecular mechanisms of time control in Biology. Hox genes are key players in the specification of tissue identity during embryo development and their expression is under strict temporal regulation. However, the molecular mechanisms underlying timely Hox activation in the early embryo remain unknown. This is hindered by the lack of a rigorous temporal framework of sequential Hox expression within a single cluster. Herein, a thorough characterization of HoxB cluster gene expression was performed over time and space in the early chick embryo. Clear temporal collinearity of HoxB cluster gene expression activation was observed. Spatial collinearity of HoxB expression was evidenced in different stages of development and in multiple tissues. Using embryo explant cultures we showed that HoxB2 is cyclically expressed in the rostral presomitic mesoderm with the same periodicity as somite formation, suggesting a link between timely tissue specification and somite formation. We foresee that the molecular framework herein provided will facilitate experimental approaches aimed at identifying the regulatory mechanisms underlying Hox expression in Time and Space. PMID:25602523
Osypiuk, Kamila; Thompson, Evan; Wayne, Peter M.
2018-01-01
Dynamic and static body postures are a defining characteristic of mind-body practices such as Tai Chi and Qigong (TCQ). A growing body of evidence supports the hypothesis that TCQ may be beneficial for psychological health, including management and prevention of depression and anxiety. Although a variety of causal factors have been identified as potential mediators of such health benefits, physical posture, despite its visible prominence, has been largely overlooked. We hypothesize that body posture while standing and/or moving may be a key therapeutic element mediating the influence of TCQ on psychological health. In the present paper, we summarize existing experimental and observational evidence that suggests a bi-directional relationship between body posture and mental states. Drawing from embodied cognitive science, we provide a theoretical framework for further investigation into this interrelationship. We discuss the challenges involved in such an investigation and propose suggestions for future studies. Despite theoretical and practical challenges, we propose that the role of posture in mind-body exercises such as TCQ should be considered in future research. PMID:29765313
Osypiuk, Kamila; Thompson, Evan; Wayne, Peter M
2018-01-01
Dynamic and static body postures are a defining characteristic of mind-body practices such as Tai Chi and Qigong (TCQ). A growing body of evidence supports the hypothesis that TCQ may be beneficial for psychological health, including management and prevention of depression and anxiety. Although a variety of causal factors have been identified as potential mediators of such health benefits, physical posture, despite its visible prominence, has been largely overlooked. We hypothesize that body posture while standing and/or moving may be a key therapeutic element mediating the influence of TCQ on psychological health. In the present paper, we summarize existing experimental and observational evidence that suggests a bi-directional relationship between body posture and mental states. Drawing from embodied cognitive science, we provide a theoretical framework for further investigation into this interrelationship. We discuss the challenges involved in such an investigation and propose suggestions for future studies. Despite theoretical and practical challenges, we propose that the role of posture in mind-body exercises such as TCQ should be considered in future research.
Joint Multi-Leaf Segmentation, Alignment, and Tracking for Fluorescence Plant Videos.
Yin, Xi; Liu, Xiaoming; Chen, Jin; Kramer, David M
2018-06-01
This paper proposes a novel framework for fluorescence plant video processing. The plant research community is interested in the leaf-level photosynthetic analysis within a plant. A prerequisite for such analysis is to segment all leaves, estimate their structures, and track them over time. We identify this as a joint multi-leaf segmentation, alignment, and tracking problem. First, leaf segmentation and alignment are applied on the last frame of a plant video to find a number of well-aligned leaf candidates. Second, leaf tracking is applied on the remaining frames with leaf candidate transformation from the previous frame. We form two optimization problems with shared terms in their objective functions for leaf alignment and tracking respectively. A quantitative evaluation framework is formulated to evaluate the performance of our algorithm with four metrics. Two models are learned to predict the alignment accuracy and detect tracking failure respectively in order to provide guidance for subsequent plant biology analysis. The limitation of our algorithm is also studied. Experimental results show the effectiveness, efficiency, and robustness of the proposed method.
Post-processing interstitialcy diffusion from molecular dynamics simulations
NASA Astrophysics Data System (ADS)
Bhardwaj, U.; Bukkuru, S.; Warrier, M.
2016-01-01
An algorithm to rigorously trace the interstitialcy diffusion trajectory in crystals is developed. The algorithm incorporates unsupervised learning and graph optimization which obviate the need to input extra domain specific information depending on crystal or temperature of the simulation. The algorithm is implemented in a flexible framework as a post-processor to molecular dynamics (MD) simulations. We describe in detail the reduction of interstitialcy diffusion into known computational problems of unsupervised clustering and graph optimization. We also discuss the steps, computational efficiency and key components of the algorithm. Using the algorithm, thermal interstitialcy diffusion from low to near-melting point temperatures is studied. We encapsulate the algorithms in a modular framework with functionality to calculate diffusion coefficients, migration energies and other trajectory properties. The study validates the algorithm by establishing the conformity of output parameters with experimental values and provides detailed insights for the interstitialcy diffusion mechanism. The algorithm along with the help of supporting visualizations and analysis gives convincing details and a new approach to quantifying diffusion jumps, jump-lengths, time between jumps and to identify interstitials from lattice atoms.
Targeted Single-Site MOF Node Modification: Trivalent Metal Loading via Atomic Layer Deposition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, In Soo; Borycz, Joshua; Platero-Prats, Ana E.
Postsynthetic functionalization of metal organic frameworks (MOFs) enables the controlled, high-density incorporation of new atoms on a crystallographically precise framework. Leveraging the broad palette of known atomic layer deposition (ALD) chemistries, ALD in MOFs (AIM) is one such targeted approach to construct diverse, highly functional, few-atom clusters. In this paper, we demonstrate the saturating reaction of trimethylindium (InMe 3) with the node hydroxyls and ligated water of NU-1000, which takes place without significant loss of MOF crystallinity or internal surface area. We computationally identify the elementary steps by which trimethylated trivalent metal compounds (ALD precursors) react with this Zr-based MOFmore » node to generate a uniform and well characterized new surface layer on the node itself, and we predict a final structure that is fully consistent with experimental X-ray pair distribution function (PDF) analysis. Finally, we further demonstrate tunable metal loading through controlled number density of the reactive handles (–OH and –OH 2) achieved through node dehydration at elevated temperatures.« less
Functional materials discovery using energy-structure-function maps
NASA Astrophysics Data System (ADS)
Pulido, Angeles; Chen, Linjiang; Kaczorowski, Tomasz; Holden, Daniel; Little, Marc A.; Chong, Samantha Y.; Slater, Benjamin J.; McMahon, David P.; Bonillo, Baltasar; Stackhouse, Chloe J.; Stephenson, Andrew; Kane, Christopher M.; Clowes, Rob; Hasell, Tom; Cooper, Andrew I.; Day, Graeme M.
2017-03-01
Molecular crystals cannot be designed in the same manner as macroscopic objects, because they do not assemble according to simple, intuitive rules. Their structures result from the balance of many weak interactions, rather than from the strong and predictable bonding patterns found in metal-organic frameworks and covalent organic frameworks. Hence, design strategies that assume a topology or other structural blueprint will often fail. Here we combine computational crystal structure prediction and property prediction to build energy-structure-function maps that describe the possible structures and properties that are available to a candidate molecule. Using these maps, we identify a highly porous solid, which has the lowest density reported for a molecular crystal so far. Both the structure of the crystal and its physical properties, such as methane storage capacity and guest-molecule selectivity, are predicted using the molecular structure as the only input. More generally, energy-structure-function maps could be used to guide the experimental discovery of materials with any target function that can be calculated from predicted crystal structures, such as electronic structure or mechanical properties.
Functional materials discovery using energy-structure-function maps.
Pulido, Angeles; Chen, Linjiang; Kaczorowski, Tomasz; Holden, Daniel; Little, Marc A; Chong, Samantha Y; Slater, Benjamin J; McMahon, David P; Bonillo, Baltasar; Stackhouse, Chloe J; Stephenson, Andrew; Kane, Christopher M; Clowes, Rob; Hasell, Tom; Cooper, Andrew I; Day, Graeme M
2017-03-30
Molecular crystals cannot be designed in the same manner as macroscopic objects, because they do not assemble according to simple, intuitive rules. Their structures result from the balance of many weak interactions, rather than from the strong and predictable bonding patterns found in metal-organic frameworks and covalent organic frameworks. Hence, design strategies that assume a topology or other structural blueprint will often fail. Here we combine computational crystal structure prediction and property prediction to build energy-structure-function maps that describe the possible structures and properties that are available to a candidate molecule. Using these maps, we identify a highly porous solid, which has the lowest density reported for a molecular crystal so far. Both the structure of the crystal and its physical properties, such as methane storage capacity and guest-molecule selectivity, are predicted using the molecular structure as the only input. More generally, energy-structure-function maps could be used to guide the experimental discovery of materials with any target function that can be calculated from predicted crystal structures, such as electronic structure or mechanical properties.
Multi-Domain Transfer Learning for Early Diagnosis of Alzheimer’s Disease
Cheng, Bo; Liu, Mingxia; Li, Zuoyong
2017-01-01
Recently, transfer learning has been successfully applied in early diagnosis of Alzheimer’s Disease (AD) based on multi-domain data. However, most of existing methods only use data from a single auxiliary domain, and thus cannot utilize the intrinsic useful correlation information from multiple domains. Accordingly, in this paper, we consider the joint learning of tasks in multi-auxiliary domains and the target domain, and propose a novel Multi-Domain Transfer Learning (MDTL) framework for early diagnosis of AD. Specifically, the proposed MDTL framework consists of two key components: 1) a multi-domain transfer feature selection (MDTFS) model that selects the most informative feature subset from multi-domain data, and 2) a multidomain transfer classification (MDTC) model that can identify disease status for early AD detection. We evaluate our method on 807 subjects from the Alzheimer’s Disease Neuroimaging Initiative (ADNI) database using baseline magnetic resonance imaging (MRI) data. The experimental results show that the proposed MDTL method can effectively utilize multi-auxiliary domain data for improving the learning performance in the target domain, compared with several state-of-the-art methods. PMID:27928657
Detection of epistatic effects with logic regression and a classical linear regression model.
Malina, Magdalena; Ickstadt, Katja; Schwender, Holger; Posch, Martin; Bogdan, Małgorzata
2014-02-01
To locate multiple interacting quantitative trait loci (QTL) influencing a trait of interest within experimental populations, usually methods as the Cockerham's model are applied. Within this framework, interactions are understood as the part of the joined effect of several genes which cannot be explained as the sum of their additive effects. However, if a change in the phenotype (as disease) is caused by Boolean combinations of genotypes of several QTLs, this Cockerham's approach is often not capable to identify them properly. To detect such interactions more efficiently, we propose a logic regression framework. Even though with the logic regression approach a larger number of models has to be considered (requiring more stringent multiple testing correction) the efficient representation of higher order logic interactions in logic regression models leads to a significant increase of power to detect such interactions as compared to a Cockerham's approach. The increase in power is demonstrated analytically for a simple two-way interaction model and illustrated in more complex settings with simulation study and real data analysis.
Post-processing interstitialcy diffusion from molecular dynamics simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhardwaj, U., E-mail: haptork@gmail.com; Bukkuru, S.; Warrier, M.
2016-01-15
An algorithm to rigorously trace the interstitialcy diffusion trajectory in crystals is developed. The algorithm incorporates unsupervised learning and graph optimization which obviate the need to input extra domain specific information depending on crystal or temperature of the simulation. The algorithm is implemented in a flexible framework as a post-processor to molecular dynamics (MD) simulations. We describe in detail the reduction of interstitialcy diffusion into known computational problems of unsupervised clustering and graph optimization. We also discuss the steps, computational efficiency and key components of the algorithm. Using the algorithm, thermal interstitialcy diffusion from low to near-melting point temperatures ismore » studied. We encapsulate the algorithms in a modular framework with functionality to calculate diffusion coefficients, migration energies and other trajectory properties. The study validates the algorithm by establishing the conformity of output parameters with experimental values and provides detailed insights for the interstitialcy diffusion mechanism. The algorithm along with the help of supporting visualizations and analysis gives convincing details and a new approach to quantifying diffusion jumps, jump-lengths, time between jumps and to identify interstitials from lattice atoms. -- Graphical abstract:.« less
Self-contained image mapping of placental vasculature in 3D ultrasound-guided fetoscopy.
Yang, Liangjing; Wang, Junchen; Ando, Takehiro; Kubota, Akihiro; Yamashita, Hiromasa; Sakuma, Ichiro; Chiba, Toshio; Kobayashi, Etsuko
2016-09-01
Surgical navigation technology directed at fetoscopic procedures is relatively underdeveloped compared with other forms of endoscopy. The narrow fetoscopic field of views and the vast vascular network on the placenta make examination and photocoagulation treatment of twin-to-twin transfusion syndrome challenging. Though ultrasonography is used for intraoperative guidance, its navigational ability is not fully exploited. This work aims to integrate 3D ultrasound imaging and endoscopic vision seamlessly for placental vasculature mapping through a self-contained framework without external navigational devices. This is achieved through development, integration, and experimentation of novel navigational modules. Firstly, a framework design that addresses the current limitations based on identified gaps is conceptualized. Secondly, integration of navigational modules including (1) ultrasound-based localization, (2) image alignment, and (3) vision-based tracking to update the scene texture map is implemented. This updated texture map is projected to an ultrasound-constructed 3D model for photorealistic texturing of the 3D scene creating a panoramic view of the moving fetoscope. In addition, a collaborative scheme for the integration of the modular workflow system is proposed to schedule updates in a systematic fashion. Finally, experiments are carried out to evaluate each modular variation and an integrated collaborative scheme of the framework. The modules and the collaborative scheme are evaluated through a series of phantom experiments with controlled trajectories for repeatability. The collaborative framework demonstrated the best accuracy (5.2 % RMS error) compared with all the three single-module variations during the experiment. Validation on an ex vivo monkey placenta shows visual continuity of the freehand fetoscopic panorama. The proposed developed collaborative framework and the evaluation study of the framework variations provide analytical insights for effective integration of ultrasonography and endoscopy. This contributes to the development of navigation techniques in fetoscopic procedures and can potentially be extended to other applications in intraoperative imaging.
DOT National Transportation Integrated Search
1993-12-01
This report presents a comprehensive modeling framework for user responses to Advanced Traveler Information Systems (ATIS) services and identifies the data needs for the validation of such a framework. The authors present overviews of the framework b...
An Ecosystem Evaluation Framework for Global Seamount Conservation and Management
Taranto, Gerald H.; Kvile, Kristina Ø.; Pitcher, Tony J.; Morato, Telmo
2012-01-01
In the last twenty years, several global targets for protection of marine biodiversity have been adopted but have failed. The Convention on Biological Diversity (CBD) aims at preserving 10% of all the marine biomes by 2020. For achieving this goal, ecologically or biologically significant areas (EBSA) have to be identified in all biogeographic regions. However, the methodologies for identifying the best suitable areas are still to be agreed. Here, we propose a framework for applying the CBD criteria to locate potential ecologically or biologically significant seamount areas based on the best information currently available. The framework combines the likelihood of a seamount constituting an EBSA and its level of human impact and can be used at global, regional and local scales. This methodology allows the classification of individual seamounts into four major portfolio conservation categories which can help optimize management efforts toward the protection of the most suitable areas. The framework was tested against 1000 dummy seamounts and satisfactorily assigned seamounts to proper EBSA and threats categories. Additionally, the framework was applied to eight case study seamounts that were included in three out of four portfolio categories: areas highly likely to be identified as EBSA with high degree of threat; areas highly likely to be EBSA with low degree of threat; and areas with a low likelihood of being EBSA with high degree of threat. This framework will allow managers to identify seamount EBSAs and to prioritize their policies in terms of protecting undisturbed areas, disturbed areas for recovery of habitats and species, or both based on their management objectives. It also identifies seamount EBSAs and threats considering different ecological groups in both pelagic and benthic communities. Therefore, this framework may represent an important tool to mitigate seamount biodiversity loss and to achieve the 2020 CBD goals. PMID:22905190
An ecosystem evaluation framework for global seamount conservation and management.
Taranto, Gerald H; Kvile, Kristina Ø; Pitcher, Tony J; Morato, Telmo
2012-01-01
In the last twenty years, several global targets for protection of marine biodiversity have been adopted but have failed. The Convention on Biological Diversity (CBD) aims at preserving 10% of all the marine biomes by 2020. For achieving this goal, ecologically or biologically significant areas (EBSA) have to be identified in all biogeographic regions. However, the methodologies for identifying the best suitable areas are still to be agreed. Here, we propose a framework for applying the CBD criteria to locate potential ecologically or biologically significant seamount areas based on the best information currently available. The framework combines the likelihood of a seamount constituting an EBSA and its level of human impact and can be used at global, regional and local scales. This methodology allows the classification of individual seamounts into four major portfolio conservation categories which can help optimize management efforts toward the protection of the most suitable areas. The framework was tested against 1000 dummy seamounts and satisfactorily assigned seamounts to proper EBSA and threats categories. Additionally, the framework was applied to eight case study seamounts that were included in three out of four portfolio categories: areas highly likely to be identified as EBSA with high degree of threat; areas highly likely to be EBSA with low degree of threat; and areas with a low likelihood of being EBSA with high degree of threat. This framework will allow managers to identify seamount EBSAs and to prioritize their policies in terms of protecting undisturbed areas, disturbed areas for recovery of habitats and species, or both based on their management objectives. It also identifies seamount EBSAs and threats considering different ecological groups in both pelagic and benthic communities. Therefore, this framework may represent an important tool to mitigate seamount biodiversity loss and to achieve the 2020 CBD goals.
Multinational Experiment 7. Outcome 3 - Cyber Domain. Objective 3.3: Concept Framework Version 3.0
2012-10-03
experimentation in order to give some parameters for Decision Makers’ actions. A.5 DIFFERENT LEGAL FRAMEWORKS The juridical framework to which we refer, in...material effects (e.g. psychological impact), economic et al, or, especially in the military field, it may affect Operational Security (OPSEC). 7...not expected at all to be run as a mechanistic tool that produces univocal outputs on the base of juridically qualified inputs, making unnecessary
Frameworks to assess health systems governance: a systematic review.
Pyone, Thidar; Smith, Helen; van den Broek, Nynke
2017-06-01
Governance of the health system is a relatively new concept and there are gaps in understanding what health system governance is and how it could be assessed. We conducted a systematic review of the literature to describe the concept of governance and the theories underpinning as applied to health systems; and to identify which frameworks are available and have been applied to assess health systems governance. Frameworks were reviewed to understand how the principles of governance might be operationalized at different levels of a health system. Electronic databases and web portals of international institutions concerned with governance were searched for publications in English for the period January 1994 to February 2016. Sixteen frameworks developed to assess governance in the health system were identified and are described. Of these, six frameworks were developed based on theories from new institutional economics; three are primarily informed by political science and public management disciplines; three arise from the development literature and four use multidisciplinary approaches. Only five of the identified frameworks have been applied. These used the principal-agent theory, theory of common pool resources, North's institutional analysis and the cybernetics theory. Governance is a practice, dependent on arrangements set at political or national level, but which needs to be operationalized by individuals at lower levels in the health system; multi-level frameworks acknowledge this. Three frameworks were used to assess governance at all levels of the health system. Health system governance is complex and difficult to assess; the concept of governance originates from different disciplines and is multidimensional. There is a need to validate and apply existing frameworks and share lessons learnt regarding which frameworks work well in which settings. A comprehensive assessment of governance could enable policy makers to prioritize solutions for problems identified as well as replicate and scale-up examples of good practice. © The Author 2017. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine.
Frameworks to assess health systems governance: a systematic review
Smith, Helen; van den Broek, Nynke
2017-01-01
Abstract Governance of the health system is a relatively new concept and there are gaps in understanding what health system governance is and how it could be assessed. We conducted a systematic review of the literature to describe the concept of governance and the theories underpinning as applied to health systems; and to identify which frameworks are available and have been applied to assess health systems governance. Frameworks were reviewed to understand how the principles of governance might be operationalized at different levels of a health system. Electronic databases and web portals of international institutions concerned with governance were searched for publications in English for the period January 1994 to February 2016. Sixteen frameworks developed to assess governance in the health system were identified and are described. Of these, six frameworks were developed based on theories from new institutional economics; three are primarily informed by political science and public management disciplines; three arise from the development literature and four use multidisciplinary approaches. Only five of the identified frameworks have been applied. These used the principal–agent theory, theory of common pool resources, North’s institutional analysis and the cybernetics theory. Governance is a practice, dependent on arrangements set at political or national level, but which needs to be operationalized by individuals at lower levels in the health system; multi-level frameworks acknowledge this. Three frameworks were used to assess governance at all levels of the health system. Health system governance is complex and difficult to assess; the concept of governance originates from different disciplines and is multidimensional. There is a need to validate and apply existing frameworks and share lessons learnt regarding which frameworks work well in which settings. A comprehensive assessment of governance could enable policy makers to prioritize solutions for problems identified as well as replicate and scale-up examples of good practice. PMID:28334991
A novel framework of tissue membrane systems for image fusion.
Zhang, Zulin; Yi, Xinzhong; Peng, Hong
2014-01-01
This paper proposes a tissue membrane system-based framework to deal with the optimal image fusion problem. A spatial domain fusion algorithm is given, and a tissue membrane system of multiple cells is used as its computing framework. Based on the multicellular structure and inherent communication mechanism of the tissue membrane system, an improved velocity-position model is developed. The performance of the fusion framework is studied with comparison of several traditional fusion methods as well as genetic algorithm (GA)-based and differential evolution (DE)-based spatial domain fusion methods. Experimental results show that the proposed fusion framework is superior or comparable to the other methods and can be efficiently used for image fusion.
Halse, Meghan E; Procacci, Barbara; Henshaw, Sarah-Louise; Perutz, Robin N; Duckett, Simon B
2017-05-01
We recently reported a pump-probe method that uses a single laser pulse to introduce parahydrogen (p-H 2 ) into a metal dihydride complex and then follows the time-evolution of the p-H 2 -derived nuclear spin states by NMR. We present here a theoretical framework to describe the oscillatory behaviour of the resultant hyperpolarised NMR signals using a product operator formalism. We consider the cases where the p-H 2 -derived protons form part of an AX, AXY, AXYZ or AA'XX' spin system in the product molecule. We use this framework to predict the patterns for 2D pump-probe NMR spectra, where the indirect dimension represents the evolution during the pump-probe delay and the positions of the cross-peaks depend on the difference in chemical shift of the p-H 2 -derived protons and the difference in their couplings to other nuclei. The evolution of the NMR signals of the p-H 2 -derived protons, as well as the transfer of hyperpolarisation to other NMR-active nuclei in the product, is described. The theoretical framework is tested experimentally for a set of ruthenium dihydride complexes representing the different spin systems. Theoretical predictions and experimental results agree to within experimental error for all features of the hyperpolarised 1 H and 31 P pump-probe NMR spectra. Thus we establish the laser pump, NMR probe approach as a robust way to directly observe and quantitatively analyse the coherent evolution of p-H 2 -derived spin order over micro-to-millisecond timescales. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
An Integrated Knowledge Framework to Characterize and Scaffold Size and Scale Cognition (FS2C)
NASA Astrophysics Data System (ADS)
Magana, Alejandra J.; Brophy, Sean P.; Bryan, Lynn A.
2012-09-01
Size and scale cognition is a critical ability associated with reasoning with concepts in different disciplines of science, technology, engineering, and mathematics. As such, researchers and educators have identified the need for young learners and their educators to become scale-literate. Informed by developmental psychology literature and recent findings in nanoscale science and engineering education, we propose an integrated knowledge framework for characterizing and scaffolding size and scale cognition called the FS2C framework. Five ad hoc assessment tasks were designed informed by the FS2C framework with the goal of identifying participants' understandings of size and scale. Findings identified participants' difficulties to discern different sizes of microscale and nanoscale objects and a low level of sophistication on identifying scale worlds among participants. Results also identified that as bigger the difference between the sizes of the objects is, the more difficult was for participants to identify how many times an object is bigger or smaller than another one. Similarly, participants showed difficulties to estimate approximate sizes of sub-macroscopic objects as well as a difficulty for participants to estimate the size of very large objects. Participants' accurate location of objects on a logarithmic scale was also challenging.
USDA-ARS?s Scientific Manuscript database
Plant productivity and other ecosystem processes vary widely in their responses to experimental increases in atmospheric carbon dioxide (CO2) concentration. We adapt a conceptual framework first suggested by Chapin et al. (1996) to define conditions that sustain ecosystems to address the question o...
Teaching Experimental Methods: A Framework for Hands-On Modules
ERIC Educational Resources Information Center
Doherty, David
2011-01-01
Experiments provide a simple and engaging framework for familiarizing students with the process of quantitative social research. In this article, I illustrate how experiments can be used in the classroom environment by describing a module that was implemented in four high school classrooms. The module familiarized students with how the scientific…
ERIC Educational Resources Information Center
Koumi, Jack
2013-01-01
This paper argues that pedagogic efficacy of multimedia packages (interactive multimedia presentations) cannot be achieved by experimental research in the absence of a detailed pedagogical screenwriting framework. Following a summary of relevant literature, such a framework is offered, consisting of micro-level design guidelines. The guidelines…
Desveaux, Laura; Gagliardi, Anna R
2018-06-04
Post-market surveillance of medical devices is reliant on physician reporting of adverse medical device events (AMDEs). Few studies have examined factors that influence whether and how physicians report AMDEs, an essential step in the development of behaviour change interventions. This study was a secondary analysis comparing application of the Theoretical Domains Framework (TDF) and the Tailored Implementation for Chronic Diseases (TICD) framework to identify potential behaviour change interventions that correspond to determinants of AMDE reporting. A previous study involving qualitative interviews with Canadian physicians that implant medical devices identified themes reflecting AMDE reporting determinants. In this secondary analysis, themes that emerged from the primary analysis were independently mapped to the TDF and TICD. Determinants and corresponding intervention options arising from both frameworks (and both mappers) were compared. Both theoretical frameworks were useful for identifying interventions corresponding to behavioural determinants of AMDE reporting. Information or education strategies that provide evidence about AMDEs, and audit and feedback of AMDE data were identified as interventions to target the theme of physician beliefs; improving information systems, and reminder cues, prompts and awards were identified as interventions to address determinants arising from the organization or systems themes; and modifying financial/non-financial incentives and sharing data on outcomes associated with AMDEs were identified as interventions to target device market themes. Numerous operational challenges were encountered in the application of both frameworks including a lack of clarity about how directly relevant to themes the domains/determinants should be, how many domains/determinants to select, if and how to resolve discrepancies across multiple mappers, and how to choose interventions from among the large number associated with selected domains/determinants. Given discrepancies in mapping themes to determinants/domains and the resulting interventions offered by the two frameworks, uncertainty remains about how to choose interventions that best match behavioural determinants in a given context. Further research is needed to provide more nuanced guidance on the application of TDF and TICD for a broader audience, which is likely to increase the utility and uptake of these frameworks in practice.
Digital evaluation of sitting posture comfort in human-vehicle system under Industry 4.0 framework
NASA Astrophysics Data System (ADS)
Tao, Qing; Kang, Jinsheng; Sun, Wenlei; Li, Zhaobo; Huo, Xiao
2016-09-01
Most of the previous studies on the vibration ride comfort of the human-vehicle system were focused only on one or two aspects of the investigation. A hybrid approach which integrates all kinds of investigation methods in real environment and virtual environment is described. The real experimental environment includes the WBV(whole body vibration) test, questionnaires for human subjective sensation and motion capture. The virtual experimental environment includes the theoretical calculation on simplified 5-DOF human body vibration model, the vibration simulation and analysis within ADAMS/VibrationTM module, and the digital human biomechanics and occupational health analysis in Jack software. While the real experimental environment provides realistic and accurate test results, it also serves as core and validation for the virtual experimental environment. The virtual experimental environment takes full advantages of current available vibration simulation and digital human modelling software, and makes it possible to evaluate the sitting posture comfort in a human-vehicle system with various human anthropometric parameters. How this digital evaluation system for car seat comfort design is fitted in the Industry 4.0 framework is also proposed.
Conceptual framework for development of comprehensive e-health evaluation tool.
Khoja, Shariq; Durrani, Hammad; Scott, Richard E; Sajwani, Afroz; Piryani, Usha
2013-01-01
The main objective of this study was to develop an e-health evaluation tool based on a conceptual framework including relevant theories for evaluating use of technology in health programs. This article presents the development of an evaluation framework for e-health programs. The study was divided into three stages: Stage 1 involved a detailed literature search of different theories and concepts on evaluation of e-health, Stage 2 plotted e-health theories to identify relevant themes, and Stage 3 developed a matrix of evaluation themes and stages of e-health programs. The framework identifies and defines different stages of e-health programs and then applies evaluation theories to each of these stages for development of the evaluation tool. This framework builds on existing theories of health and technology evaluation and presents a conceptual framework for developing an e-health evaluation tool to examine and measure different factors that play a definite role in the success of e-health programs. The framework on the horizontal axis divides e-health into different stages of program implementation, while the vertical axis identifies different themes and areas of consideration for e-health evaluation. The framework helps understand various aspects of e-health programs and their impact that require evaluation at different stages of the life cycle. The study led to the development of a new and comprehensive e-health evaluation tool, named the Khoja-Durrani-Scott Framework for e-Health Evaluation.
Mitchell, Brett G; Gardner, Anne
2014-03-01
To present a discussion on theoretical frameworks in infection prevention and control. Infection prevention and control programmes have been in place for several years in response to the incidence of healthcare-associated infections and their associated morbidity and mortality. Theoretical frameworks play an important role in formalizing the understanding of infection prevention activities. Discussion paper. A literature search using electronic databases was conducted for published articles in English addressing theoretical frameworks in infection prevention and control between 1980-2012. Nineteen papers that included a reference to frameworks were identified in the review. A narrative analysis of these papers was completed. Two models were identified and neither included the role of surveillance. To reduce the risk of acquiring a healthcare-associated infection, a multifaceted approach to infection prevention is required. One key component in this approach is surveillance. The review identified two infection prevention and control frameworks, yet these are rarely applied in infection prevention and control programmes. Only one framework considered the multifaceted approach required for infection prevention. It did not, however, incorporate the role of surveillance. We present a framework that incorporates the role of surveillance into a biopsychosocial approach to infection prevention and control. Infection prevention and control programmes and associated research are led primarily by nurses. There is a need for an explicit infection prevention and control framework incorporating the important role that surveillance has in infection prevention activities. This study presents one framework for further critique and discussion. © 2013 John Wiley & Sons Ltd.
Suratanee, Apichat; Plaimas, Kitiporn
2017-01-01
The associations between proteins and diseases are crucial information for investigating pathological mechanisms. However, the number of known and reliable protein-disease associations is quite small. In this study, an analysis framework to infer associations between proteins and diseases was developed based on a large data set of a human protein-protein interaction network integrating an effective network search, namely, the reverse k -nearest neighbor (R k NN) search. The R k NN search was used to identify an impact of a protein on other proteins. Then, associations between proteins and diseases were inferred statistically. The method using the R k NN search yielded a much higher precision than a random selection, standard nearest neighbor search, or when applying the method to a random protein-protein interaction network. All protein-disease pair candidates were verified by a literature search. Supporting evidence for 596 pairs was identified. In addition, cluster analysis of these candidates revealed 10 promising groups of diseases to be further investigated experimentally. This method can be used to identify novel associations to better understand complex relationships between proteins and diseases.
A framework for evaluating mixture analysis algorithms
NASA Astrophysics Data System (ADS)
Dasaratha, Sridhar; Vignesh, T. S.; Shanmukh, Sarat; Yarra, Malathi; Botonjic-Sehic, Edita; Grassi, James; Boudries, Hacene; Freeman, Ivan; Lee, Young K.; Sutherland, Scott
2010-04-01
In recent years, several sensing devices capable of identifying unknown chemical and biological substances have been commercialized. The success of these devices in analyzing real world samples is dependent on the ability of the on-board identification algorithm to de-convolve spectra of substances that are mixtures. To develop effective de-convolution algorithms, it is critical to characterize the relationship between the spectral features of a substance and its probability of detection within a mixture, as these features may be similar to or overlap with other substances in the mixture and in the library. While it has been recognized that these aspects pose challenges to mixture analysis, a systematic effort to quantify spectral characteristics and their impact, is generally lacking. In this paper, we propose metrics that can be used to quantify these spectral features. Some of these metrics, such as a modification of variance inflation factor, are derived from classical statistical measures used in regression diagnostics. We demonstrate that these metrics can be correlated to the accuracy of the substance's identification in a mixture. We also develop a framework for characterizing mixture analysis algorithms, using these metrics. Experimental results are then provided to show the application of this framework to the evaluation of various algorithms, including one that has been developed for a commercial device. The illustration is based on synthetic mixtures that are created from pure component Raman spectra measured on a portable device.
Selvarasu, Suresh; Kim, Do Yun; Karimi, Iftekhar A; Lee, Dong-Yup
2010-10-01
We present an integrated framework for characterizing fed-batch cultures of mouse hybridoma cells producing monoclonal antibody (mAb). This framework systematically combines data preprocessing, elemental balancing and statistical analysis technique. Initially, specific rates of cell growth, glucose/amino acid consumptions and mAb/metabolite productions were calculated via curve fitting using logistic equations, with subsequent elemental balancing of the preprocessed data indicating the presence of experimental measurement errors. Multivariate statistical analysis was then employed to understand physiological characteristics of the cellular system. The results from principal component analysis (PCA) revealed three major clusters of amino acids with similar trends in their consumption profiles: (i) arginine, threonine and serine, (ii) glycine, tyrosine, phenylalanine, methionine, histidine and asparagine, and (iii) lysine, valine and isoleucine. Further analysis using partial least square (PLS) regression identified key amino acids which were positively or negatively correlated with the cell growth, mAb production and the generation of lactate and ammonia. Based on these results, the optimal concentrations of key amino acids in the feed medium can be inferred, potentially leading to an increase in cell viability and productivity, as well as a decrease in toxic waste production. The study demonstrated how the current methodological framework using multivariate statistical analysis techniques can serve as a potential tool for deriving rational medium design strategies. Copyright © 2010 Elsevier B.V. All rights reserved.
Knowledge-transfer learning for prediction of matrix metalloprotease substrate-cleavage sites.
Wang, Yanan; Song, Jiangning; Marquez-Lago, Tatiana T; Leier, André; Li, Chen; Lithgow, Trevor; Webb, Geoffrey I; Shen, Hong-Bin
2017-07-18
Matrix Metalloproteases (MMPs) are an important family of proteases that play crucial roles in key cellular and disease processes. Therefore, MMPs constitute important targets for drug design, development and delivery. Advanced proteomic technologies have identified type-specific target substrates; however, the complete repertoire of MMP substrates remains uncharacterized. Indeed, computational prediction of substrate-cleavage sites associated with MMPs is a challenging problem. This holds especially true when considering MMPs with few experimentally verified cleavage sites, such as for MMP-2, -3, -7, and -8. To fill this gap, we propose a new knowledge-transfer computational framework which effectively utilizes the hidden shared knowledge from some MMP types to enhance predictions of other, distinct target substrate-cleavage sites. Our computational framework uses support vector machines combined with transfer machine learning and feature selection. To demonstrate the value of the model, we extracted a variety of substrate sequence-derived features and compared the performance of our method using both 5-fold cross-validation and independent tests. The results show that our transfer-learning-based method provides a robust performance, which is at least comparable to traditional feature-selection methods for prediction of MMP-2, -3, -7, -8, -9 and -12 substrate-cleavage sites on independent tests. The results also demonstrate that our proposed computational framework provides a useful alternative for the characterization of sequence-level determinants of MMP-substrate specificity.
NASA Astrophysics Data System (ADS)
Mikuła, Andrzej; Król, Magdalena; Mozgawa, Włodzimierz; Koleżyński, Andrzej
2018-04-01
Vibrational spectroscopy can be considered as one of the most important methods used for structural characterization of various porous aluminosilicate materials, including zeolites. On the other hand, vibrational spectra of zeolites are still difficult to interpret, particularly in the pseudolattice region, where bands related to ring oscillations can be observed. Using combination of theoretical and computational approach, a detailed analysis of these regions of spectra is possible; such analysis should be, however, carried out employing models with different level of complexity and simultaneously the same theory level. In this work, an attempt was made to identify ring oscillations in vibrational spectra of selected zeolite structures. A series of ab initio calculations focused on S4R, S6R, and as a novelty, 5-1 isolated clusters, as well as periodic siliceous frameworks built from those building units (ferrierite (FER), mordenite (MOR) and heulandite (HEU) type) have been carried out. Due to the hierarchical structure of zeolite frameworks it can be expected that the total envelope of the zeolite spectra should be with good accuracy a sum of the spectra of structural elements that build each zeolite framework. Based on the results of HF calculations, normal vibrations have been visualized and detailed analysis of pseudolattice range of resulting theoretical spectra have been carried out. Obtained results have been applied for interpretation of experimental spectra of selected zeolites.
An action potential-driven model of soleus muscle activation dynamics for locomotor-like movements
NASA Astrophysics Data System (ADS)
Kim, Hojeong; Sandercock, Thomas G.; Heckman, C. J.
2015-08-01
Objective. The goal of this study was to develop a physiologically plausible, computationally robust model for muscle activation dynamics (A(t)) under physiologically relevant excitation and movement. Approach. The interaction of excitation and movement on A(t) was investigated comparing the force production between a cat soleus muscle and its Hill-type model. For capturing A(t) under excitation and movement variation, a modular modeling framework was proposed comprising of three compartments: (1) spikes-to-[Ca2+]; (2) [Ca2+]-to-A; and (3) A-to-force transformation. The individual signal transformations were modeled based on physiological factors so that the parameter values could be separately determined for individual modules directly based on experimental data. Main results. The strong dependency of A(t) on excitation frequency and muscle length was found during both isometric and dynamically-moving contractions. The identified dependencies of A(t) under the static and dynamic conditions could be incorporated in the modular modeling framework by modulating the model parameters as a function of movement input. The new modeling approach was also applicable to cat soleus muscles producing waveforms independent of those used to set the model parameters. Significance. This study provides a modeling framework for spike-driven muscle responses during movement, that is suitable not only for insights into molecular mechanisms underlying muscle behaviors but also for large scale simulations.
Cho, Jae Yong; Cheong, Jae-Ho; Kim, Hoguen; Li, Min; Downey, Thomas J.; Dyer, Matthew D.; Sun, Yongming; Sun, Jingtao; Beasley, Ellen M.; Chung, Hyun Cheol; Noh, Sung Hoon; Weinstein, John N.; Liu, Chang-Gong; Powis, Garth
2013-01-01
Gastric cancer is the most common cancer in Asia and most developing countries. Despite the use of multimodality therapeutics, it remains the second leading cause of cancer death in the world. To identify the molecular underpinnings of gastric cancer in the Asian population, we applied an RNA-sequencing approach to gastric tumor and noncancerous specimens, generating 680 million informative short reads to quantitatively characterize the entire transcriptome of gastric cancer (including mRNAs and microRNAs). A multi-layer analysis was then developed to identify multiple types of transcriptional aberrations associated with different stages of gastric cancer, including differentially expressed mRNAs, recurrent somatic mutations and key differentially expressed microRNAs. Through this approach, we identified the central metabolic regulator AMPK-α as a potential functional target in Asian gastric cancer. Further, we experimentally demonstrated the translational relevance of this gene as a potential therapeutic target for early-stage gastric cancer in Asian patients. Together, our findings not only provide a valuable information resource for identifying and elucidating the molecular mechanisms of Asian gastric cancer, but also represent a general integrative framework to develop more effective therapeutic targets. PMID:22434430
Bennett, Thomas D; Todorova, Tanya K; Baxter, Emma F; Reid, David G; Gervais, Christel; Bueken, Bart; Van de Voorde, B; De Vos, Dirk; Keen, David A; Mellot-Draznieks, Caroline
2016-01-21
The mechanism and products of the structural collapse of the metal–organic frameworks (MOFs) UiO-66, MIL-140B and MIL-140C upon ball-milling are investigated through solid state 13C NMR and pair distribution function (PDF) studies, finding amorphization to proceed by the breaking of a fraction of metal–ligand bonding in each case. The amorphous products contain inorganic–organic bonding motifs reminiscent of the crystalline phases. Whilst the inorganic Zr6O4(OH)4 clusters of UiO-66 remain intact upon structural collapse, the ZrO backbone of the MIL-140 frameworks undergoes substantial distortion. Density functional theory calculations have been performed to investigate defective models of MIL-140B and show, through comparison of calculated and experimental 13C NMR spectra, that amorphization and defects in the materials are linked.
Morrissey, Bethny; Blyth, Karen; Carter, Phil; Chelala, Claude; Jones, Louise; Holen, Ingunn; Speirs, Valerie
2017-01-01
While significant medical breakthroughs have been achieved through using animal models, our experience shows that often there is surplus material remaining that is frequently never revisited but could be put to good use by other scientists. Recognising that most scientists are willing to share this material on a collaborative basis, it makes economic, ethical, and academic sense to explore the option to utilise this precious resource before generating new/additional animal models and associated samples. To bring together those requiring animal tissue and those holding this type of archival material, we have devised a framework called Sharing Experimental Animal Resources, Coordinating Holdings (SEARCH) with the aim of making remaining material derived from animal studies in biomedical research more visible and accessible to the scientific community. We encourage journals, funding bodies, and scientists to unite in promoting a new way of approaching animal research by adopting the SEARCH framework.
Morrissey, Bethny; Blyth, Karen; Carter, Phil; Chelala, Claude; Jones, Louise; Holen, Ingunn; Speirs, Valerie
2017-01-01
While significant medical breakthroughs have been achieved through using animal models, our experience shows that often there is surplus material remaining that is frequently never revisited but could be put to good use by other scientists. Recognising that most scientists are willing to share this material on a collaborative basis, it makes economic, ethical, and academic sense to explore the option to utilise this precious resource before generating new/additional animal models and associated samples. To bring together those requiring animal tissue and those holding this type of archival material, we have devised a framework called Sharing Experimental Animal Resources, Coordinating Holdings (SEARCH) with the aim of making remaining material derived from animal studies in biomedical research more visible and accessible to the scientific community. We encourage journals, funding bodies, and scientists to unite in promoting a new way of approaching animal research by adopting the SEARCH framework. PMID:28081116
ELPSA as a Lesson Design Framework
ERIC Educational Resources Information Center
Lowrie, Tom; Patahuddin, Sitti Maesuri
2015-01-01
This paper offers a framework for a mathematics lesson design that is consistent with the way we learn about, and discover, most things in life. In addition, the framework provides a structure for identifying how mathematical concepts and understanding are acquired and developed. This framework is called ELPSA and represents five learning…
eXframe: reusable framework for storage, analysis and visualization of genomics experiments
2011-01-01
Background Genome-wide experiments are routinely conducted to measure gene expression, DNA-protein interactions and epigenetic status. Structured metadata for these experiments is imperative for a complete understanding of experimental conditions, to enable consistent data processing and to allow retrieval, comparison, and integration of experimental results. Even though several repositories have been developed for genomics data, only a few provide annotation of samples and assays using controlled vocabularies. Moreover, many of them are tailored for a single type of technology or measurement and do not support the integration of multiple data types. Results We have developed eXframe - a reusable web-based framework for genomics experiments that provides 1) the ability to publish structured data compliant with accepted standards 2) support for multiple data types including microarrays and next generation sequencing 3) query, analysis and visualization integration tools (enabled by consistent processing of the raw data and annotation of samples) and is available as open-source software. We present two case studies where this software is currently being used to build repositories of genomics experiments - one contains data from hematopoietic stem cells and another from Parkinson's disease patients. Conclusion The web-based framework eXframe offers structured annotation of experiments as well as uniform processing and storage of molecular data from microarray and next generation sequencing platforms. The framework allows users to query and integrate information across species, technologies, measurement types and experimental conditions. Our framework is reusable and freely modifiable - other groups or institutions can deploy their own custom web-based repositories based on this software. It is interoperable with the most important data formats in this domain. We hope that other groups will not only use eXframe, but also contribute their own useful modifications. PMID:22103807
In-silico experiments of zebrafish behaviour: modeling swimming in three dimensions
NASA Astrophysics Data System (ADS)
Mwaffo, Violet; Butail, Sachit; Porfiri, Maurizio
2017-01-01
Zebrafish is fast becoming a species of choice in biomedical research for the investigation of functional and dysfunctional processes coupled with their genetic and pharmacological modulation. As with mammals, experimentation with zebrafish constitutes a complicated ethical issue that calls for the exploration of alternative testing methods to reduce the number of subjects, refine experimental designs, and replace live animals. Inspired by the demonstrated advantages of computational studies in other life science domains, we establish an authentic data-driven modelling framework to simulate zebrafish swimming in three dimensions. The model encapsulates burst-and-coast swimming style, speed modulation, and wall interaction, laying the foundations for in-silico experiments of zebrafish behaviour. Through computational studies, we demonstrate the ability of the model to replicate common ethological observables such as speed and spatial preference, and anticipate experimental observations on the correlation between tank dimensions on zebrafish behaviour. Reaching to other experimental paradigms, our framework is expected to contribute to a reduction in animal use and suffering.
In-silico experiments of zebrafish behaviour: modeling swimming in three dimensions
Mwaffo, Violet; Butail, Sachit; Porfiri, Maurizio
2017-01-01
Zebrafish is fast becoming a species of choice in biomedical research for the investigation of functional and dysfunctional processes coupled with their genetic and pharmacological modulation. As with mammals, experimentation with zebrafish constitutes a complicated ethical issue that calls for the exploration of alternative testing methods to reduce the number of subjects, refine experimental designs, and replace live animals. Inspired by the demonstrated advantages of computational studies in other life science domains, we establish an authentic data-driven modelling framework to simulate zebrafish swimming in three dimensions. The model encapsulates burst-and-coast swimming style, speed modulation, and wall interaction, laying the foundations for in-silico experiments of zebrafish behaviour. Through computational studies, we demonstrate the ability of the model to replicate common ethological observables such as speed and spatial preference, and anticipate experimental observations on the correlation between tank dimensions on zebrafish behaviour. Reaching to other experimental paradigms, our framework is expected to contribute to a reduction in animal use and suffering. PMID:28071731
How to design a single-cell RNA-sequencing experiment: pitfalls, challenges and perspectives.
Dal Molin, Alessandra; Di Camillo, Barbara
2018-01-31
The sequencing of the transcriptome of single cells, or single-cell RNA-sequencing, has now become the dominant technology for the identification of novel cell types in heterogeneous cell populations or for the study of stochastic gene expression. In recent years, various experimental methods and computational tools for analysing single-cell RNA-sequencing data have been proposed. However, most of them are tailored to different experimental designs or biological questions, and in many cases, their performance has not been benchmarked yet, thus increasing the difficulty for a researcher to choose the optimal single-cell transcriptome sequencing (scRNA-seq) experiment and analysis workflow. In this review, we aim to provide an overview of the current available experimental and computational methods developed to handle single-cell RNA-sequencing data and, based on their peculiarities, we suggest possible analysis frameworks depending on specific experimental designs. Together, we propose an evaluation of challenges and open questions and future perspectives in the field. In particular, we go through the different steps of scRNA-seq experimental protocols such as cell isolation, messenger RNA capture, reverse transcription, amplification and use of quantitative standards such as spike-ins and Unique Molecular Identifiers (UMIs). We then analyse the current methodological challenges related to preprocessing, alignment, quantification, normalization, batch effect correction and methods to control for confounding effects. © The Author(s) 2018. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Rapid development of Proteomic applications with the AIBench framework.
López-Fernández, Hugo; Reboiro-Jato, Miguel; Glez-Peña, Daniel; Méndez Reboredo, José R; Santos, Hugo M; Carreira, Ricardo J; Capelo-Martínez, José L; Fdez-Riverola, Florentino
2011-09-15
In this paper we present two case studies of Proteomics applications development using the AIBench framework, a Java desktop application framework mainly focused in scientific software development. The applications presented in this work are Decision Peptide-Driven, for rapid and accurate protein quantification, and Bacterial Identification, for Tuberculosis biomarker search and diagnosis. Both tools work with mass spectrometry data, specifically with MALDI-TOF spectra, minimizing the time required to process and analyze the experimental data. Copyright 2011 The Author(s). Published by Journal of Integrative Bioinformatics.
Experimental Non-Violation of the Bell Inequality
NASA Astrophysics Data System (ADS)
Palmer, Tim
2018-05-01
A finite non-classical framework for physical theory is described which challenges the conclusion that the Bell Inequality has been shown to have been violated experimentally, even approximately. This framework postulates the universe as a deterministic locally causal system evolving on a measure-zero fractal-like geometry $I_U$ in cosmological state space. Consistent with the assumed primacy of $I_U$, and $p$-adic number theory, a non-Euclidean (and hence non-classical) metric $g_p$ is defined on cosmological state space, where $p$ is a large but finite Pythagorean prime. Using number-theoretic properties of spherical triangles, the inequalities violated experimentally are shown to be $g_p$-distant from the CHSH inequality, whose violation would rule out local realism. This result fails in the singular limit $p=\\infty$, at which $g_p$ is Euclidean. Broader implications are discussed.
A Framework for Identifying Selective Chemical Applications for IPM in Dryland Agriculture
Umina, Paul A.; Jenkins, Sommer; McColl, Stuart; Arthur, Aston; Hoffmann, Ary A.
2015-01-01
Shifts to Integrated Pest Management (IPM) in agriculture are assisted by the identification of chemical applications that provide effective control of pests relative to broad-spectrum pesticides but have fewer negative effects on natural enemy (beneficial) groups that assist in pest control. Here, we outline a framework for identifying such applications and apply this framework to field trials involving the crop establishment phase of Australian dryland cropping systems. Several chemicals, which are not presently available to farmers in Australia, were identified as providing moderate levels of pest control and seedling protection, with the potential to be less harmful to beneficial groups including predatory mites, predatory beetles and ants. This framework highlights the challenges involved in chemically controlling pests while maintaining non-target populations when pest species are present at damaging levels. PMID:26694469
Zhang, Chengwei; Li, Xiaohong; Li, Shuxin; Feng, Zhiyong
2017-09-20
Biological environment is uncertain and its dynamic is similar to the multiagent environment, thus the research results of the multiagent system area can provide valuable insights to the understanding of biology and are of great significance for the study of biology. Learning in a multiagent environment is highly dynamic since the environment is not stationary anymore and each agent's behavior changes adaptively in response to other coexisting learners, and vice versa. The dynamics becomes more unpredictable when we move from fixed-agent interaction environments to multiagent social learning framework. Analytical understanding of the underlying dynamics is important and challenging. In this work, we present a social learning framework with homogeneous learners (e.g., Policy Hill Climbing (PHC) learners), and model the behavior of players in the social learning framework as a hybrid dynamical system. By analyzing the dynamical system, we obtain some conditions about convergence or non-convergence. We experimentally verify the predictive power of our model using a number of representative games. Experimental results confirm the theoretical analysis. Under multiagent social learning framework, we modeled the behavior of agent in biologic environment, and theoretically analyzed the dynamics of the model. We present some sufficient conditions about convergence or non-convergence and prove them theoretically. It can be used to predict the convergence of the system.
Sensorimotor Incongruence in People with Musculoskeletal Pain: A Systematic Review.
Don, Sanneke; Voogt, Lennard; Meeus, Mira; De Kooning, Margot; Nijs, Jo
2017-01-01
Musculoskeletal pain has major public health implications, but the theoretical framework remains unclear. It is hypothesized that sensorimotor incongruence (SMI) might be a cause of long-lasting pain sensations in people with chronic musculoskeletal pain. Research data about experimental SMI triggering pain has been equivocal, making the relation between SMI and pain elusive. The aim of this study was to systematically review the studies on experimental SMI in people with musculoskeletal pain and healthy individuals. Preferred reporting items for systematic reviews and meta-analyses guidelines were followed. A systematic literature search was conducted using several databases until January 2015. To identify relevant articles, keywords regarding musculoskeletal pain or healthy subjects and the sensory or the motor system were combined. Study characteristics were extracted. Risk of bias was assessed using the Dutch Institute for Healthcare Improvement (CBO) checklist for randomized controlled trials, and level of evidence was judged. Eight cross-over studies met the inclusion criteria. The methodological quality of the studies varied, and populations were heterogeneous. In populations with musculoskeletal pain, outcomes of sensory disturbances and pain were higher during all experimental conditions compared to baseline conditions. In healthy subjects, pain reports during experimental SMI were very low or did not occur at all. Based on the current evidence and despite some methodological issues, there is no evidence that experimental SMI triggers pain in healthy individuals and in people with chronic musculoskeletal pain. However, people with chronic musculoskeletal pain report more sensory disturbances and pain during the experimental conditions, indicating that visual manipulation influences pain outcomes in this population. © 2016 World Institute of Pain.
Neale, Dave; Clackson, Kaili; Georgieva, Stanimira; Dedetas, Hatice; Scarpate, Melissa; Wass, Sam; Leong, Victoria
2018-01-01
Play during early life is a ubiquitous activity, and an individual’s propensity for play is positively related to cognitive development and emotional well-being. Play behavior (which may be solitary or shared with a social partner) is diverse and multi-faceted. A challenge for current research is to converge on a common definition and measurement system for play – whether examined at a behavioral, cognitive or neurological level. Combining these different approaches in a multimodal analysis could yield significant advances in understanding the neurocognitive mechanisms of play, and provide the basis for developing biologically grounded play models. However, there is currently no integrated framework for conducting a multimodal analysis of play that spans brain, cognition and behavior. The proposed coding framework uses grounded and observable behaviors along three dimensions (sensorimotor, cognitive and socio-emotional), to compute inferences about playful behavior in a social context, and related social interactional states. Here, we illustrate the sensitivity and utility of the proposed coding framework using two contrasting dyadic corpora (N = 5) of mother-infant object-oriented interactions during experimental conditions that were either non-conducive (Condition 1) or conducive (Condition 2) to the emergence of playful behavior. We find that the framework accurately identifies the modal form of social interaction as being either non-playful (Condition 1) or playful (Condition 2), and further provides useful insights about differences in the quality of social interaction and temporal synchronicity within the dyad. It is intended that this fine-grained coding of play behavior will be easily assimilated with, and inform, future analysis of neural data that is also collected during adult–infant play. In conclusion, here, we present a novel framework for analyzing the continuous time-evolution of adult–infant play patterns, underpinned by biologically informed state coding along sensorimotor, cognitive and socio-emotional dimensions. We expect that the proposed framework will have wide utility amongst researchers wishing to employ an integrated, multimodal approach to the study of play, and lead toward a greater understanding of the neuroscientific basis of play. It may also yield insights into a new biologically grounded taxonomy of play interactions. PMID:29618994
Neale, Dave; Clackson, Kaili; Georgieva, Stanimira; Dedetas, Hatice; Scarpate, Melissa; Wass, Sam; Leong, Victoria
2018-01-01
Play during early life is a ubiquitous activity, and an individual's propensity for play is positively related to cognitive development and emotional well-being. Play behavior (which may be solitary or shared with a social partner) is diverse and multi-faceted. A challenge for current research is to converge on a common definition and measurement system for play - whether examined at a behavioral, cognitive or neurological level. Combining these different approaches in a multimodal analysis could yield significant advances in understanding the neurocognitive mechanisms of play, and provide the basis for developing biologically grounded play models. However, there is currently no integrated framework for conducting a multimodal analysis of play that spans brain, cognition and behavior. The proposed coding framework uses grounded and observable behaviors along three dimensions (sensorimotor, cognitive and socio-emotional), to compute inferences about playful behavior in a social context, and related social interactional states. Here, we illustrate the sensitivity and utility of the proposed coding framework using two contrasting dyadic corpora ( N = 5) of mother-infant object-oriented interactions during experimental conditions that were either non-conducive (Condition 1) or conducive (Condition 2) to the emergence of playful behavior. We find that the framework accurately identifies the modal form of social interaction as being either non-playful (Condition 1) or playful (Condition 2), and further provides useful insights about differences in the quality of social interaction and temporal synchronicity within the dyad. It is intended that this fine-grained coding of play behavior will be easily assimilated with, and inform, future analysis of neural data that is also collected during adult-infant play. In conclusion, here, we present a novel framework for analyzing the continuous time-evolution of adult-infant play patterns, underpinned by biologically informed state coding along sensorimotor, cognitive and socio-emotional dimensions. We expect that the proposed framework will have wide utility amongst researchers wishing to employ an integrated, multimodal approach to the study of play, and lead toward a greater understanding of the neuroscientific basis of play. It may also yield insights into a new biologically grounded taxonomy of play interactions.
Vortex lattices and defect-mediated viscosity reduction in active liquids
NASA Astrophysics Data System (ADS)
Slomka, Jonasz; Dunkel, Jorn
2016-11-01
Generic pattern-formation and viscosity-reduction mechanisms in active fluids are investigated using a generalized Navier-Stokes model that captures the experimentally observed bulk vortex dynamics in microbial suspensions. We present exact analytical solutions including stress-free vortex lattices and introduce a computational framework that allows the efficient treatment of previously intractable higher-order shear boundary conditions. Large-scale parameter scans identify the conditions for spontaneous flow symmetry breaking, defect-mediated low-viscosity phases and negative-viscosity states amenable to energy harvesting in confined suspensions. The theory uses only generic assumptions about the symmetries and long-wavelength structure of active stress tensors, suggesting that inviscid phases may be achievable in a broad class of non-equilibrium fluids by tuning confinement geometry and pattern scale selection.
Geometry-dependent viscosity reduction in sheared active fluids
NASA Astrophysics Data System (ADS)
Słomka, Jonasz; Dunkel, Jörn
2017-04-01
We investigate flow pattern formation and viscosity reduction mechanisms in active fluids by studying a generalized Navier-Stokes model that captures the experimentally observed bulk vortex dynamics in microbial suspensions. We present exact analytical solutions including stress-free vortex lattices and introduce a computational framework that allows the efficient treatment of higher-order shear boundary conditions. Large-scale parameter scans identify the conditions for spontaneous flow symmetry breaking, geometry-dependent viscosity reduction, and negative-viscosity states amenable to energy harvesting in confined suspensions. The theory uses only generic assumptions about the symmetries and long-wavelength structure of active stress tensors, suggesting that inviscid phases may be achievable in a broad class of nonequilibrium fluids by tuning confinement geometry and pattern scale selection.
Visual Pattern Analysis in Histopathology Images Using Bag of Features
NASA Astrophysics Data System (ADS)
Cruz-Roa, Angel; Caicedo, Juan C.; González, Fabio A.
This paper presents a framework to analyse visual patterns in a collection of medical images in a two stage procedure. First, a set of representative visual patterns from the image collection is obtained by constructing a visual-word dictionary under a bag-of-features approach. Second, an analysis of the relationships between visual patterns and semantic concepts in the image collection is performed. The most important visual patterns for each semantic concept are identified using correlation analysis. A matrix visualization of the structure and organization of the image collection is generated using a cluster analysis. The experimental evaluation was conducted on a histopathology image collection and results showed clear relationships between visual patterns and semantic concepts, that in addition, are of easy interpretation and understanding.
Linking the History of Radiation Biology to the Hallmarks of Cancer
Boss, Mary-Keara; Bristow, Robert; Dewhirst, Mark W.
2014-01-01
Hanahan and Weinberg recently updated their conceptual framework of the “Hallmarks of Cancer”. The original article, published in 2000, is among the most highly cited reviews in the field of oncology. The goal of this review is to highlight important discoveries in radiation biology that pertain to the Hallmarks. We identified early studies that exemplified how ionizing radiation affects the hallmarks or how radiation was used experimentally to advance the understanding of key hallmarks. A literature search was performed to obtain relevant primary research, and topics were assigned to a particular hallmark to allow an organized, chronological account of the radiobiological advancements. The hallmarks are reviewed in an order that flows from cellular to microenvironmental effects. PMID:24811865
Deciphering Neural Codes of Memory during Sleep
Chen, Zhe; Wilson, Matthew A.
2017-01-01
Memories of experiences are stored in the cerebral cortex. Sleep is critical for consolidating hippocampal memory of wake experiences into the neocortex. Understanding representations of neural codes of hippocampal-neocortical networks during sleep would reveal important circuit mechanisms on memory consolidation, and provide novel insights into memory and dreams. Although sleep-associated ensemble spike activity has been investigated, identifying the content of memory in sleep remains challenging. Here, we revisit important experimental findings on sleep-associated memory (i.e., neural activity patterns in sleep that reflect memory processing) and review computational approaches for analyzing sleep-associated neural codes (SANC). We focus on two analysis paradigms for sleep-associated memory, and propose a new unsupervised learning framework (“memory first, meaning later”) for unbiased assessment of SANC. PMID:28390699
NASA Astrophysics Data System (ADS)
Jaffke, Patrick; Möller, Peter; Stetcu, Ionel; Talou, Patrick; Schmitt, Christelle
2018-03-01
We implement fission fragment yields, calculated using Brownian shape-motion on a macroscopic-microscopic potential energy surface in six dimensions, into the Hauser-Feshbach statistical decay code CGMF. This combination allows us to test the impact of utilizing theoretically-calculated fission fragment yields on the subsequent prompt neutron and γ-ray emission. We draw connections between the fragment yields and the total kinetic energy TKE of the fission fragments and demonstrate that the use of calculated yields can introduce a difference in the 〈TKE〉 and, thus, the prompt neutron multiplicity
Detecting Service Chains and Feature Interactions in Sensor-Driven Home Network Services
Inada, Takuya; Igaki, Hiroshi; Ikegami, Kosuke; Matsumoto, Shinsuke; Nakamura, Masahide; Kusumoto, Shinji
2012-01-01
Sensor-driven services often cause chain reactions, since one service may generate an environmental impact that automatically triggers another service. We first propose a framework that can formalize and detect such service chains based on ECA (event, condition, action) rules. Although the service chain can be a major source of feature interactions, not all service chains lead to harmful interactions. Therefore, we then propose a method that identifies feature interactions within the service chains. Specifically, we characterize the degree of deviation of every service chain by evaluating the gap between expected and actual service states. An experimental evaluation demonstrates that the proposed method successfully detects 11 service chains and 6 feature interactions within 7 practical sensor-driven services. PMID:23012499
Sensor-Based Human Activity Recognition in a Multi-user Scenario
NASA Astrophysics Data System (ADS)
Wang, Liang; Gu, Tao; Tao, Xianping; Lu, Jian
Existing work on sensor-based activity recognition focuses mainly on single-user activities. However, in real life, activities are often performed by multiple users involving interactions between them. In this paper, we propose Coupled Hidden Markov Models (CHMMs) to recognize multi-user activities from sensor readings in a smart home environment. We develop a multimodal sensing platform and present a theoretical framework to recognize both single-user and multi-user activities. We conduct our trace collection done in a smart home, and evaluate our framework through experimental studies. Our experimental result shows that we achieve an average accuracy of 85.46% with CHMMs.
Niche construction game cancer cells play
NASA Astrophysics Data System (ADS)
Bergman, Aviv; Gligorijevic, Bojana
2015-10-01
Niche construction concept was originally defined in evolutionary biology as the continuous interplay between natural selection via environmental conditions and the modification of these conditions by the organism itself. Processes unraveling during cancer metastasis include construction of niches, which cancer cells use towards more efficient survival, transport into new environments and preparation of the remote sites for their arrival. Many elegant experiments were done lately illustrating, for example, the premetastatic niche construction, but there is practically no mathematical modeling done which would apply the niche construction framework. To create models useful for understanding niche construction role in cancer progression, we argue that a) genetic, b) phenotypic and c) ecological levels are to be included. While the model proposed here is phenomenological in its current form, it can be converted into a predictive outcome model via experimental measurement of the model parameters. Here we give an overview of an experimentally formulated problem in cancer metastasis and propose how niche construction framework can be utilized and broadened to model it. Other life science disciplines, such as host-parasite coevolution, may also benefit from niche construction framework adaptation, to satisfy growing need for theoretical considerations of data collected by experimental biology.
Distributed memory parallel Markov random fields using graph partitioning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heinemann, C.; Perciano, T.; Ushizima, D.
Markov random fields (MRF) based algorithms have attracted a large amount of interest in image analysis due to their ability to exploit contextual information about data. Image data generated by experimental facilities, though, continues to grow larger and more complex, making it more difficult to analyze in a reasonable amount of time. Applying image processing algorithms to large datasets requires alternative approaches to circumvent performance problems. Aiming to provide scientists with a new tool to recover valuable information from such datasets, we developed a general purpose distributed memory parallel MRF-based image analysis framework (MPI-PMRF). MPI-PMRF overcomes performance and memory limitationsmore » by distributing data and computations across processors. The proposed approach was successfully tested with synthetic and experimental datasets. Additionally, the performance of the MPI-PMRF framework is analyzed through a detailed scalability study. We show that a performance increase is obtained while maintaining an accuracy of the segmentation results higher than 98%. The contributions of this paper are: (a) development of a distributed memory MRF framework; (b) measurement of the performance increase of the proposed approach; (c) verification of segmentation accuracy in both synthetic and experimental, real-world datasets« less
Niche construction game cancer cells play.
Bergman, Aviv; Gligorijevic, Bojana
2015-10-01
Niche construction concept was originally defined in evolutionary biology as the continuous interplay between natural selection via environmental conditions and the modification of these conditions by the organism itself. Processes unraveling during cancer metastasis include construction of niches, which cancer cells use towards more efficient survival, transport into new environments and preparation of the remote sites for their arrival. Many elegant experiments were done lately illustrating, for example, the premetastatic niche construction, but there is practically no mathematical modeling done which would apply the niche construction framework. To create models useful for understanding niche construction role in cancer progression, we argue that a) genetic, b) phenotypic and c) ecological levels are to be included. While the model proposed here is phenomenological in its current form, it can be converted into a predictive outcome model via experimental measurement of the model parameters. Here we give an overview of an experimentally formulated problem in cancer metastasis and propose how niche construction framework can be utilized and broadened to model it. Other life science disciplines, such as host-parasite coevolution, may also benefit from niche construction framework adaptation, to satisfy growing need for theoretical considerations of data collected by experimental biology.
Identification of adsorption sites in Cu-BTC by experimentation and molecular simulation.
García-Pérez, Elena; Gascón, Jorge; Morales-Flórez, Víctor; Castillo, Juan Manuel; Kapteijn, Freek; Calero, Sofía
2009-02-03
The adsorption of several quadrupolar and nonpolar gases on the Metal Organic Framework Cu-BTC has been studied by combining experimental measurements and Monte Carlo simulations. Four main adsorption sites for this structure have been identified: site I close to the copper atoms, site I' in the bigger cavities, site II located in the small octahedral cages, and site III at the windows of the four open faces of the octahedral cage. Our simulations identify the octahedral cages (sites II and III) and the big cages (site I') as the preferred positions for adsorption, while site I, near the copper atoms, remains empty over the entire range of pressures analyzed due to its reduced accessibility. The occupation of the different sites for ethane and propane in Cu-BTC proceeds similarly as for methane, and shows small differences for O2 and N2 that can be attributed to the quadrupole moment of these molecules. Site II is filled predominantly for methane (the nonpolar molecule), whereas for N2, the occupation of II and I' can be considered almost equivalent. The molecular sitting for O2 shows an intermediate behavior between those observed for methane and for N2. The differences between simulated and experimental data at elevated temperatures for propane are tentatively attributed to a reversible change in the lattice parameters of Cu-BTC by dehydration and by temperature, blocking the accessibility to site III and reducing that to site I'. Adsorption parameters of the investigated molecules have been determined from the simulations.
JSEM: A Framework for Identifying and Evaluating Indicators.
ERIC Educational Resources Information Center
Hyman, Jeffrey B.; Leibowitz, Scott G.
2001-01-01
Presents an approach to identifying and evaluating combinations of indicators when the mathematical relationships between the indicators and an endpoint may not be quantified, a limitation common to many ecological assessments. Uses the framework of Structural Equation Modeling (SEM), which combines path analysis with measurement model, to…
NASA Astrophysics Data System (ADS)
Rovinelli, Andrea; Guilhem, Yoann; Proudhon, Henry; Lebensohn, Ricardo A.; Ludwig, Wolfgang; Sangid, Michael D.
2017-06-01
Microstructurally small cracks exhibit large variability in their fatigue crack growth rate. It is accepted that the inherent variability in microstructural features is related to the uncertainty in the growth rate. However, due to (i) the lack of cycle-by-cycle experimental data, (ii) the complexity of the short crack growth phenomenon, and (iii) the incomplete physics of constitutive relationships, only empirical damage metrics have been postulated to describe the short crack driving force metric (SCDFM) at the mesoscale level. The identification of the SCDFM of polycrystalline engineering alloys is a critical need, in order to achieve more reliable fatigue life prediction and improve material design. In this work, the first steps in the development of a general probabilistic framework are presented, which uses experimental result as an input, retrieves missing experimental data through crystal plasticity (CP) simulations, and extracts correlations utilizing machine learning and Bayesian networks (BNs). More precisely, experimental results representing cycle-by-cycle data of a short crack growing through a beta-metastable titanium alloy, VST-55531, have been acquired via phase and diffraction contrast tomography. These results serve as an input for FFT-based CP simulations, which provide the micromechanical fields influenced by the presence of the crack, complementing the information available from the experiment. In order to assess the correlation between postulated SCDFM and experimental observations, the data is mined and analyzed utilizing BNs. Results show the ability of the framework to autonomously capture relevant correlations and the equivalence in the prediction capability of different postulated SCDFMs for the high cycle fatigue regime.
NASA Astrophysics Data System (ADS)
Sellami, Takwa; Jelassi, Sana; Darcherif, Abdel Moumen; Berriri, Hanen; Mimouni, Med Faouzi
2018-04-01
With the advancement of wind turbines towards complex structures, the requirement of trusty structural models has become more apparent. Hence, the vibration characteristics of the wind turbine components, like the blades and the tower, have to be extracted under vibration constraints. Although extracting the modal properties of blades is a simple task, calculating precise modal data for the whole wind turbine coupled to its tower/foundation is still a perplexing task. In this framework, this paper focuses on the investigation of the structural modeling approach of modern commercial micro-turbines. Thus, the structural model a complex designed wind turbine, which is Rutland 504, is established based on both experimental and numerical methods. A three-dimensional (3-D) numerical model of the structure was set up based on the finite volume method (FVM) using the academic finite element analysis software ANSYS. To validate the created model, experimental vibration tests were carried out using the vibration test system of TREVISE platform at ECAM-EPMI. The tests were based on the experimental modal analysis (EMA) technique, which is one of the most efficient techniques for identifying structures parameters. Indeed, the poles and residues of the frequency response functions (FRF), between input and output spectra, were calculated to extract the mode shapes and the natural frequencies of the structure. Based on the obtained modal parameters, the numerical designed model was up-dated.
A review of event processing frameworks used in HEP
Sexton-Kennedy, E.
2015-12-23
Today there are many different experimental event processing frameworks in use by running or about to be running experiments. This talk will discuss the different components of these frameworks. In the past there have been attempts at shared framework projects for example the collaborations on the BaBar framework (between BaBar, CDF, and CLEO), on the Gaudi framework (between LHCb and ATLAS), on AliROOT/FairROOT (between Alice and GSI/Fair), and in some ways on art (Fermilab based experiments) and CMS’ framework. However, for reasons that will be discussed, these collaborations did not result in common frameworks shared among the intended experiments. Thoughmore » importantly, two of the resulting projects have succeeded in providing frameworks that are shared among many customer experiments: Fermilab's art framework and GSI/Fair's FairROOT. Interestingly, several projects are considering remerging their frameworks after many years apart. I'll report on an investigation and analysis of these realities. In addition, with the advent of the need for multi-threaded frameworks and the scarce available manpower, it is important to collaborate in the future, however it is also important to understand why previous attempts at multi-experiment frameworks either worked or didn't work.« less
ERIC Educational Resources Information Center
Page, Lindsay C.
2012-01-01
Experimental evaluations are increasingly common in the U.S. educational policy-research context. Often, in investigations of multifaceted interventions, researchers and policymakers alike are interested in not only "whether" a given intervention impacted an outcome but also "why". What "features" of the intervention…
Alternative Approaches to Introductory Economics.
ERIC Educational Resources Information Center
Bonello, Frank J.; And Others
This document examines the educational output of three alternative approaches to introductory macroeconomics at the University of Notre Dame. The framework for evaluation consists of the cognitive and affective tradeoffs entailed by using a new experimental course as opposed to two more traditional courses. The experimental course is a freshman…
NASA Astrophysics Data System (ADS)
Maiti, Santanu K.
2014-07-01
The experimentally obtained (Venkataraman et al. [1]) cosine squared relation of electronic conductance in a biphenyl molecule is verified theoretically within a tight-binding framework. Using Green's function formalism we numerically calculate two-terminal conductance as a function of relative twist angle among the molecular rings and find that the results are in good agreement with the experimental observation.
Paolo Benettin; Scott W. Bailey; John L. Campbell; Mark B. Green; Andrea Rinaldo; Gene E. Likens; Kevin J. McGuire; Gianluca Botter
2015-01-01
We combine experimental and modeling results from a headwater catchment at the Hubbard Brook Experimental Forest (HBEF), New Hampshire, USA, to explore the link between stream solute dynamics and water age. A theoretical framework based on water age dynamics, which represents a general basis for characterizing solute transport at the catchment scale, is here applied to...
US Army Research Laboratory Visualization Framework Architecture Document
2018-01-11
this report are not to be construed as an official Department of the Army position unless so designated by other authorized documents. Citation of...release; distribution is unlimited. 14. ABSTRACT Visualization of network science experimentation results is generally achieved using stovepipe...report documents the ARL Visualization Framework system design and specific details of its implementation. 15. SUBJECT TERMS visualization
ERIC Educational Resources Information Center
Blikstein, Paulo; Fuhrmann, Tamar; Salehi, Shima
2016-01-01
In this paper, we investigate an approach to supporting students' learning in science through a combination of physical experimentation and virtual modeling. We present a study that utilizes a scientific inquiry framework, which we call "bifocal modeling," to link student-designed experiments and computer models in real time. In this…
The operant-respondent distinction: Future directions
Pear, Joseph J.; Eldridge, Gloria D.
1984-01-01
The operant-respondent distinction has provided a major organizing framework for the data generated through the experimental analysis of behavior. Problems have been encountered, however, in using it as an explanatory concept for such phenomena as avoidance and conditioned suppression. Data now exist that do not fit neatly into the framework. Moreover, the discovery of autoshaping has highlighted difficulties in isolating the two types of behavior and conditioning. Despite these problems, the operant-respondent framework remains the most successful paradigm currently available for organizing behavioral data. Research and theoretical efforts should therefore probably be directed to modifying the framework to account for disparate data. PMID:16812402
Sun, Yongmei; Li, Xing; Wu, Di; Pan, Qi; Ji, Yuefeng; Ren, Hong; Ding, Keyue
2016-01-01
RNA editing is one of the post- or co-transcriptional processes that can lead to amino acid substitutions in protein sequences, alternative pre-mRNA splicing, and changes in gene expression levels. Although several methods have been suggested to identify RNA editing sites, there remains challenges to be addressed in distinguishing true RNA editing sites from its counterparts on genome and technical artifacts. In addition, there lacks a software framework to identify and visualize potential RNA editing sites. Here, we presented a software - 'RED' (RNA Editing sites Detector) - for the identification of RNA editing sites by integrating multiple rule-based and statistical filters. The potential RNA editing sites can be visualized at the genome and the site levels by graphical user interface (GUI). To improve performance, we used MySQL database management system (DBMS) for high-throughput data storage and query. We demonstrated the validity and utility of RED by identifying the presence and absence of C→U RNA-editing sites experimentally validated, in comparison with REDItools, a command line tool to perform high-throughput investigation of RNA editing. In an analysis of a sample data-set with 28 experimentally validated C→U RNA editing sites, RED had sensitivity and specificity of 0.64 and 0.5. In comparison, REDItools had a better sensitivity (0.75) but similar specificity (0.5). RED is an easy-to-use, platform-independent Java-based software, and can be applied to RNA-seq data without or with DNA sequencing data. The package is freely available under the GPLv3 license at http://github.com/REDetector/RED or https://sourceforge.net/projects/redetector.
Sun, Yongmei; Li, Xing; Wu, Di; Pan, Qi; Ji, Yuefeng; Ren, Hong; Ding, Keyue
2016-01-01
RNA editing is one of the post- or co-transcriptional processes that can lead to amino acid substitutions in protein sequences, alternative pre-mRNA splicing, and changes in gene expression levels. Although several methods have been suggested to identify RNA editing sites, there remains challenges to be addressed in distinguishing true RNA editing sites from its counterparts on genome and technical artifacts. In addition, there lacks a software framework to identify and visualize potential RNA editing sites. Here, we presented a software − ‘RED’ (RNA Editing sites Detector) − for the identification of RNA editing sites by integrating multiple rule-based and statistical filters. The potential RNA editing sites can be visualized at the genome and the site levels by graphical user interface (GUI). To improve performance, we used MySQL database management system (DBMS) for high-throughput data storage and query. We demonstrated the validity and utility of RED by identifying the presence and absence of C→U RNA-editing sites experimentally validated, in comparison with REDItools, a command line tool to perform high-throughput investigation of RNA editing. In an analysis of a sample data-set with 28 experimentally validated C→U RNA editing sites, RED had sensitivity and specificity of 0.64 and 0.5. In comparison, REDItools had a better sensitivity (0.75) but similar specificity (0.5). RED is an easy-to-use, platform-independent Java-based software, and can be applied to RNA-seq data without or with DNA sequencing data. The package is freely available under the GPLv3 license at http://github.com/REDetector/RED or https://sourceforge.net/projects/redetector. PMID:26930599
BEACON: A Summary Framework to Overcome Potential Reimbursement Hurdles.
Dunlop, William C N; Mullins, C Daniel; Pirk, Olaf; Goeree, Ron; Postma, Maarten J; Enstone, Ashley; Heron, Louise
2016-10-01
To provide a framework for addressing payers' criteria during the development of pharmaceuticals. A conceptual framework was presented to an international health economic expert panel for discussion. A structured literature search (from 2010 to May 2015), using the following databases in Ovid: Medline(®) and Medline(®) In-Process (PubMed), Embase (Ovid), EconLit (EBSCOhost) and the National Health Service Economic Evaluation Database (NHS EED), and a 'grey literature' search, were conducted to identify existing criteria from the payer perspective. The criteria assessed by existing frameworks and guidelines were collated; the most commonly reported criteria were considered for inclusion in the framework. A mnemonic was conceived as a memory aide to summarise these criteria. Overall, 41 publications were identified as potentially relevant to the objective. Following further screening, 26 were excluded upon full-text review on the basis of no framework presented (n = 13), redundancy (n = 11) or abstract only (n = 2). Frameworks that captured criteria developed for or utilised by the pharmaceutical industry (n = 5) and reimbursement guidance (n = 10) were reviewed. The most commonly identified criteria-unmet need/patient burden, safety, efficacy, quality-of-life outcomes, environment, evidence quality, budget impact and comparator-were incorporated into the summary framework. For ease of communication, the following mnemonic was developed: BEACON (Burden/target population, Environment, Affordability/value, Comparator, Outcomes, Number of studies/quality of evidence). The BEACON framework aims to capture the 'essence' of payer requirements by addressing the most commonly described criteria requested by payers regarding the introduction of a new pharmaceutical.
AQUATIC STRESSORS: FRAMEWORK AND IMPLEMENTATION PLAN FOR EFFECTS RESEARCH
This document describes the framework and research implementation plans for ecological effects research on aquatic stressors within the National Health and Environmental Effects Laboratory. The context for the research identified within the framework is the common management goal...
Martyniuk, Christopher J
2018-04-01
Environmental science has benefited a great deal from omics-based technologies. High-throughput toxicology has defined adverse outcome pathways (AOPs), prioritized chemicals of concern, and identified novel actions of environmental chemicals. While many of these approaches are conducted under rigorous laboratory conditions, a significant challenge has been the interpretation of omics data in "real-world" exposure scenarios. Clarity in the interpretation of these data limits their use in environmental monitoring programs. In recent years, one overarching objective of many has been to address fundamental questions concerning experimental design and the robustness of data collected under the broad umbrella of environmental genomics. These questions include: (1) the likelihood that molecular profiles return to a predefined baseline level following remediation efforts, (2) how reference site selection in an urban environment influences interpretation of omics data and (3) what is the most appropriate species to monitor in the environment from an omics point of view. In addition, inter-genomics studies have been conducted to assess transcriptome reproducibility in toxicology studies. One lesson learned from inter-genomics studies is that there are core molecular networks that can be identified by multiple laboratories using the same platform. This supports the idea that "omics-networks" defined a priori may be a viable approach moving forward for evaluating environmental impacts over time. Both spatial and temporal variability in ecosystem structure is expected to influence molecular responses to environmental stressors, and it is important to recognize how these variables, as well as individual factor (i.e. sex, age, maturation), may confound interpretation of network responses to chemicals. This mini-review synthesizes the progress made towards adopting these tools into environmental monitoring and identifies future challenges to be addressed, as we move into the next era of high throughput sequencing. A conceptual framework for validating and incorporating molecular networks into environmental monitoring programs is proposed. As AOPs become more defined and their potential in environmental monitoring assessments becomes more recognized, the AOP framework may prove to be the conduit between omics and penultimate ecological responses for environmental risk assessments. Copyright © 2018 Elsevier B.V. All rights reserved.
Matrix Dominated Failure of Fiber-Reinforced Composite Laminates Under Static and Dynamic Loading
NASA Astrophysics Data System (ADS)
Schaefer, Joseph Daniel
Hierarchical material systems provide the unique opportunity to connect material knowledge to solving specific design challenges. Representing the quickest growing class of hierarchical materials in use, fiber-reinforced polymer composites (FRPCs) offer superior strength and stiffness-to-weight ratios, damage tolerance, and decreasing production costs compared to metals and alloys. However, the implementation of FRPCs has historically been fraught with inadequate knowledge of the material failure behavior due to incomplete verification of recent computational constitutive models and improper (or non-existent) experimental validation, which has severely slowed creation and development. Noted by the recent Materials Genome Initiative and the Worldwide Failure Exercise, current state of the art qualification programs endure a 20 year gap between material conceptualization and implementation due to the lack of effective partnership between computational coding (simulation) and experimental characterization. Qualification processes are primarily experiment driven; the anisotropic nature of composites predisposes matrix-dominant properties to be sensitive to strain rate, which necessitates extensive testing. To decrease the qualification time, a framework that practically combines theoretical prediction of material failure with limited experimental validation is required. In this work, the Northwestern Failure Theory (NU Theory) for composite lamina is presented as the theoretical basis from which the failure of unidirectional and multidirectional composite laminates is investigated. From an initial experimental characterization of basic lamina properties, the NU Theory is employed to predict the matrix-dependent failure of composites under any state of biaxial stress from quasi-static to 1000 s-1 strain rates. It was found that the number of experiments required to characterize the strain-rate-dependent failure of a new composite material was reduced by an order of magnitude, and the resulting strain-rate-dependence was applicable for a large class of materials. The presented framework provides engineers with the capability to quickly identify fiber and matrix combinations for a given application and determine the failure behavior over the range of practical loadings cases. The failure-mode-based NU Theory may be especially useful when partnered with computational approaches (which often employ micromechanics to determine constituent and constitutive response) to provide accurate validation of the matrix-dominated failure modes experienced by laminates during progressive failure.
Contextual Advantage for State Discrimination
NASA Astrophysics Data System (ADS)
Schmid, David; Spekkens, Robert W.
2018-02-01
Finding quantitative aspects of quantum phenomena which cannot be explained by any classical model has foundational importance for understanding the boundary between classical and quantum theory. It also has practical significance for identifying information processing tasks for which those phenomena provide a quantum advantage. Using the framework of generalized noncontextuality as our notion of classicality, we find one such nonclassical feature within the phenomenology of quantum minimum-error state discrimination. Namely, we identify quantitative limits on the success probability for minimum-error state discrimination in any experiment described by a noncontextual ontological model. These constraints constitute noncontextuality inequalities that are violated by quantum theory, and this violation implies a quantum advantage for state discrimination relative to noncontextual models. Furthermore, our noncontextuality inequalities are robust to noise and are operationally formulated, so that any experimental violation of the inequalities is a witness of contextuality, independently of the validity of quantum theory. Along the way, we introduce new methods for analyzing noncontextuality scenarios and demonstrate a tight connection between our minimum-error state discrimination scenario and a Bell scenario.
Mifsud, Borbala; Martincorena, Inigo; Darbo, Elodie; Sugar, Robert; Schoenfelder, Stefan; Fraser, Peter; Luscombe, Nicholas M
2017-01-01
Hi-C is one of the main methods for investigating spatial co-localisation of DNA in the nucleus. However, the raw sequencing data obtained from Hi-C experiments suffer from large biases and spurious contacts, making it difficult to identify true interactions. Existing methods use complex models to account for biases and do not provide a significance threshold for detecting interactions. Here we introduce a simple binomial probabilistic model that resolves complex biases and distinguishes between true and false interactions. The model corrects biases of known and unknown origin and yields a p-value for each interaction, providing a reliable threshold based on significance. We demonstrate this experimentally by testing the method against a random ligation dataset. Our method outperforms previous methods and provides a statistical framework for further data analysis, such as comparisons of Hi-C interactions between different conditions. GOTHiC is available as a BioConductor package (http://www.bioconductor.org/packages/release/bioc/html/GOTHiC.html).
Deciphering microbial interactions and detecting keystone species with co-occurrence networks.
Berry, David; Widder, Stefanie
2014-01-01
Co-occurrence networks produced from microbial survey sequencing data are frequently used to identify interactions between community members. While this approach has potential to reveal ecological processes, it has been insufficiently validated due to the technical limitations inherent in studying complex microbial ecosystems. Here, we simulate multi-species microbial communities with known interaction patterns using generalized Lotka-Volterra dynamics. We then construct co-occurrence networks and evaluate how well networks reveal the underlying interactions and how experimental and ecological parameters can affect network inference and interpretation. We find that co-occurrence networks can recapitulate interaction networks under certain conditions, but that they lose interpretability when the effects of habitat filtering become significant. We demonstrate that networks suffer from local hot spots of spurious correlation in the neighborhood of hub species that engage in many interactions. We also identify topological features associated with keystone species in co-occurrence networks. This study provides a substantiated framework to guide environmental microbiologists in the construction and interpretation of co-occurrence networks from microbial survey datasets.
Rafehi, Haloom; Kaspi, Antony; Ziemann, Mark; Okabe, Jun; Karagiannis, Tom C; El-Osta, Assam
2017-01-01
Given the skyrocketing costs to develop new drugs, repositioning of approved drugs, such as histone deacetylase (HDAC) inhibitors, may be a promising strategy to develop novel therapies. However, a gap exists in the understanding and advancement of these agents to meaningful translation for which new indications may emerge. To address this, we performed systems-level analyses of 33 independent HDAC inhibitor microarray studies. Based on network analysis, we identified enrichment for pathways implicated in metabolic syndrome and diabetes (insulin receptor signaling, lipid metabolism, immunity and trafficking). Integration with ENCODE ChIP-seq datasets identified suppression of EP300 target genes implicated in diabetes. Experimental validation indicates reversal of diabetes-associated EP300 target genes in primary vascular endothelial cells derived from a diabetic individual following inhibition of HDACs (by SAHA), EP300, or EP300 knockdown. Our computational systems biology approach provides an adaptable framework for the prediction of novel therapeutics for existing disease.
Evaluating E-Training for Public Library Staff: A Quasi-Experimental Investigation
ERIC Educational Resources Information Center
Dalston, Teresa
2009-01-01
A comparative evaluation framework of instructional interventions for implementation of online training for public library staff would enable a better understanding of how to improve the effectiveness, efficiency and efficacy of training in certain training environments. This dissertation describes a quasi-experimental study of a two-week,…
Henderson, Rebecca J; Johnson, Andrew; Moodie, Sheila
2014-12-01
Parent-to-parent support for parents with children who are deaf or hard of hearing (D/HH) is identified as an important component of Early Hearing Detection and Intervention (EHDI) programs for children with hearing loss. The specific aim of this review was to identify the constructs and components of parent-to-parent support for parents of children who are D/HH. An extensive scoping literature review identified 39 peer-reviewed articles published from 2000 to 2014. Studies were selected and reviewed based on standardized procedures. Data were identified, extracted, and organized into libraries of thematic and descriptive content. A conceptual framework of parent-to-parent support for parents of children who are D/HH was developed and presented in a comprehensive, bidirectional informational graphic. The constructs and components of the conceptual framework are (a) well-being: parent, family, and child; (b) knowledge: advocacy, system navigation, and education; and (c) empowerment: confidence and competence. The findings from this scoping review led to the development of a structured conceptual framework of parent-to-parent support for parents of children who are D/HH. The conceptual framework provides an important opportunity to explore and clearly define the vital contribution of parents in EHDI programs.
Schiller, Claire; Winters, Meghan; Hanson, Heather M; Ashe, Maureen C
2013-05-02
Stakeholders, as originally defined in theory, are groups or individual who can affect or are affected by an issue. Stakeholders are an important source of information in health research, providing critical perspectives and new insights on the complex determinants of health. The intersection of built and social environments with older adult mobility is an area of research that is fundamentally interdisciplinary and would benefit from a better understanding of stakeholder perspectives. Although a rich body of literature surrounds stakeholder theory, a systematic process for identifying health stakeholders in practice does not exist. This paper presents a framework of stakeholders related to older adult mobility and the built environment, and further outlines a process for systematically identifying stakeholders that can be applied in other health contexts, with a particular emphasis on concept mapping research. Informed by gaps in the relevant literature we developed a framework for identifying and categorizing health stakeholders. The framework was created through a novel iterative process of stakeholder identification and categorization. The development entailed a literature search to identify stakeholder categories, representation of identified stakeholders in a visual chart, and correspondence with expert informants to obtain practice-based insight. The three-step, iterative creation process progressed from identifying stakeholder categories, to identifying specific stakeholder groups and soliciting feedback from expert informants. The result was a stakeholder framework comprised of seven categories with detailed sub-groups. The main categories of stakeholders were, (1) the Public, (2) Policy makers and governments, (3) Research community, (4) Practitioners and professionals, (5) Health and social service providers, (6) Civil society organizations, and (7) Private business. Stakeholders related to older adult mobility and the built environment span many disciplines and realms of practice. Researchers studying this issue may use the detailed stakeholder framework process we present to identify participants for future projects. Health researchers pursuing stakeholder-based projects in other contexts are encouraged to incorporate this process of stakeholder identification and categorization to ensure systematic consideration of relevant perspectives in their work.
2013-01-01
Background Stakeholders, as originally defined in theory, are groups or individual who can affect or are affected by an issue. Stakeholders are an important source of information in health research, providing critical perspectives and new insights on the complex determinants of health. The intersection of built and social environments with older adult mobility is an area of research that is fundamentally interdisciplinary and would benefit from a better understanding of stakeholder perspectives. Although a rich body of literature surrounds stakeholder theory, a systematic process for identifying health stakeholders in practice does not exist. This paper presents a framework of stakeholders related to older adult mobility and the built environment, and further outlines a process for systematically identifying stakeholders that can be applied in other health contexts, with a particular emphasis on concept mapping research. Methods Informed by gaps in the relevant literature we developed a framework for identifying and categorizing health stakeholders. The framework was created through a novel iterative process of stakeholder identification and categorization. The development entailed a literature search to identify stakeholder categories, representation of identified stakeholders in a visual chart, and correspondence with expert informants to obtain practice-based insight. Results The three-step, iterative creation process progressed from identifying stakeholder categories, to identifying specific stakeholder groups and soliciting feedback from expert informants. The result was a stakeholder framework comprised of seven categories with detailed sub-groups. The main categories of stakeholders were, (1) the Public, (2) Policy makers and governments, (3) Research community, (4) Practitioners and professionals, (5) Health and social service providers, (6) Civil society organizations, and (7) Private business. Conclusions Stakeholders related to older adult mobility and the built environment span many disciplines and realms of practice. Researchers studying this issue may use the detailed stakeholder framework process we present to identify participants for future projects. Health researchers pursuing stakeholder-based projects in other contexts are encouraged to incorporate this process of stakeholder identification and categorization to ensure systematic consideration of relevant perspectives in their work. PMID:23639179
Dynamics of person-to-person interactions from distributed RFID sensor networks.
Cattuto, Ciro; Van den Broeck, Wouter; Barrat, Alain; Colizza, Vittoria; Pinton, Jean-François; Vespignani, Alessandro
2010-07-15
Digital networks, mobile devices, and the possibility of mining the ever-increasing amount of digital traces that we leave behind in our daily activities are changing the way we can approach the study of human and social interactions. Large-scale datasets, however, are mostly available for collective and statistical behaviors, at coarse granularities, while high-resolution data on person-to-person interactions are generally limited to relatively small groups of individuals. Here we present a scalable experimental framework for gathering real-time data resolving face-to-face social interactions with tunable spatial and temporal granularities. We use active Radio Frequency Identification (RFID) devices that assess mutual proximity in a distributed fashion by exchanging low-power radio packets. We analyze the dynamics of person-to-person interaction networks obtained in three high-resolution experiments carried out at different orders of magnitude in community size. The data sets exhibit common statistical properties and lack of a characteristic time scale from 20 seconds to several hours. The association between the number of connections and their duration shows an interesting super-linear behavior, which indicates the possibility of defining super-connectors both in the number and intensity of connections. Taking advantage of scalability and resolution, this experimental framework allows the monitoring of social interactions, uncovering similarities in the way individuals interact in different contexts, and identifying patterns of super-connector behavior in the community. These results could impact our understanding of all phenomena driven by face-to-face interactions, such as the spreading of transmissible infectious diseases and information.
Molecular simulations of self-assembly processes in metal-organic frameworks: Model dependence
NASA Astrophysics Data System (ADS)
Biswal, Debasmita; Kusalik, Peter G.
2017-07-01
Molecular simulation is a powerful tool for investigating microscopic behavior in various chemical systems, where the use of suitable models is critical to successfully reproduce the structural and dynamic properties of the real systems of interest. In this context, molecular dynamics simulation studies of self-assembly processes in metal-organic frameworks (MOFs), a well-known class of porous materials with interesting chemical and physical properties, are relatively challenging, where a reasonably accurate representation of metal-ligand interactions is anticipated to play an important role. In the current study, we both investigate the performance of some existing models and introduce and test new models to help explore the self-assembly in an archetypal Zn-carboxylate MOF system. To this end, the behavior of six different Zn-ion models, three solvent models, and two ligand models was examined and validated against key experimental structural parameters. To explore longer time scale ordering events during MOF self-assembly via explicit solvent simulations, it is necessary to identify a suitable combination of simplified model components representing metal ions, organic ligands, and solvent molecules. It was observed that an extended cationic dummy atom (ECDA) Zn-ion model combined with an all-atom carboxylate ligand model and a simple dipolar solvent model can reproduce characteristic experimental structures for the archetypal MOF system. The successful use of these models in extensive sets of molecular simulations, which provide key insights into the self-assembly mechanism of this archetypal MOF system occurring during the early stages of this process, has been very recently reported.
Mutation Detection with Next-Generation Resequencing through a Mediator Genome
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wurtzel, Omri; Dori-Bachash, Mally; Pietrokovski, Shmuel
2010-12-31
The affordability of next generation sequencing (NGS) is transforming the field of mutation analysis in bacteria. The genetic basis for phenotype alteration can be identified directly by sequencing the entire genome of the mutant and comparing it to the wild-type (WT) genome, thus identifying acquired mutations. A major limitation for this approach is the need for an a-priori sequenced reference genome for the WT organism, as the short reads of most current NGS approaches usually prohibit de-novo genome assembly. To overcome this limitation we propose a general framework that utilizes the genome of relative organisms as mediators for comparing WTmore » and mutant bacteria. Under this framework, both mutant and WT genomes are sequenced with NGS, and the short sequencing reads are mapped to the mediator genome. Variations between the mutant and the mediator that recur in the WT are ignored, thus pinpointing the differences between the mutant and the WT. To validate this approach we sequenced the genome of Bdellovibrio bacteriovorus 109J, an obligatory bacterial predator, and its prey-independent mutant, and compared both to the mediator species Bdellovibrio bacteriovorus HD100. Although the mutant and the mediator sequences differed in more than 28,000 nucleotide positions, our approach enabled pinpointing the single causative mutation. Experimental validation in 53 additional mutants further established the implicated gene. Our approach extends the applicability of NGS-based mutant analyses beyond the domain of available reference genomes.« less
Analysis and characterization of graphene-on-substrate devices
NASA Astrophysics Data System (ADS)
Berdebes, Dionisis
The purpose of this MS Thesis is the analysis and characterization of graphene on substrate structures prepared at the Birck Nanotechnology Center-Purdue University/IBM Watson Research Center-N.Y., and characterized under low-field transport conditions. First, a literature survey is conducted, both in theoretical and experimental work on graphene transport phenomena, and the open issues are reported. Next, the theory of low-field transport in graphene is reviewed within a Landauer framework. Experimental results of back-gated graphene-on-substrate devices, prepared by the Appenzeller group, are then presented, followed by an extraction of an energy/temperature dependent backscattering mean free path as the main characterization parameter. A key conclusion is the critical role of contacts in two-probe measurements. In this framework, a non-self-consistent Non Equilibrium Green's Function method is employed for the calculation of the odd and even metal-graphene ballistic interfacial resistance. A good agreement with the relevant experimental work is observed.
PsyGlass: Capitalizing on Google Glass for naturalistic data collection.
Paxton, Alexandra; Rodriguez, Kevin; Dale, Rick
2015-09-01
As commercial technology moves further into wearable technologies, cognitive and psychological scientists can capitalize on these devices to facilitate naturalistic research designs while still maintaining strong experimental control. One such wearable technology is Google Glass (Google, Inc.: www.google.com/glass), which can present wearers with audio and visual stimuli while tracking a host of multimodal data. In this article, we introduce PsyGlass, a framework for incorporating Google Glass into experimental work that is freely available for download and community improvement over time (www.github.com/a-paxton/PsyGlass). As a proof of concept, we use this framework to investigate dual-task pressures on naturalistic interaction. The preliminary study demonstrates how designs from classic experimental psychology may be integrated in naturalistic interactive designs with emerging technologies. We close with a series of recommendations for using PsyGlass and a discussion of how wearable technology more broadly may contribute to new or adapted naturalistic research designs.
Swept shock/boundary-layer interactions: Scaling laws, flowfield structure, and experimental methods
NASA Technical Reports Server (NTRS)
Settles, Gary S.
1993-01-01
A general review is given of several decades of research on the scaling laws and flowfield structures of swept shock wave/turbulent boundary layer interactions. Attention is further restricted to the experimental study and physical understanding of the steady-state aspects of these flows. The interaction produced by a sharp, upright fin mounted on a flat plate is taken as an archetype. An overall framework of quasiconical symmetry describing such interactions is first developed. Boundary-layer separation, the interaction footprint, Mach number scaling, and Reynolds number scaling are then considered, followed by a discussion of the quasiconical similarity of interactions produced by geometrically-dissimilar shock generators. The detailed structure of these interaction flowfields is next reviewed, and is illustrated by both qualitative visualizations and quantitative flow images in the quasiconical framework. Finally, the experimental techniques used to investigate such flows are reviewed, with emphasis on modern non-intrusive optical flow diagnostics.
Kawamoto, Kensaku; Hongsermeier, Tonya; Wright, Adam; Lewis, Janet; Bell, Douglas S; Middleton, Blackford
2013-01-01
To identify key principles for establishing a national clinical decision support (CDS) knowledge sharing framework. As part of an initiative by the US Office of the National Coordinator for Health IT (ONC) to establish a framework for national CDS knowledge sharing, key stakeholders were identified. Stakeholders' viewpoints were obtained through surveys and in-depth interviews, and findings and relevant insights were summarized. Based on these insights, key principles were formulated for establishing a national CDS knowledge sharing framework. Nineteen key stakeholders were recruited, including six executives from electronic health record system vendors, seven executives from knowledge content producers, three executives from healthcare provider organizations, and three additional experts in clinical informatics. Based on these stakeholders' insights, five key principles were identified for effectively sharing CDS knowledge nationally. These principles are (1) prioritize and support the creation and maintenance of a national CDS knowledge sharing framework; (2) facilitate the development of high-value content and tooling, preferably in an open-source manner; (3) accelerate the development or licensing of required, pragmatic standards; (4) acknowledge and address medicolegal liability concerns; and (5) establish a self-sustaining business model. Based on the principles identified, a roadmap for national CDS knowledge sharing was developed through the ONC's Advancing CDS initiative. The study findings may serve as a useful guide for ongoing activities by the ONC and others to establish a national framework for sharing CDS knowledge and improving clinical care.
ERIC Educational Resources Information Center
Li, Jie; Alagaraja, Meera
2007-01-01
The authors suggest a conceptual framework for developing CU's in the Chinese organizational context. We reviewed literature on existing conceptual frameworks and chose the CU wheel as proposed by Prince and Stewart. Four core processes identified in the CU wheel were realigned and readjusted in developing our framework of Corporate University in…
A framework for identifying carbon hotspots and forest management drivers
Nilesh Timilsina; Francisco J. Escobedo; Wendell P. Cropper; Amr Abd-Elrahman; Thomas Brandeis; Sonia Delphin; Samuel Lambert
2013-01-01
Spatial analyses of ecosystem system services that are directly relevant to both forest management decision making and conservation in the subtropics are rare. Also, frameworks that identify and map carbon stocks and corresponding forest management drivers using available regional, national, and international-level forest inventory datasets could provide insights into...
How Does Teacher Knowledge in Statistics Impact on Teacher Listening?
ERIC Educational Resources Information Center
Burgess, Tim
2012-01-01
For teaching statistics investigations at primary school level, teacher knowledge has been identified using a framework developed from a classroom based study. Through development of the framework, three types of teacher listening problems were identified, each of which had potential impact on the students' learning. The three types of problems…
Teaching Strategies in Online Discussion Board: A Framework in Higher Education
ERIC Educational Resources Information Center
Chou, Pao-Nan
2012-01-01
In order to promote meaningful learning in the online discussion board, this study attempted to identify key factors that affect the online discussion by a literature review and propose an innovative instructional framework to deal with those factors. Through the literature review, five factors were identified. An innovative instructional…
A Manual to Identify Sources of Fluvial Sediment | Science ...
Sedimentation is one of the main causes of stream/river aquatic life use impairments in R3. Currently states lack standard guidance on appropriate tools available to quantify sediment sources and develop sediment budgets in TMDL Development. Methods for distinguishing sediment types for TMDL development will focus stream restoration and soil conservation efforts in strategic locations in a watershed and may better target appropriate BMPs to achieve sediment load reductions. Properly identifying sediment sources in a TMDL will also help focus NPDES permitting, stream restoration activities and other TMDL implementation efforts. This project will focus on developing a framework that will be published as a guidance document that outlines steps and approaches to identify the significant sources of fine-grained sediment in 303D listed watersheds. In this framework, the sediment-fingerprinting and sediment budget approaches will be emphasized. This project will focus on developing a framework that will be published as a guidance document that outlines steps and approaches to identify the significant sources of fine-grained sediment in 303D listed watersheds. In this framework, the sediment-fingerprinting and sediment budget approaches will be emphasized.
Devine, Susan G; Muller, Reinhold; Carter, Anthony
2008-12-01
An exploratory descriptive study was undertaken to identify staff perceptions of the types and sources of occupational health and safety hazards at a remote fly-in-fly-out minerals extraction and processing plant in northwest Queensland. Ongoing focus groups with all sectors of the operation were conducted concurrently with quantitative research studies from 2001 to 2005. Action research processes were used with management and staff to develop responses to identified issues. Staff identified and generated solutions to the core themes of: health and safety policies and procedures; chemical exposures; hydration and fatigue. The Framework for Health Promotion Action was applied to ensure a comprehensive and holistic response to identified issues. Participatory processes using an action research framework enabled a deep understanding of staff perceptions of occupational health and safety hazards in this setting. The Framework for Health Promotion provided a relevant and useful tool to engage with staff and develop solutions to perceived occupational health and safety issues in the workplace.
Computations of Combustion-Powered Actuation for Dynamic Stall Suppression
NASA Technical Reports Server (NTRS)
Jee, Solkeun; Bowles, Patrick O.; Matalanis, Claude G.; Min, Byung-Young; Wake, Brian E.; Crittenden, Tom; Glezer, Ari
2016-01-01
A computational framework for the simulation of dynamic stall suppression with combustion-powered actuation (COMPACT) is validated against wind tunnel experimental results on a VR-12 airfoil. COMPACT slots are located at 10% chord from the leading edge of the airfoil and directed tangentially along the suction-side surface. Helicopter rotor-relevant flow conditions are used in the study. A computationally efficient two-dimensional approach, based on unsteady Reynolds-averaged Navier-Stokes (RANS), is compared in detail against the baseline and the modified airfoil with COMPACT, using aerodynamic forces, pressure profiles, and flow-field data. The two-dimensional RANS approach predicts baseline static and dynamic stall very well. Most of the differences between the computational and experimental results are within two standard deviations of the experimental data. The current framework demonstrates an ability to predict COMPACT efficacy across the experimental dataset. Enhanced aerodynamic lift on the downstroke of the pitching cycle due to COMPACT is well predicted, and the cycleaveraged lift enhancement computed is within 3% of the test data. Differences with experimental data are discussed with a focus on three-dimensional features not included in the simulations and the limited computational model for COMPACT.
Thomas, Emily; Murphy, Mary; Pitt, Rebecca; Rivers, Angela; Leavens, David A
2008-11-01
Povinelli, Bierschwale, and Cech (1999) reported that when tested on a visual attention task, the behavior of juvenile chimpanzees did not support a high-level understanding of visual attention. This study replicates their research using adult humans and aims to investigate the validity of their experimental design. Participants were trained to respond to pointing cues given by an experimenter, and then tested on their ability to locate hidden objects from visual cues. Povinelli et al.'s assertion that the generalization of pointing to gaze is indicative of a high-level framework was not supported by our findings: Training improved performance only on initial probe trials when the experimenter's gaze was not directed at the baited cup. Furthermore, participants performed above chance on such trials, the same result exhibited by chimpanzees and used as evidence by Povinelli et al. to support a low-level framework. These findings, together with the high performance of participants in an incongruent condition, in which the experimenter pointed to or gazed at an unbaited container, challenge the validity of their experimental design. (PsycINFO Database Record (c) 2008 APA, all rights reserved).
ERIC Educational Resources Information Center
Jang, Jeong-yoon; Hand, Brian
2017-01-01
This study investigated the value of using a scaffolded critique framework to promote two different types of writing--argumentative writing and explanatory writing--with different purposes within an argument-based inquiry approach known as the Science Writing Heuristic (SWH) approach. A quasi-experimental design with sixth and seventh grade…
ERIC Educational Resources Information Center
Goldstein, Howard; Lackey, Kimberly C.; Schneider, Naomi J. B.
2014-01-01
This review presents a novel framework for evaluating evidence based on a set of parallel criteria that can be applied to both group and single-subject experimental design (SSED) studies. The authors illustrate use of this evaluation system in a systematic review of 67 articles investigating social skills interventions for preschoolers with autism…
ERIC Educational Resources Information Center
Moon, Shannon
2017-01-01
In the absence of tools for intelligent tutoring systems for soaring flight simulation training, this study evaluated a framework foundation to measure pilot performance, affect, and physiological response to training in real-time. Volunteers were asked to perform a series of flight tasks selected from Federal Aviation Administration Practical…
Jacobs, Molly M.; Malloy, Timothy F.; Tickner, Joel A.; Edwards, Sally
2015-01-01
Background Given increasing pressures for hazardous chemical replacement, there is growing interest in alternatives assessment to avoid substituting a toxic chemical with another of equal or greater concern. Alternatives assessment is a process for identifying, comparing, and selecting safer alternatives to chemicals of concern (including those used in materials, processes, or technologies) on the basis of their hazards, performance, and economic viability. Objectives The purposes of this substantive review of alternatives assessment frameworks are to identify consistencies and differences in methods and to outline needs for research and collaboration to advance science policy practice. Methods This review compares methods used in six core components of these frameworks: hazard assessment, exposure characterization, life-cycle impacts, technical feasibility evaluation, economic feasibility assessment, and decision making. Alternatives assessment frameworks published from 1990 to 2014 were included. Results Twenty frameworks were reviewed. The frameworks were consistent in terms of general process steps, but some differences were identified in the end points addressed. Methodological gaps were identified in the exposure characterization, life-cycle assessment, and decision–analysis components. Methods for addressing data gaps remain an issue. Discussion Greater consistency in methods and evaluation metrics is needed but with sufficient flexibility to allow the process to be adapted to different decision contexts. Conclusion Although alternatives assessment is becoming an important science policy field, there is a need for increased cross-disciplinary collaboration to refine methodologies in support of the informed substitution and design of safer chemicals, materials, and products. Case studies can provide concrete lessons to improve alternatives assessment. Citation Jacobs MM, Malloy TF, Tickner JA, Edwards S. 2016. Alternatives assessment frameworks: research needs for the informed substitution of hazardous chemicals. Environ Health Perspect 124:265–280; http://dx.doi.org/10.1289/ehp.1409581 PMID:26339778
Wilmoth, Jared L; Doak, Peter W; Timm, Andrea; Halsted, Michelle; Anderson, John D; Ginovart, Marta; Prats, Clara; Portell, Xavier; Retterer, Scott T; Fuentes-Cabrera, Miguel
2018-01-01
The factors leading to changes in the organization of microbial assemblages at fine spatial scales are not well characterized or understood. However, they are expected to guide the succession of community development and function toward specific outcomes that could impact human health and the environment. In this study, we put forward a combined experimental and agent-based modeling framework and use it to interpret unique spatial organization patterns of H1-Type VI secretion system (T6SS) mutants of P . aeruginosa under spatial confinement. We find that key parameters, such as T6SS-mediated cell contact and lysis, spatial localization, relative species abundance, cell density and local concentrations of growth substrates and metabolites are influenced by spatial confinement. The model, written in the accessible programming language NetLogo, can be adapted to a variety of biological systems of interest and used to simulate experiments across a broad parameter space. It was implemented and run in a high-throughput mode by deploying it across multiple CPUs, with each simulation representing an individual well within a high-throughput microwell array experimental platform. The microfluidics and agent-based modeling framework we present in this paper provides an effective means by which to connect experimental studies in microbiology to model development. The work demonstrates progress in coupling experimental results to simulation while also highlighting potential sources of discrepancies between real-world experiments and idealized models.
Wilmoth, Jared L.; Doak, Peter W.; Timm, Andrea; Halsted, Michelle; Anderson, John D.; Ginovart, Marta; Prats, Clara; Portell, Xavier; Retterer, Scott T.; Fuentes-Cabrera, Miguel
2018-01-01
The factors leading to changes in the organization of microbial assemblages at fine spatial scales are not well characterized or understood. However, they are expected to guide the succession of community development and function toward specific outcomes that could impact human health and the environment. In this study, we put forward a combined experimental and agent-based modeling framework and use it to interpret unique spatial organization patterns of H1-Type VI secretion system (T6SS) mutants of P. aeruginosa under spatial confinement. We find that key parameters, such as T6SS-mediated cell contact and lysis, spatial localization, relative species abundance, cell density and local concentrations of growth substrates and metabolites are influenced by spatial confinement. The model, written in the accessible programming language NetLogo, can be adapted to a variety of biological systems of interest and used to simulate experiments across a broad parameter space. It was implemented and run in a high-throughput mode by deploying it across multiple CPUs, with each simulation representing an individual well within a high-throughput microwell array experimental platform. The microfluidics and agent-based modeling framework we present in this paper provides an effective means by which to connect experimental studies in microbiology to model development. The work demonstrates progress in coupling experimental results to simulation while also highlighting potential sources of discrepancies between real-world experiments and idealized models. PMID:29467721
Modeling Criminal Activity in Urban Landscapes
NASA Astrophysics Data System (ADS)
Brantingham, Patricia; Glässer, Uwe; Jackson, Piper; Vajihollahi, Mona
Computational and mathematical methods arguably have an enormous potential for serving practical needs in crime analysis and prevention by offering novel tools for crime investigations and experimental platforms for evidence-based policy making. We present a comprehensive formal framework and tool support for mathematical and computational modeling of criminal behavior to facilitate systematic experimental studies of a wide range of criminal activities in urban environments. The focus is on spatial and temporal aspects of different forms of crime, including opportunistic and serial violent crimes. However, the proposed framework provides a basis to push beyond conventional empirical research and engage the use of computational thinking and social simulations in the analysis of terrorism and counter-terrorism.
Predicting Drug-Target Interactions With Multi-Information Fusion.
Peng, Lihong; Liao, Bo; Zhu, Wen; Li, Zejun; Li, Keqin
2017-03-01
Identifying potential associations between drugs and targets is a critical prerequisite for modern drug discovery and repurposing. However, predicting these associations is difficult because of the limitations of existing computational methods. Most models only consider chemical structures and protein sequences, and other models are oversimplified. Moreover, datasets used for analysis contain only true-positive interactions, and experimentally validated negative samples are unavailable. To overcome these limitations, we developed a semi-supervised based learning framework called NormMulInf through collaborative filtering theory by using labeled and unlabeled interaction information. The proposed method initially determines similarity measures, such as similarities among samples and local correlations among the labels of the samples, by integrating biological information. The similarity information is then integrated into a robust principal component analysis model, which is solved using augmented Lagrange multipliers. Experimental results on four classes of drug-target interaction networks suggest that the proposed approach can accurately classify and predict drug-target interactions. Part of the predicted interactions are reported in public databases. The proposed method can also predict possible targets for new drugs and can be used to determine whether atropine may interact with alpha1B- and beta1- adrenergic receptors. Furthermore, the developed technique identifies potential drugs for new targets and can be used to assess whether olanzapine and propiomazine may target 5HT2B. Finally, the proposed method can potentially address limitations on studies of multitarget drugs and multidrug targets.
Gresham, David; Boer, Viktor M; Caudy, Amy; Ziv, Naomi; Brandt, Nathan J; Storey, John D; Botstein, David
2011-01-01
An essential property of all cells is the ability to exit from active cell division and persist in a quiescent state. For single-celled microbes this primarily occurs in response to nutrient deprivation. We studied the genetic requirements for survival of Saccharomyces cerevisiae when starved for either of two nutrients: phosphate or leucine. We measured the survival of nearly all nonessential haploid null yeast mutants in mixed populations using a quantitative sequencing method that estimates the abundance of each mutant on the basis of frequency of unique molecular barcodes. Starvation for phosphate results in a population half-life of 337 hr whereas starvation for leucine results in a half-life of 27.7 hr. To measure survival of individual mutants in each population we developed a statistical framework that accounts for the multiple sources of experimental variation. From the identities of the genes in which mutations strongly affect survival, we identify genetic evidence for several cellular processes affecting survival during nutrient starvation, including autophagy, chromatin remodeling, mRNA processing, and cytoskeleton function. In addition, we found evidence that mitochondrial and peroxisome function is required for survival. Our experimental and analytical methods represent an efficient and quantitative approach to characterizing genetic functions and networks with unprecedented resolution and identified genotype-by-environment interactions that have important implications for interpretation of studies of aging and quiescence in yeast.
Nonlinear machine learning in soft materials engineering and design
NASA Astrophysics Data System (ADS)
Ferguson, Andrew
The inherently many-body nature of molecular folding and colloidal self-assembly makes it challenging to identify the underlying collective mechanisms and pathways governing system behavior, and has hindered rational design of soft materials with desired structure and function. Fundamentally, there exists a predictive gulf between the architecture and chemistry of individual molecules or colloids and the collective many-body thermodynamics and kinetics. Integrating machine learning techniques with statistical thermodynamics provides a means to bridge this divide and identify emergent folding pathways and self-assembly mechanisms from computer simulations or experimental particle tracking data. We will survey a few of our applications of this framework that illustrate the value of nonlinear machine learning in understanding and engineering soft materials: the non-equilibrium self-assembly of Janus colloids into pinwheels, clusters, and archipelagos; engineering reconfigurable ''digital colloids'' as a novel high-density information storage substrate; probing hierarchically self-assembling onjugated asphaltenes in crude oil; and determining macromolecular folding funnels from measurements of single experimental observables. We close with an outlook on the future of machine learning in soft materials engineering, and share some personal perspectives on working at this disciplinary intersection. We acknowledge support for this work from a National Science Foundation CAREER Award (Grant No. DMR-1350008) and the Donors of the American Chemical Society Petroleum Research Fund (ACS PRF #54240-DNI6).
The Role of Omics in the Application of Adverse Outcome Pathways for Chemical Risk Assessment.
Brockmeier, Erica K; Hodges, Geoff; Hutchinson, Thomas H; Butler, Emma; Hecker, Markus; Tollefsen, Knut Erik; Garcia-Reyero, Natalia; Kille, Peter; Becker, Dörthe; Chipman, Kevin; Colbourne, John; Collette, Timothy W; Cossins, Andrew; Cronin, Mark; Graystock, Peter; Gutsell, Steve; Knapen, Dries; Katsiadaki, Ioanna; Lange, Anke; Marshall, Stuart; Owen, Stewart F; Perkins, Edward J; Plaistow, Stewart; Schroeder, Anthony; Taylor, Daisy; Viant, Mark; Ankley, Gerald; Falciani, Francesco
2017-08-01
In conjunction with the second International Environmental Omics Symposium (iEOS) conference, held at the University of Liverpool (United Kingdom) in September 2014, a workshop was held to bring together experts in toxicology and regulatory science from academia, government and industry. The purpose of the workshop was to review the specific roles that high-content omics datasets (eg, transcriptomics, metabolomics, lipidomics, and proteomics) can hold within the adverse outcome pathway (AOP) framework for supporting ecological and human health risk assessments. In light of the growing number of examples of the application of omics data in the context of ecological risk assessment, we considered how omics datasets might continue to support the AOP framework. In particular, the role of omics in identifying potential AOP molecular initiating events and providing supportive evidence of key events at different levels of biological organization and across taxonomic groups was discussed. Areas with potential for short and medium-term breakthroughs were also discussed, such as providing mechanistic evidence to support chemical read-across, providing weight of evidence information for mode of action assignment, understanding biological networks, and developing robust extrapolations of species-sensitivity. Key challenges that need to be addressed were considered, including the need for a cohesive approach towards experimental design, the lack of a mutually agreed framework to quantitatively link genes and pathways to key events, and the need for better interpretation of chemically induced changes at the molecular level. This article was developed to provide an overview of ecological risk assessment process and a perspective on how high content molecular-level datasets can support the future of assessment procedures through the AOP framework. © The Author 2017. Published by Oxford University Press on behalf of the Society of Toxicology.
A deep learning framework for modeling structural features of RNA-binding protein targets
Zhang, Sai; Zhou, Jingtian; Hu, Hailin; Gong, Haipeng; Chen, Ligong; Cheng, Chao; Zeng, Jianyang
2016-01-01
RNA-binding proteins (RBPs) play important roles in the post-transcriptional control of RNAs. Identifying RBP binding sites and characterizing RBP binding preferences are key steps toward understanding the basic mechanisms of the post-transcriptional gene regulation. Though numerous computational methods have been developed for modeling RBP binding preferences, discovering a complete structural representation of the RBP targets by integrating their available structural features in all three dimensions is still a challenging task. In this paper, we develop a general and flexible deep learning framework for modeling structural binding preferences and predicting binding sites of RBPs, which takes (predicted) RNA tertiary structural information into account for the first time. Our framework constructs a unified representation that characterizes the structural specificities of RBP targets in all three dimensions, which can be further used to predict novel candidate binding sites and discover potential binding motifs. Through testing on the real CLIP-seq datasets, we have demonstrated that our deep learning framework can automatically extract effective hidden structural features from the encoded raw sequence and structural profiles, and predict accurate RBP binding sites. In addition, we have conducted the first study to show that integrating the additional RNA tertiary structural features can improve the model performance in predicting RBP binding sites, especially for the polypyrimidine tract-binding protein (PTB), which also provides a new evidence to support the view that RBPs may own specific tertiary structural binding preferences. In particular, the tests on the internal ribosome entry site (IRES) segments yield satisfiable results with experimental support from the literature and further demonstrate the necessity of incorporating RNA tertiary structural information into the prediction model. The source code of our approach can be found in https://github.com/thucombio/deepnet-rbp. PMID:26467480
The Role of Omics in the Application of Adverse Outcome Pathways for Chemical Risk Assessment
Brockmeier, Erica K.; Hodges, Geoff; Hutchinson, Thomas H.; Butler, Emma; Hecker, Markus; Tollefsen, Knut Erik; Garcia-Reyero, Natalia; Kille, Peter; Becker, Dörthe; Chipman, Kevin; Colbourne, John; Collette, Timothy W.; Cossins, Andrew; Cronin, Mark; Graystock, Peter; Gutsell, Steve; Knapen, Dries; Katsiadaki, Ioanna; Lange, Anke; Marshall, Stuart; Owen, Stewart F.; Perkins, Edward J.; Plaistow, Stewart; Schroeder, Anthony; Taylor, Daisy; Viant, Mark; Ankley, Gerald; Falciani, Francesco
2017-01-01
Abstract In conjunction with the second International Environmental Omics Symposium (iEOS) conference, held at the University of Liverpool (United Kingdom) in September 2014, a workshop was held to bring together experts in toxicology and regulatory science from academia, government and industry. The purpose of the workshop was to review the specific roles that high-content omics datasets (eg, transcriptomics, metabolomics, lipidomics, and proteomics) can hold within the adverse outcome pathway (AOP) framework for supporting ecological and human health risk assessments. In light of the growing number of examples of the application of omics data in the context of ecological risk assessment, we considered how omics datasets might continue to support the AOP framework. In particular, the role of omics in identifying potential AOP molecular initiating events and providing supportive evidence of key events at different levels of biological organization and across taxonomic groups was discussed. Areas with potential for short and medium-term breakthroughs were also discussed, such as providing mechanistic evidence to support chemical read-across, providing weight of evidence information for mode of action assignment, understanding biological networks, and developing robust extrapolations of species-sensitivity. Key challenges that need to be addressed were considered, including the need for a cohesive approach towards experimental design, the lack of a mutually agreed framework to quantitatively link genes and pathways to key events, and the need for better interpretation of chemically induced changes at the molecular level. This article was developed to provide an overview of ecological risk assessment process and a perspective on how high content molecular-level datasets can support the future of assessment procedures through the AOP framework. PMID:28525648
HTA and its legal issues: a framework for identifying legal issues in health technology assessment.
Widrig, Daniel; Tag, Brigitte
2014-12-01
Legal analysis can highlight important issues that are relevant when deciding whether a medical technology should be implemented or reimbursed. Literature and studies show that even though the law is an acknowledged part of health technology assessment (HTA), legal issues are rarely considered in practice. One reason for this may be the lack of knowledge about the diversity of legal issues that are relevant for HTA. Therefore, this contribution aims primarily to identify and then explain the relevant legal issues in HTA. This study offers a framework for identifying the legal issues in HTAs in different jurisdictions and provides a basis for further research. After extensive literature search, the authors review Swiss health law to identify legal issues that are relevant to HTA. The authors then categorize these legal issues using a framework with an inside and outside perspective. Finally, they explain a selection of these legal issues with several examples. This study reveals numerous legal issues that are relevant for HTA and underlines the necessity of incorporating legal analysis in HTAs. The suggested perspectival framework in this study provides a basis to structure the legal analysis. The identified legal issues are relevant in other countries and the perspectival framework is transferable to other jurisdictions. The article underlines the importance of in-depth discussion about the role of law in HTA. It provides a structured overview of the legal issues in HTA and suggests a development of more concrete instruments toward a standardized legal technology assessment.
Effect of Autonomy Support on Self-Determined Motivation in Elementary Physical Education
Chang, Yu-Kai; Chen, Senlin; Tu, Kun-Wei; Chi, Li-Kang
2016-01-01
Using the quasi-experimental design, this study examined the effect of autonomy support on self-determined motivation in elementary school physical education (PE) students. One hundred and twenty six participants were assigned to either the autonomy support group (n = 61) or the control group (n = 65) for a six-week intervention period. Perceived teacher autonomy, perceived autonomy in PE, and self-determined motivation in PE were pre- and post-tested using validated questionnaires. Significant increases in perceived teacher autonomy and perceived autonomy in PE were observed in the autonomy support group, but not in the control group. Intrinsic motivation was higher in the autonomy support group than that in the control group. From an experimental perspective, these findings suggest that the autonomy support was successfully manipulated in the PE classes, which in turn increased the students’ perceived autonomy and intrinsic motivation. Key points The SDT is a relevant theoretical framework for elementary school physical education. Using the quasi-experimental research design, this study is one of the earlies studies supporting that elementary school PE teachers can manipulate the instructional context using the SDT to increase students’ perceived autonomy and intrinsic motivation. Increasing students’ perceived autonomy may not lead to significant changes in other SDT constructs (i.e., amotivation, external regulation, introjected regulation, and identified regulation). PMID:27803624
Prediction of enhancer-promoter interactions via natural language processing.
Zeng, Wanwen; Wu, Mengmeng; Jiang, Rui
2018-05-09
Precise identification of three-dimensional genome organization, especially enhancer-promoter interactions (EPIs), is important to deciphering gene regulation, cell differentiation and disease mechanisms. Currently, it is a challenging task to distinguish true interactions from other nearby non-interacting ones since the power of traditional experimental methods is limited due to low resolution or low throughput. We propose a novel computational framework EP2vec to assay three-dimensional genomic interactions. We first extract sequence embedding features, defined as fixed-length vector representations learned from variable-length sequences using an unsupervised deep learning method in natural language processing. Then, we train a classifier to predict EPIs using the learned representations in supervised way. Experimental results demonstrate that EP2vec obtains F1 scores ranging from 0.841~ 0.933 on different datasets, which outperforms existing methods. We prove the robustness of sequence embedding features by carrying out sensitivity analysis. Besides, we identify motifs that represent cell line-specific information through analysis of the learned sequence embedding features by adopting attention mechanism. Last, we show that even superior performance with F1 scores 0.889~ 0.940 can be achieved by combining sequence embedding features and experimental features. EP2vec sheds light on feature extraction for DNA sequences of arbitrary lengths and provides a powerful approach for EPIs identification.
Particle dynamics and deposition in true-scale pulmonary acinar models.
Fishler, Rami; Hofemeier, Philipp; Etzion, Yael; Dubowski, Yael; Sznitman, Josué
2015-09-11
Particle transport phenomena in the deep alveolated airways of the lungs (i.e. pulmonary acinus) govern deposition outcomes following inhalation of hazardous or pharmaceutical aerosols. Yet, there is still a dearth of experimental tools for resolving acinar particle dynamics and validating numerical simulations. Here, we present a true-scale experimental model of acinar structures consisting of bifurcating alveolated ducts that capture breathing-like wall motion and ensuing respiratory acinar flows. We study experimentally captured trajectories of inhaled polydispersed smoke particles (0.2 to 1 μm in diameter), demonstrating how intrinsic particle motion, i.e. gravity and diffusion, is crucial in determining dispersion and deposition of aerosols through a streamline crossing mechanism, a phenomenon paramount during flow reversal and locally within alveolar cavities. A simple conceptual framework is constructed for predicting the fate of inhaled particles near an alveolus by identifying capture and escape zones and considering how streamline crossing may shift particles between them. In addition, we examine the effect of particle size on detailed deposition patterns of monodispersed microspheres between 0.1-2 μm. Our experiments underline local modifications in the deposition patterns due to gravity for particles ≥0.5 μm compared to smaller particles, and show good agreement with corresponding numerical simulations.
Particle dynamics and deposition in true-scale pulmonary acinar models
Fishler, Rami; Hofemeier, Philipp; Etzion, Yael; Dubowski, Yael; Sznitman, Josué
2015-01-01
Particle transport phenomena in the deep alveolated airways of the lungs (i.e. pulmonary acinus) govern deposition outcomes following inhalation of hazardous or pharmaceutical aerosols. Yet, there is still a dearth of experimental tools for resolving acinar particle dynamics and validating numerical simulations. Here, we present a true-scale experimental model of acinar structures consisting of bifurcating alveolated ducts that capture breathing-like wall motion and ensuing respiratory acinar flows. We study experimentally captured trajectories of inhaled polydispersed smoke particles (0.2 to 1 μm in diameter), demonstrating how intrinsic particle motion, i.e. gravity and diffusion, is crucial in determining dispersion and deposition of aerosols through a streamline crossing mechanism, a phenomenon paramount during flow reversal and locally within alveolar cavities. A simple conceptual framework is constructed for predicting the fate of inhaled particles near an alveolus by identifying capture and escape zones and considering how streamline crossing may shift particles between them. In addition, we examine the effect of particle size on detailed deposition patterns of monodispersed microspheres between 0.1–2 μm. Our experiments underline local modifications in the deposition patterns due to gravity for particles ≥0.5 μm compared to smaller particles, and show good agreement with corresponding numerical simulations. PMID:26358580
Piwowar, Valentina; Thiel, Felicitas
2014-10-01
Response shift (RS) can threaten the internal validity of pre-post designs. As RS may indicate a redefinition of the target construct, its occurrence in training evaluation is rather likely. The most common approach to deal with RS is to implement a retrospective pretest (then-test) instead of the traditional pre-test. In health psychology, an adapted measurement invariance approach (MIad) was developed as an alternative technique to study RS. Results produced by identifying RS with the two approaches were rarely studied simultaneously or within an experimental framework. To study RS in two different treatment conditions and compare results produced by both techniques in identifying various types of RS. We further studied validity aspects of the then-test. We evaluated RS by applying the then-test procedure (TP) and the measurement invariance apporach MIad within an experimental design: Participants either attended a short-term or a long-term classroom management training program. Participants were 146 student teachers in their first year of master's study. Pre (before training), post, and then self-ratings (after training) on classroom management knowledge were administered. Results indicated that the two approaches do not yield the same results. The MIad identified more and also group-specific RS as opposed to the findings of the TP, which found less and only little evidence for group-specific RS. Further research is needed to study the usability and validity of the respective approaches. In particular, the usability of the then-test seems to be challenged. © The Author(s) 2014.
A Decision Support Framework For Science-Based, Multi-Stakeholder Deliberation: A Coral Reef Example
We present a decision support framework for science-based assessment and multi-stakeholder deliberation. The framework consists of two parts: a DPSIR (Drivers-Pressures-States-Impacts-Responses) analysis to identify the important causal relationships among anthropogenic environ...
Supervisory Styles: A Contingency Framework
ERIC Educational Resources Information Center
Boehe, Dirk Michael
2016-01-01
While the contingent nature of doctoral supervision has been acknowledged, the literature on supervisory styles has yet to deliver a theory-based contingency framework. A contingency framework can assist supervisors and research students in identifying appropriate supervisory styles under varying circumstances. The conceptual study reported here…
M(II)-dipyridylamide-based coordination frameworks (M=Mn, Co, Ni): Structural transformation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tzeng, Biing-Chiau; Selvam, TamilSelvi; Tsai, Miao-Hsin
2016-11-15
A series of 1-D double-zigzag (([M(papx){sub 2}(H{sub 2}O){sub 2}](ClO{sub 4}){sub 2}){sub n}; M=Mn, x=s (1), x=o (3); M=Co, x=s (4), x=o (5); M=Ni, x=s (6), x=o (7)) and 2-D polyrotaxane ([Mn(paps){sub 2}(ClO{sub 4}){sub 2}]{sub n} (2)) frameworks were synthesized by reactions of M(ClO{sub 4}){sub 2} (M=Mn, Co, and Ni) with papx (paps, N,N’-bis(pyridylcarbonyl)-4,4’-diaminodiphenylthioether; papo, N,N’-bis(pyridylcarbonyl)-4,4’-diaminodiphenyl ether), which have been isolated and structurally characterized by X-ray diffraction. Based on powder X-ray diffraction (PXRD) experiments, heating the double-zigzag frameworks underwent structural transformation to give the respective polyrotaxane ones. Moreover, grinding the solid samples of the respective polyrotaxanes in the presence of moisturemore » also resulted in the total conversion to the original double-zigzag frameworks. In this study, we have successfully extended studies to Mn{sup II}, Co{sup II}, and Ni{sup II} frameworks from the previous Zn{sup II}, Cd{sup II}, and Cu{sup II} ones, and interestingly such structural transformation is able to be proven experimentally by powder and single-crystal X-ray diffraction studies as well. - Graphical abstract: 1-D double-zigzag and 2-D polyrotaxane frameworks of M(II)-papx (x=s, o; M=Mn, Co, Ni) frameworks can be interconverted by heating and grinding in the presence of moiture, and such structural transformation has be proven experimentally by powder and single-crystal X-ray diffraction studies.« less
Predictive framework for estimating exposure of birds to pharmaceuticals
Bean, Thomas G.; Arnold, Kathryn E.; Lane, Julie M.; Bergström, Ed; Thomas-Oates, Jane; Rattner, Barnett A.; Boxall, Allistair B.A.
2017-01-01
We present and evaluate a framework for estimating concentrations of pharmaceuticals over time in wildlife feeding at wastewater treatment plants (WWTPs). The framework is composed of a series of predictive steps involving the estimation of pharmaceutical concentration in wastewater, accumulation into wildlife food items, and uptake by wildlife with subsequent distribution into, and elimination from, tissues. Because many pharmacokinetic parameters for wildlife are unavailable for the majority of drugs in use, a read-across approach was employed using either rodent or human data on absorption, distribution, metabolism, and excretion. Comparison of the different steps in the framework against experimental data for the scenario where birds are feeding on a WWTP contaminated with fluoxetine showed that estimated concentrations in wastewater treatment works were lower than measured concentrations; concentrations in food could be reasonably estimated if experimental bioaccumulation data are available; and read-across from rodent data worked better than human to bird read-across. The framework provides adequate predictions of plasma concentrations and of elimination behavior in birds but yields poor predictions of distribution in tissues. The approach holds promise, but it is important that we improve our understanding of the physiological similarities and differences between wild birds and domesticated laboratory mammals used in pharmaceutical efficacy/safety trials, so that the wealth of data available can be applied more effectively in ecological risk assessments.
Predictive framework for estimating exposure of birds to pharmaceuticals.
Bean, Thomas G; Arnold, Kathryn E; Lane, Julie M; Bergström, Ed; Thomas-Oates, Jane; Rattner, Barnett A; Boxall, Alistair B A
2017-09-01
We present and evaluate a framework for estimating concentrations of pharmaceuticals over time in wildlife feeding at wastewater treatment plants (WWTPs). The framework is composed of a series of predictive steps involving the estimation of pharmaceutical concentration in wastewater, accumulation into wildlife food items, and uptake by wildlife with subsequent distribution into, and elimination from, tissues. Because many pharmacokinetic parameters for wildlife are unavailable for the majority of drugs in use, a read-across approach was employed using either rodent or human data on absorption, distribution, metabolism, and excretion. Comparison of the different steps in the framework against experimental data for the scenario where birds are feeding on a WWTP contaminated with fluoxetine showed that estimated concentrations in wastewater treatment works were lower than measured concentrations; concentrations in food could be reasonably estimated if experimental bioaccumulation data are available; and read-across from rodent data worked better than human to bird read-across. The framework provides adequate predictions of plasma concentrations and of elimination behavior in birds but yields poor predictions of distribution in tissues. The approach holds promise, but it is important that we improve our understanding of the physiological similarities and differences between wild birds and domesticated laboratory mammals used in pharmaceutical efficacy/safety trials, so that the wealth of data available can be applied more effectively in ecological risk assessments. Environ Toxicol Chem 2017;36:2335-2344. © 2017 SETAC. © 2017 SETAC.
NASA Astrophysics Data System (ADS)
Bergner, F.; Pareige, C.; Hernández-Mayoral, M.; Malerba, L.; Heintze, C.
2014-05-01
An attempt is made to quantify the contributions of different types of defect-solute clusters to the total irradiation-induced yield stress increase in neutron-irradiated (300 °C, 0.6 dpa), industrial-purity Fe-Cr model alloys (target Cr contents of 2.5, 5, 9 and 12 at.% Cr). Former work based on the application of transmission electron microscopy, atom probe tomography, and small-angle neutron scattering revealed the formation of dislocation loops, NiSiPCr-enriched clusters and α‧-phase particles, which act as obstacles to dislocation glide. The values of the dimensionless obstacle strength are estimated in the framework of a three-feature dispersed-barrier hardening model. Special attention is paid to the effect of measuring errors, experimental details and model details on the estimates. The three families of obstacles and the hardening model are well capable of reproducing the observed yield stress increase as a function of Cr content, suggesting that the nanostructural features identified experimentally are the main, if not the only, causes of irradiation hardening in these model alloys.
Martin, Richard L; Simon, Cory M; Smit, Berend; Haranczyk, Maciej
2014-04-02
Porous polymer networks (PPNs) are a class of advanced porous materials that combine the advantages of cheap and stable polymers with the high surface areas and tunable chemistry of metal-organic frameworks. They are of particular interest for gas separation or storage applications, for instance, as methane adsorbents for a vehicular natural gas tank or other portable applications. PPNs are self-assembled from distinct building units; here, we utilize commercially available chemical fragments and two experimentally known synthetic routes to design in silico a large database of synthetically realistic PPN materials. All structures from our database of 18,000 materials have been relaxed with semiempirical electronic structure methods and characterized with Grand-canonical Monte Carlo simulations for methane uptake and deliverable (working) capacity. A number of novel structure-property relationships that govern methane storage performance were identified. The relationships are translated into experimental guidelines to realize the ideal PPN structure. We found that cooperative methane-methane attractions were present in all of the best-performing materials, highlighting the importance of guest interaction in the design of optimal materials for methane storage.
Bayesian network prior: network analysis of biological data using external knowledge
Isci, Senol; Dogan, Haluk; Ozturk, Cengizhan; Otu, Hasan H.
2014-01-01
Motivation: Reverse engineering GI networks from experimental data is a challenging task due to the complex nature of the networks and the noise inherent in the data. One way to overcome these hurdles would be incorporating the vast amounts of external biological knowledge when building interaction networks. We propose a framework where GI networks are learned from experimental data using Bayesian networks (BNs) and the incorporation of external knowledge is also done via a BN that we call Bayesian Network Prior (BNP). BNP depicts the relation between various evidence types that contribute to the event ‘gene interaction’ and is used to calculate the probability of a candidate graph (G) in the structure learning process. Results: Our simulation results on synthetic, simulated and real biological data show that the proposed approach can identify the underlying interaction network with high accuracy even when the prior information is distorted and outperforms existing methods. Availability: Accompanying BNP software package is freely available for academic use at http://bioe.bilgi.edu.tr/BNP. Contact: hasan.otu@bilgi.edu.tr Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:24215027
Wang, Zhuo; Danziger, Samuel A; Heavner, Benjamin D; Ma, Shuyi; Smith, Jennifer J; Li, Song; Herricks, Thurston; Simeonidis, Evangelos; Baliga, Nitin S; Aitchison, John D; Price, Nathan D
2017-05-01
Gene regulatory and metabolic network models have been used successfully in many organisms, but inherent differences between them make networks difficult to integrate. Probabilistic Regulation Of Metabolism (PROM) provides a partial solution, but it does not incorporate network inference and underperforms in eukaryotes. We present an Integrated Deduced And Metabolism (IDREAM) method that combines statistically inferred Environment and Gene Regulatory Influence Network (EGRIN) models with the PROM framework to create enhanced metabolic-regulatory network models. We used IDREAM to predict phenotypes and genetic interactions between transcription factors and genes encoding metabolic activities in the eukaryote, Saccharomyces cerevisiae. IDREAM models contain many fewer interactions than PROM and yet produce significantly more accurate growth predictions. IDREAM consistently outperformed PROM using any of three popular yeast metabolic models and across three experimental growth conditions. Importantly, IDREAM's enhanced accuracy makes it possible to identify subtle synthetic growth defects. With experimental validation, these novel genetic interactions involving the pyruvate dehydrogenase complex suggested a new role for fatty acid-responsive factor Oaf1 in regulating acetyl-CoA production in glucose grown cells.
A model of recovering the parameters of fast nonlocal heat transport in magnetic fusion plasmas
NASA Astrophysics Data System (ADS)
Kukushkin, A. B.; Kulichenko, A. A.; Sdvizhenskii, P. A.; Sokolov, A. V.; Voloshinov, V. V.
2017-12-01
A model is elaborated for interpreting the initial stage of the fast nonlocal transport events, which exhibit immediate response, in the diffusion time scale, of the spatial profile of electron temperature to its local perturbation, while the net heat flux is directed opposite to ordinary diffusion (i.e. along the temperature gradient). We solve the inverse problem of recovering the kernel of the integral equation, which describes nonlocal (superdiffusive) transport of energy due to emission and absorption of electromagnetic (EM) waves with long free path and strong reflection from the vacuum vessel’s wall. To allow for the errors of experimental data, we use the method based on the regularized (in the framework of an ill-posed problem, using the parametric models) approximation of available experimental data. The model is applied to interpreting the data from stellarator LHD and tokamak TFTR. The EM wave transport is considered here in the single-group approximation, however the limitations of the physics model enable us to identify the spectral range of the EM waves which might be responsible for the observed phenomenon.
Kirk, Maggie; Tonkin, Emma; Skirton, Heather
2014-02-01
To report a review of a genetics education framework using a consensus approach to agree on a contemporary and comprehensive revised framework. Advances in genomic health care have been significant since the first genetics education framework for nurses was developed in 2003. These, coupled with developments in policy and international efforts to promote nursing competence in genetics, indicated that review was timely. A structured, iterative, primarily qualitative approach, based on a nominal group technique. A meeting convened in 2010 involved stakeholders in UK nursing education, practice and management, including patient representatives (n = 30). A consensus approach was used to solicit participants' views on the individual/family needs identified from real-life stories of people affected by genetic conditions and the nurses' knowledge, skills and attitudes needed to meet those needs. Five groups considered the stories in iterative rounds, reviewing comments from previous groups. Omissions and deficiencies were identified by mapping resulting themes to the original framework. Anonymous voting captured views. Educators at a second meeting developed learning outcomes for the final framework. Deficiencies in relation to Advocacy, Information management and Ongoing care were identified. All competencies of the original framework were revised, adding an eighth competency to make explicit the need for ongoing care of the individual/family. Modifications to the framework reflect individual/family needs and are relevant to the nursing role. The approach promoted engagement in a complex issue and provides a framework to guide nurse education in genetics/genomics; however, nursing leadership is crucial to successful implementation. © 2013 The Authors. Journal of Advanced Nursing published by John Wiley & Sons Ltd.
Critical Watersheds: Climate Change, Tipping Points, and Energy-Water Impacts
NASA Astrophysics Data System (ADS)
Middleton, R. S.; Brown, M.; Coon, E.; Linn, R.; McDowell, N. G.; Painter, S. L.; Xu, C.
2014-12-01
Climate change, extreme climate events, and climate-induced disturbances will have a substantial and detrimental impact on terrestrial ecosystems. How ecosystems respond to these impacts will, in turn, have a significant effect on the quantity, quality, and timing of water supply for energy security, agriculture, industry, and municipal use. As a community, we lack sufficient quantitative and mechanistic understanding of the complex interplay between climate extremes (e.g., drought, floods), ecosystem dynamics (e.g., vegetation succession), and disruptive events (e.g., wildfire) to assess ecosystem vulnerabilities and to design mitigation strategies that minimize or prevent catastrophic ecosystem impacts. Through a combination of experimental and observational science and modeling, we are developing a unique multi-physics ecohydrologic framework for understanding and quantifying feedbacks between novel climate and extremes, surface and subsurface hydrology, ecosystem dynamics, and disruptive events in critical watersheds. The simulation capability integrates and advances coupled surface-subsurface hydrology from the Advanced Terrestrial Simulator (ATS), dynamic vegetation succession from the Ecosystem Demography (ED) model, and QUICFIRE, a novel wildfire behavior model developed from the FIRETEC platform. These advances are expected to make extensive contributions to the literature and to earth system modeling. The framework is designed to predict, quantify, and mitigate the impacts of climate change on vulnerable watersheds, with a focus on the US Mountain West and the energy-water nexus. This emerging capability is used to identify tipping points in watershed ecosystems, quantify impacts on downstream users, and formally evaluate mitigation efforts including forest (e.g., thinning, prescribed burns) and watershed (e.g., slope stabilization). The framework is being trained, validated, and demonstrated using field observations and remote data collections in the Valles Caldera National Preserve, including pre- and post-wildfire and infestation observations. Ultimately, the framework will be applied to the upper Colorado River basin. Here, we present an overview of the framework development strategy and latest field and modeling results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, M. Hope; Truex, Mike; Freshley, Mark
Complex sites are defined as those with difficult subsurface access, deep and/or thick zones of contamination, large areal extent, subsurface heterogeneities that limit the effectiveness of remediation, or where long-term remedies are needed to address contamination (e.g., because of long-term sources or large extent). The Test Area North at the Idaho National Laboratory, developed for nuclear fuel operations and heavy metal manufacturing, is used as a case study. Liquid wastes and sludge from experimental facilities were disposed in an injection well, which contaminated the subsurface aquifer located deep within fractured basalt. The wastes included organic, inorganic, and low-level radioactive constituents,more » with the focus of this case study on trichloroethylene. The site is used as an example of a systems-based framework that provides a structured approach to regulatory processes established for remediation under existing regulations. The framework is intended to facilitate remedy decisions and implementation at complex sites where restoration may be uncertain, require long timeframes, or involve use of adaptive management approaches. The framework facilitates site, regulator, and stakeholder interactions during the remedial planning and implementation process by using a conceptual model description as a technical foundation for decisions, identifying endpoints, which are interim remediation targets or intermediate decision points on the path to an ultimate end, and maintaining protectiveness during the remediation process. At the Test Area North, using a structured approach to implementing concepts in the endpoint framework, a three-component remedy is largely functioning as intended and is projected to meet remedial action objectives by 2095 as required. The remedy approach is being adjusted as new data become available. The framework provides a structured process for evaluating and adjusting the remediation approach, allowing site owners, regulators, and stakeholders to manage contamination at complex sites where adaptive remedies are needed.« less
A novel framework for virtual prototyping of rehabilitation exoskeletons.
Agarwal, Priyanshu; Kuo, Pei-Hsin; Neptune, Richard R; Deshpande, Ashish D
2013-06-01
Human-worn rehabilitation exoskeletons have the potential to make therapeutic exercises increasingly accessible to disabled individuals while reducing the cost and labor involved in rehabilitation therapy. In this work, we propose a novel human-model-in-the-loop framework for virtual prototyping (design, control and experimentation) of rehabilitation exoskeletons by merging computational musculoskeletal analysis with simulation-based design techniques. The framework allows to iteratively optimize design and control algorithm of an exoskeleton using simulation. We introduce biomechanical, morphological, and controller measures to quantify the performance of the device for optimization study. Furthermore, the framework allows one to carry out virtual experiments for testing specific "what-if" scenarios to quantify device performance and recovery progress. To illustrate the application of the framework, we present a case study wherein the design and analysis of an index-finger exoskeleton is carried out using the proposed framework.
Hierarchical control and performance evaluation of multi-vehicle autonomous systems
NASA Astrophysics Data System (ADS)
Balakirsky, Stephen; Scrapper, Chris; Messina, Elena
2005-05-01
This paper will describe how the Mobility Open Architecture Tools and Simulation (MOAST) framework can facilitate performance evaluations of RCS compliant multi-vehicle autonomous systems. This framework provides an environment that allows for simulated and real architectural components to function seamlessly together. By providing repeatable environmental conditions, this framework allows for the development of individual components as well as component performance metrics. MOAST is composed of high-fidelity and low-fidelity simulation systems, a detailed model of real-world terrain, actual hardware components, a central knowledge repository, and architectural glue to tie all of the components together. This paper will describe the framework"s components in detail and provide an example that illustrates how the framework can be utilized to develop and evaluate a single architectural component through the use of repeatable trials and experimentation that includes both virtual and real components functioning together
Public health program capacity for sustainability: a new framework
2013-01-01
Background Public health programs can only deliver benefits if they are able to sustain activities over time. There is a broad literature on program sustainability in public health, but it is fragmented and there is a lack of consensus on core constructs. The purpose of this paper is to present a new conceptual framework for program sustainability in public health. Methods This developmental study uses a comprehensive literature review, input from an expert panel, and the results of concept-mapping to identify the core domains of a conceptual framework for public health program capacity for sustainability. The concept-mapping process included three types of participants (scientists, funders, and practitioners) from several public health areas (e.g., tobacco control, heart disease and stroke, physical activity and nutrition, and injury prevention). Results The literature review identified 85 relevant studies focusing on program sustainability in public health. Most of the papers described empirical studies of prevention-oriented programs aimed at the community level. The concept-mapping process identified nine core domains that affect a program’s capacity for sustainability: Political Support, Funding Stability, Partnerships, Organizational Capacity, Program Evaluation, Program Adaptation, Communications, Public Health Impacts, and Strategic Planning. Concept-mapping participants further identified 93 items across these domains that have strong face validity—89% of the individual items composing the framework had specific support in the sustainability literature. Conclusions The sustainability framework presented here suggests that a number of selected factors may be related to a program’s ability to sustain its activities and benefits over time. These factors have been discussed in the literature, but this framework synthesizes and combines the factors and suggests how they may be interrelated with one another. The framework presents domains for public health decision makers to consider when developing and implementing prevention and intervention programs. The sustainability framework will be useful for public health decision makers, program managers, program evaluators, and dissemination and implementation researchers. PMID:23375082
Sinden, Kathryn; MacDermid, Joy C
2014-03-01
Employers are tasked with developing injury management and return-to-work (RTW) programs in response to occupational health and safety policies. Physical demands analyses (PDAs) are the cornerstone of injury management and RTW development. Synthesizing and contextualizing policy knowledge for use in occupational program development, including PDAs, is challenging due to multiple stakeholder involvement. Few studies have used a knowledge translation theoretical framework to facilitate policy-based interventions in occupational contexts. The primary aim of this case study was to identify how constructs of the knowledge-to-action (KTA) framework were reflected in employer stakeholder-researcher collaborations during development of a firefighter PDA. Four stakeholder meetings were conducted with employee participants who had experience using PDAs in their occupational role. Directed content analysis informed analyses of meeting minutes, stakeholder views and personal reflections recorded throughout the case. Existing knowledge sources including local data, stakeholder experiences, policies and priorities were synthesized and tailored to develop a PDA in response to the barriers and facilitators identified by the firefighters. The flexibility of the KTA framework and synthesis of multiple knowledge sources were identified strengths. The KTA Action cycle was useful in directing the overall process but insufficient for directing the specific aspects of PDA development. Integration of specific PDA guidelines into the process provided explicit direction on best practices in tailoring the PDA and knowledge synthesis. Although the themes of the KTA framework were confirmed in our analysis, order modification of the KTA components was required. Despite a complex context with divergent perspectives successful implementation of a draft PDA was achieved. The KTA framework facilitated knowledge synthesis and PDA development but specific standards and modifications to the KTA framework were needed to enhance process structure. Flexibility for modification and integration of PDA practice guidelines were identified as assets of the KTA framework during its application.
Public health program capacity for sustainability: a new framework.
Schell, Sarah F; Luke, Douglas A; Schooley, Michael W; Elliott, Michael B; Herbers, Stephanie H; Mueller, Nancy B; Bunger, Alicia C
2013-02-01
Public health programs can only deliver benefits if they are able to sustain activities over time. There is a broad literature on program sustainability in public health, but it is fragmented and there is a lack of consensus on core constructs. The purpose of this paper is to present a new conceptual framework for program sustainability in public health. This developmental study uses a comprehensive literature review, input from an expert panel, and the results of concept-mapping to identify the core domains of a conceptual framework for public health program capacity for sustainability. The concept-mapping process included three types of participants (scientists, funders, and practitioners) from several public health areas (e.g., tobacco control, heart disease and stroke, physical activity and nutrition, and injury prevention). The literature review identified 85 relevant studies focusing on program sustainability in public health. Most of the papers described empirical studies of prevention-oriented programs aimed at the community level. The concept-mapping process identified nine core domains that affect a program's capacity for sustainability: Political Support, Funding Stability, Partnerships, Organizational Capacity, Program Evaluation, Program Adaptation, Communications, Public Health Impacts, and Strategic Planning. Concept-mapping participants further identified 93 items across these domains that have strong face validity-89% of the individual items composing the framework had specific support in the sustainability literature. The sustainability framework presented here suggests that a number of selected factors may be related to a program's ability to sustain its activities and benefits over time. These factors have been discussed in the literature, but this framework synthesizes and combines the factors and suggests how they may be interrelated with one another. The framework presents domains for public health decision makers to consider when developing and implementing prevention and intervention programs. The sustainability framework will be useful for public health decision makers, program managers, program evaluators, and dissemination and implementation researchers.
Clark, Benjamin J; Harvey, Ryan E
2016-09-01
The anterior and lateral thalamus has long been considered to play an important role in spatial and mnemonic cognitive functions; however, it remains unclear whether each region makes a unique contribution to spatial information processing. We begin by reviewing evidence from anatomical studies and electrophysiological recordings which suggest that at least one of the functions of the anterior thalamus is to guide spatial orientation in relation to a global or distal spatial framework, while the lateral thalamus serves to guide behavior in relation to a local or proximal framework. We conclude by reviewing experimental work using targeted manipulations (lesion or neuronal silencing) of thalamic nuclei during spatial behavior and single-unit recordings from neuronal representations of space. Our summary of this literature suggests that although the evidence strongly supports a working model of spatial information processing involving the anterior thalamus, research regarding the role of the lateral thalamus is limited and requires further attention. We therefore identify a number of major gaps in this research and suggest avenues of future study that could potentially solidify our understanding of the relative roles of anterior and lateral thalamic regions in spatial representation and memory. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
James, C. M.; Gildfind, D. E.; Lewis, S. W.; Morgan, R. G.; Zander, F.
2018-03-01
Expansion tubes are an important type of test facility for the study of planetary entry flow-fields, being the only type of impulse facility capable of simulating the aerothermodynamics of superorbital planetary entry conditions from 10 to 20 km/s. However, the complex flow processes involved in expansion tube operation make it difficult to fully characterise flow conditions, with two-dimensional full facility computational fluid dynamics simulations often requiring tens or hundreds of thousands of computational hours to complete. In an attempt to simplify this problem and provide a rapid flow condition prediction tool, this paper presents a validated and comprehensive analytical framework for the simulation of an expansion tube facility. It identifies central flow processes and models them from state to state through the facility using established compressible and isentropic flow relations, and equilibrium and frozen chemistry. How the model simulates each section of an expansion tube is discussed, as well as how the model can be used to simulate situations where flow conditions diverge from ideal theory. The model is then validated against experimental data from the X2 expansion tube at the University of Queensland.
Integrated Japanese Dependency Analysis Using a Dialog Context
NASA Astrophysics Data System (ADS)
Ikegaya, Yuki; Noguchi, Yasuhiro; Kogure, Satoru; Itoh, Toshihiko; Konishi, Tatsuhiro; Kondo, Makoto; Asoh, Hideki; Takagi, Akira; Itoh, Yukihiro
This paper describes how to perform syntactic parsing and semantic analysis in a dialog system. The paper especially deals with how to disambiguate potentially ambiguous sentences using the contextual information. Although syntactic parsing and semantic analysis are often studied independently of each other, correct parsing of a sentence often requires the semantic information on the input and/or the contextual information prior to the input. Accordingly, we merge syntactic parsing with semantic analysis, which enables syntactic parsing taking advantage of the semantic content of an input and its context. One of the biggest problems of semantic analysis is how to interpret dependency structures. We employ a framework for semantic representations that circumvents the problem. Within the framework, the meaning of any predicate is converted into a semantic representation which only permits a single type of predicate: an identifying predicate "aru". The semantic representations are expressed as sets of "attribute-value" pairs, and those semantic representations are stored in the context information. Our system disambiguates syntactic/semantic ambiguities of inputs referring to the attribute-value pairs in the context information. We have experimentally confirmed the effectiveness of our approach; specifically, the experiment confirmed high accuracy of parsing and correctness of generated semantic representations.
Douglas, Heather E; Raban, Magdalena Z; Walter, Scott R; Westbrook, Johanna I
2017-03-01
Multi-tasking is an important skill for clinical work which has received limited research attention. Its impacts on clinical work are poorly understood. In contrast, there is substantial multi-tasking research in cognitive psychology, driver distraction, and human-computer interaction. This review synthesises evidence of the extent and impacts of multi-tasking on efficiency and task performance from health and non-healthcare literature, to compare and contrast approaches, identify implications for clinical work, and to develop an evidence-informed framework for guiding the measurement of multi-tasking in future healthcare studies. The results showed healthcare studies using direct observation have focused on descriptive studies to quantify concurrent multi-tasking and its frequency in different contexts, with limited study of impact. In comparison, non-healthcare studies have applied predominantly experimental and simulation designs, focusing on interleaved and concurrent multi-tasking, and testing theories of the mechanisms by which multi-tasking impacts task efficiency and performance. We propose a framework to guide the measurement of multi-tasking in clinical settings that draws together lessons from these siloed research efforts. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
A Novel Mittag-Leffler Kernel Based Hybrid Fault Diagnosis Method for Wheeled Robot Driving System.
Yuan, Xianfeng; Song, Mumin; Zhou, Fengyu; Chen, Zhumin; Li, Yan
2015-01-01
The wheeled robots have been successfully applied in many aspects, such as industrial handling vehicles, and wheeled service robots. To improve the safety and reliability of wheeled robots, this paper presents a novel hybrid fault diagnosis framework based on Mittag-Leffler kernel (ML-kernel) support vector machine (SVM) and Dempster-Shafer (D-S) fusion. Using sensor data sampled under different running conditions, the proposed approach initially establishes multiple principal component analysis (PCA) models for fault feature extraction. The fault feature vectors are then applied to train the probabilistic SVM (PSVM) classifiers that arrive at a preliminary fault diagnosis. To improve the accuracy of preliminary results, a novel ML-kernel based PSVM classifier is proposed in this paper, and the positive definiteness of the ML-kernel is proved as well. The basic probability assignments (BPAs) are defined based on the preliminary fault diagnosis results and their confidence values. Eventually, the final fault diagnosis result is archived by the fusion of the BPAs. Experimental results show that the proposed framework not only is capable of detecting and identifying the faults in the robot driving system, but also has better performance in stability and diagnosis accuracy compared with the traditional methods.
A Novel Mittag-Leffler Kernel Based Hybrid Fault Diagnosis Method for Wheeled Robot Driving System
Yuan, Xianfeng; Song, Mumin; Chen, Zhumin; Li, Yan
2015-01-01
The wheeled robots have been successfully applied in many aspects, such as industrial handling vehicles, and wheeled service robots. To improve the safety and reliability of wheeled robots, this paper presents a novel hybrid fault diagnosis framework based on Mittag-Leffler kernel (ML-kernel) support vector machine (SVM) and Dempster-Shafer (D-S) fusion. Using sensor data sampled under different running conditions, the proposed approach initially establishes multiple principal component analysis (PCA) models for fault feature extraction. The fault feature vectors are then applied to train the probabilistic SVM (PSVM) classifiers that arrive at a preliminary fault diagnosis. To improve the accuracy of preliminary results, a novel ML-kernel based PSVM classifier is proposed in this paper, and the positive definiteness of the ML-kernel is proved as well. The basic probability assignments (BPAs) are defined based on the preliminary fault diagnosis results and their confidence values. Eventually, the final fault diagnosis result is archived by the fusion of the BPAs. Experimental results show that the proposed framework not only is capable of detecting and identifying the faults in the robot driving system, but also has better performance in stability and diagnosis accuracy compared with the traditional methods. PMID:26229526
Correlations between Community Structure and Link Formation in Complex Networks
Liu, Zhen; He, Jia-Lin; Kapoor, Komal; Srivastava, Jaideep
2013-01-01
Background Links in complex networks commonly represent specific ties between pairs of nodes, such as protein-protein interactions in biological networks or friendships in social networks. However, understanding the mechanism of link formation in complex networks is a long standing challenge for network analysis and data mining. Methodology/Principal Findings Links in complex networks have a tendency to cluster locally and form so-called communities. This widely existed phenomenon reflects some underlying mechanism of link formation. To study the correlations between community structure and link formation, we present a general computational framework including a theory for network partitioning and link probability estimation. Our approach enables us to accurately identify missing links in partially observed networks in an efficient way. The links having high connection likelihoods in the communities reveal that links are formed preferentially to create cliques and accordingly promote the clustering level of the communities. The experimental results verify that such a mechanism can be well captured by our approach. Conclusions/Significance Our findings provide a new insight into understanding how links are created in the communities. The computational framework opens a wide range of possibilities to develop new approaches and applications, such as community detection and missing link prediction. PMID:24039818
2018-01-01
In this work, mid-infrared (mid-IR), far-IR, and Raman spectra are presented for the distinct (meta)stable phases of the flexible metal–organic framework MIL-53(Al). Static density functional theory (DFT) simulations are performed, allowing for the identification of all IR-active modes, which is unprecedented in the low-frequency region. A unique vibrational fingerprint is revealed, resulting from aluminum-oxide backbone stretching modes, which can be used to clearly distinguish the IR spectra of the closed- and large-pore phases. Furthermore, molecular dynamics simulations based on a DFT description of the potential energy surface enable determination of the theoretical Raman spectrum of the closed- and large-pore phases for the first time. An excellent correspondence between theory and experiment is observed. Both the low-frequency IR and Raman spectra show major differences in vibrational modes between the closed- and large-pore phases, indicating changes in lattice dynamics between the two structures. In addition, several collective modes related to the breathing mechanism in MIL-53(Al) are identified. In particular, we rationalize the importance of the trampoline-like motion of the linker for the phase transition. PMID:29449906
A software framework for developing measurement applications under variable requirements.
Arpaia, Pasquale; Buzio, Marco; Fiscarelli, Lucio; Inglese, Vitaliano
2012-11-01
A framework for easily developing software for measurement and test applications under highly and fast-varying requirements is proposed. The framework allows the software quality, in terms of flexibility, usability, and maintainability, to be maximized. Furthermore, the development effort is reduced and finalized, by relieving the test engineer of development details. The framework can be configured for satisfying a large set of measurement applications in a generic field for an industrial test division, a test laboratory, or a research center. As an experimental case study, the design, the implementation, and the assessment inside the application to a measurement scenario of magnet testing at the European Organization for Nuclear Research is reported.
Innovation in neurosurgery: less than IDEAL? A systematic review.
Muskens, I S; Diederen, S J H; Senders, J T; Zamanipoor Najafabadi, A H; van Furth, W R; May, A M; Smith, T R; Bredenoord, A L; Broekman, M L D
2017-10-01
Surgical innovation is different from the introduction of novel pharmaceuticals. To help address this, in 2009 the IDEAL Collaboration (Idea, Development, Exploration, Assessment, Long-term follow-up) introduced the five-stage framework for surgical innovation. To evaluate the framework feasibility for novel neurosurgical procedure introduction, two innovative surgical procedures were examined: the endoscopic endonasal approach for skull base meningiomas (EEMS) and the WovenEndobridge (WEB device) for endovascular treatment of intracranial aneurysms. The published literature on EEMS and WEB devices was systematically reviewed. Identified studies were classified according to the IDEAL framework stage. Next, studies were evaluated for possible categorization according to the IDEAL framework. Five hundred seventy-six papers describing EEMS were identified of which 26 papers were included. No prospective studies were identified, and no studies reported on ethical approval or patient informed consent for the innovative procedure. Therefore, no clinical studies could be categorized according to the IDEAL Framework. For WEB devices, 6229 articles were screened of which 21 were included. In contrast to EEMS, two studies were categorized as 2a and two as 2b. The results of this systematic review demonstrate that both EEMS and WEB devices were not introduced according to the (later developed in the case of EEMS) IDEAL framework. Elements of the framework such as informed consent, ethical approval, and rigorous outcomes reporting are important and could serve to improve the quality of neurosurgical research. Alternative study designs and the use of big data could be useful modifications of the IDEAL framework for innovation in neurosurgery.
NASA Astrophysics Data System (ADS)
Ao, Ping
2011-03-01
There has been a tremendous progress in cancer research. However, it appears the current dominant cancer research framework of regarding cancer as diseases of genome leads impasse. Naturally questions have been asked that whether it is possible to develop alternative frameworks such that they can connect both to mutations and other genetic/genomic effects and to environmental factors. Furthermore, such framework can be made quantitative and with predictions experimentally testable. In this talk, I will present a positive answer to this calling. I will explain on our construction of endogenous network theory based on molecular-cellular agencies as dynamical variable. Such cancer theory explicitly demonstrates a profound connection to many fundamental concepts in physics, as such stochastic non-equilibrium processes, ``energy'' landscape, metastability, etc. It suggests that neneath cancer's daunting complexity may lie a simplicity that gives grounds for hope. The rationales behind such theory, its predictions, and its initial experimental verifications will be presented. Supported by USA NIH and China NSF.
Non-Markovian quantum processes: Complete framework and efficient characterization
NASA Astrophysics Data System (ADS)
Pollock, Felix A.; Rodríguez-Rosario, César; Frauenheim, Thomas; Paternostro, Mauro; Modi, Kavan
2018-01-01
Currently, there is no systematic way to describe a quantum process with memory solely in terms of experimentally accessible quantities. However, recent technological advances mean we have control over systems at scales where memory effects are non-negligible. The lack of such an operational description has hindered advances in understanding physical, chemical, and biological processes, where often unjustified theoretical assumptions are made to render a dynamical description tractable. This has led to theories plagued with unphysical results and no consensus on what a quantum Markov (memoryless) process is. Here, we develop a universal framework to characterize arbitrary non-Markovian quantum processes. We show how a multitime non-Markovian process can be reconstructed experimentally, and that it has a natural representation as a many-body quantum state, where temporal correlations are mapped to spatial ones. Moreover, this state is expected to have an efficient matrix-product-operator form in many cases. Our framework constitutes a systematic tool for the effective description of memory-bearing open-system evolutions.
Rimoldi, Martino; Bernales, Varinia; Borycz, Joshua; ...
2017-01-05
NU-1000, a zirconium-based metal-organic framework featuring mesoporous channels, has been post-synthetically metalated via atomic layer deposition in MOF (AIM) employing dimethylaluminum iso-propoxide ([AlMe 2 iOPr] 2 – DMAI), a milder precursor than widely used trimethylaluminum (AlMe 3 - TMA). The aluminum-modified NU-1000 (Al-NU-1000) has been characterized with a comprehensive suite of techniques that points to the formation of aluminum oxide clusters well dispersed through the framework and stabilized by confinement within small pores intrinsic to the NU-1000 structure. Experimental evidence allows for identification of spectroscopic similarities between Al-NU-1000 and γ-Al 2O 3. Density functional theory modeling provides structures and simulatedmore » spectra the relevance of which can be assessed via comparison to experimental IR and EXAFS data. As a result, the catalytic performance of Al-NU-1000 has been benchmarked against γ-Al 2O 3, with promising results in terms of selectivity.« less
Bose-Einstein correlation within the framework of hadronic mechanics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burande, Chandrakant S.
The Bose-Einstein correlation is the phenomenon in which protons and antiprotons collide at extremely high energies; coalesce one into the other resulting into the fireball of finite dimension. They annihilate each other and produces large number of mesons that remain correlated at distances very large compared to the size of the fireball. It was believed that Einstein’s special relativity and relativistic quantum mechanics are the valid frameworks to represent this phenomenon. Although, these frameworks are incomplete and require arbitrary parameters (chaoticity) to fit the experimental data which are prohibited by the basic axioms of relativistic quantum mechanics, such as thatmore » for the vacuum expectation values. Moreover, correlated mesons can not be treated as a finite set of isolated point-like particles because it is non-local event due to overlapping of wavepackets. Therefore, the Bose-Einstein correlation is incompatible with the axiom of expectation values of quantum mechanics. In contrary, relativistic hadronic mechanics constructed by Santilli allows an exact representation of the experimental data of the Bose-Einstein correlation and restore the validity of the Lorentz and Poincare symmetries under nonlocal and non-Hamiltonian internal effects. Further, F. Cardone and R. Mignani observed that the Bose-Einstein two-point correlation function derived by Santilli is perfectly matched with experimental data at high energy.« less
Rasmussen, Peter M.; Smith, Amy F.; Sakadžić, Sava; Boas, David A.; Pries, Axel R.; Secomb, Timothy W.; Østergaard, Leif
2017-01-01
Objective In vivo imaging of the microcirculation and network-oriented modeling have emerged as powerful means of studying microvascular function and understanding its physiological significance. Network-oriented modeling may provide the means of summarizing vast amounts of data produced by high-throughput imaging techniques in terms of key, physiological indices. To estimate such indices with sufficient certainty, however, network-oriented analysis must be robust to the inevitable presence of uncertainty due to measurement errors as well as model errors. Methods We propose the Bayesian probabilistic data analysis framework as a means of integrating experimental measurements and network model simulations into a combined and statistically coherent analysis. The framework naturally handles noisy measurements and provides posterior distributions of model parameters as well as physiological indices associated with uncertainty. Results We applied the analysis framework to experimental data from three rat mesentery networks and one mouse brain cortex network. We inferred distributions for more than five hundred unknown pressure and hematocrit boundary conditions. Model predictions were consistent with previous analyses, and remained robust when measurements were omitted from model calibration. Conclusion Our Bayesian probabilistic approach may be suitable for optimizing data acquisition and for analyzing and reporting large datasets acquired as part of microvascular imaging studies. PMID:27987383
Tile prediction schemes for wide area motion imagery maps in GIS
NASA Astrophysics Data System (ADS)
Michael, Chris J.; Lin, Bruce Y.
2017-11-01
Wide-area surveillance, traffic monitoring, and emergency management are just several of many applications benefiting from the incorporation of Wide-Area Motion Imagery (WAMI) maps into geographic information systems. Though the use of motion imagery as a GIS base map via the Web Map Service (WMS) standard is not a new concept, effectively streaming imagery is particularly challenging due to its large scale and the multidimensionally interactive nature of clients that use WMS. Ineffective streaming from a server to one or more clients can unnecessarily overwhelm network bandwidth and cause frustratingly large amounts of latency in visualization to the user. Seamlessly streaming WAMI through GIS requires good prediction to accurately guess the tiles of the video that will be traversed in the near future. In this study, we present an experimental framework for such prediction schemes by presenting a stochastic interaction model that represents a human user's interaction with a GIS video map. We then propose several algorithms by which the tiles of the stream may be predicted. Results collected both within the experimental framework and using human analyst trajectories show that, though each algorithm thrives under certain constraints, the novel Markovian algorithm yields the best results overall. Furthermore, we make the argument that the proposed experimental framework is sufficient for the study of these prediction schemes.
Investigation of Effective Strategies for Developing Creative Science Thinking
ERIC Educational Resources Information Center
Yang, Kuay-Keng; Lee, Ling; Hong, Zuway-R; Lin, Huann-shyang
2016-01-01
The purpose of this study was to explore the effectiveness of the creative inquiry-based science teaching on students' creative science thinking and science inquiry performance. A quasi-experimental design consisting one experimental group (N = 20) and one comparison group (N = 24) with pretest and post-test was conducted. The framework of the…
The Use of Techniques of Sensory Evaluation as a Framework for Teaching Experimental Methods.
ERIC Educational Resources Information Center
Bennett, R.; Hamilton, M.
1981-01-01
Describes sensory assessment techniques and conditions for their satisfactory performance, including how they can provide open-ended exercises and advantages as relatively inexpensive and simple methods of teaching experimentation. Experiments described focus on diffusion of salt into potatoes after being cooked in boiled salted water. (Author/JN)
Unpacking the Hidden Efficacies of Learning in Productive Failure
ERIC Educational Resources Information Center
Hung, David; Chen, Victor; Lim, Seo Hong
2009-01-01
This paper describes a framework for learning where learners undergo experimentations with the phenomena at hand according to progressive and staged goals. Bowling is used as a case study in this paper. The premise for experimentations is that learners can experience hidden efficacies, including the formation of "bad habits." A distinction is made…
Understanding Leadership: An Experimental-Experiential Model
ERIC Educational Resources Information Center
Hole, George T.
2014-01-01
Books about leadership are dangerous to readers who fantasize about being leaders or apply leadership ideas as if they were proven formulas. As an antidote, I offer an experimental framework in which any leadership-management model can be tested to gain experiential understanding of the model. As a result one can gain reality-based insights about…
NASA Astrophysics Data System (ADS)
Shao, Hongbing
Software testing with scientific software systems often suffers from test oracle problem, i.e., lack of test oracles. Amsterdam discrete dipole approximation code (ADDA) is a scientific software system that can be used to simulate light scattering of scatterers of various types. Testing of ADDA suffers from "test oracle problem". In this thesis work, I established a testing framework to test scientific software systems and evaluated this framework using ADDA as a case study. To test ADDA, I first used CMMIE code as the pseudo oracle to test ADDA in simulating light scattering of a homogeneous sphere scatterer. Comparable results were obtained between ADDA and CMMIE code. This validated ADDA for use with homogeneous sphere scatterers. Then I used experimental result obtained for light scattering of a homogeneous sphere to validate use of ADDA with sphere scatterers. ADDA produced light scattering simulation comparable to the experimentally measured result. This further validated the use of ADDA for simulating light scattering of sphere scatterers. Then I used metamorphic testing to generate test cases covering scatterers of various geometries, orientations, homogeneity or non-homogeneity. ADDA was tested under each of these test cases and all tests passed. The use of statistical analysis together with metamorphic testing is discussed as a future direction. In short, using ADDA as a case study, I established a testing framework, including use of pseudo oracles, experimental results and the metamorphic testing techniques to test scientific software systems that suffer from test oracle problems. Each of these techniques is necessary and contributes to the testing of the software under test.
Clark, Ryan L; McGinley, Laura L; Purdy, Hugh M; Korosh, Travis C; Reed, Jennifer L; Root, Thatcher W; Pfleger, Brian F
2018-03-27
Cyanobacteria are photosynthetic microorganisms whose metabolism can be modified through genetic engineering for production of a wide variety of molecules directly from CO 2 , light, and nutrients. Diverse molecules have been produced in small quantities by engineered cyanobacteria to demonstrate the feasibility of photosynthetic biorefineries. Consequently, there is interest in engineering these microorganisms to increase titer and productivity to meet industrial metrics. Unfortunately, differing experimental conditions and cultivation techniques confound comparisons of strains and metabolic engineering strategies. In this work, we discuss the factors governing photoautotrophic growth and demonstrate nutritionally replete conditions in which a model cyanobacterium can be grown to stationary phase with light as the sole limiting substrate. We introduce a mathematical framework for understanding the dynamics of growth and product secretion in light-limited cyanobacterial cultures. Using this framework, we demonstrate how cyanobacterial growth in differing experimental systems can be easily scaled by the volumetric photon delivery rate using the model organisms Synechococcus sp. strain PCC7002 and Synechococcus elongatus strain UTEX2973. We use this framework to predict scaled up growth and product secretion in 1L photobioreactors of two strains of Synechococcus PCC7002 engineered for production of l-lactate or L-lysine. The analytical framework developed in this work serves as a guide for future metabolic engineering studies of cyanobacteria to allow better comparison of experiments performed in different experimental systems and to further investigate the dynamics of growth and product secretion. Copyright © 2018 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.
2013-01-01
Background Optimization procedures to identify gene knockouts for targeted biochemical overproduction have been widely in use in modern metabolic engineering. Flux balance analysis (FBA) framework has provided conceptual simplifications for genome-scale dynamic analysis at steady states. Based on FBA, many current optimization methods for targeted bio-productions have been developed under the maximum cell growth assumption. The optimization problem to derive gene knockout strategies recently has been formulated as a bi-level programming problem in OptKnock for maximum targeted bio-productions with maximum growth rates. However, it has been shown that knockout mutants in fact reach the steady states with the minimization of metabolic adjustment (MOMA) from the corresponding wild-type strains instead of having maximal growth rates after genetic or metabolic intervention. In this work, we propose a new bi-level computational framework--MOMAKnock--which can derive robust knockout strategies under the MOMA flux distribution approximation. Methods In this new bi-level optimization framework, we aim to maximize the production of targeted chemicals by identifying candidate knockout genes or reactions under phenotypic constraints approximated by the MOMA assumption. Hence, the targeted chemical production is the primary objective of MOMAKnock while the MOMA assumption is formulated as the inner problem of constraining the knockout metabolic flux to be as close as possible to the steady-state phenotypes of wide-type strains. As this new inner problem becomes a quadratic programming problem, a novel adaptive piecewise linearization algorithm is developed in this paper to obtain the exact optimal solution to this new bi-level integer quadratic programming problem for MOMAKnock. Results Our new MOMAKnock model and the adaptive piecewise linearization solution algorithm are tested with a small E. coli core metabolic network and a large-scale iAF1260 E. coli metabolic network. The derived knockout strategies are compared with those from OptKnock. Our preliminary experimental results show that MOMAKnock can provide improved targeted productions with more robust knockout strategies. PMID:23368729
Data governance in predictive toxicology: A review.
Fu, Xin; Wojak, Anna; Neagu, Daniel; Ridley, Mick; Travis, Kim
2011-07-13
Due to recent advances in data storage and sharing for further data processing in predictive toxicology, there is an increasing need for flexible data representations, secure and consistent data curation and automated data quality checking. Toxicity prediction involves multidisciplinary data. There are hundreds of collections of chemical, biological and toxicological data that are widely dispersed, mostly in the open literature, professional research bodies and commercial companies. In order to better manage and make full use of such large amount of toxicity data, there is a trend to develop functionalities aiming towards data governance in predictive toxicology to formalise a set of processes to guarantee high data quality and better data management. In this paper, data quality mainly refers in a data storage sense (e.g. accuracy, completeness and integrity) and not in a toxicological sense (e.g. the quality of experimental results). This paper reviews seven widely used predictive toxicology data sources and applications, with a particular focus on their data governance aspects, including: data accuracy, data completeness, data integrity, metadata and its management, data availability and data authorisation. This review reveals the current problems (e.g. lack of systematic and standard measures of data quality) and desirable needs (e.g. better management and further use of captured metadata and the development of flexible multi-level user access authorisation schemas) of predictive toxicology data sources development. The analytical results will help to address a significant gap in toxicology data quality assessment and lead to the development of novel frameworks for predictive toxicology data and model governance. While the discussed public data sources are well developed, there nevertheless remain some gaps in the development of a data governance framework to support predictive toxicology. In this paper, data governance is identified as the new challenge in predictive toxicology, and a good use of it may provide a promising framework for developing high quality and easy accessible toxicity data repositories. This paper also identifies important research directions that require further investigation in this area.
Data governance in predictive toxicology: A review
2011-01-01
Background Due to recent advances in data storage and sharing for further data processing in predictive toxicology, there is an increasing need for flexible data representations, secure and consistent data curation and automated data quality checking. Toxicity prediction involves multidisciplinary data. There are hundreds of collections of chemical, biological and toxicological data that are widely dispersed, mostly in the open literature, professional research bodies and commercial companies. In order to better manage and make full use of such large amount of toxicity data, there is a trend to develop functionalities aiming towards data governance in predictive toxicology to formalise a set of processes to guarantee high data quality and better data management. In this paper, data quality mainly refers in a data storage sense (e.g. accuracy, completeness and integrity) and not in a toxicological sense (e.g. the quality of experimental results). Results This paper reviews seven widely used predictive toxicology data sources and applications, with a particular focus on their data governance aspects, including: data accuracy, data completeness, data integrity, metadata and its management, data availability and data authorisation. This review reveals the current problems (e.g. lack of systematic and standard measures of data quality) and desirable needs (e.g. better management and further use of captured metadata and the development of flexible multi-level user access authorisation schemas) of predictive toxicology data sources development. The analytical results will help to address a significant gap in toxicology data quality assessment and lead to the development of novel frameworks for predictive toxicology data and model governance. Conclusions While the discussed public data sources are well developed, there nevertheless remain some gaps in the development of a data governance framework to support predictive toxicology. In this paper, data governance is identified as the new challenge in predictive toxicology, and a good use of it may provide a promising framework for developing high quality and easy accessible toxicity data repositories. This paper also identifies important research directions that require further investigation in this area. PMID:21752279
Ren, Shaogang; Zeng, Bo; Qian, Xiaoning
2013-01-01
Optimization procedures to identify gene knockouts for targeted biochemical overproduction have been widely in use in modern metabolic engineering. Flux balance analysis (FBA) framework has provided conceptual simplifications for genome-scale dynamic analysis at steady states. Based on FBA, many current optimization methods for targeted bio-productions have been developed under the maximum cell growth assumption. The optimization problem to derive gene knockout strategies recently has been formulated as a bi-level programming problem in OptKnock for maximum targeted bio-productions with maximum growth rates. However, it has been shown that knockout mutants in fact reach the steady states with the minimization of metabolic adjustment (MOMA) from the corresponding wild-type strains instead of having maximal growth rates after genetic or metabolic intervention. In this work, we propose a new bi-level computational framework--MOMAKnock--which can derive robust knockout strategies under the MOMA flux distribution approximation. In this new bi-level optimization framework, we aim to maximize the production of targeted chemicals by identifying candidate knockout genes or reactions under phenotypic constraints approximated by the MOMA assumption. Hence, the targeted chemical production is the primary objective of MOMAKnock while the MOMA assumption is formulated as the inner problem of constraining the knockout metabolic flux to be as close as possible to the steady-state phenotypes of wide-type strains. As this new inner problem becomes a quadratic programming problem, a novel adaptive piecewise linearization algorithm is developed in this paper to obtain the exact optimal solution to this new bi-level integer quadratic programming problem for MOMAKnock. Our new MOMAKnock model and the adaptive piecewise linearization solution algorithm are tested with a small E. coli core metabolic network and a large-scale iAF1260 E. coli metabolic network. The derived knockout strategies are compared with those from OptKnock. Our preliminary experimental results show that MOMAKnock can provide improved targeted productions with more robust knockout strategies.
Mavandadi, Sam; Feng, Steve; Yu, Frank; Dimitrov, Stoyan; Nielsen-Saines, Karin; Prescott, William R; Ozcan, Aydogan
2012-01-01
We propose a methodology for digitally fusing diagnostic decisions made by multiple medical experts in order to improve accuracy of diagnosis. Toward this goal, we report an experimental study involving nine experts, where each one was given more than 8,000 digital microscopic images of individual human red blood cells and asked to identify malaria infected cells. The results of this experiment reveal that even highly trained medical experts are not always self-consistent in their diagnostic decisions and that there exists a fair level of disagreement among experts, even for binary decisions (i.e., infected vs. uninfected). To tackle this general medical diagnosis problem, we propose a probabilistic algorithm to fuse the decisions made by trained medical experts to robustly achieve higher levels of accuracy when compared to individual experts making such decisions. By modelling the decisions of experts as a three component mixture model and solving for the underlying parameters using the Expectation Maximisation algorithm, we demonstrate the efficacy of our approach which significantly improves the overall diagnostic accuracy of malaria infected cells. Additionally, we present a mathematical framework for performing 'slide-level' diagnosis by using individual 'cell-level' diagnosis data, shedding more light on the statistical rules that should govern the routine practice in examination of e.g., thin blood smear samples. This framework could be generalized for various other tele-pathology needs, and can be used by trained experts within an efficient tele-medicine platform.
Mikuła, Andrzej; Król, Magdalena; Mozgawa, Włodzimierz; Koleżyński, Andrzej
2018-04-15
Vibrational spectroscopy can be considered as one of the most important methods used for structural characterization of various porous aluminosilicate materials, including zeolites. On the other hand, vibrational spectra of zeolites are still difficult to interpret, particularly in the pseudolattice region, where bands related to ring oscillations can be observed. Using combination of theoretical and computational approach, a detailed analysis of these regions of spectra is possible; such analysis should be, however, carried out employing models with different level of complexity and simultaneously the same theory level. In this work, an attempt was made to identify ring oscillations in vibrational spectra of selected zeolite structures. A series of ab initio calculations focused on S4R, S6R, and as a novelty, 5-1 isolated clusters, as well as periodic siliceous frameworks built from those building units (ferrierite (FER), mordenite (MOR) and heulandite (HEU) type) have been carried out. Due to the hierarchical structure of zeolite frameworks it can be expected that the total envelope of the zeolite spectra should be with good accuracy a sum of the spectra of structural elements that build each zeolite framework. Based on the results of HF calculations, normal vibrations have been visualized and detailed analysis of pseudolattice range of resulting theoretical spectra have been carried out. Obtained results have been applied for interpretation of experimental spectra of selected zeolites. Copyright © 2018 Elsevier B.V. All rights reserved.
Pedestrian detection from thermal images: A sparse representation based approach
NASA Astrophysics Data System (ADS)
Qi, Bin; John, Vijay; Liu, Zheng; Mita, Seiichi
2016-05-01
Pedestrian detection, a key technology in computer vision, plays a paramount role in the applications of advanced driver assistant systems (ADASs) and autonomous vehicles. The objective of pedestrian detection is to identify and locate people in a dynamic environment so that accidents can be avoided. With significant variations introduced by illumination, occlusion, articulated pose, and complex background, pedestrian detection is a challenging task for visual perception. Different from visible images, thermal images are captured and presented with intensity maps based objects' emissivity, and thus have an enhanced spectral range to make human beings perceptible from the cool background. In this study, a sparse representation based approach is proposed for pedestrian detection from thermal images. We first adopted the histogram of sparse code to represent image features and then detect pedestrian with the extracted features in an unimodal and a multimodal framework respectively. In the unimodal framework, two types of dictionaries, i.e. joint dictionary and individual dictionary, are built by learning from prepared training samples. In the multimodal framework, a weighted fusion scheme is proposed to further highlight the contributions from features with higher separability. To validate the proposed approach, experiments were conducted to compare with three widely used features: Haar wavelets (HWs), histogram of oriented gradients (HOG), and histogram of phase congruency (HPC) as well as two classification methods, i.e. AdaBoost and support vector machine (SVM). Experimental results on a publicly available data set demonstrate the superiority of the proposed approach.
Intuitive experimentation in the physical world.
Bramley, Neil R; Gerstenberg, Tobias; Tenenbaum, Joshua B; Gureckis, Todd M
2018-06-06
Many aspects of our physical environment are hidden. For example, it is hard to estimate how heavy an object is from visual observation alone. In this paper we examine how people actively "experiment" within the physical world to discover such latent properties. In the first part of the paper, we develop a novel framework for the quantitative analysis of the information produced by physical interactions. We then describe two experiments that present participants with moving objects in "microworlds" that operate according to continuous spatiotemporal dynamics similar to everyday physics (i.e., forces of gravity, friction, etc.). Participants were asked to interact with objects in the microworlds in order to identify their masses, or the forces of attraction/repulsion that governed their movement. Using our modeling framework, we find that learners who freely interacted with the physical system selectively produced evidence that revealed the physical property consistent with their inquiry goal. As a result, their inferences were more accurate than for passive observers and, in some contexts, for yoked participants who watched video replays of an active learner's interactions. We characterize active learners' actions into a range of micro-experiment strategies and discuss how these might be learned or generalized from past experience. The technical contribution of this work is the development of a novel analytic framework and methodology for the study of interactively learning about the physical world. Its empirical contribution is the demonstration of sophisticated goal directed human active learning in a naturalistic context. Copyright © 2018 Elsevier Inc. All rights reserved.
Adapting evidence-based interventions using a common theory, practices, and principles.
Rotheram-Borus, Mary Jane; Swendeman, Dallas; Becker, Kimberly D
2014-01-01
Hundreds of validated evidence-based intervention programs (EBIP) aim to improve families' well-being; however, most are not broadly adopted. As an alternative diffusion strategy, we created wellness centers to reach families' everyday lives with a prevention framework. At two wellness centers, one in a middle-class neighborhood and one in a low-income neighborhood, popular local activity leaders (instructors of martial arts, yoga, sports, music, dancing, Zumba), and motivated parents were trained to be Family Mentors. Trainings focused on a framework that taught synthesized, foundational prevention science theory, practice elements, and principles, applied to specific content areas (parenting, social skills, and obesity). Family Mentors were then allowed to adapt scripts and activities based on their cultural experiences but were closely monitored and supervised over time. The framework was implemented in a range of activities (summer camps, coaching) aimed at improving social, emotional, and behavioral outcomes. Successes and challenges are discussed for (a) engaging parents and communities; (b) identifying and training Family Mentors to promote children and families' well-being; and (c) gathering data for supervision, outcome evaluation, and continuous quality improvement. To broadly diffuse prevention to families, far more experimentation is needed with alternative and engaging implementation strategies that are enhanced with knowledge harvested from researchers' past 30 years of experience creating EBIP. One strategy is to train local parents and popular activity leaders in applying robust prevention science theory, common practice elements, and principles of EBIP. More systematic evaluation of such innovations is needed.
Review of "Stuck Schools: A Framework for Identifying Schools Where Students Need Change--Now"
ERIC Educational Resources Information Center
Lee, Jaekyung
2010-01-01
The Education Trust research report "Stuck Schools" suggests a framework for identifying chronically low-performing schools in need of turnaround. The study uses Maryland and Indiana to show that some low-performing schools make progress while others remain stagnant. The report has four serious problems of reliability and validity,…
A Theoretical Framework to Guide the Re-Engineering of Technology Education
ERIC Educational Resources Information Center
Kelley, Todd; Kellam, Nadia
2009-01-01
Before leaders in technology education are able to identify a theoretical framework upon which a curriculum is to stand, they must first grapple with two opposing views of the purpose of technology education--education for all learners or career/technical education. Dakers (2006) identifies two opposing philosophies that can serve as a framework…
Identifying Core Mobile Learning Faculty Competencies Based Integrated Approach: A Delphi Study
ERIC Educational Resources Information Center
Elbarbary, Rafik Said
2015-01-01
This study is based on the integrated approach as a concept framework to identify, categorize, and rank a key component of mobile learning core competencies for Egyptian faculty members in higher education. The field investigation framework used four rounds Delphi technique to determine the importance rate of each component of core competencies…
Plant trait detection with multi-scale spectrometry
NASA Astrophysics Data System (ADS)
Gamon, J. A.; Wang, R.
2017-12-01
Proximal and remote sensing using imaging spectrometry offers new opportunities for detecting plant traits, with benefits for phenotyping, productivity estimation, stress detection, and biodiversity studies. Using proximal and airborne spectrometry, we evaluated variation in plant optical properties at various spatial and spectral scales with the goal of identifying optimal scales for distinguishing plant traits related to photosynthetic function. Using directed approaches based on physiological vegetation indices, and statistical approaches based on spectral information content, we explored alternate ways of distinguishing plant traits with imaging spectrometry. With both leaf traits and canopy structure contributing to the signals, results exhibit a strong scale dependence. Our results demonstrate the benefits of multi-scale experimental approaches within a clear conceptual framework when applying remote sensing methods to plant trait detection for phenotyping, productivity, and biodiversity studies.
Yaacoubi, Slah; McKeon, Peter; Ke, Weina; Declercq, Nico F.; Dahmene, Fethi
2017-01-01
This paper presents an overview and description of the approach to be used to investigate the behavior and the defect sensitivity of various ultrasonic guided wave (UGW) modes propagating specifically in composite cylindrical vessels in the framework of the safety of hydrogen energy transportation such as hydrogen-powered aircrafts. These structures which consist of thick and multi-layer composites are envisioned for housing hydrogen gas at high pressures. Due to safety concerns associated with a weakened structure, structural health monitoring techniques are needed. A procedure for optimizing damage detection in these structural types is presented. It is shown that a finite element method can help identify useful experimental parameters including frequency range, excitation type, and receiver placement. PMID:28925961
SPECTRa-T: machine-based data extraction and semantic searching of chemistry e-theses.
Downing, Jim; Harvey, Matt J; Morgan, Peter B; Murray-Rust, Peter; Rzepa, Henry S; Stewart, Diana C; Tonge, Alan P; Townsend, Joe A
2010-02-22
The SPECTRa-T project has developed text-mining tools to extract named chemical entities (NCEs), such as chemical names and terms, and chemical objects (COs), e.g., experimental spectral assignments and physical chemistry properties, from electronic theses (e-theses). Although NCEs were readily identified within the two major document formats studied, only the use of structured documents enabled identification of chemical objects and their association with the relevant chemical entity (e.g., systematic chemical name). A corpus of theses was analyzed and it is shown that a high degree of semantic information can be extracted from structured documents. This integrated information has been deposited in a persistent Resource Description Framework (RDF) triple-store that allows users to conduct semantic searches. The strength and weaknesses of several document formats are reviewed.
Vibrations of single-crystal gold nanorods and nanowires
NASA Astrophysics Data System (ADS)
Saviot, L.
2018-04-01
The vibrations of gold nanowires and nanorods are investigated numerically in the framework of continuum elasticity using the Rayleigh-Ritz variational method. Special attention is paid to identify the vibrations relevant in Raman scattering experiments. A comprehensive description of the vibrations of nanorods is proposed by determining their symmetry, comparing with standing waves in the corresponding nanowires, and estimating their Raman intensity. The role of experimentally relevant parameters such as the anisotropic cubic lattice structure, the presence of faceted lateral surfaces, and the shape of the ends of the nanorods is evaluated. Elastic anisotropy is shown to play a significant role contrarily to the presence of facets. Localized vibrations are found for nanorods with flat ends. Their evolution as the shape of the ends is changed to half-spheres is discussed.
Gibberellin control of stamen development: a fertile field.
Plackett, Andrew R G; Thomas, Stephen G; Wilson, Zoe A; Hedden, Peter
2011-10-01
Stamen development is governed by a conserved genetic pathway, within which the role of hormones has been the subject of considerable recent research. Our understanding of the involvement of gibberellin (GA) signalling in this developmental process is further advanced than for the other phytohormones, and here we review recent experimental results in rice (Oryza sativa) and Arabidopsis (Arabidopsis thaliana) that have provided insight into the timing and mechanisms of GA regulation of stamen development, identifying the tapetum and developing pollen as major targets. GA signalling governs both tapetum secretory functions and entry into programmed cell death via the GAMYB class of transcription factor, the targets of which integrate with the established genetic framework for the regulation of tapetum function at multiple hierarchical levels. Copyright © 2011 Elsevier Ltd. All rights reserved.
Hopf-link topological nodal-loop semimetals
NASA Astrophysics Data System (ADS)
Zhou, Yao; Xiong, Feng; Wan, Xiangang; An, Jin
2018-04-01
We construct a generic two-band model which can describe topological semimetals with multiple closed nodal loops. All the existing multi-nodal-loop semimetals, including the nodal-net, nodal-chain, and Hopf-link states, can be examined within the same framework. Based on a two-nodal-loop model, the corresponding drumhead surface states for these topologically different bulk states are studied and compared with each other. The connection of our model with Hopf insulators is also discussed. Furthermore, to identify experimentally these topologically different semimetal states, especially to distinguish the Hopf-link from unlinked ones, we also investigate their Landau levels. It is found that the Hopf-link state can be characterized by the existence of a quadruply degenerate zero-energy Landau band, regardless of the direction of the magnetic field.
NASA Technical Reports Server (NTRS)
Manganaris, Stefanos; Fisher, Doug; Kulkarni, Deepak
1993-01-01
In this paper we address the problem of detecting and diagnosing faults in physical systems, for which neither prior expertise for the task nor suitable system models are available. We propose an architecture that integrates the on-line acquisition and exploitation of monitoring and diagnostic knowledge. The focus of the paper is on the component of the architecture that discovers classes of behaviors with similar characteristics by observing a system in operation. We investigate a characterization of behaviors based on best fitting approximation models. An experimental prototype has been implemented to test it. We present preliminary results in diagnosing faults of the Reaction Control System of the Space Shuttle. The merits and limitations of the approach are identified and directions for future work are set.
Describing the impact of health research: a Research Impact Framework
Kuruvilla, Shyama; Mays, Nicholas; Pleasant, Andrew; Walt, Gill
2006-01-01
Background Researchers are increasingly required to describe the impact of their work, e.g. in grant proposals, project reports, press releases and research assessment exercises. Specialised impact assessment studies can be difficult to replicate and may require resources and skills not available to individual researchers. Researchers are often hard-pressed to identify and describe research impacts and ad hoc accounts do not facilitate comparison across time or projects. Methods The Research Impact Framework was developed by identifying potential areas of health research impact from the research impact assessment literature and based on research assessment criteria, for example, as set out by the UK Research Assessment Exercise panels. A prototype of the framework was used to guide an analysis of the impact of selected research projects at the London School of Hygiene and Tropical Medicine. Additional areas of impact were identified in the process and researchers also provided feedback on which descriptive categories they thought were useful and valid vis-à-vis the nature and impact of their work. Results We identified four broad areas of impact: I. Research-related impacts; II. Policy impacts; III. Service impacts: health and intersectoral and IV. Societal impacts. Within each of these areas, further descriptive categories were identified. For example, the nature of research impact on policy can be described using the following categorisation, put forward by Weiss: Instrumental use where research findings drive policy-making; Mobilisation of support where research provides support for policy proposals; Conceptual use where research influences the concepts and language of policy deliberations and Redefining/wider influence where research leads to rethinking and changing established practices and beliefs. Conclusion Researchers, while initially sceptical, found that the Research Impact Framework provided prompts and descriptive categories that helped them systematically identify a range of specific and verifiable impacts related to their work (compared to ad hoc approaches they had previously used). The framework could also help researchers think through implementation strategies and identify unintended or harmful effects. The standardised structure of the framework facilitates comparison of research impacts across projects and time, which is useful from analytical, management and assessment perspectives. PMID:17049092
Release of genetically engineered insects: a framework to identify potential ecological effects
David, Aaron S; Kaser, Joe M; Morey, Amy C; Roth, Alexander M; Andow, David A
2013-01-01
Genetically engineered (GE) insects have the potential to radically change pest management worldwide. With recent approvals of GE insect releases, there is a need for a synthesized framework to evaluate their potential ecological and evolutionary effects. The effects may occur in two phases: a transitory phase when the focal population changes in density, and a steady state phase when it reaches a new, constant density. We review potential effects of a rapid change in insect density related to population outbreaks, biological control, invasive species, and other GE organisms to identify a comprehensive list of potential ecological and evolutionary effects of GE insect releases. We apply this framework to the Anopheles gambiae mosquito – a malaria vector being engineered to suppress the wild mosquito population – to identify effects that may occur during the transitory and steady state phases after release. Our methodology reveals many potential effects in each phase, perhaps most notably those dealing with immunity in the transitory phase, and with pathogen and vector evolution in the steady state phase. Importantly, this framework identifies knowledge gaps in mosquito ecology. Identifying effects in the transitory and steady state phases allows more rigorous identification of the potential ecological effects of GE insect release. PMID:24198955
Dar-Nimrod, Ilan; Heine, Steven J
2011-09-01
In the target article (Dar-Nimrod & Heine, 2011), we provided a social-cognitive framework which identified genetic essentialist biases and their implications. In their commentaries, Haslam (2011) and Turkheimer (2011) indicated their general agreement with this framework but highlighted some important points for consideration. Haslam suggested that neuroessentialism is a comparable kind of essentialist bias and identified similarities with the genetic essentialism framework. In response, we acknowledge similarities but also identify qualitative and quantitative differences between genetic essentialism and other kinds of essentialist biases. Turkheimer challenged us to extend our discussion to address the question of how should people respond to genetic etiological information, critiqued the use of heritability coefficients, and identified a new construct (1 - rMZ), which may be termed a free-will coefficient. In response, we emphasize the need to transform interactionist explanations from being empty platitudes to becoming the default conceptual framework; we wholeheartedly accept his critical view of heritability coefficient estimates (but acknowledge a more limited utility for them); and we are intrigued by his conceptual interest in identifying free-will coefficients yet warn against falling into pitfalls similar to those that were stumbled into in the past. (PsycINFO Database Record (c) 2011 APA, all rights reserved).
The Foundations Framework for Developing and Reporting New Models of Care for Multimorbidity
Stokes, Jonathan; Man, Mei-See; Guthrie, Bruce; Mercer, Stewart W.; Salisbury, Chris; Bower, Peter
2017-01-01
PURPOSE Multimorbidity challenges health systems globally. New models of care are urgently needed to better manage patients with multimorbidity; however, there is no agreed framework for designing and reporting models of care for multimorbidity and their evaluation. METHODS Based on findings from a literature search to identify models of care for multimorbidity, we developed a framework to describe these models. We illustrate the application of the framework by identifying the focus and gaps in current models of care, and by describing the evolution of models over time. RESULTS Our framework describes each model in terms of its theoretical basis and target population (the foundations of the model) and of the elements of care implemented to deliver the model. We categorized elements of care into 3 types: (1) clinical focus, (2) organization of care, (3) support for model delivery. Application of the framework identified a limited use of theory in model design and a strong focus on some patient groups (elderly, high users) more than others (younger patients, deprived populations). We found changes in elements with time, with a decrease in models implementing home care and an increase in models offering extended appointments. CONCLUSIONS By encouragin greater clarity about the underpinning theory and target population, and by categorizing the wide range of potentially important elements of an intervention to improve care for patients with multimorbidity, the framework may be useful in designing and reporting models of care and help advance the currently limited evidence base. PMID:29133498
First, Eric L; Gounaris, Chrysanthos E; Floudas, Christodoulos A
2013-05-07
With the growing number of zeolites and metal-organic frameworks (MOFs) available, computational methods are needed to screen databases of structures to identify those most suitable for applications of interest. We have developed novel methods based on mathematical optimization to predict the shape selectivity of zeolites and MOFs in three dimensions by considering the energy costs of transport through possible pathways. Our approach is applied to databases of over 1800 microporous materials including zeolites, MOFs, zeolitic imidazolate frameworks, and hypothetical MOFs. New materials are identified for applications in gas separations (CO2/N2, CO2/CH4, and CO2/H2), air separation (O2/N2), and chemicals (propane/propylene, ethane/ethylene, styrene/ethylbenzene, and xylenes).
Symptom outcomes important to women with anal incontinence: a conceptual framework.
Sung, Vivian W; Rogers, Rebecca G; Bann, Carla M; Arya, Lily; Barber, Matthew D; Lowder, Jerry; Lukacz, Emily S; Markland, Alayne; Siddiqui, Nazema; Wilmot, Amanda; Meikle, Susan F
2014-05-01
To develop a framework that describes the most important symptom outcomes for anal incontinence treatment from the patient perspective. A conceptual framework was developed by the Pelvic Floor Disorders Network based on four semistructured focus groups and confirmed in two sets of 10 cognitive interviews including women with anal incontinence. We explored: 1) patient-preferred terminology for describing anal incontinence symptoms; 2) patient definitions of treatment "success"; 3) importance of symptoms and outcomes in the framework; and 4) conceptual gaps (defined as outcomes not previously identified as important). Sessions were conducted according to grounded theory transcribed, coded, and qualitatively and quantitatively analyzed to identify relevant themes. Content and face validity of the framework were further assessed using cognitive interviews. Thirty-four women participated in focus groups and 20 in cognitive interviews. Overall, 29 (54%) were aged 60 years or older, 42 (78%) were white, and 10 (19%) had a high school degree or less. Two overarching outcome themes were identified: "primary bowel leakage symptoms" and "ancillary bowel symptoms." Subdomains important in primary bowel leakage symptoms included leakage characteristics (symptom frequency, amount of leakage, symptom bother) and conditions when bowel leakage occurs (predictability, awareness, urgency). Subdomains important under ancillary bowel symptoms included emptying disorders (constipation, obstructed defecation, and wiping issues) and discomfort (pain, burning). New outcomes identified included predictability, awareness, wiping issues, and discomfort. Women with anal incontinence desire a wide range of symptom outcomes after treatment. These are captured in our conceptual framework, which can aid clinicians and researchers in assessing anal incontinence. LEVEL OF EVIEDENCE: II.
Hongsermeier, Tonya; Wright, Adam; Lewis, Janet; Bell, Douglas S; Middleton, Blackford
2013-01-01
Objective To identify key principles for establishing a national clinical decision support (CDS) knowledge sharing framework. Materials and methods As part of an initiative by the US Office of the National Coordinator for Health IT (ONC) to establish a framework for national CDS knowledge sharing, key stakeholders were identified. Stakeholders' viewpoints were obtained through surveys and in-depth interviews, and findings and relevant insights were summarized. Based on these insights, key principles were formulated for establishing a national CDS knowledge sharing framework. Results Nineteen key stakeholders were recruited, including six executives from electronic health record system vendors, seven executives from knowledge content producers, three executives from healthcare provider organizations, and three additional experts in clinical informatics. Based on these stakeholders' insights, five key principles were identified for effectively sharing CDS knowledge nationally. These principles are (1) prioritize and support the creation and maintenance of a national CDS knowledge sharing framework; (2) facilitate the development of high-value content and tooling, preferably in an open-source manner; (3) accelerate the development or licensing of required, pragmatic standards; (4) acknowledge and address medicolegal liability concerns; and (5) establish a self-sustaining business model. Discussion Based on the principles identified, a roadmap for national CDS knowledge sharing was developed through the ONC's Advancing CDS initiative. Conclusion The study findings may serve as a useful guide for ongoing activities by the ONC and others to establish a national framework for sharing CDS knowledge and improving clinical care. PMID:22865671
Gao, Wenliang; Jing, Yan; Yang, Jia; Zhou, Zhengyang; Yang, Dingfeng; Sun, Junliang; Lin, Jianhua; Cong, Rihong; Yang, Tao
2014-03-03
An open-framework gallium borate with intrinsic photocatalytic activities to water splitting has been discovered. Small inorganic molecules, H3BO3 and H3B3O6, are confined inside structural channels by multiple hydrogen bonds. It is the first example to experimentally show the structural template effect of boric acid in flux synthesis.
A competency framework for librarians involved in systematic reviews.
Townsend, Whitney A; Anderson, Patricia F; Ginier, Emily C; MacEachern, Mark P; Saylor, Kate M; Shipman, Barbara L; Smith, Judith E
2017-07-01
The project identified a set of core competencies for librarians who are involved in systematic reviews. A team of seven informationists with broad systematic review experience examined existing systematic review standards, conducted a literature search, and used their own expertise to identify core competencies and skills that are necessary to undertake various roles in systematic review projects. The team identified a total of six competencies for librarian involvement in systematic reviews: "Systematic review foundations," "Process management and communication," "Research methodology," "Comprehensive searching," "Data management," and "Reporting." Within each competency are the associated skills and knowledge pieces (indicators). Competence can be measured using an adaptation of Miller's Pyramid for Clinical Assessment, either through self-assessment or identification of formal assessment instruments. The Systematic Review Competencies Framework provides a standards-based, flexible way for librarians and organizations to identify areas of competence and areas in need of development to build capacity for systematic review integration. The framework can be used to identify or develop appropriate assessment tools and to target skill development opportunities.
Evaluating health-promoting schools in Hong Kong: development of a framework.
Lee, Albert; Cheng, Frances F K; St Leger, Lawry
2005-06-01
Health-promoting schools (HPS)/healthy schools have existed internationally for about 15 years. Yet there are few comprehensive evaluation frameworks available which enable the outcomes of HPS initiatives to be assessed. This paper identifies an evaluation framework developed in Hong Kong. The framework uses a range of approaches to explore what schools actually do in their health promotion and health education initiatives. The framework, which is based on the WHO (Western Pacific Regional Office) Guidelines for HPS, is described in detail. The appropriate instruments for data collection are described and their origins identified. The evaluation plan and protocol, which underpinned the very comprehensive evaluation in Hong Kong, are explained. Finally, a case is argued for evaluation of HPS to be more in line with the educational dynamics of schools and the research literature on effective schooling, rather than focusing primarily on health-related measures.
Leightley, Daniel; Yap, Moi Hoon
2018-03-02
The aim of this study was to compare the performance between young adults ( n = 15), healthy old people ( n = 10), and masters athletes ( n = 15) using a depth sensor and automated digital assessment framework. Participants were asked to complete a clinically validated assessment of the sit-to-stand technique (five repetitions), which was recorded using a depth sensor. A feature encoding and evaluation framework to assess balance, core, and limb performance using time- and speed-related measurements was applied to markerless motion capture data. The associations between the measurements and participant groups were examined and used to evaluate the assessment framework suitability. The proposed framework could identify phases of sit-to-stand, stability, transition style, and performance between participant groups with a high degree of accuracy. In summary, we found that a depth sensor coupled with the proposed framework could identify performance subtleties between groups.
2018-01-01
The aim of this study was to compare the performance between young adults (n = 15), healthy old people (n = 10), and masters athletes (n = 15) using a depth sensor and automated digital assessment framework. Participants were asked to complete a clinically validated assessment of the sit-to-stand technique (five repetitions), which was recorded using a depth sensor. A feature encoding and evaluation framework to assess balance, core, and limb performance using time- and speed-related measurements was applied to markerless motion capture data. The associations between the measurements and participant groups were examined and used to evaluate the assessment framework suitability. The proposed framework could identify phases of sit-to-stand, stability, transition style, and performance between participant groups with a high degree of accuracy. In summary, we found that a depth sensor coupled with the proposed framework could identify performance subtleties between groups. PMID:29498644
Virtual shelves in a digital library: a framework for access to networked information sources.
Patrick, T B; Springer, G K; Mitchell, J A; Sievert, M E
1995-01-01
Develop a framework for collections-based access to networked information sources that addresses the problem of location-dependent access to information sources. This framework uses a metaphor of a virtual shelf. A virtual shelf is a general-purpose server that is dedicated to a particular information subject class. The identifier of one of these servers identifies its subject class. Location-independent call numbers are assigned to information sources. Call numbers are based on standard vocabulary codes. The call numbers are first mapped to the location-independent identifiers of virtual shelves. When access to an information resource is required, a location directory provides a second mapping of these location-independent server identifiers to actual network locations. The framework has been implemented in two different systems. One system is based on the Open System Foundation/Distributed Computing Environment and the other is based on the World Wide Web. This framework applies in new ways traditional methods of library classification and cataloging. It is compatible with two traditional styles of selecting information searching and browsing. Traditional methods may be combined with new paradigms of information searching that will be able to take advantage of the special properties of digital information. Cooperation between the library-informational science community and the informatics community can provide a means for a continuing application of the knowledge and techniques of library science to the new problems of networked information sources.
Yang, Shan; Al-Hashimi, Hashim M.
2016-01-01
A growing number of studies employ time-averaged experimental data to determine dynamic ensembles of biomolecules. While it is well known that different ensembles can satisfy experimental data to within error, the extent and nature of these degeneracies, and their impact on the accuracy of the ensemble determination remains poorly understood. Here, we use simulations and a recently introduced metric for assessing ensemble similarity to explore degeneracies in determining ensembles using NMR residual dipolar couplings (RDCs) with specific application to A-form helices in RNA. Various target ensembles were constructed representing different domain-domain orientational distributions that are confined to a topologically restricted (<10%) conformational space. Five independent sets of ensemble averaged RDCs were then computed for each target ensemble and a ‘sample and select’ scheme used to identify degenerate ensembles that satisfy RDCs to within experimental uncertainty. We find that ensembles with different ensemble sizes and that can differ significantly from the target ensemble (by as much as ΣΩ ~ 0.4 where ΣΩ varies between 0 and 1 for maximum and minimum ensemble similarity, respectively) can satisfy the ensemble averaged RDCs. These deviations increase with the number of unique conformers and breadth of the target distribution, and result in significant uncertainty in determining conformational entropy (as large as 5 kcal/mol at T = 298 K). Nevertheless, the RDC-degenerate ensembles are biased towards populated regions of the target ensemble, and capture other essential features of the distribution, including the shape. Our results identify ensemble size as a major source of uncertainty in determining ensembles and suggest that NMR interactions such as RDCs and spin relaxation, on their own, do not carry the necessary information needed to determine conformational entropy at a useful level of precision. The framework introduced here provides a general approach for exploring degeneracies in ensemble determination for different types of experimental data. PMID:26131693
NASA Astrophysics Data System (ADS)
Mendoza, Sergio; Rothenberger, Michael; Hake, Alison; Fathy, Hosam
2016-03-01
This article presents a framework for optimizing the thermal cycle to estimate a battery cell's entropy coefficient at 20% state of charge (SOC). Our goal is to maximize Fisher identifiability: a measure of the accuracy with which a parameter can be estimated. Existing protocols in the literature for estimating entropy coefficients demand excessive laboratory time. Identifiability optimization makes it possible to achieve comparable accuracy levels in a fraction of the time. This article demonstrates this result for a set of lithium iron phosphate (LFP) cells. We conduct a 24-h experiment to obtain benchmark measurements of their entropy coefficients. We optimize a thermal cycle to maximize parameter identifiability for these cells. This optimization proceeds with respect to the coefficients of a Fourier discretization of this thermal cycle. Finally, we compare the estimated parameters using (i) the benchmark test, (ii) the optimized protocol, and (iii) a 15-h test from the literature (by Forgez et al.). The results are encouraging for two reasons. First, they confirm the simulation-based prediction that the optimized experiment can produce accurate parameter estimates in 2 h, compared to 15-24. Second, the optimized experiment also estimates a thermal time constant representing the effects of thermal capacitance and convection heat transfer.
Remote wave measurements using autonomous mobile robotic systems
NASA Astrophysics Data System (ADS)
Kurkin, Andrey; Zeziulin, Denis; Makarov, Vladimir; Belyakov, Vladimir; Tyugin, Dmitry; Pelinovsky, Efim
2016-04-01
The project covers the development of a technology for monitoring and forecasting the state of the coastal zone environment using radar equipment transported by autonomous mobile robotic systems (AMRS). Sought-after areas of application are the eastern and northern coasts of Russia, where continuous collection of information on topographic changes of the coastal zone and carrying out hydrodynamic measurements in inaccessible to human environment are needed. The intensity of the reflection of waves, received by radar surveillance, is directly related to the height of the waves. Mathematical models and algorithms for processing experimental data (signal selection, spectral analysis, wavelet analysis), recalculation of landwash from data on heights of waves far from the shore, determination of the threshold values of heights of waves far from the shore have been developed. There has been developed the program complex for functioning of the experimental prototype of AMRS, comprising the following modules: data loading module, reporting module, module of georeferencing, data analysis module, monitoring module, hardware control module, graphical user interface. Further work will be connected with carrying out tests of manufactured experimental prototype in conditions of selected routes coastline of Sakhalin Island. Conducting field tests will allow to reveal the shortcomings of development and to identify ways of optimization of the structure and functioning algorithms of AMRS, as well as functioning the measuring equipment. The presented results have been obtained in Nizhny Novgorod State Technical University n.a. R. Alekseev in the framework of the Federal Target Program «Research and development on priority directions of scientific-technological complex of Russia for 2014 - 2020 years» (agreement № 14.574.21.0089 (unique identifier of agreement - RFMEFI57414X0089)).
NASA Astrophysics Data System (ADS)
Fruchart, Michel; Vitelli, Vincenzo
2018-03-01
A theoretical framework for the design of so-called perturbative metamaterials, based on weakly interacting unit cells, has led to the experimental demonstration of a quadrupole topological insulator.
Wikis and Collaborative Writing Applications in Health Care: A Scoping Review
Grajales III, Francisco J; Faber, Marjan J; Kuziemsky, Craig E; Gagnon, Susie; Bilodeau, Andrea; Rioux, Simon; Nelen, Willianne LDM; Gagnon, Marie-Pierre; Turgeon, Alexis F; Aubin, Karine; Gold, Irving; Poitras, Julien; Eysenbach, Gunther; Kremer, Jan AM; Légaré, France
2013-01-01
Background Collaborative writing applications (eg, wikis and Google Documents) hold the potential to improve the use of evidence in both public health and health care. The rapid rise in their use has created the need for a systematic synthesis of the evidence of their impact as knowledge translation (KT) tools in the health care sector and for an inventory of the factors that affect their use. Objective Through the Levac six-stage methodology, a scoping review was undertaken to explore the depth and breadth of evidence about the effective, safe, and ethical use of wikis and collaborative writing applications (CWAs) in health care. Methods Multiple strategies were used to locate studies. Seven scientific databases and 6 grey literature sources were queried for articles on wikis and CWAs published between 2001 and September 16, 2011. In total, 4436 citations and 1921 grey literature items were screened. Two reviewers independently reviewed citations, selected eligible studies, and extracted data using a standardized form. We included any paper presenting qualitative or quantitative empirical evidence concerning health care and CWAs. We defined a CWA as any technology that enables the joint and simultaneous editing of a webpage or an online document by many end users. We performed qualitative content analysis to identify the factors that affect the use of CWAs using the Gagnon framework and their effects on health care using the Donabedian framework. Results Of the 111 studies included, 4 were experimental, 5 quasi-experimental, 5 observational, 52 case studies, 23 surveys about wiki use, and 22 descriptive studies about the quality of information in wikis. We classified them by theme: patterns of use of CWAs (n=26), quality of information in existing CWAs (n=25), and CWAs as KT tools (n=73). A high prevalence of CWA use (ie, more than 50%) is reported in 58% (7/12) of surveys conducted with health care professionals and students. However, we found only one longitudinal study showing that CWA use is increasing in health care. Moreover, contribution rates remain low and the quality of information contained in different CWAs needs improvement. We identified 48 barriers and 91 facilitators in 4 major themes (factors related to the CWA, users’ knowledge and attitude towards CWAs, human environment, and organizational environment). We also found 57 positive and 23 negative effects that we classified into processes and outcomes. Conclusions Although we found some experimental and quasi-experimental studies of the effectiveness and safety of CWAs as educational and KT interventions, the vast majority of included studies were observational case studies about CWAs being used by health professionals and patients. More primary research is needed to find ways to address the different barriers to their use and to make these applications more useful for different stakeholders. PMID:24103318
Wikis and collaborative writing applications in health care: a scoping review.
Archambault, Patrick M; van de Belt, Tom H; Grajales, Francisco J; Faber, Marjan J; Kuziemsky, Craig E; Gagnon, Susie; Bilodeau, Andrea; Rioux, Simon; Nelen, Willianne L D M; Gagnon, Marie-Pierre; Turgeon, Alexis F; Aubin, Karine; Gold, Irving; Poitras, Julien; Eysenbach, Gunther; Kremer, Jan A M; Légaré, France
2013-10-08
Collaborative writing applications (eg, wikis and Google Documents) hold the potential to improve the use of evidence in both public health and health care. The rapid rise in their use has created the need for a systematic synthesis of the evidence of their impact as knowledge translation (KT) tools in the health care sector and for an inventory of the factors that affect their use. Through the Levac six-stage methodology, a scoping review was undertaken to explore the depth and breadth of evidence about the effective, safe, and ethical use of wikis and collaborative writing applications (CWAs) in health care. Multiple strategies were used to locate studies. Seven scientific databases and 6 grey literature sources were queried for articles on wikis and CWAs published between 2001 and September 16, 2011. In total, 4436 citations and 1921 grey literature items were screened. Two reviewers independently reviewed citations, selected eligible studies, and extracted data using a standardized form. We included any paper presenting qualitative or quantitative empirical evidence concerning health care and CWAs. We defined a CWA as any technology that enables the joint and simultaneous editing of a webpage or an online document by many end users. We performed qualitative content analysis to identify the factors that affect the use of CWAs using the Gagnon framework and their effects on health care using the Donabedian framework. Of the 111 studies included, 4 were experimental, 5 quasi-experimental, 5 observational, 52 case studies, 23 surveys about wiki use, and 22 descriptive studies about the quality of information in wikis. We classified them by theme: patterns of use of CWAs (n=26), quality of information in existing CWAs (n=25), and CWAs as KT tools (n=73). A high prevalence of CWA use (ie, more than 50%) is reported in 58% (7/12) of surveys conducted with health care professionals and students. However, we found only one longitudinal study showing that CWA use is increasing in health care. Moreover, contribution rates remain low and the quality of information contained in different CWAs needs improvement. We identified 48 barriers and 91 facilitators in 4 major themes (factors related to the CWA, users' knowledge and attitude towards CWAs, human environment, and organizational environment). We also found 57 positive and 23 negative effects that we classified into processes and outcomes. Although we found some experimental and quasi-experimental studies of the effectiveness and safety of CWAs as educational and KT interventions, the vast majority of included studies were observational case studies about CWAs being used by health professionals and patients. More primary research is needed to find ways to address the different barriers to their use and to make these applications more useful for different stakeholders.
Ethical issues associated with medical tourism in Africa.
Mogaka, John J O; Mupara, Lucia; Tsoka-Gwegweni, Joyce M
2017-01-01
Global disparities in medical technologies, laws, economic inequities, and social-cultural differences drive medical tourism (MT), the practice of travelling to consume healthcare that is either too delayed, unavailable, unaffordable or legally proscribed at home. Africa is simultaneously a source and destination for MT. MT however, presents a new and challenging health ethics frontier, being largely unregulated and characterized by policy contradictions, minority discrimination and conflict of interest among role-players. This article assesses the level of knowledge of MT and its associated ethical issues in Africa; it also identifies critical research gaps on the subject in the region. Exploratory design guided by Arksey and O'Malley's (2005) framework was used. Key search terms and prior determined exclusion/inclusion criteria were used to identify relevant literature sources. Fifty-seven articles met the inclusion criteria. Distributive justice, healthcare resource allocation, experimental treatments and organ transplant were the most common ethical issues of medical tourism in Africa. The dearth of robust engagement of MT and healthcare ethics, as identified through this review, calls for more rigorous research on this subject. Although the bulk of the medical tourism industry is driven by global legal disparities based on ethical considerations, little attention has been given to this subject.
Targeting couple and parent-child coercion to improve health behaviors.
Smith Slep, Amy M; Heyman, Richard E; Mitnick, Danielle M; Lorber, Michael F; Beauchaine, Theodore P
2018-02-01
This phase of the NIH Science of Behavior Change program emphasizes an "experimental medicine approach to behavior change," that seeks to identify targets related to stress reactivity, self-regulation, and social processes for maximal effects on multiple health outcomes. Within this framework, our project focuses on interpersonal processes associated with health: coercive couple and parent-child conflict. Diabetes and poor oral health portend pain, distress, expense, loss of productivity, and even mortality. They share overlapping medical regimens, are driven by overlapping proximal health behaviors, and affect a wide developmental span, from early childhood to late adulthood. Coercive couple and parent-child conflict constitute potent and destructive influences on a wide range of adult and child health outcomes. Such interaction patterns give rise to disturbed environmental stress reactivity (e.g., disrupted sympathetic nervous and parasympathetic nervous systems) and a wide range of adverse health outcomes in children and adults, including dental caries, obesity, and diabetes-related metabolic markers. In this work, we seek to identify/develop/validate assays assessing coercion, identify/develop and test brief interventions to reduce coercion, and test whether changes in coercion trigger changes in health behaviors. Copyright © 2017 Elsevier Ltd. All rights reserved.
Multivariate inference of pathway activity in host immunity and response to therapeutics
Goel, Gautam; Conway, Kara L.; Jaeger, Martin; Netea, Mihai G.; Xavier, Ramnik J.
2014-01-01
Developing a quantitative view of how biological pathways are regulated in response to environmental factors is central for understanding of disease phenotypes. We present a computational framework, named Multivariate Inference of Pathway Activity (MIPA), which quantifies degree of activity induced in a biological pathway by computing five distinct measures from transcriptomic profiles of its member genes. Statistical significance of inferred activity is examined using multiple independent self-contained tests followed by a competitive analysis. The method incorporates a new algorithm to identify a subset of genes that may regulate the extent of activity induced in a pathway. We present an in-depth evaluation of specificity, robustness, and reproducibility of our method. We benchmarked MIPA's false positive rate at less than 1%. Using transcriptomic profiles representing distinct physiological and disease states, we illustrate applicability of our method in (i) identifying gene–gene interactions in autophagy-dependent response to Salmonella infection, (ii) uncovering gene–environment interactions in host response to bacterial and viral pathogens and (iii) identifying driver genes and processes that contribute to wound healing and response to anti-TNFα therapy. We provide relevant experimental validation that corroborates the accuracy and advantage of our method. PMID:25147207
Kirk, Maggie; Tonkin, Emma; Skirton, Heather
2014-01-01
KIRK M., TONKIN E. & SKIRTON H. (2014) An iterative consensus-building approach to revising a genetics/genomics competency framework for nurse education in the UK. Journal of Advanced Nursing 70(2), 405–420. doi: 10.1111/jan.12207 AimTo report a review of a genetics education framework using a consensus approach to agree on a contemporary and comprehensive revised framework. BackgroundAdvances in genomic health care have been significant since the first genetics education framework for nurses was developed in 2003. These, coupled with developments in policy and international efforts to promote nursing competence in genetics, indicated that review was timely. DesignA structured, iterative, primarily qualitative approach, based on a nominal group technique. MethodA meeting convened in 2010 involved stakeholders in UK nursing education, practice and management, including patient representatives (n = 30). A consensus approach was used to solicit participants' views on the individual/family needs identified from real-life stories of people affected by genetic conditions and the nurses' knowledge, skills and attitudes needed to meet those needs. Five groups considered the stories in iterative rounds, reviewing comments from previous groups. Omissions and deficiencies were identified by mapping resulting themes to the original framework. Anonymous voting captured views. Educators at a second meeting developed learning outcomes for the final framework. FindingsDeficiencies in relation to Advocacy, Information management and Ongoing care were identified. All competencies of the original framework were revised, adding an eighth competency to make explicit the need for ongoing care of the individual/family. ConclusionModifications to the framework reflect individual/family needs and are relevant to the nursing role. The approach promoted engagement in a complex issue and provides a framework to guide nurse education in genetics/genomics; however, nursing leadership is crucial to successful implementation. PMID:23879662
Ethical frameworks for surrogates’ end-of-life planning experiences: A qualitative systematic review
Kim, Hyejin; Deatrick, Janet A; Ulrich, Connie M
2016-01-01
Despite the growing body of knowledge about surrogate decision making, we know very little about the use of ethical frameworks including ethical theories, principles, and concepts to understand surrogates’ day-to-day experiences in end-of-life care planning for incapacitated adults. This systematic review of 30 qualitative research papers was conducted to identify the types of ethical frameworks used to address surrogates’ experiences in end-of-life care planning for incapacitated adults as well as the most common themes or patterns found in surrogate decision making research.. Seven papers explicitly identified ethical theories, principles, or concepts for their studies, such as autonomy, substituted judgment, and best interests. Themes identified about surrogate decision making included: responsibilities and goals, factors affecting surrogates’ decision making, and outcomes for surrogates. In fact, an overarching theme of “wanting to do the right thing” for incapacitated adults and/or themselves was prominent. Understanding the complexity of surrogates’ experiences of end-of-life care planning is beyond the scope of conventional ethical frameworks. Ethical frameworks that address individuality and contextual variations related to decision making may more appropriately guide surrogate decision making research that explores surrogates’ end-of-life care planning experiences. PMID:27005954
A simple theoretical framework for understanding heterogeneous differentiation of CD4+ T cells
2012-01-01
Background CD4+ T cells have several subsets of functional phenotypes, which play critical yet diverse roles in the immune system. Pathogen-driven differentiation of these subsets of cells is often heterogeneous in terms of the induced phenotypic diversity. In vitro recapitulation of heterogeneous differentiation under homogeneous experimental conditions indicates some highly regulated mechanisms by which multiple phenotypes of CD4+ T cells can be generated from a single population of naïve CD4+ T cells. Therefore, conceptual understanding of induced heterogeneous differentiation will shed light on the mechanisms controlling the response of populations of CD4+ T cells under physiological conditions. Results We present a simple theoretical framework to show how heterogeneous differentiation in a two-master-regulator paradigm can be governed by a signaling network motif common to all subsets of CD4+ T cells. With this motif, a population of naïve CD4+ T cells can integrate the signals from their environment to generate a functionally diverse population with robust commitment of individual cells. Notably, two positive feedback loops in this network motif govern three bistable switches, which in turn, give rise to three types of heterogeneous differentiated states, depending upon particular combinations of input signals. We provide three prototype models illustrating how to use this framework to explain experimental observations and make specific testable predictions. Conclusions The process in which several types of T helper cells are generated simultaneously to mount complex immune responses upon pathogenic challenges can be highly regulated, and a simple signaling network motif can be responsible for generating all possible types of heterogeneous populations with respect to a pair of master regulators controlling CD4+ T cell differentiation. The framework provides a mathematical basis for understanding the decision-making mechanisms of CD4+ T cells, and it can be helpful for interpreting experimental results. Mathematical models based on the framework make specific testable predictions that may improve our understanding of this differentiation system. PMID:22697466
U.S. History Framework for the 2010 National Assessment of Educational Progress
ERIC Educational Resources Information Center
National Assessment Governing Board, 2009
2009-01-01
This framework identifies the main ideas, major events, key individuals, and unifying themes of American history as a basis for preparing the 2010 assessment. The framework recognizes that U.S. history includes powerful ideas, common and diverse traditions, economic developments, technological and scientific innovations, philosophical debates,…
A Contingency Framework for Listening to the Dying
ERIC Educational Resources Information Center
Vora, Erika; Vora, Ariana
2008-01-01
Listening to the dying poses special challenges. This paper proposes a contingency framework for describing and assessing various circumstances when listening to the dying. It identifies current approaches to listening, applies the contingency framework toward effectively listening to the dying, and proposes a new type of listening called…
ERIC Educational Resources Information Center
Marston, Jennie
2014-01-01
This article by Jennie Marston provides a framework to assist you in selecting appropriate picture books to present mathematical content. Jennie demonstrates the framework by applying three specific examples of picture books to the framework along with examples of activities.
A database de-identification framework to enable direct queries on medical data for secondary use.
Erdal, B S; Liu, J; Ding, J; Chen, J; Marsh, C B; Kamal, J; Clymer, B D
2012-01-01
To qualify the use of patient clinical records as non-human-subject for research purpose, electronic medical record data must be de-identified so there is minimum risk to protected health information exposure. This study demonstrated a robust framework for structured data de-identification that can be applied to any relational data source that needs to be de-identified. Using a real world clinical data warehouse, a pilot implementation of limited subject areas were used to demonstrate and evaluate this new de-identification process. Query results and performances are compared between source and target system to validate data accuracy and usability. The combination of hashing, pseudonyms, and session dependent randomizer provides a rigorous de-identification framework to guard against 1) source identifier exposure; 2) internal data analyst manually linking to source identifiers; and 3) identifier cross-link among different researchers or multiple query sessions by the same researcher. In addition, a query rejection option is provided to refuse queries resulting in less than preset numbers of subjects and total records to prevent users from accidental subject identification due to low volume of data. This framework does not prevent subject re-identification based on prior knowledge and sequence of events. Also, it does not deal with medical free text de-identification, although text de-identification using natural language processing can be included due its modular design. We demonstrated a framework resulting in HIPAA Compliant databases that can be directly queried by researchers. This technique can be augmented to facilitate inter-institutional research data sharing through existing middleware such as caGrid.
On the Epistemological Crisis in Genomics
Dougherty, Edward R
2008-01-01
There is an epistemological crisis in genomics. At issue is what constitutes scientific knowledge in genomic science, or systems biology in general. Does this crisis require a new perspective on knowledge heretofore absent from science or is it merely a matter of interpreting new scientific developments in an existing epistemological framework? This paper discusses the manner in which the experimental method, as developed and understood over recent centuries, leads naturally to a scientific epistemology grounded in an experimental-mathematical duality. It places genomics into this epistemological framework and examines the current situation in genomics. Meaning and the constitution of scientific knowledge are key concerns for genomics, and the nature of the epistemological crisis in genomics depends on how these are understood. PMID:19440447
Nazione, Samantha; Pace, Kristin
2015-01-01
Medical malpractice lawsuits are a growing problem in the United States, and there is much controversy regarding how to best address this problem. The medical error disclosure framework suggests that apologizing, expressing empathy, engaging in corrective action, and offering compensation after a medical error may improve the provider-patient relationship and ultimately help reduce the number of medical malpractice lawsuits patients bring to medical providers. This study provides an experimental examination of the medical error disclosure framework and its effect on amount of money requested in a lawsuit, negative intentions, attitudes, and anger toward the provider after a medical error. Results suggest empathy may play a large role in providing positive outcomes after a medical error.