DOE Office of Scientific and Technical Information (OSTI.GOV)
Mattsson, Ann E.
Density Functional Theory (DFT) based Equation of State (EOS) construction is a prominent part of Sandia’s capabilities to support engineering sciences. This capability is based on augmenting experimental data with information gained from computational investigations, especially in those parts of the phase space where experimental data is hard, dangerous, or expensive to obtain. A key part of the success of the Sandia approach is the fundamental science work supporting the computational capability. Not only does this work enhance the capability to perform highly accurate calculations but it also provides crucial insight into the limitations of the computational tools, providing highmore » confidence in the results even where results cannot be, or have not yet been, validated by experimental data. This report concerns the key ingredient of projector augmented-wave (PAW) potentials for use in pseudo-potential computational codes. Using the tools discussed in SAND2012-7389 we assess the standard Vienna Ab-initio Simulation Package (VASP) PAWs for Molybdenum.« less
NASA Astrophysics Data System (ADS)
Farroni, Flavio; Lamberti, Raffaele; Mancinelli, Nicolò; Timpone, Francesco
2018-03-01
Tyres play a key role in ground vehicles' dynamics because they are responsible for traction, braking and cornering. A proper tyre-road interaction model is essential for a useful and reliable vehicle dynamics model. In the last two decades Pacejka's Magic Formula (MF) has become a standard in simulation field. This paper presents a Tool, called TRIP-ID (Tyre Road Interaction Parameters IDentification), developed to characterize and to identify with a high grade of accuracy and reliability MF micro-parameters from experimental data deriving from telemetry or from test rig. The tool guides interactively the user through the identification process on the basis of strong diagnostic considerations about the experimental data made evident by the tool itself. A motorsport application of the tool is shown as a case study.
Framework for assessing key variable dependencies in loose-abrasive grinding and polishing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, J.S.; Aikens, D.M.; Brown, N.J.
1995-12-01
This memo describes a framework for identifying all key variables that determine the figuring performance of loose-abrasive lapping and polishing machines. This framework is intended as a tool for prioritizing R&D issues, assessing the completeness of process models and experimental data, and for providing a mechanism to identify any assumptions in analytical models or experimental procedures. Future plans for preparing analytical models or performing experiments can refer to this framework in establishing the context of the work.
Vehicle Lightweighting: Challenges and Opportunities with Aluminum
NASA Astrophysics Data System (ADS)
Sachdev, Anil K.; Mishra, Raja K.; Mahato, Anirban; Alpas, Ahmet
Rising energy costs, consumer preferences and regulations drive requirements for fuel economy, performance, comfort, safety and cost of future automobiles. These conflicting situations offer challenges for vehicle lightweighting, for which aluminum applications are key. This paper describes product design needs and materials and process development opportunities driven by theoretical, experimental and modeling tools in the area of sheet and castings. Computational tools and novel experimental techniques used in their development are described. The paper concludes with challenges that lie ahead for pervasive use of aluminum and the necessary fundamental R&D that is still needed.
Krause, Lyndon; Farrow, Damian; Reid, Machar; Buszard, Tim; Pinder, Ross
2018-06-01
Representative Learning Design (RLD) is a framework for assessing the degree to which experimental or practice tasks simulate key aspects of specific performance environments (i.e. competition). The key premise being that when practice replicates the performance environment, skills are more likely to transfer. In applied situations, however, there is currently no simple or quick method for coaches to assess the key concepts of RLD (e.g. during on-court tasks). The aim of this study was to develop a tool for coaches to efficiently assess practice task design in tennis. A consensus-based tool was developed using a 4-round Delphi process with 10 academic and 13 tennis-coaching experts. Expert consensus was reached for the inclusion of seven items, each consisting of two sub-questions related to (i) the task goal and (ii) the relevance of the task to competition performance. The Representative Practice Assessment Tool (RPAT) is proposed for use in assessing and enhancing practice task designs in tennis to increase the functional coupling between information and movement, and to maximise the potential for skill transfer to competition contexts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, S.S.; Zhu, S.; Cai, Y.
Motion-dependent magnetic forces are the key elements in the study of magnetically levitated vehicle (maglev) system dynamics. In the past, most maglev-system designs were based on a quasisteady-motion theory of magnetic forces. This report presents an experimental and analytical study that will enhance our understanding of the role of unsteady-motion-dependent magnetic forces and demonstrate an experimental technique that can be used to measure those unsteady magnetic forces directly. The experimental technique provides a useful tool to measure motion-dependent magnetic forces for the prediction and control of maglev systems.
Sensitivity analysis of navy aviation readiness based sparing model
2017-09-01
variability. (See Figure 4.) Figure 4. Research design flowchart 18 Figure 4 lays out the four steps of the methodology , starting in the upper left-hand...as a function of changes in key inputs. We develop NAVARM Experimental Designs (NED), a computational tool created by applying a state-of-the-art...experimental design to the NAVARM model. Statistical analysis of the resulting data identifies the most influential cost factors. Those are, in order of
Computer Simulations: A Tool to Predict Experimental Parameters with Cold Atoms
2013-04-01
Department of the Army position unless so designated by other authorized documents. Citation of manufacturer’s or trade names does not constitute an...specifically designed to work with cold atom systems and atom chips, and is already able to compute their key properties. We simulate our experimental...also allows one to choose different physics and define the interdependencies between them. It is not specifically designed for cold atom systems or
Using Galaxy to Perform Large-Scale Interactive Data Analyses
Hillman-Jackson, Jennifer; Clements, Dave; Blankenberg, Daniel; Taylor, James; Nekrutenko, Anton
2014-01-01
Innovations in biomedical research technologies continue to provide experimental biologists with novel and increasingly large genomic and high-throughput data resources to be analyzed. As creating and obtaining data has become easier, the key decision faced by many researchers is a practical one: where and how should an analysis be performed? Datasets are large and analysis tool set-up and use is riddled with complexities outside of the scope of core research activities. The authors believe that Galaxy provides a powerful solution that simplifies data acquisition and analysis in an intuitive Web application, granting all researchers access to key informatics tools previously only available to computational specialists working in Unix-based environments. We will demonstrate through a series of biomedically relevant protocols how Galaxy specifically brings together (1) data retrieval from public and private sources, for example, UCSC's Eukaryote and Microbial Genome Browsers, (2) custom tools (wrapped Unix functions, format standardization/conversions, interval operations), and 3rd-party analysis tools. PMID:22700312
The Microarray Revolution: Perspectives from Educators
ERIC Educational Resources Information Center
Brewster, Jay L.; Beason, K. Beth; Eckdahl, Todd T.; Evans, Irene M.
2004-01-01
In recent years, microarray analysis has become a key experimental tool, enabling the analysis of genome-wide patterns of gene expression. This review approaches the microarray revolution with a focus upon four topics: 1) the early development of this technology and its application to cancer diagnostics; 2) a primer of microarray research,…
Methods of photoelectrode characterization with high spatial and temporal resolution
Esposito, Daniel V.; Baxter, Jason B.; John, Jimmy; ...
2015-06-19
Here, materials and photoelectrode architectures that are highly efficient, extremely stable, and made from low cost materials are required for commercially viable photoelectrochemical (PEC) water-splitting technology. A key challenge is the heterogeneous nature of real-world materials, which often possess spatial variation in their crystal structure, morphology, and/or composition at the nano-, micro-, or macro-scale. Different structures and compositions can have vastly different properties and can therefore strongly influence the overall performance of the photoelectrode through complex structure–property relationships. A complete understanding of photoelectrode materials would also involve elucidation of processes such as carrier collection and electrochemical charge transfer that occurmore » at very fast time scales. We present herein an overview of a broad suite of experimental and computational tools that can be used to define the structure–property relationships of photoelectrode materials at small dimensions and on fast time scales. A major focus is on in situ scanning-probe measurement (SPM) techniques that possess the ability to measure differences in optical, electronic, catalytic, and physical properties with nano- or micro-scale spatial resolution. In situ ultrafast spectroscopic techniques, used to probe carrier dynamics involved with processes such as carrier generation, recombination, and interfacial charge transport, are also discussed. Complementing all of these experimental techniques are computational atomistic modeling tools, which can be invaluable for interpreting experimental results, aiding in materials discovery, and interrogating PEC processes at length and time scales not currently accessible by experiment. In addition to reviewing the basic capabilities of these experimental and computational techniques, we highlight key opportunities and limitations of applying these tools for the development of PEC materials.« less
Wang, Qin; Wang, Xiang-Bin
2014-01-01
We present a model on the simulation of the measurement-device independent quantum key distribution (MDI-QKD) with phase randomized general sources. It can be used to predict experimental observations of a MDI-QKD with linear channel loss, simulating corresponding values for the gains, the error rates in different basis, and also the final key rates. Our model can be applicable to the MDI-QKDs with arbitrary probabilistic mixture of different photon states or using any coding schemes. Therefore, it is useful in characterizing and evaluating the performance of the MDI-QKD protocol, making it a valuable tool in studying the quantum key distributions. PMID:24728000
Insights into early lithic technologies from ethnography
Hayden, Brian
2015-01-01
Oldowan lithic assemblages are often portrayed as a product of the need to obtain sharp flakes for cutting into animal carcases. However, ethnographic and experimental research indicates that the optimal way to produce flakes for such butchering purposes is via bipolar reduction of small cryptocrystalline pebbles rather than from larger crystalline cores resembling choppers. Ethnographic observations of stone tool-using hunter-gatherers in environments comparable with early hominins indicate that most stone tools (particularly chopper forms and flake tools) were used for making simple shaft tools including spears, digging sticks and throwing sticks. These tools bear strong resemblances to Oldowan stone tools. Bipolar reduction for butchering probably preceded chopper-like core reduction and provides a key link between primate nut-cracking technologies and the emergence of more sophisticated lithic technologies leading to the Oldowan. PMID:26483534
Systems Biology-Driven Hypotheses Tested In Vivo: The Need to Advancing Molecular Imaging Tools.
Verma, Garima; Palombo, Alessandro; Grigioni, Mauro; La Monaca, Morena; D'Avenio, Giuseppe
2018-01-01
Processing and interpretation of biological images may provide invaluable insights on complex, living systems because images capture the overall dynamics as a "whole." Therefore, "extraction" of key, quantitative morphological parameters could be, at least in principle, helpful in building a reliable systems biology approach in understanding living objects. Molecular imaging tools for system biology models have attained widespread usage in modern experimental laboratories. Here, we provide an overview on advances in the computational technology and different instrumentations focused on molecular image processing and analysis. Quantitative data analysis through various open source software and algorithmic protocols will provide a novel approach for modeling the experimental research program. Besides this, we also highlight the predictable future trends regarding methods for automatically analyzing biological data. Such tools will be very useful to understand the detailed biological and mathematical expressions under in-silico system biology processes with modeling properties.
LabKey Server: an open source platform for scientific data integration, analysis and collaboration.
Nelson, Elizabeth K; Piehler, Britt; Eckels, Josh; Rauch, Adam; Bellew, Matthew; Hussey, Peter; Ramsay, Sarah; Nathe, Cory; Lum, Karl; Krouse, Kevin; Stearns, David; Connolly, Brian; Skillman, Tom; Igra, Mark
2011-03-09
Broad-based collaborations are becoming increasingly common among disease researchers. For example, the Global HIV Enterprise has united cross-disciplinary consortia to speed progress towards HIV vaccines through coordinated research across the boundaries of institutions, continents and specialties. New, end-to-end software tools for data and specimen management are necessary to achieve the ambitious goals of such alliances. These tools must enable researchers to organize and integrate heterogeneous data early in the discovery process, standardize processes, gain new insights into pooled data and collaborate securely. To meet these needs, we enhanced the LabKey Server platform, formerly known as CPAS. This freely available, open source software is maintained by professional engineers who use commercially proven practices for software development and maintenance. Recent enhancements support: (i) Submitting specimens requests across collaborating organizations (ii) Graphically defining new experimental data types, metadata and wizards for data collection (iii) Transitioning experimental results from a multiplicity of spreadsheets to custom tables in a shared database (iv) Securely organizing, integrating, analyzing, visualizing and sharing diverse data types, from clinical records to specimens to complex assays (v) Interacting dynamically with external data sources (vi) Tracking study participants and cohorts over time (vii) Developing custom interfaces using client libraries (viii) Authoring custom visualizations in a built-in R scripting environment. Diverse research organizations have adopted and adapted LabKey Server, including consortia within the Global HIV Enterprise. Atlas is an installation of LabKey Server that has been tailored to serve these consortia. It is in production use and demonstrates the core capabilities of LabKey Server. Atlas now has over 2,800 active user accounts originating from approximately 36 countries and 350 organizations. It tracks roughly 27,000 assay runs, 860,000 specimen vials and 1,300,000 vial transfers. Sharing data, analysis tools and infrastructure can speed the efforts of large research consortia by enhancing efficiency and enabling new insights. The Atlas installation of LabKey Server demonstrates the utility of the LabKey platform for collaborative research. Stable, supported builds of LabKey Server are freely available for download at http://www.labkey.org. Documentation and source code are available under the Apache License 2.0.
LabKey Server: An open source platform for scientific data integration, analysis and collaboration
2011-01-01
Background Broad-based collaborations are becoming increasingly common among disease researchers. For example, the Global HIV Enterprise has united cross-disciplinary consortia to speed progress towards HIV vaccines through coordinated research across the boundaries of institutions, continents and specialties. New, end-to-end software tools for data and specimen management are necessary to achieve the ambitious goals of such alliances. These tools must enable researchers to organize and integrate heterogeneous data early in the discovery process, standardize processes, gain new insights into pooled data and collaborate securely. Results To meet these needs, we enhanced the LabKey Server platform, formerly known as CPAS. This freely available, open source software is maintained by professional engineers who use commercially proven practices for software development and maintenance. Recent enhancements support: (i) Submitting specimens requests across collaborating organizations (ii) Graphically defining new experimental data types, metadata and wizards for data collection (iii) Transitioning experimental results from a multiplicity of spreadsheets to custom tables in a shared database (iv) Securely organizing, integrating, analyzing, visualizing and sharing diverse data types, from clinical records to specimens to complex assays (v) Interacting dynamically with external data sources (vi) Tracking study participants and cohorts over time (vii) Developing custom interfaces using client libraries (viii) Authoring custom visualizations in a built-in R scripting environment. Diverse research organizations have adopted and adapted LabKey Server, including consortia within the Global HIV Enterprise. Atlas is an installation of LabKey Server that has been tailored to serve these consortia. It is in production use and demonstrates the core capabilities of LabKey Server. Atlas now has over 2,800 active user accounts originating from approximately 36 countries and 350 organizations. It tracks roughly 27,000 assay runs, 860,000 specimen vials and 1,300,000 vial transfers. Conclusions Sharing data, analysis tools and infrastructure can speed the efforts of large research consortia by enhancing efficiency and enabling new insights. The Atlas installation of LabKey Server demonstrates the utility of the LabKey platform for collaborative research. Stable, supported builds of LabKey Server are freely available for download at http://www.labkey.org. Documentation and source code are available under the Apache License 2.0. PMID:21385461
Tempest: Tools for Addressing the Needs of Next-Generation Climate Models
NASA Astrophysics Data System (ADS)
Ullrich, P. A.; Guerra, J. E.; Pinheiro, M. C.; Fong, J.
2015-12-01
Tempest is a comprehensive simulation-to-science infrastructure that tackles the needs of next-generation, high-resolution, data intensive climate modeling activities. This project incorporates three key components: TempestDynamics, a global modeling framework for experimental numerical methods and high-performance computing; TempestRemap, a toolset for arbitrary-order conservative and consistent remapping between unstructured grids; and TempestExtremes, a suite of detection and characterization tools for identifying weather extremes in large climate datasets. In this presentation, the latest advances with the implementation of this framework will be discussed, and a number of projects now utilizing these tools will be featured.
Insights into early lithic technologies from ethnography.
Hayden, Brian
2015-11-19
Oldowan lithic assemblages are often portrayed as a product of the need to obtain sharp flakes for cutting into animal carcases. However, ethnographic and experimental research indicates that the optimal way to produce flakes for such butchering purposes is via bipolar reduction of small cryptocrystalline pebbles rather than from larger crystalline cores resembling choppers. Ethnographic observations of stone tool-using hunter-gatherers in environments comparable with early hominins indicate that most stone tools (particularly chopper forms and flake tools) were used for making simple shaft tools including spears, digging sticks and throwing sticks. These tools bear strong resemblances to Oldowan stone tools. Bipolar reduction for butchering probably preceded chopper-like core reduction and provides a key link between primate nut-cracking technologies and the emergence of more sophisticated lithic technologies leading to the Oldowan. © 2015 The Author(s).
Show and tell: disclosure and data sharing in experimental pathology.
Schofield, Paul N; Ward, Jerrold M; Sundberg, John P
2016-06-01
Reproducibility of data from experimental investigations using animal models is increasingly under scrutiny because of the potentially negative impact of poor reproducibility on the translation of basic research. Histopathology is a key tool in biomedical research, in particular for the phenotyping of animal models to provide insights into the pathobiology of diseases. Failure to disclose and share crucial histopathological experimental details compromises the validity of the review process and reliability of the conclusions. We discuss factors that affect the interpretation and validation of histopathology data in publications and the importance of making these data accessible to promote replicability in research. © 2016. Published by The Company of Biologists Ltd.
Biomaterial science meets computational biology.
Hutmacher, Dietmar W; Little, J Paige; Pettet, Graeme J; Loessner, Daniela
2015-05-01
There is a pressing need for a predictive tool capable of revealing a holistic understanding of fundamental elements in the normal and pathological cell physiology of organoids in order to decipher the mechanoresponse of cells. Therefore, the integration of a systems bioengineering approach into a validated mathematical model is necessary to develop a new simulation tool. This tool can only be innovative by combining biomaterials science with computational biology. Systems-level and multi-scale experimental data are incorporated into a single framework, thus representing both single cells and collective cell behaviour. Such a computational platform needs to be validated in order to discover key mechano-biological factors associated with cell-cell and cell-niche interactions.
An experimental investigation on orthogonal cutting of hybrid CFRP/Ti stacks
NASA Astrophysics Data System (ADS)
Xu, Jinyang; El Mansori, Mohamed
2016-10-01
Hybrid CFRP/Ti stack has been widely used in the modern aerospace industry owing to its superior mechanical/physical properties and excellent structural functions. Several applications require mechanical machining of these hybrid composite stacks in order to achieve dimensional accuracy and assembly performance. However, machining of such composite-to-metal alliance is usually an extremely challenging task in the manufacturing sectors due to the disparate natures of each stacked constituent and their respective poor machinability. Special issues may arise from the high force/heat generation, severe subsurface damage and rapid tool wear. To study the fundamental mechanisms controlling the bi-material machining, this paper presented an experimental study on orthogonal cutting of hybrid CFRP/Ti stack by using superior polycrystalline diamond (PCD) tipped tools. The utilized cutting parameters for hybrid CFRP/Ti machining were rigorously adopted through a compromise selection due to the disparate machinability behaviors of the CFRP laminate and Ti alloy. The key cutting responses in terms of cutting force generation, machined surface quality and tool wear mechanism were precisely addressed. The experimental results highlighted the involved five stages of CFRP/Ti cutting and the predominant crater wear and edge fracture failure governing the PCD cutting process.
Experimenter's laboratory for visualized interactive science
NASA Technical Reports Server (NTRS)
Hansen, Elaine R.; Klemp, Marjorie K.; Lasater, Sally W.; Szczur, Marti R.; Klemp, Joseph B.
1992-01-01
The science activities of the 1990's will require the analysis of complex phenomena and large diverse sets of data. In order to meet these needs, we must take advantage of advanced user interaction techniques: modern user interface tools; visualization capabilities; affordable, high performance graphics workstations; and interoperable data standards and translator. To meet these needs, we propose to adopt and upgrade several existing tools and systems to create an experimenter's laboratory for visualized interactive science. Intuitive human-computer interaction techniques have already been developed and demonstrated at the University of Colorado. A Transportable Applications Executive (TAE+), developed at GSFC, is a powerful user interface tool for general purpose applications. A 3D visualization package developed by NCAR provides both color shaded surface displays and volumetric rendering in either index or true color. The Network Common Data Form (NetCDF) data access library developed by Unidata supports creation, access and sharing of scientific data in a form that is self-describing and network transparent. The combination and enhancement of these packages constitutes a powerful experimenter's laboratory capable of meeting key science needs of the 1990's. This proposal encompasses the work required to build and demonstrate this capability.
Experimenter's laboratory for visualized interactive science
NASA Technical Reports Server (NTRS)
Hansen, Elaine R.; Klemp, Marjorie K.; Lasater, Sally W.; Szczur, Marti R.; Klemp, Joseph B.
1993-01-01
The science activities of the 1990's will require the analysis of complex phenomena and large diverse sets of data. In order to meet these needs, we must take advantage of advanced user interaction techniques: modern user interface tools; visualization capabilities; affordable, high performance graphics workstations; and interoperatable data standards and translator. To meet these needs, we propose to adopt and upgrade several existing tools and systems to create an experimenter's laboratory for visualized interactive science. Intuitive human-computer interaction techniques have already been developed and demonstrated at the University of Colorado. A Transportable Applications Executive (TAE+), developed at GSFC, is a powerful user interface tool for general purpose applications. A 3D visualization package developed by NCAR provides both color-shaded surface displays and volumetric rendering in either index or true color. The Network Common Data Form (NetCDF) data access library developed by Unidata supports creation, access and sharing of scientific data in a form that is self-describing and network transparent. The combination and enhancement of these packages constitutes a powerful experimenter's laboratory capable of meeting key science needs of the 1990's. This proposal encompasses the work required to build and demonstrate this capability.
1992-07-01
methodologies ; software performance analysis; software testing; and concurrent languages. Finally, efforts in algorithms, which are primarily designed to upgrade...These codes provide a powerful research tool for testing new concepts and designs prior to experimental implementation. DoE’s laser program has also...development, and specially designed production facilities. World leadership in bth non -fluorinated and fluorinated materials resides in the U.S. but Japan
Self-assembly kinetics of microscale components: A parametric evaluation
NASA Astrophysics Data System (ADS)
Carballo, Jose M.
The goal of the present work is to develop, and evaluate a parametric model of a basic microscale Self-Assembly (SA) interaction that provides scaling predictions of process rates as a function of key process variables. At the microscale, assembly by "grasp and release" is generally challenging. Recent research efforts have proposed adapting nanoscale self-assembly (SA) processes to the microscale. SA offers the potential for reduced equipment cost and increased throughput by harnessing attractive forces (most commonly, capillary) to spontaneously assemble components. However, there are challenges for implementing microscale SA as a commercial process. The existing lack of design tools prevents simple process optimization. Previous efforts have characterized a specific aspect of the SA process. However, the existing microscale SA models do not characterize the inter-component interactions. All existing models have simplified the outcome of SA interactions as an experimentally-derived value specific to a particular configuration, instead of evaluating it outcome as a function of component level parameters (such as speed, geometry, bonding energy and direction). The present study parameterizes the outcome of interactions, and evaluates the effect of key parameters. The present work closes the gap between existing microscale SA models to add a key piece towards a complete design tool for general microscale SA process modeling. First, this work proposes a simple model for defining the probability of assembly of basic SA interactions. A basic SA interaction is defined as the event where a single part arrives on an assembly site. The model describes the probability of assembly as a function of kinetic energy, binding energy, orientation and incidence angle for the component and the assembly site. Secondly, an experimental SA system was designed, and implemented to create individual SA interactions while controlling process parameters independently. SA experiments measured the outcome of SA interactions, while studying the independent effects of each parameter. As a first step towards a complete scaling model, experiments were performed to evaluate the effects of part geometry and part travel direction under low kinetic energy conditions. Experimental results show minimal dependence of assembly yield on the incidence angle of the parts, and significant effects induced by changes in part geometry. The results from this work indicate that SA could be modeled as an energy-based process due to the small path dependence effects. Assembly probability is linearly related to the orientation probability. The proportionality constant is based on the area fraction of the sites with an amplification factor. This amplification factor accounts for the ability of capillary forces to align parts with only very small areas of contact when they have a low kinetic energy. Results provide unprecedented insight about SA interactions. The present study is a key step towards completing a basic model of a general SA process. Moreover, the outcome from this work can complement existing SA process models, in order to create a complete design tool for microscale SA systems. In addition to SA experiments, Monte Carlo simulations of experimental part-site interactions were conducted. This study confirmed that a major contributor to experimental variation is the stochastic nature of experimental SA interactions and the limited sample size of the experiments. Furthermore, the simulations serve as a tool for defining an optimum sampling strategy to minimize the uncertainty in future SA experiments.
Initial sequencing and comparative analysis of the mouse genome
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waterston, Robert H.; Lindblad-Toh, Kerstin; Birney, Ewan
2002-12-15
The sequence of the mouse genome is a key informational tool for understanding the contents of the human genome and a key experimental tool for biomedical research. Here, we report the results of an international collaboration to produce a high-quality draft sequence of the mouse genome. We also present an initial comparative analysis of the mouse and human genomes, describing some of the insights that can be gleaned from the two sequences. We discuss topics including the analysis of the evolutionary forces shaping the size, structure and sequence of the genomes; the conservation of large-scale synteny across most of themore » genomes; the much lower extent of sequence orthology covering less than half of the genomes; the proportions of the genomes under selection; the number of protein-coding genes; the expansion of gene families related to reproduction and immunity; the evolution of proteins; and the identification of intraspecies polymorphism.« less
Cserpán, Dorottya; Meszéna, Domokos; Wittner, Lucia; Tóth, Kinga; Ulbert, István; Somogyvári, Zoltán
2017-01-01
Revealing the current source distribution along the neuronal membrane is a key step on the way to understanding neural computations; however, the experimental and theoretical tools to achieve sufficient spatiotemporal resolution for the estimation remain to be established. Here, we address this problem using extracellularly recorded potentials with arbitrarily distributed electrodes for a neuron of known morphology. We use simulations of models with varying complexity to validate the proposed method and to give recommendations for experimental applications. The method is applied to in vitro data from rat hippocampus. PMID:29148974
Computational Design of Functionalized Metal–Organic Framework Nodes for Catalysis
2017-01-01
Recent progress in the synthesis and characterization of metal–organic frameworks (MOFs) has opened the door to an increasing number of possible catalytic applications. The great versatility of MOFs creates a large chemical space, whose thorough experimental examination becomes practically impossible. Therefore, computational modeling is a key tool to support, rationalize, and guide experimental efforts. In this outlook we survey the main methodologies employed to model MOFs for catalysis, and we review selected recent studies on the functionalization of their nodes. We pay special attention to catalytic applications involving natural gas conversion. PMID:29392172
Transfection of Capsaspora owczarzaki, a close unicellular relative of animals.
Parra-Acero, Helena; Ros-Rocher, Núria; Perez-Posada, Alberto; Kożyczkowska, Aleksandra; Sánchez-Pons, Núria; Nakata, Azusa; Suga, Hiroshi; Najle, Sebastián R; Ruiz-Trillo, Iñaki
2018-05-11
How animals emerged from their unicellular ancestor remains a major evolutionary question. New genome data from the closest unicellular relatives of animals has provided important insights into animal origins. We know that the unicellular ancestor of animals had an unexpectedly complex genetic repertoire, including many genes key to animal development and multicellularity. Thus, assessing the function of these genes among unicellular relatives of animals is key to understand how were they co-opted at the onset of Metazoa. However, those analyses have been hampered by the lack of genetic tools. Progress has been made in choanoflagellates and teretosporeans, two of the three lineages closely related to animals, while in filastereans no tools are yet available for functional analysis. Importantly, filastereans possess a striking repertoire of genes involved in transcriptional regulation and other developmental processes. Here, we describe a reliable transfection method for the filasterean Capsaspora owczarzaki We also provide a set of constructs for visualizing subcellular structures in live cells. These tools convert Capsaspora into a unique experimentally tractable organism to address the origin and evolution of animal multicellularity. © 2018. Published by The Company of Biologists Ltd.
A Thermal Management Systems Model for the NASA GTX RBCC Concept
NASA Technical Reports Server (NTRS)
Traci, Richard M.; Farr, John L., Jr.; Laganelli, Tony; Walker, James (Technical Monitor)
2002-01-01
The Vehicle Integrated Thermal Management Analysis Code (VITMAC) was further developed to aid the analysis, design, and optimization of propellant and thermal management concepts for advanced propulsion systems. The computational tool is based on engineering level principles and models. A graphical user interface (GUI) provides a simple and straightforward method to assess and evaluate multiple concepts before undertaking more rigorous analysis of candidate systems. The tool incorporates the Chemical Equilibrium and Applications (CEA) program and the RJPA code to permit heat transfer analysis of both rocket and air breathing propulsion systems. Key parts of the code have been validated with experimental data. The tool was specifically tailored to analyze rocket-based combined-cycle (RBCC) propulsion systems being considered for space transportation applications. This report describes the computational tool and its development and verification for NASA GTX RBCC propulsion system applications.
The Balloon Experimental Twin Telescope for Infrared Interferometry
NASA Technical Reports Server (NTRS)
Rinehart, Stephen A.
2008-01-01
Astronomical studies at infrared wavelengths have dramatically improved our understanding of the universe, and observations with Spitzer, the upcoming Herschel mission, and SOFIA will continue to provide exciting new discoveries. The relatively low angular resolution of these missions, however, is insufficient to resolve the physical scales on which mid- to far-infrared emission arises, resulting in source and structure ambiguities that limit our ability to answer key science questions. Interferometry enables high angular resolution at these wavelengths, a powerful tool for scientific discovery, We will build the Balloon Experimental Twin Telescope for Infrared Interferometry (BETII), an eight-meter baseline Michelson stellar interferometer to fly on a high-altitude balloon. BETTII's spectral-spatial capability, provided by an instrument using double-Fourier techniques, will address key questions about the nature of disks in young star clusters and active galactic nuclei and the envelopes of evolved stars. BETTII will also lay the technological groundwork for future space interferometers,
Two-photon calcium imaging in mice navigating a virtual reality environment.
Leinweber, Marcus; Zmarz, Pawel; Buchmann, Peter; Argast, Paul; Hübener, Mark; Bonhoeffer, Tobias; Keller, Georg B
2014-02-20
In recent years, two-photon imaging has become an invaluable tool in neuroscience, as it allows for chronic measurement of the activity of genetically identified cells during behavior(1-6). Here we describe methods to perform two-photon imaging in mouse cortex while the animal navigates a virtual reality environment. We focus on the aspects of the experimental procedures that are key to imaging in a behaving animal in a brightly lit virtual environment. The key problems that arise in this experimental setup that we here address are: minimizing brain motion related artifacts, minimizing light leak from the virtual reality projection system, and minimizing laser induced tissue damage. We also provide sample software to control the virtual reality environment and to do pupil tracking. With these procedures and resources it should be possible to convert a conventional two-photon microscope for use in behaving mice.
Mycobacterium tuberculosis Metabolism
Warner, Digby F.
2015-01-01
Metabolism underpins the physiology and pathogenesis of Mycobacterium tuberculosis. However, although experimental mycobacteriology has provided key insights into the metabolic pathways that are essential for survival and pathogenesis, determining the metabolic status of bacilli during different stages of infection and in different cellular compartments remains challenging. Recent advances—in particular, the development of systems biology tools such as metabolomics—have enabled key insights into the biochemical state of M. tuberculosis in experimental models of infection. In addition, their use to elucidate mechanisms of action of new and existing antituberculosis drugs is critical for the development of improved interventions to counter tuberculosis. This review provides a broad summary of mycobacterial metabolism, highlighting the adaptation of M. tuberculosis as specialist human pathogen, and discusses recent insights into the strategies used by the host and infecting bacillus to influence the outcomes of the host–pathogen interaction through modulation of metabolic functions. PMID:25502746
Fully device-independent conference key agreement
NASA Astrophysics Data System (ADS)
Ribeiro, Jérémy; Murta, Gláucia; Wehner, Stephanie
2018-02-01
We present a security analysis of conference key agreement (CKA) in the most adversarial model of device independence (DI). Our protocol can be implemented by any experimental setup that is capable of performing Bell tests [specifically, the Mermin-Ardehali-Belinskii-Klyshko (MABK) inequality], and security can in principle be obtained for any violation of the MABK inequality that detects genuine multipartite entanglement among the N parties involved in the protocol. As our main tool, we derive a direct physical connection between the N -partite MABK inequality and the Clauser-Horne-Shimony-Holt (CHSH) inequality, showing that certain violations of the MABK inequality correspond to a violation of the CHSH inequality between one of the parties and the other N -1 . We compare the asymptotic key rate for device-independent conference key agreement (DICKA) to the case where the parties use N -1 device-independent quantum key distribution protocols in order to generate a common key. We show that for some regime of noise the DICKA protocol leads to better rates.
A new approach to the rationale discovery of polymeric biomaterials
Kohn, Joachim; Welsh, William J.; Knight, Doyle
2007-01-01
This paper attempts to illustrate both the need for new approaches to biomaterials discovery as well as the significant promise inherent in the use of combinatorial and computational design strategies. The key observation of this Leading Opinion Paper is that the biomaterials community has been slow to embrace advanced biomaterials discovery tools such as combinatorial methods, high throughput experimentation, and computational modeling in spite of the significant promise shown by these discovery tools in materials science, medicinal chemistry and the pharmaceutical industry. It seems that the complexity of living cells and their interactions with biomaterials has been a conceptual as well as a practical barrier to the use of advanced discovery tools in biomaterials science. However, with the continued increase in computer power, the goal of predicting the biological response of cells in contact with biomaterials surfaces is within reach. Once combinatorial synthesis, high throughput experimentation, and computational modeling are integrated into the biomaterials discovery process, a significant acceleration is possible in the pace of development of improved medical implants, tissue regeneration scaffolds, and gene/drug delivery systems. PMID:17644176
A Probabilistic Approach to Model Update
NASA Technical Reports Server (NTRS)
Horta, Lucas G.; Reaves, Mercedes C.; Voracek, David F.
2001-01-01
Finite element models are often developed for load validation, structural certification, response predictions, and to study alternate design concepts. In rare occasions, models developed with a nominal set of parameters agree with experimental data without the need to update parameter values. Today, model updating is generally heuristic and often performed by a skilled analyst with in-depth understanding of the model assumptions. Parameter uncertainties play a key role in understanding the model update problem and therefore probabilistic analysis tools, developed for reliability and risk analysis, may be used to incorporate uncertainty in the analysis. In this work, probability analysis (PA) tools are used to aid the parameter update task using experimental data and some basic knowledge of potential error sources. Discussed here is the first application of PA tools to update parameters of a finite element model for a composite wing structure. Static deflection data at six locations are used to update five parameters. It is shown that while prediction of individual response values may not be matched identically, the system response is significantly improved with moderate changes in parameter values.
Improving material removal determinacy based on the compensation of tool influence function
NASA Astrophysics Data System (ADS)
Zhong, Bo; Chen, Xian-hua; Deng, Wen-hui; Zhao, Shi-jie; Zheng, Nan
2018-03-01
In the process of computer-controlled optical surfacing (CCOS), the key of correcting the surface error of optical components is to ensure the consistency between the simulated tool influence function and the actual tool influence function (TIF). The existing removal model usually adopts the fixed-point TIF to remove the material with the planning path and velocity, and it considers that the polishing process is linear and time invariant. However, in the actual polishing process, the TIF is a function related to the feed speed. In this paper, the relationship between the actual TIF and the feed speed (i.e. the compensation relationship between static removal and dynamic removal) is determined by experimental method. Then, the existing removal model is modified based on the compensation relationship, to improve the conformity between simulated and actual processing. Finally, the surface error modification correction test are carried out. The results show that the fitting degree of the simulated surface and the experimental surface is better than 88%, and the surface correction accuracy can be better than 1/10 λ (Λ=632.8nm).
Direction and Integration of Experimental Ground Test Capabilities and Computational Methods
NASA Technical Reports Server (NTRS)
Dunn, Steven C.
2016-01-01
This paper groups and summarizes the salient points and findings from two AIAA conference panels targeted at defining the direction, with associated key issues and recommendations, for the integration of experimental ground testing and computational methods. Each panel session utilized rapporteurs to capture comments from both the panel members and the audience. Additionally, a virtual panel of several experts were consulted between the two sessions and their comments were also captured. The information is organized into three time-based groupings, as well as by subject area. These panel sessions were designed to provide guidance to both researchers/developers and experimental/computational service providers in defining the future of ground testing, which will be inextricably integrated with the advancement of computational tools.
Designing novel cellulase systems through agent-based modeling and global sensitivity analysis.
Apte, Advait A; Senger, Ryan S; Fong, Stephen S
2014-01-01
Experimental techniques allow engineering of biological systems to modify functionality; however, there still remains a need to develop tools to prioritize targets for modification. In this study, agent-based modeling (ABM) was used to build stochastic models of complexed and non-complexed cellulose hydrolysis, including enzymatic mechanisms for endoglucanase, exoglucanase, and β-glucosidase activity. Modeling results were consistent with experimental observations of higher efficiency in complexed systems than non-complexed systems and established relationships between specific cellulolytic mechanisms and overall efficiency. Global sensitivity analysis (GSA) of model results identified key parameters for improving overall cellulose hydrolysis efficiency including: (1) the cellulase half-life, (2) the exoglucanase activity, and (3) the cellulase composition. Overall, the following parameters were found to significantly influence cellulose consumption in a consolidated bioprocess (CBP): (1) the glucose uptake rate of the culture, (2) the bacterial cell concentration, and (3) the nature of the cellulase enzyme system (complexed or non-complexed). Broadly, these results demonstrate the utility of combining modeling and sensitivity analysis to identify key parameters and/or targets for experimental improvement.
GenoBase: comprehensive resource database of Escherichia coli K-12
Otsuka, Yuta; Muto, Ai; Takeuchi, Rikiya; Okada, Chihiro; Ishikawa, Motokazu; Nakamura, Koichiro; Yamamoto, Natsuko; Dose, Hitomi; Nakahigashi, Kenji; Tanishima, Shigeki; Suharnan, Sivasundaram; Nomura, Wataru; Nakayashiki, Toru; Aref, Walid G.; Bochner, Barry R.; Conway, Tyrrell; Gribskov, Michael; Kihara, Daisuke; Rudd, Kenneth E.; Tohsato, Yukako; Wanner, Barry L.; Mori, Hirotada
2015-01-01
Comprehensive experimental resources, such as ORFeome clone libraries and deletion mutant collections, are fundamental tools for elucidation of gene function. Data sets by omics analysis using these resources provide key information for functional analysis, modeling and simulation both in individual and systematic approaches. With the long-term goal of complete understanding of a cell, we have over the past decade created a variety of clone and mutant sets for functional genomics studies of Escherichia coli K-12. We have made these experimental resources freely available to the academic community worldwide. Accordingly, these resources have now been used in numerous investigations of a multitude of cell processes. Quality control is extremely important for evaluating results generated by these resources. Because the annotation has been changed since 2005, which we originally used for the construction, we have updated these genomic resources accordingly. Here, we describe GenoBase (http://ecoli.naist.jp/GB/), which contains key information about comprehensive experimental resources of E. coli K-12, their quality control and several omics data sets generated using these resources. PMID:25399415
Designing novel cellulase systems through agent-based modeling and global sensitivity analysis
Apte, Advait A; Senger, Ryan S; Fong, Stephen S
2014-01-01
Experimental techniques allow engineering of biological systems to modify functionality; however, there still remains a need to develop tools to prioritize targets for modification. In this study, agent-based modeling (ABM) was used to build stochastic models of complexed and non-complexed cellulose hydrolysis, including enzymatic mechanisms for endoglucanase, exoglucanase, and β-glucosidase activity. Modeling results were consistent with experimental observations of higher efficiency in complexed systems than non-complexed systems and established relationships between specific cellulolytic mechanisms and overall efficiency. Global sensitivity analysis (GSA) of model results identified key parameters for improving overall cellulose hydrolysis efficiency including: (1) the cellulase half-life, (2) the exoglucanase activity, and (3) the cellulase composition. Overall, the following parameters were found to significantly influence cellulose consumption in a consolidated bioprocess (CBP): (1) the glucose uptake rate of the culture, (2) the bacterial cell concentration, and (3) the nature of the cellulase enzyme system (complexed or non-complexed). Broadly, these results demonstrate the utility of combining modeling and sensitivity analysis to identify key parameters and/or targets for experimental improvement. PMID:24830736
High-throughput strategies for the discovery and engineering of enzymes for biocatalysis.
Jacques, Philippe; Béchet, Max; Bigan, Muriel; Caly, Delphine; Chataigné, Gabrielle; Coutte, François; Flahaut, Christophe; Heuson, Egon; Leclère, Valérie; Lecouturier, Didier; Phalip, Vincent; Ravallec, Rozenn; Dhulster, Pascal; Froidevaux, Rénato
2017-02-01
Innovations in novel enzyme discoveries impact upon a wide range of industries for which biocatalysis and biotransformations represent a great challenge, i.e., food industry, polymers and chemical industry. Key tools and technologies, such as bioinformatics tools to guide mutant library design, molecular biology tools to create mutants library, microfluidics/microplates, parallel miniscale bioreactors and mass spectrometry technologies to create high-throughput screening methods and experimental design tools for screening and optimization, allow to evolve the discovery, development and implementation of enzymes and whole cells in (bio)processes. These technological innovations are also accompanied by the development and implementation of clean and sustainable integrated processes to meet the growing needs of chemical, pharmaceutical, environmental and biorefinery industries. This review gives an overview of the benefits of high-throughput screening approach from the discovery and engineering of biocatalysts to cell culture for optimizing their production in integrated processes and their extraction/purification.
Atomic force microscopy on chromosomes, chromatin and DNA: a review.
Kalle, Wouter; Strappe, Padraig
2012-12-01
The purpose of this review is to discuss the achievements and progress that has been made in the use of atomic force microscopy in DNA related research in the last 25 years. For this review DNA related research is split up in chromosomal-, chromatin- and DNA focused research to achieve a logical flow from large- to smaller structures. The focus of this review is not only on the AFM as imaging tool but also on the AFM as measuring tool using force spectroscopy, as therein lays its greatest advantage and future. The amazing technological and experimental progress that has been made during the last 25 years is too extensive to fully cover in this review but some key developments and experiments have been described to give an overview of the evolution of AFM use from 'imaging tool' to 'measurement tool' on chromosomes, chromatin and DNA. Crown Copyright © 2012. Published by Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marzouk, Youssef
Predictive simulation of complex physical systems increasingly rests on the interplay of experimental observations with computational models. Key inputs, parameters, or structural aspects of models may be incomplete or unknown, and must be developed from indirect and limited observations. At the same time, quantified uncertainties are needed to qualify computational predictions in the support of design and decision-making. In this context, Bayesian statistics provides a foundation for inference from noisy and limited data, but at prohibitive computional expense. This project intends to make rigorous predictive modeling *feasible* in complex physical systems, via accelerated and scalable tools for uncertainty quantification, Bayesianmore » inference, and experimental design. Specific objectives are as follows: 1. Develop adaptive posterior approximations and dimensionality reduction approaches for Bayesian inference in high-dimensional nonlinear systems. 2. Extend accelerated Bayesian methodologies to large-scale {\\em sequential} data assimilation, fully treating nonlinear models and non-Gaussian state and parameter distributions. 3. Devise efficient surrogate-based methods for Bayesian model selection and the learning of model structure. 4. Develop scalable simulation/optimization approaches to nonlinear Bayesian experimental design, for both parameter inference and model selection. 5. Demonstrate these inferential tools on chemical kinetic models in reacting flow, constructing and refining thermochemical and electrochemical models from limited data. Demonstrate Bayesian filtering on canonical stochastic PDEs and in the dynamic estimation of inhomogeneous subsurface properties and flow fields.« less
Do chimpanzees use weight to select hammer tools?
Schrauf, Cornelia; Call, Josep; Fuwa, Koki; Hirata, Satoshi
2012-01-01
The extent to which tool-using animals take into account relevant task parameters is poorly understood. Nut cracking is one of the most complex forms of tool use, the choice of an adequate hammer being a critical aspect in success. Several properties make a hammer suitable for nut cracking, with weight being a key factor in determining the impact of a strike; in general, the greater the weight the fewer strikes required. This study experimentally investigated whether chimpanzees are able to encode the relevance of weight as a property of hammers to crack open nuts. By presenting chimpanzees with three hammers that differed solely in weight, we assessed their ability to relate the weight of the different tools with their effectiveness and thus select the most effective one(s). Our results show that chimpanzees use weight alone in selecting tools to crack open nuts and that experience clearly affects the subjects' attentiveness to the tool properties that are relevant for the task at hand. Chimpanzees can encode the requirements that a nut-cracking tool should meet (in terms of weight) to be effective.
NASA Astrophysics Data System (ADS)
He, Qiuwei; Lv, Xingming; Wang, Xin; Qu, Xingtian; Zhao, Ji
2017-01-01
Blade is the key component in the energy power equipment of turbine, aircraft engines and so on. Researches on the process and equipment for blade finishing become one of important and difficult point. To control precisely tool system of developed hybrid grinding and polishing machine tool for blade finishing, the tool system with changeable wheel for belt polishing is analyzed in this paper. Firstly, the belt length and wrap angle of each wheel in different position of tension wheel swing angle in the process of changing wheel is analyzed. The reasonable belt length is calculated by using MATLAB, and relationships between wrap angle of each wheel and cylinder expansion amount of contact wheel are obtained. Then, the control system for changeable wheel tool structure is developed. Lastly, the surface roughness of blade finishing is verified by experiments. Theoretical analysis and experimental results show that reasonable belt length and wheel wrap angle can be obtained by proposed analysis method, the changeable wheel tool system can be controlled precisely, and the surface roughness of blade after grinding meets the design requirements.
Computational Analysis of Material Flow During Friction Stir Welding of AA5059 Aluminum Alloys
NASA Astrophysics Data System (ADS)
Grujicic, M.; Arakere, G.; Pandurangan, B.; Ochterbeck, J. M.; Yen, C.-F.; Cheeseman, B. A.; Reynolds, A. P.; Sutton, M. A.
2012-09-01
Workpiece material flow and stirring/mixing during the friction stir welding (FSW) process are investigated computationally. Within the numerical model of the FSW process, the FSW tool is treated as a Lagrangian component while the workpiece material is treated as an Eulerian component. The employed coupled Eulerian/Lagrangian computational analysis of the welding process was of a two-way thermo-mechanical character (i.e., frictional-sliding/plastic-work dissipation is taken to act as a heat source in the thermal-energy balance equation) while temperature is allowed to affect mechanical aspects of the model through temperature-dependent material properties. The workpiece material (AA5059, solid-solution strengthened and strain-hardened aluminum alloy) is represented using a modified version of the classical Johnson-Cook model (within which the strain-hardening term is augmented to take into account for the effect of dynamic recrystallization) while the FSW tool material (AISI H13 tool steel) is modeled as an isotropic linear-elastic material. Within the analysis, the effects of some of the FSW key process parameters are investigated (e.g., weld pitch, tool tilt-angle, and the tool pin-size). The results pertaining to the material flow during FSW are compared with their experimental counterparts. It is found that, for the most part, experimentally observed material-flow characteristics are reproduced within the current FSW-process model.
1991-12-01
34 foreign keys" ,which are keys inherited from conlected entities, the keys would already be defined in the connected entity’s domain primiti le definition...defined for the rootnode re!ationship because all attributes are foreign keys and they are already defined in the connected entities domain primitive...can exchange data with other tools including other tools in the tool vendor’s tool 99 Upper CASE Tool Charactcrizcs set. The important attributes are
Manole, Claudiu Constantin; Pîrvu, C; Maury, F; Demetrescu, I
2016-06-01
In a Surface Plasmon Resonance (SPR) experiment two key parameters are classically recorded: the time and the angle of SPR reflectivity. This paper brings into focus a third key parameter: SPR reflectivity. The SPR reflectivity is proved to be related to surface roughness changes. Practical investigations on (i) gold anodizing and (ii) polypyrrole film growth in presence of oxalic acid is detailed under potentiostatic conditions. These experimental results reveal the potential of using the SPR technique to investigate real-time changes both on the gold surface, but also in the gold film itself. This extends the versatility of the technique in particular as sensitive in-situ diagnostic tool.
Towards a cyberinfrastructure for the biological sciences: progress, visions and challenges.
Stein, Lincoln D
2008-09-01
Biology is an information-driven science. Large-scale data sets from genomics, physiology, population genetics and imaging are driving research at a dizzying rate. Simultaneously, interdisciplinary collaborations among experimental biologists, theorists, statisticians and computer scientists have become the key to making effective use of these data sets. However, too many biologists have trouble accessing and using these electronic data sets and tools effectively. A 'cyberinfrastructure' is a combination of databases, network protocols and computational services that brings people, information and computational tools together to perform science in this information-driven world. This article reviews the components of a biological cyberinfrastructure, discusses current and pending implementations, and notes the many challenges that lie ahead.
Development of New Sensing Materials Using Combinatorial and High-Throughput Experimentation
NASA Astrophysics Data System (ADS)
Potyrailo, Radislav A.; Mirsky, Vladimir M.
New sensors with improved performance characteristics are needed for applications as diverse as bedside continuous monitoring, tracking of environmental pollutants, monitoring of food and water quality, monitoring of chemical processes, and safety in industrial, consumer, and automotive settings. Typical requirements in sensor improvement are selectivity, long-term stability, sensitivity, response time, reversibility, and reproducibility. Design of new sensing materials is the important cornerstone in the effort to develop new sensors. Often, sensing materials are too complex to predict their performance quantitatively in the design stage. Thus, combinatorial and high-throughput experimentation methodologies provide an opportunity to generate new required data to discover new sensing materials and/or to optimize existing material compositions. The goal of this chapter is to provide an overview of the key concepts of experimental development of sensing materials using combinatorial and high-throughput experimentation tools, and to promote additional fruitful interactions between computational scientists and experimentalists.
Enabling the use of hereditary information from pedigree tools in medical knowledge-based systems.
Gay, Pablo; López, Beatriz; Plà, Albert; Saperas, Jordi; Pous, Carles
2013-08-01
The use of family information is a key issue to deal with inheritance illnesses. This kind of information use to come in the form of pedigree files, which contain structured information as tree or graphs, which explains the family relationships. Knowledge-based systems should incorporate the information gathered by pedigree tools to assess medical decision making. In this paper, we propose a method to achieve such a goal, which consists on the definition of new indicators, and methods and rules to compute them from family trees. The method is illustrated with several case studies. We provide information about its implementation and integration on a case-based reasoning tool. The method has been experimentally tested with breast cancer diagnosis data. The results show the feasibility of our methodology. Copyright © 2013 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Sebok, Angelia; Wickens, Christopher; Sargent, Robert
2015-01-01
One human factors challenge is predicting operator performance in novel situations. Approaches such as drawing on relevant previous experience, and developing computational models to predict operator performance in complex situations, offer potential methods to address this challenge. A few concerns with modeling operator performance are that models need to realistic, and they need to be tested empirically and validated. In addition, many existing human performance modeling tools are complex and require that an analyst gain significant experience to be able to develop models for meaningful data collection. This paper describes an effort to address these challenges by developing an easy to use model-based tool, using models that were developed from a review of existing human performance literature and targeted experimental studies, and performing an empirical validation of key model predictions.
Manual praxis in stone tool manufacture: implications for language evolution.
Ruck, Lana
2014-12-01
Alternative functions of the left-hemisphere dominant Broca's region have induced hypotheses regarding the evolutionary parallels between manual praxis and language in humans. Many recent studies on Broca's area reveal several assumptions about the cognitive mechanisms that underlie both functions, including: (1) an accurate, finely controlled body schema, (2) increasing syntactical abilities, particularly for goal-oriented actions, and (3) bilaterality and fronto-parietal connectivity. Although these characteristics are supported by experimental paradigms, many researchers have failed to acknowledge a major line of evidence for the evolutionary development of these traits: stone tools. The neuroscience of stone tool manufacture is a viable proxy for understanding evolutionary aspects of manual praxis and language, and may provide key information for evaluating competing hypotheses on the co-evolution of these cognitive domains in our species. Copyright © 2014 Elsevier Inc. All rights reserved.
Pucci, Fabrizio; Bourgeas, Raphaël; Rooman, Marianne
2016-03-18
The accurate prediction of the impact of an amino acid substitution on the thermal stability of a protein is a central issue in protein science, and is of key relevance for the rational optimization of various bioprocesses that use enzymes in unusual conditions. Here we present one of the first computational tools to predict the change in melting temperature ΔTm upon point mutations, given the protein structure and, when available, the melting temperature Tm of the wild-type protein. The key ingredients of our model structure are standard and temperature-dependent statistical potentials, which are combined with the help of an artificial neural network. The model structure was chosen on the basis of a detailed thermodynamic analysis of the system. The parameters of the model were identified on a set of more than 1,600 mutations with experimentally measured ΔTm. The performance of our method was tested using a strict 5-fold cross-validation procedure, and was found to be significantly superior to that of competing methods. We obtained a root mean square deviation between predicted and experimental ΔTm values of 4.2 °C that reduces to 2.9 °C when ten percent outliers are removed. A webserver-based tool is freely available for non-commercial use at soft.dezyme.com.
NASA Technical Reports Server (NTRS)
Rinehart, Stephen
2009-01-01
Astronomical studies at infrared wavelengths have dramatically improved our understanding of the universe, and observations with Spitzer, the upcoming Herschel mission, and SOFIA will continue to provide exciting new discoveries. The relatively low angular resolution of these missions, however, is insufficient to resolve the physical scale on which mid-to far-infrared emission arises, resulting in source and structure ambiguities that limit our ability to answer key science questions. Interferometry enables high angular resolution at these wavelengths - a powerful tool for scientific discovery. We will build the Balloon Experimental Twin Telescope for Infrared Interferometry (BETTII), an eight-meter baseline Michelson stellar interferometer to fly on a high-altitude balloon. BETTII's spectral-spatial capability, provided by an instrument using double-Fourier techniques, will address key questions about the nature of disks in young star clusters and active galactic nuclei and the envelopes of evolved stars. BETTII will also lay the technological groundwork for future space interferometers and for suborbital programs optimized for studying extrasolar planets.
CCTop: An Intuitive, Flexible and Reliable CRISPR/Cas9 Target Prediction Tool
del Sol Keyer, Maria; Wittbrodt, Joachim; Mateo, Juan L.
2015-01-01
Engineering of the CRISPR/Cas9 system has opened a plethora of new opportunities for site-directed mutagenesis and targeted genome modification. Fundamental to this is a stretch of twenty nucleotides at the 5’ end of a guide RNA that provides specificity to the bound Cas9 endonuclease. Since a sequence of twenty nucleotides can occur multiple times in a given genome and some mismatches seem to be accepted by the CRISPR/Cas9 complex, an efficient and reliable in silico selection and evaluation of the targeting site is key prerequisite for the experimental success. Here we present the CRISPR/Cas9 target online predictor (CCTop, http://crispr.cos.uni-heidelberg.de) to overcome limitations of already available tools. CCTop provides an intuitive user interface with reasonable default parameters that can easily be tuned by the user. From a given query sequence, CCTop identifies and ranks all candidate sgRNA target sites according to their off-target quality and displays full documentation. CCTop was experimentally validated for gene inactivation, non-homologous end-joining as well as homology directed repair. Thus, CCTop provides the bench biologist with a tool for the rapid and efficient identification of high quality target sites. PMID:25909470
Schultz, Simon R; Copeland, Caroline S; Foust, Amanda J; Quicke, Peter; Schuck, Renaud
2017-01-01
Recent years have seen substantial developments in technology for imaging neural circuits, raising the prospect of large scale imaging studies of neural populations involved in information processing, with the potential to lead to step changes in our understanding of brain function and dysfunction. In this article we will review some key recent advances: improved fluorophores for single cell resolution functional neuroimaging using a two photon microscope; improved approaches to the problem of scanning active circuits; and the prospect of scanless microscopes which overcome some of the bandwidth limitations of current imaging techniques. These advances in technology for experimental neuroscience have in themselves led to technical challenges, such as the need for the development of novel signal processing and data analysis tools in order to make the most of the new experimental tools. We review recent work in some active topics, such as region of interest segmentation algorithms capable of demixing overlapping signals, and new highly accurate algorithms for calcium transient detection. These advances motivate the development of new data analysis tools capable of dealing with spatial or spatiotemporal patterns of neural activity, that scale well with pattern size.
Kawano, Shin; Watanabe, Tsutomu; Mizuguchi, Sohei; Araki, Norie; Katayama, Toshiaki; Yamaguchi, Atsuko
2014-07-01
TogoTable (http://togotable.dbcls.jp/) is a web tool that adds user-specified annotations to a table that a user uploads. Annotations are drawn from several biological databases that use the Resource Description Framework (RDF) data model. TogoTable uses database identifiers (IDs) in the table as a query key for searching. RDF data, which form a network called Linked Open Data (LOD), can be searched from SPARQL endpoints using a SPARQL query language. Because TogoTable uses RDF, it can integrate annotations from not only the reference database to which the IDs originally belong, but also externally linked databases via the LOD network. For example, annotations in the Protein Data Bank can be retrieved using GeneID through links provided by the UniProt RDF. Because RDF has been standardized by the World Wide Web Consortium, any database with annotations based on the RDF data model can be easily incorporated into this tool. We believe that TogoTable is a valuable Web tool, particularly for experimental biologists who need to process huge amounts of data such as high-throughput experimental output. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.
Schultz, Simon R.; Copeland, Caroline S.; Foust, Amanda J.; Quicke, Peter; Schuck, Renaud
2017-01-01
Recent years have seen substantial developments in technology for imaging neural circuits, raising the prospect of large scale imaging studies of neural populations involved in information processing, with the potential to lead to step changes in our understanding of brain function and dysfunction. In this article we will review some key recent advances: improved fluorophores for single cell resolution functional neuroimaging using a two photon microscope; improved approaches to the problem of scanning active circuits; and the prospect of scanless microscopes which overcome some of the bandwidth limitations of current imaging techniques. These advances in technology for experimental neuroscience have in themselves led to technical challenges, such as the need for the development of novel signal processing and data analysis tools in order to make the most of the new experimental tools. We review recent work in some active topics, such as region of interest segmentation algorithms capable of demixing overlapping signals, and new highly accurate algorithms for calcium transient detection. These advances motivate the development of new data analysis tools capable of dealing with spatial or spatiotemporal patterns of neural activity, that scale well with pattern size. PMID:28757657
Predicting cancerlectins by the optimal g-gap dipeptides
NASA Astrophysics Data System (ADS)
Lin, Hao; Liu, Wei-Xin; He, Jiao; Liu, Xin-Hui; Ding, Hui; Chen, Wei
2015-12-01
The cancerlectin plays a key role in the process of tumor cell differentiation. Thus, to fully understand the function of cancerlectin is significant because it sheds light on the future direction for the cancer therapy. However, the traditional wet-experimental methods were money- and time-consuming. It is highly desirable to develop an effective and efficient computational tool to identify cancerlectins. In this study, we developed a sequence-based method to discriminate between cancerlectins and non-cancerlectins. The analysis of variance (ANOVA) was used to choose the optimal feature set derived from the g-gap dipeptide composition. The jackknife cross-validated results showed that the proposed method achieved the accuracy of 75.19%, which is superior to other published methods. For the convenience of other researchers, an online web-server CaLecPred was established and can be freely accessed from the website http://lin.uestc.edu.cn/server/CalecPred. We believe that the CaLecPred is a powerful tool to study cancerlectins and to guide the related experimental validations.
A survey of urban climate change experiments in 100 cities
Castán Broto, Vanesa; Bulkeley, Harriet
2013-01-01
Cities are key sites where climate change is being addressed. Previous research has largely overlooked the multiplicity of climate change responses emerging outside formal contexts of decision-making and led by actors other than municipal governments. Moreover, existing research has largely focused on case studies of climate change mitigation in developed economies. The objective of this paper is to uncover the heterogeneous mix of actors, settings, governance arrangements and technologies involved in the governance of climate change in cities in different parts of the world. The paper focuses on urban climate change governance as a process of experimentation. Climate change experiments are presented here as interventions to try out new ideas and methods in the context of future uncertainties. They serve to understand how interventions work in practice, in new contexts where they are thought of as innovative. To study experimentation, the paper presents evidence from the analysis of a database of 627 urban climate change experiments in a sample of 100 global cities. The analysis suggests that, since 2005, experimentation is a feature of urban responses to climate change across different world regions and multiple sectors. Although experimentation does not appear to be related to particular kinds of urban economic and social conditions, some of its core features are visible. For example, experimentation tends to focus on energy. Also, both social and technical forms of experimentation are visible, but technical experimentation is more common in urban infrastructure systems. While municipal governments have a critical role in climate change experimentation, they often act alongside other actors and in a variety of forms of partnership. These findings point at experimentation as a key tool to open up new political spaces for governing climate change in the city. PMID:23805029
GenoBase: comprehensive resource database of Escherichia coli K-12.
Otsuka, Yuta; Muto, Ai; Takeuchi, Rikiya; Okada, Chihiro; Ishikawa, Motokazu; Nakamura, Koichiro; Yamamoto, Natsuko; Dose, Hitomi; Nakahigashi, Kenji; Tanishima, Shigeki; Suharnan, Sivasundaram; Nomura, Wataru; Nakayashiki, Toru; Aref, Walid G; Bochner, Barry R; Conway, Tyrrell; Gribskov, Michael; Kihara, Daisuke; Rudd, Kenneth E; Tohsato, Yukako; Wanner, Barry L; Mori, Hirotada
2015-01-01
Comprehensive experimental resources, such as ORFeome clone libraries and deletion mutant collections, are fundamental tools for elucidation of gene function. Data sets by omics analysis using these resources provide key information for functional analysis, modeling and simulation both in individual and systematic approaches. With the long-term goal of complete understanding of a cell, we have over the past decade created a variety of clone and mutant sets for functional genomics studies of Escherichia coli K-12. We have made these experimental resources freely available to the academic community worldwide. Accordingly, these resources have now been used in numerous investigations of a multitude of cell processes. Quality control is extremely important for evaluating results generated by these resources. Because the annotation has been changed since 2005, which we originally used for the construction, we have updated these genomic resources accordingly. Here, we describe GenoBase (http://ecoli.naist.jp/GB/), which contains key information about comprehensive experimental resources of E. coli K-12, their quality control and several omics data sets generated using these resources. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.
Virtual laboratories: new opportunities for collaborative water science
NASA Astrophysics Data System (ADS)
Ceola, Serena; Arheimer, Berit; Bloeschl, Guenter; Baratti, Emanuele; Capell, Rene; Castellarin, Attilio; Freer, Jim; Han, Dawei; Hrachowitz, Markus; Hundecha, Yeshewatesfa; Hutton, Christopher; Lindström, Goran; Montanari, Alberto; Nijzink, Remko; Parajka, Juraj; Toth, Elena; Viglione, Alberto; Wagener, Thorsten
2015-04-01
Reproducibility and repeatability of experiments are the fundamental prerequisites that allow researchers to validate results and share hydrological knowledge, experience and expertise in the light of global water management problems. Virtual laboratories offer new opportunities to enable these prerequisites since they allow experimenters to share data, tools and pre-defined experimental procedures (i.e. protocols). Here we present the outcomes of a first collaborative numerical experiment undertaken by five different international research groups in a virtual laboratory to address the key issues of reproducibility and repeatability. Moving from the definition of accurate and detailed experimental protocols, a rainfall-runoff model was independently applied to 15 European catchments by the research groups and model results were collectively examined through a web-based discussion. We found that a detailed modelling protocol was crucial to ensure the comparability and reproducibility of the proposed experiment across groups. Our results suggest that sharing comprehensive and precise protocols and running the experiments within a controlled environment (e.g. virtual laboratory) is as fundamental as sharing data and tools for ensuring experiment repeatability and reproducibility across the broad scientific community and thus advancing hydrology in a more coherent way.
Dishevelled links basal body docking and orientation in ciliated epithelial cells
Vladar, Eszter K.; Axelrod, Jeffrey D.
2014-01-01
Some epithelia contain cells with multiple, motile cilia that beat in a concerted fashion. New tools and experimental systems have facilitated molecular studies of cilium biogenesis and of the coordinated planar polarization of cilia that leads to their concerted motility. Recent, elegant work by Park and colleagues, using embryonic frog epidermis, demonstrates that Dishevelled (Dvl), a key regulator of both the Wnt/β-catenin and Planar Cell Polarity (PCP) pathways, controls both the docking and planar polarization of ciliary basal bodies. PMID:18819800
MODELING MICROBUBBLE DYNAMICS IN BIOMEDICAL APPLICATIONS*
CHAHINE, Georges L.; HSIAO, Chao-Tsung
2012-01-01
Controlling microbubble dynamics to produce desirable biomedical outcomes when and where necessary and avoid deleterious effects requires advanced knowledge, which can be achieved only through a combination of experimental and numerical/analytical techniques. The present communication presents a multi-physics approach to study the dynamics combining viscous- in-viscid effects, liquid and structure dynamics, and multi bubble interaction. While complex numerical tools are developed and used, the study aims at identifying the key parameters influencing the dynamics, which need to be included in simpler models. PMID:22833696
2008-12-01
tools capable of reducing fratricide and collateral damage. The theory of recognition-by-components developed by Dr. Irving Biederman presented a...trainer. The key to thermal combat identification was discovered in an unusual place: chick sexing. Biederman and Shiffrar [11] conducted object...professional sexers was .82. Biederman and Shiffrar conclude that “…after instruction the performance of the naïve subjects more closely resemble that of the
Some aspects of precise laser machining - Part 2: Experimental
NASA Astrophysics Data System (ADS)
Grabowski, Marcin; Wyszynski, Dominik; Ostrowski, Robert
2018-05-01
The paper describes the role of laser beam polarization on quality of laser beam machined cutting tool edge. In micromachining the preparation of the cutting tools in play a key role on dimensional accuracy, sharpness and the quality of the cutting edges. In order to assure quality and dimensional accuracy of the cutting tool edge it is necessary to apply laser polarization control. In the research diode pumped Nd:YAG 532nm pulse laser was applied. Laser beam polarization used in the research was linear (horizontal, vertical). The goal of the carried out research was to describe impact of laser beam polarization on efficiency of the cutting process and quality of machined parts (edge, surface) made of polycrystalline diamond (PCD) and cubic boron nitride (cBN). Application of precise cutting tool in micromachining has significant impact on the minimum uncut chip thickness and quality of the parts. The research was carried within the INNOLOT program funded by the National Centre for Research and Development.
The Diesel Combustion Collaboratory: Combustion Researchers Collaborating over the Internet
DOE Office of Scientific and Technical Information (OSTI.GOV)
C. M. Pancerella; L. A. Rahn; C. Yang
2000-02-01
The Diesel Combustion Collaborator (DCC) is a pilot project to develop and deploy collaborative technologies to combustion researchers distributed throughout the DOE national laboratories, academia, and industry. The result is a problem-solving environment for combustion research. Researchers collaborate over the Internet using DCC tools, which include: a distributed execution management system for running combustion models on widely distributed computers, including supercomputers; web-accessible data archiving capabilities for sharing graphical experimental or modeling data; electronic notebooks and shared workspaces for facilitating collaboration; visualization of combustion data; and video-conferencing and data-conferencing among researchers at remote sites. Security is a key aspect of themore » collaborative tools. In many cases, the authors have integrated these tools to allow data, including large combustion data sets, to flow seamlessly, for example, from modeling tools to data archives. In this paper the authors describe the work of a larger collaborative effort to design, implement and deploy the DCC.« less
[Coffee as hepatoprotective factor].
Szántová, Mária; Ďurkovičová, Zuzana
The mind about the coffee did change upon the recent studies and metaanalysis of the last years. Consensual protective effect of coffee on the progression of chronic liver diseases (NASH, viral hepatitis, liver cirrhosis, hepatocelullar carcinoma) was detected in experimental, clinical and large population studies together with decrease of mortality. Antioxidant, antifibrotic, insulinsensitizing and anticarcinogenic effect of coffee were detected. Modulation of genetic expression of key enzymes of fatty acid synthesis, modulation of mRNA included in autophagia, reduction of stress of endoplasmatic reticulum together with decrease of proinflammatory cytokines and decrease of fibrogenesis are main mechanisms. Chlorogenic acids, diterpens (cafestol, kahweol), caffein, polyfenols and melanoidins are key protective components of coffee. Inverse dose-dependent correlation of coffee consumption with liver diseases was found in clinical and population studies. Coffee is non-pharmacological tool of primary and secondary prevention of chronic liver diseases. Review of published data together with supposed mechanisms of hepatoprotection are given.Key words: coffee - hepatoprotective effect - metaanalysis.
The evolution of tumour phylogenetics: principles and practice
Schwartz, Russell; Schäffer, Alejandro A.
2018-01-01
Rapid advances in high-throughput sequencing and a growing realization of the importance of evolutionary theory to cancer genomics have led to a proliferation of phylogenetic studies of tumour progression. These studies have yielded not only new insights but also a plethora of experimental approaches, sometimes reaching conflicting or poorly supported conclusions. Here, we consider this body of work in light of the key computational principles underpinning phylogenetic inference, with the goal of providing practical guidance on the design and analysis of scientifically rigorous tumour phylogeny studies. We survey the range of methods and tools available to the researcher, their key applications, and the various unsolved problems, closing with a perspective on the prospects and broader implications of this field. PMID:28190876
The evolution of tumour phylogenetics: principles and practice.
Schwartz, Russell; Schäffer, Alejandro A
2017-04-01
Rapid advances in high-throughput sequencing and a growing realization of the importance of evolutionary theory to cancer genomics have led to a proliferation of phylogenetic studies of tumour progression. These studies have yielded not only new insights but also a plethora of experimental approaches, sometimes reaching conflicting or poorly supported conclusions. Here, we consider this body of work in light of the key computational principles underpinning phylogenetic inference, with the goal of providing practical guidance on the design and analysis of scientifically rigorous tumour phylogeny studies. We survey the range of methods and tools available to the researcher, their key applications, and the various unsolved problems, closing with a perspective on the prospects and broader implications of this field.
Kazaura, Kamugisha; Omae, Kazunori; Suzuki, Toshiji; Matsumoto, Mitsuji; Mutafungwa, Edward; Korhonen, Timo O; Murakami, Tadaaki; Takahashi, Koichi; Matsumoto, Hideki; Wakamori, Kazuhiko; Arimoto, Yoshinori
2006-06-12
The deterioration and deformation of a free-space optical beam wave-front as it propagates through the atmosphere can reduce the link availability and may introduce burst errors thus degrading the performance of the system. We investigate the suitability of utilizing soft-computing (SC) based tools for improving performance of free-space optical (FSO) communications systems. The SC based tools are used for the prediction of key parameters of a FSO communications system. Measured data collected from an experimental FSO communication system is used as training and testing data for a proposed multi-layer neural network predictor (MNNP) used to predict future parameter values. The predicted parameters are essential for reducing transmission errors by improving the antenna's accuracy of tracking data beams. This is particularly essential for periods considered to be of strong atmospheric turbulence. The parameter values predicted using the proposed tool show acceptable conformity with original measurements.
New educational tools to encourage high-school students' activity in stem
NASA Astrophysics Data System (ADS)
Mayorova, Vera; Grishko, Dmitriy; Leonov, Victor
2018-01-01
Many students have to choose their future profession during their last years in the high school and therefore to choose a university where they will get proper education. That choice may define their professional life for many years ahead or probably for the rest of their lives. Bauman Moscow State Technical University conducts various events to introduce future professions to high-school students. Such activity helps them to pick specialization in line with their interests and motivates them to study key scientific subjects. The paper focuses on newly developed educational tools to encourage high school students' interest in STEM disciplines. These tools include laboratory courses developed in the fields of physics, information technologies and mathematics. More than 2000 high school students already participated in these experimental courses. These activities are aimed at increasing the quality of STEM disciplines learning which will result in higher quality of training of future engineers.
An Atlas of annotations of Hydra vulgaris transcriptome.
Evangelista, Daniela; Tripathi, Kumar Parijat; Guarracino, Mario Rosario
2016-09-22
RNA sequencing takes advantage of the Next Generation Sequencing (NGS) technologies for analyzing RNA transcript counts with an excellent accuracy. Trying to interpret this huge amount of data in biological information is still a key issue, reason for which the creation of web-resources useful for their analysis is highly desiderable. Starting from a previous work, Transcriptator, we present the Atlas of Hydra's vulgaris, an extensible web tool in which its complete transcriptome is annotated. In order to provide to the users an advantageous resource that include the whole functional annotated transcriptome of Hydra vulgaris water polyp, we implemented the Atlas web-tool contains 31.988 accesible and downloadable transcripts of this non-reference model organism. Atlas, as a freely available resource, can be considered a valuable tool to rapidly retrieve functional annotation for transcripts differentially expressed in Hydra vulgaris exposed to the distinct experimental treatments. WEB RESOURCE URL: http://www-labgtp.na.icar.cnr.it/Atlas .
Improving human activity recognition and its application in early stroke diagnosis.
Villar, José R; González, Silvia; Sedano, Javier; Chira, Camelia; Trejo-Gabriel-Galan, Jose M
2015-06-01
The development of efficient stroke-detection methods is of significant importance in today's society due to the effects and impact of stroke on health and economy worldwide. This study focuses on Human Activity Recognition (HAR), which is a key component in developing an early stroke-diagnosis tool. An overview of the proposed global approach able to discriminate normal resting from stroke-related paralysis is detailed. The main contributions include an extension of the Genetic Fuzzy Finite State Machine (GFFSM) method and a new hybrid feature selection (FS) algorithm involving Principal Component Analysis (PCA) and a voting scheme putting the cross-validation results together. Experimental results show that the proposed approach is a well-performing HAR tool that can be successfully embedded in devices.
Data management routines for reproducible research using the G-Node Python Client library
Sobolev, Andrey; Stoewer, Adrian; Pereira, Michael; Kellner, Christian J.; Garbers, Christian; Rautenberg, Philipp L.; Wachtler, Thomas
2014-01-01
Structured, efficient, and secure storage of experimental data and associated meta-information constitutes one of the most pressing technical challenges in modern neuroscience, and does so particularly in electrophysiology. The German INCF Node aims to provide open-source solutions for this domain that support the scientific data management and analysis workflow, and thus facilitate future data access and reproducible research. G-Node provides a data management system, accessible through an application interface, that is based on a combination of standardized data representation and flexible data annotation to account for the variety of experimental paradigms in electrophysiology. The G-Node Python Library exposes these services to the Python environment, enabling researchers to organize and access their experimental data using their familiar tools while gaining the advantages that a centralized storage entails. The library provides powerful query features, including data slicing and selection by metadata, as well as fine-grained permission control for collaboration and data sharing. Here we demonstrate key actions in working with experimental neuroscience data, such as building a metadata structure, organizing recorded data in datasets, annotating data, or selecting data regions of interest, that can be automated to large degree using the library. Compliant with existing de-facto standards, the G-Node Python Library is compatible with many Python tools in the field of neurophysiology and thus enables seamless integration of data organization into the scientific data workflow. PMID:24634654
Data management routines for reproducible research using the G-Node Python Client library.
Sobolev, Andrey; Stoewer, Adrian; Pereira, Michael; Kellner, Christian J; Garbers, Christian; Rautenberg, Philipp L; Wachtler, Thomas
2014-01-01
Structured, efficient, and secure storage of experimental data and associated meta-information constitutes one of the most pressing technical challenges in modern neuroscience, and does so particularly in electrophysiology. The German INCF Node aims to provide open-source solutions for this domain that support the scientific data management and analysis workflow, and thus facilitate future data access and reproducible research. G-Node provides a data management system, accessible through an application interface, that is based on a combination of standardized data representation and flexible data annotation to account for the variety of experimental paradigms in electrophysiology. The G-Node Python Library exposes these services to the Python environment, enabling researchers to organize and access their experimental data using their familiar tools while gaining the advantages that a centralized storage entails. The library provides powerful query features, including data slicing and selection by metadata, as well as fine-grained permission control for collaboration and data sharing. Here we demonstrate key actions in working with experimental neuroscience data, such as building a metadata structure, organizing recorded data in datasets, annotating data, or selecting data regions of interest, that can be automated to large degree using the library. Compliant with existing de-facto standards, the G-Node Python Library is compatible with many Python tools in the field of neurophysiology and thus enables seamless integration of data organization into the scientific data workflow.
RNA-SeQC: RNA-seq metrics for quality control and process optimization.
DeLuca, David S; Levin, Joshua Z; Sivachenko, Andrey; Fennell, Timothy; Nazaire, Marc-Danie; Williams, Chris; Reich, Michael; Winckler, Wendy; Getz, Gad
2012-06-01
RNA-seq, the application of next-generation sequencing to RNA, provides transcriptome-wide characterization of cellular activity. Assessment of sequencing performance and library quality is critical to the interpretation of RNA-seq data, yet few tools exist to address this issue. We introduce RNA-SeQC, a program which provides key measures of data quality. These metrics include yield, alignment and duplication rates; GC bias, rRNA content, regions of alignment (exon, intron and intragenic), continuity of coverage, 3'/5' bias and count of detectable transcripts, among others. The software provides multi-sample evaluation of library construction protocols, input materials and other experimental parameters. The modularity of the software enables pipeline integration and the routine monitoring of key measures of data quality such as the number of alignable reads, duplication rates and rRNA contamination. RNA-SeQC allows investigators to make informed decisions about sample inclusion in downstream analysis. In summary, RNA-SeQC provides quality control measures critical to experiment design, process optimization and downstream computational analysis. See www.genepattern.org to run online, or www.broadinstitute.org/rna-seqc/ for a command line tool.
The future challenge for aeropropulsion
NASA Technical Reports Server (NTRS)
Rosen, Robert; Bowditch, David N.
1992-01-01
NASA's research in aeropropulsion is focused on improving the efficiency, capability, and environmental compatibility for all classes of future aircraft. The development of innovative concepts, and theoretical, experimental, and computational tools provide the knowledge base for continued propulsion system advances. Key enabling technologies include advances in internal fluid mechanics, structures, light-weight high-strength composite materials, and advanced sensors and controls. Recent emphasis has been on the development of advanced computational tools in internal fluid mechanics, structural mechanics, reacting flows, and computational chemistry. For subsonic transport applications, very high bypass ratio turbofans with increased engine pressure ratio are being investigated to increase fuel efficiency and reduce airport noise levels. In a joint supersonic cruise propulsion program with industry, the critical environmental concerns of emissions and community noise are being addressed. NASA is also providing key technologies for the National Aerospaceplane, and is studying propulsion systems that provide the capability for aircraft to accelerate to and cruise in the Mach 4-6 speed range. The combination of fundamental, component, and focused technology development underway at NASA will make possible dramatic advances in aeropropulsion efficiency and environmental compatibility for future aeronautical vehicles.
Development of a Scale-up Tool for Pervaporation Processes
Thiess, Holger; Strube, Jochen
2018-01-01
In this study, an engineering tool for the design and optimization of pervaporation processes is developed based on physico-chemical modelling coupled with laboratory/mini-plant experiments. The model incorporates the solution-diffusion-mechanism, polarization effects (concentration and temperature), axial dispersion, pressure drop and the temperature drop in the feed channel due to vaporization of the permeating components. The permeance, being the key model parameter, was determined via dehydration experiments on a mini-plant scale for the binary mixtures ethanol/water and ethyl acetate/water. A second set of experimental data was utilized for the validation of the model for two chemical systems. The industrially relevant ternary mixture, ethanol/ethyl acetate/water, was investigated close to its azeotropic point and compared to a simulation conducted with the determined binary permeance data. Experimental and simulation data proved to agree very well for the investigated process conditions. In order to test the scalability of the developed engineering tool, large-scale data from an industrial pervaporation plant used for the dehydration of ethanol was compared to a process simulation conducted with the validated physico-chemical model. Since the membranes employed in both mini-plant and industrial scale were of the same type, the permeance data could be transferred. The comparison of the measured and simulated data proved the scalability of the derived model. PMID:29342956
NASA Astrophysics Data System (ADS)
Persico, Marco; Fattorusso, Roberto; Taglialatela-Scafati, Orazio; Chianese, Giuseppina; de Paola, Ivan; Zaccaro, Laura; Rondinelli, Francesca; Lombardo, Marco; Quintavalla, Arianna; Trombini, Claudio; Fattorusso, Ernesto; Fattorusso, Caterina; Farina, Biancamaria
2017-04-01
In the present work we performed a combined experimental and computational study on the interaction of the natural antimalarial endoperoxide plakortin and its synthetic analogue 4a with heme. Obtained results indicate that the studied compounds produce reactive carbon radical species after being reductively activated by heme. In particular, similarly to artemisinin, the formation of radicals prone to inter-molecular reactions should represent the key event responsible for Plasmodium death. To our knowledge this is the first experimental investigation on the reductive activation of simple antimalarial endoperoxides (1,2-dioxanes) by heme and results were compared to the ones previously obtained from the reaction with FeCl2. The obtained experimental data and the calculated molecular interaction models represent crucial tools for the rational optimization of our promising class of low-cost synthetic antimalarial endoperoxides.
mHealth for HIV Treatment & Prevention: A Systematic Review of the Literature
Catalani, Caricia; Philbrick, William; Fraser, Hamish; Mechael, , Patricia; Israelski, Dennis M.
2013-01-01
This systematic review assesses the published literature to describe the landscape of mobile health technology (mHealth) for HIV/AIDS and the evidence supporting the use of these tools to address the HIV prevention, care, and treatment cascade. The speed of innovation, broad range of initiatives and tools, and heterogeneity in reporting have made it difficult to uncover and synthesize knowledge on how mHealth tools might be effective in addressing the HIV pandemic. To do address this gap, a team of reviewers collected literature on the use of mobile technology for HIV/AIDS among health, engineering, and social science literature databases and analyzed a final set of 62 articles. Articles were systematically coded, assessed for scientific rigor, and sorted for HIV programmatic relevance. The review revealed evidence that mHealth tools support HIV programmatic priorities, including: linkage to care, retention in care, and adherence to antiretroviral treatment. In terms of technical features, mHealth tools facilitate alerts and reminders, data collection, direct voice communication, educational messaging, information on demand, and more. Studies were mostly descriptive with a growing number of quasi-experimental and experimental designs. There was a lack of evidence around the use of mHealth tools to address the needs of key populations, including pregnant mothers, sex workers, users of injection drugs, and men who have sex with men. The science and practice of mHealth for HIV are evolving rapidly, but still in their early stages. Small-scale efforts, pilot projects, and preliminary descriptive studies are advancing and there is a promising trend toward implementing mHealth innovation that is feasible and acceptable within low-resource settings, positive program outcomes, operational improvements, and rigorous study design PMID:24133558
Principles for valid histopathologic scoring in research
Gibson-Corley, Katherine N.; Olivier, Alicia K.; Meyerholz, David K.
2013-01-01
Histopathologic scoring is a tool by which semi-quantitative data can be obtained from tissues. Initially, a thorough understanding of the experimental design, study objectives and methods are required to allow the pathologist to appropriately examine tissues and develop lesion scoring approaches. Many principles go into the development of a scoring system such as tissue examination, lesion identification, scoring definitions and consistency in interpretation. Masking (a.k.a. “blinding”) of the pathologist to experimental groups is often necessary to constrain bias and multiple mechanisms are available. Development of a tissue scoring system requires appreciation of the attributes and limitations of the data (e.g. nominal, ordinal, interval and ratio data) to be evaluated. Incidence, ordinal and rank methods of tissue scoring are demonstrated along with key principles for statistical analyses and reporting. Validation of a scoring system occurs through two principal measures: 1) validation of repeatability and 2) validation of tissue pathobiology. Understanding key principles of tissue scoring can help in the development and/or optimization of scoring systems so as to consistently yield meaningful and valid scoring data. PMID:23558974
Surgical tool detection and tracking in retinal microsurgery
NASA Astrophysics Data System (ADS)
Alsheakhali, Mohamed; Yigitsoy, Mehmet; Eslami, Abouzar; Navab, Nassir
2015-03-01
Visual tracking of surgical instruments is an essential part of eye surgery, and plays an important role for the surgeons as well as it is a key component of robotics assistance during the operation time. The difficulty of detecting and tracking medical instruments in-vivo images comes from its deformable shape, changes in brightness, and the presence of the instrument shadow. This paper introduces a new approach to detect the tip of surgical tool and its width regardless of its head shape and the presence of the shadows or vessels. The approach relies on integrating structural information about the strong edges from the RGB color model, and the tool location-based information from L*a*b color model. The probabilistic Hough transform was applied to get the strongest straight lines in the RGB-images, and based on information from the L* and a*, one of these candidates lines is selected as the edge of the tool shaft. Based on that line, the tool slope, the tool centerline and the tool tip could be detected. The tracking is performed by keeping track of the last detected tool tip and the tool slope, and filtering the Hough lines within a box around the last detected tool tip based on the slope differences. Experimental results demonstrate the high accuracy achieved in term of detecting the tool tip position, the tool joint point position, and the tool centerline. The approach also meets the real time requirements.
Li, Hongyu; Walker, David; Yu, Guoyu; Sayle, Andrew; Messelink, Wilhelmus; Evans, Rob; Beaucamp, Anthony
2013-01-14
Edge mis-figure is regarded as one of the most difficult technical issues for manufacturing the segments of extremely large telescopes, which can dominate key aspects of performance. A novel edge-control technique has been developed, based on 'Precessions' polishing technique and for which accurate and stable edge tool influence functions (TIFs) are crucial. In the first paper in this series [D. Walker Opt. Express 20, 19787-19798 (2012)], multiple parameters were experimentally optimized using an extended set of experiments. The first purpose of this new work is to 'short circuit' this procedure through modeling. This also gives the prospect of optimizing local (as distinct from global) polishing for edge mis-figure, now under separate development. This paper presents a model that can predict edge TIFs based on surface-speed profiles and pressure distributions over the polishing spot at the edge of the part, the latter calculated by finite element analysis and verified by direct force measurement. This paper also presents a hybrid-measurement method for edge TIFs to verify the simulation results. Experimental and simulation results show good agreement.
Tang, Hua; Chen, Wei; Lin, Hao
2016-04-01
Immunoglobulins, also called antibodies, are a group of cell surface proteins which are produced by the immune system in response to the presence of a foreign substance (called antigen). They play key roles in many medical, diagnostic and biotechnological applications. Correct identification of immunoglobulins is crucial to the comprehension of humoral immune function. With the avalanche of protein sequences identified in postgenomic age, it is highly desirable to develop computational methods to timely identify immunoglobulins. In view of this, we designed a predictor called "IGPred" by formulating protein sequences with the pseudo amino acid composition into which nine physiochemical properties of amino acids were incorporated. Jackknife cross-validated results showed that 96.3% of immunoglobulins and 97.5% of non-immunoglobulins can be correctly predicted, indicating that IGPred holds very high potential to become a useful tool for antibody analysis. For the convenience of most experimental scientists, a web-server for IGPred was established at http://lin.uestc.edu.cn/server/IGPred. We believe that the web-server will become a powerful tool to study immunoglobulins and to guide related experimental validations.
A bioinformatics expert system linking functional data to anatomical outcomes in limb regeneration
Lobo, Daniel; Feldman, Erica B.; Shah, Michelle; Malone, Taylor J.
2014-01-01
Abstract Amphibians and molting arthropods have the remarkable capacity to regenerate amputated limbs, as described by an extensive literature of experimental cuts, amputations, grafts, and molecular techniques. Despite a rich history of experimental effort, no comprehensive mechanistic model exists that can account for the pattern regulation observed in these experiments. While bioinformatics algorithms have revolutionized the study of signaling pathways, no such tools have heretofore been available to assist scientists in formulating testable models of large‐scale morphogenesis that match published data in the limb regeneration field. Major barriers to preventing an algorithmic approach are the lack of formal descriptions for experimental regenerative information and a repository to centralize storage and mining of functional data on limb regeneration. Establishing a new bioinformatics of shape would significantly accelerate the discovery of key insights into the mechanisms that implement complex regeneration. Here, we describe a novel mathematical ontology for limb regeneration to unambiguously encode phenotype, manipulation, and experiment data. Based on this formalism, we present the first centralized formal database of published limb regeneration experiments together with a user‐friendly expert system tool to facilitate its access and mining. These resources are freely available for the community and will assist both human biologists and artificial intelligence systems to discover testable, mechanistic models of limb regeneration. PMID:25729585
Fletcher, Jason M; Conley, Dalton
2013-10-01
The integration of genetics and the social sciences will lead to a more complex understanding of the articulation between social and biological processes, although the empirical difficulties inherent in this integration are large. One key challenge is the implications of moving "outside the lab" and away from the experimental tools available for research with model organisms. Social science research methods used to examine human behavior in nonexperimental, real-world settings to date have not been fully taken advantage of during this disciplinary integration, especially in the form of gene-environment interaction research. This article outlines and provides examples of several prominent research designs that should be used in gene-environment research and highlights a key benefit to geneticists of working with social scientists.
Austvoll-Dahlgren, Astrid; Nsangi, Allen; Semakula, Daniel
2016-12-29
People's ability to appraise claims about treatment effects is crucial for informed decision-making. Our objective was to systematically map this area of research in order to (a) provide an overview of interventions targeting key concepts that people need to understand to assess treatment claims and (b) to identify assessment tools used to evaluate people's understanding of these concepts. The findings of this review provide a starting point for decisions about which key concepts to address when developing new interventions, and which assessment tools should be considered. We conducted a systematic mapping review of interventions and assessment tools addressing key concepts important for people to be able to assess treatment claims. A systematic literature search was done by a reserach librarian in relevant databases. Judgement about inclusion of studies and data collection was done by at least two researchers. We included all quantitative study designs targeting one or more of the key concepts, and targeting patients, healthy members of the public, and health professionals. The studies were divided into four categories: risk communication and decision aids, evidence-based medicine and critical appraisal, understanding of controlled trials, and science education. Findings were summarised descriptively. We included 415 studies, of which the interventions and assessment tools we identified included only a handful of the key concepts. The most common key concepts in interventions were "Treatments usually have beneficial and harmful effects," "Treatment comparisons should be fair," "Compare like with like," and "Single studies can be misleading." A variety of assessment tools were identified, but only four assessment tools included 10 or more key concepts. There is great potential for developing learning and assessment tools targeting key concepts that people need to understand to assess claims about treatment effects. There is currently no instrument covering assessment of all these key concepts.
Manijak, Mieszko P; Nielsen, Henrik B
2011-06-11
Although, systematic analysis of gene annotation is a powerful tool for interpreting gene expression data, it sometimes is blurred by incomplete gene annotation, missing expression response of key genes and secondary gene expression responses. These shortcomings may be partially circumvented by instead matching gene expression signatures to signatures of other experiments. To facilitate this we present the Functional Association Response by Overlap (FARO) server, that match input signatures to a compendium of 242 gene expression signatures, extracted from more than 1700 Arabidopsis microarray experiments. Hereby we present a publicly available tool for robust characterization of Arabidopsis gene expression experiments which can point to similar experimental factors in other experiments. The server is available at http://www.cbs.dtu.dk/services/faro/.
NASA Astrophysics Data System (ADS)
Gregorčič, Peter; Sedlaček, Marko; Podgornik, Bojan; Reif, Jürgen
2016-11-01
Laser-induced periodic surface structures (LIPSS) are produced on cold work tool steel by irradiation with a low number of picosecond laser pulses. As expected, the ripples, with a period of about 90% of the laser wavelength, are oriented perpendicular to the laser polarization. Subsequent irradiation with the polarization rotated by 45° or 90° results in a corresponding rotation of the ripples. This is visible already with the first pulse and becomes almost complete - erasing the previous orientation - after as few as three pulses. The phenomenon is not only observed for single-spot irradiation but also for writing long coherent traces. The experimental results strongly defy the role of surface plasmon-polaritons as the predominant key to LIPSS formation.
“Elegant Tool” Delivers Genome-Level Science for Electrolytes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keith Arterburn
Now, a ‘disruptive, virtual scientific simulation tool’ delivers a new, genome-level investigation for electrolytes to develop better, more efficient batteries. Dr. Kevin Gering, an Idaho National Laboratory researcher, has developed the Advanced Electrolyte Model (AEM), a copyrighted molecular-based simulation tool that has been scientifically proven and validated using at least a dozen ‘real-world’ physical metrics. Nominated for the 2014 international R&D 100 Award, AEM revolutionizes electrolyte materials selection, optimizing combinations and key design elements to make battery design and experimentation quick, accurate and responsive to specific needs.
A Perspective on DNA Microarrays in Pathology Research and Practice
Pollack, Jonathan R.
2007-01-01
DNA microarray technology matured in the mid-1990s, and the past decade has witnessed a tremendous growth in its application. DNA microarrays have provided powerful tools for pathology researchers seeking to describe, classify, and understand human disease. There has also been great expectation that the technology would advance the practice of pathology. This review highlights some of the key contributions of DNA microarrays to experimental pathology, focusing in the area of cancer research. Also discussed are some of the current challenges in translating utility to clinical practice. PMID:17600117
(−) Arctigenin and (+) Pinoresinol Are Antagonists of the Human Thyroid Hormone Receptor β
2015-01-01
Lignans are important biologically active dietary polyphenolic compounds. Consumption of foods that are rich in lignans is associated with positive health effects. Using modeling tools to probe the ligand-binding pockets of molecular receptors, we found that lignans have high docking affinity for the human thyroid hormone receptor β. Follow-up experimental results show that lignans (−) arctigenin and (+) pinoresinol are antagonists of the human thyroid hormone receptor β. The modeled complexes show key plausible interactions between the two ligands and important amino acid residues of the receptor. PMID:25383984
An Update on Design Tools for Optimization of CMC 3D Fiber Architectures
NASA Technical Reports Server (NTRS)
Lang, J.; DiCarlo, J.
2012-01-01
Objective: Describe and up-date progress for NASA's efforts to develop 3D architectural design tools for CMC in general and for SIC/SiC composites in particular. Describe past and current sequential work efforts aimed at: Understanding key fiber and tow physical characteristics in conventional 2D and 3D woven architectures as revealed by microstructures in the literature. Developing an Excel program for down-selecting and predicting key geometric properties and resulting key fiber-controlled properties for various conventional 3D architectures. Developing a software tool for accurately visualizing all the key geometric details of conventional 3D architectures. Validating tools by visualizing and predicting the Internal geometry and key mechanical properties of a NASA SIC/SIC panel with a 3D orthogonal architecture. Applying the predictive and visualization tools toward advanced 3D orthogonal SiC/SIC composites, and combining them into a user-friendly software program.
OralCard: a bioinformatic tool for the study of oral proteome.
Arrais, Joel P; Rosa, Nuno; Melo, José; Coelho, Edgar D; Amaral, Diana; Correia, Maria José; Barros, Marlene; Oliveira, José Luís
2013-07-01
The molecular complexity of the human oral cavity can only be clarified through identification of components that participate within it. However current proteomic techniques produce high volumes of information that are dispersed over several online databases. Collecting all of this data and using an integrative approach capable of identifying unknown associations is still an unsolved problem. This is the main motivation for this work. We present the online bioinformatic tool OralCard, which comprises results from 55 manually curated articles reflecting the oral molecular ecosystem (OralPhysiOme). It comprises experimental information available from the oral proteome both of human (OralOme) and microbial origin (MicroOralOme) structured in protein, disease and organism. This tool is a key resource for researchers to understand the molecular foundations implicated in biology and disease mechanisms of the oral cavity. The usefulness of this tool is illustrated with the analysis of the oral proteome associated with diabetes melitus type 2. OralCard is available at http://bioinformatics.ua.pt/oralcard. Copyright © 2013 Elsevier Ltd. All rights reserved.
Using Galaxy to Perform Large-Scale Interactive Data Analyses
Hillman-Jackson, Jennifer; Clements, Dave; Blankenberg, Daniel; Taylor, James; Nekrutenko, Anton
2012-01-01
Innovations in biomedical research technologies continue to provide experimental biologists with novel and increasingly large genomic and high-throughput data resources to be analyzed. As creating and obtaining data has become easier, the key decision faced by many researchers is a practical one: where and how should an analysis be performed? Datasets are large and analysis tool set-up and use is riddled with complexities outside of the scope of core research activities. The authors believe that Galaxy (galaxyproject.org) provides a powerful solution that simplifies data acquisition and analysis in an intuitive web-application, granting all researchers access to key informatics tools previously only available to computational specialists working in Unix-based environments. We will demonstrate through a series of biomedically relevant protocols how Galaxy specifically brings together 1) data retrieval from public and private sources, for example, UCSC’s Eukaryote and Microbial Genome Browsers (genome.ucsc.edu), 2) custom tools (wrapped Unix functions, format standardization/conversions, interval operations) and 3rd party analysis tools, for example, Bowtie/Tuxedo Suite (bowtie-bio.sourceforge.net), Lastz (www.bx.psu.edu/~rsharris/lastz/), SAMTools (samtools.sourceforge.net), FASTX-toolkit (hannonlab.cshl.edu/fastx_toolkit), and MACS (liulab.dfci.harvard.edu/MACS), and creates results formatted for visualization in tools such as the Galaxy Track Browser (GTB, galaxyproject.org/wiki/Learn/Visualization), UCSC Genome Browser (genome.ucsc.edu), Ensembl (www.ensembl.org), and GeneTrack (genetrack.bx.psu.edu). Galaxy rapidly has become the most popular choice for integrated next generation sequencing (NGS) analytics and collaboration, where users can perform, document, and share complex analysis within a single interface in an unprecedented number of ways. PMID:18428782
Correction tool for Active Shape Model based lumbar muscle segmentation.
Valenzuela, Waldo; Ferguson, Stephen J; Ignasiak, Dominika; Diserens, Gaelle; Vermathen, Peter; Boesch, Chris; Reyes, Mauricio
2015-08-01
In the clinical environment, accuracy and speed of the image segmentation process plays a key role in the analysis of pathological regions. Despite advances in anatomic image segmentation, time-effective correction tools are commonly needed to improve segmentation results. Therefore, these tools must provide faster corrections with a low number of interactions, and a user-independent solution. In this work we present a new interactive correction method for correcting the image segmentation. Given an initial segmentation and the original image, our tool provides a 2D/3D environment, that enables 3D shape correction through simple 2D interactions. Our scheme is based on direct manipulation of free form deformation adapted to a 2D environment. This approach enables an intuitive and natural correction of 3D segmentation results. The developed method has been implemented into a software tool and has been evaluated for the task of lumbar muscle segmentation from Magnetic Resonance Images. Experimental results show that full segmentation correction could be performed within an average correction time of 6±4 minutes and an average of 68±37 number of interactions, while maintaining the quality of the final segmentation result within an average Dice coefficient of 0.92±0.03.
ICAT: Integrating data infrastructure for facilities based science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flannery, Damian; Matthews, Brian; Griffin, Tom
2009-12-21
ICAT: Integrating data infrastructure for facilities based science Damian Flannery, Brian Matthews, Tom Griffin, Juan Bicarregui, Michael Gleaves, Laurent Lerusse, Roger Downing, Alun Ashton, Shoaib Sufi, Glen Drinkwater, Kerstin Kleese Abstract— Scientific facilities, in particular large-scale photon and neutron sources, have demanding requirements to manage the increasing quantities of experimental data they generate in a systematic and secure way. In this paper, we describe the ICAT infrastructure for cataloguing facility generated experimental data which has been in development within STFC and DLS for several years. We consider the factors which have influenced its design and describe its architecture and metadatamore » model, a key tool in the management of data. We go on to give an outline of its current implementation and use, with plans for its future development.« less
NASA Astrophysics Data System (ADS)
Bijeljic, Branko; Icardi, Matteo; Prodanović, Maša
2018-05-01
Substantial progress has been made over last few decades on understanding the physics of multiphase flow and reactive transport phenomena in subsurface porous media. Confluence of advances in experimental techniques (including micromodels, X-ray microtomography, Nuclear Magnetic Resonance (NMR)) as well as computational power have made it possible to observe static and dynamic multi-scale flow, transport and reactive processes, thus stimulating development of new generation of modelling tools from pore to field scale. One of the key challenges is to make experiment and models as complementary as possible, with continuously improving experimental methods in order to increase predictive capabilities of theoretical models across scales. This creates need to establish rigorous benchmark studies of flow, transport and reaction in porous media which can then serve as the basis for introducing more complex phenomena in future developments.
Water facilities in retrospect and prospect: An illuminating tool for vehicle design
NASA Technical Reports Server (NTRS)
Erickson, G. E.; Peak, D. J.; Delfrate, J.; Skow, A. M.; Malcolm, G. N.
1986-01-01
Water facilities play a fundamental role in the design of air, ground, and marine vehicles by providing a qualitative, and sometimes quantitative, description of complex flow phenomena. Water tunnels, channels, and tow tanks used as flow-diagnostic tools have experienced a renaissance in recent years in response to the increased complexity of designs suitable for advanced technology vehicles. These vehicles are frequently characterized by large regions of steady and unsteady three-dimensional flow separation and ensuing vortical flows. The visualization and interpretation of the complicated fluid motions about isolated vehicle components and complete configurations in a time and cost effective manner in hydrodynamic test facilities is a key element in the development of flow control concepts, and, hence, improved vehicle designs. A historical perspective of the role of water facilities in the vehicle design process is presented. The application of water facilities to specific aerodynamic and hydrodynamic flow problems is discussed, and the strengths and limitations of these important experimental tools are emphasized.
A new scoring function for top-down spectral deconvolution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kou, Qiang; Wu, Si; Liu, Xiaowen
2014-12-18
Background: Top-down mass spectrometry plays an important role in intact protein identification and characterization. Top-down mass spectra are more complex than bottom-up mass spectra because they often contain many isotopomer envelopes from highly charged ions, which may overlap with one another. As a result, spectral deconvolution, which converts a complex top-down mass spectrum into a monoisotopic mass list, is a key step in top-down spectral interpretation. Results: In this paper, we propose a new scoring function, L-score, for evaluating isotopomer envelopes. By combining L-score with MS-Deconv, a new software tool, MS-Deconv+, was developed for top-down spectral deconvolution. Experimental results showedmore » that MS-Deconv+ outperformed existing software tools in top-down spectral deconvolution. Conclusions: L-score shows high discriminative ability in identification of isotopomer envelopes. Using L-score, MS-Deconv+ reports many correct monoisotopic masses missed by other software tools, which are valuable for proteoform identification and characterization.« less
Remote Control and Data Acquisition: A Case Study
NASA Technical Reports Server (NTRS)
DeGennaro, Alfred J.; Wilkinson, R. Allen
2000-01-01
This paper details software tools developed to remotely command experimental apparatus, and to acquire and visualize the associated data in soft real time. The work was undertaken because commercial products failed to meet the needs. This work has identified six key factors intrinsic to development of quality research laboratory software. Capabilities include access to all new instrument functions without any programming or dependence on others to write drivers or virtual instruments, simple full screen text-based experiment configuration and control user interface, months of continuous experiment run-times, order of 1% CPU load for condensed matter physics experiment described here, very little imposition of software tool choices on remote users, and total remote control from anywhere in the world over the Internet or from home on a 56 Kb modem as if the user is sitting in the laboratory. This work yielded a set of simple robust tools that are highly reliable, resource conserving, extensible, and versatile, with a uniform simple interface.
Braid, Francesca; Williams, Sarah B; Weller, Renate
2012-01-01
Recognition of anatomical landmarks in live animals (and humans) is key for clinical practice, but students often find it difficult to translate knowledge from dissection-based anatomy onto the live animal and struggle to acquire this vital skill. The purpose of this study was to create and evaluate the use of an equine anatomy rug ("Anato-Rug") depicting topographical anatomy and key areas of lung, heart, and gastrointestinal auscultation, which could be used together with a live horse to aid learning of "live animal" anatomy. Over the course of 2 weeks, 38 third year veterinary students were randomly allocated into an experimental group, revising topographical anatomy from the "Anato-Rug," or a control group, learning topographical anatomy from a textbook. Immediately post activity, both groups underwent a test on live anatomy knowledge and were retested 1 week later. Both groups then completed a questionnaire to ascertain their perceptions of their learning experiences. Results showed that the experimental groups scored significantly higher than the control group at the first testing session, experienced more enjoyment during the activity and gained more confidence in identifying anatomical landmarks than the control group. There was not a significant difference in scores between groups at the second testing session. The findings indicate that the anatomy rug is an effective learning tool that aids understanding, confidence, and enjoyment in learning equine thorax and abdominal anatomy; however it was not better than traditional methods with regards to longer term memory recall. Copyright © 2012 American Association of Anatomists.
Metabonomics: its potential as a tool in toxicology for safety assessment and data integration.
Griffin, J L; Bollard, M E
2004-10-01
The functional genomic techniques of transcriptomics and proteomics promise unparalleled global information during the drug development process. However, if these technologies are used in isolation the large multivariate data sets produced are often difficult to interpret, and have the potential of missing key metabolic events (e.g. as a result of experimental noise in the system). To better understand the significance of these megavariate data the temporal changes in phenotype must be described. High resolution 1H NMR spectroscopy used in conjunction with pattern recognition provides one such tool for defining the dynamic phenotype of a cell, organ or organism in terms of a metabolic phenotype. In this review the benefits of this metabonomics/metabolomics approach to problems in toxicology will be discussed. One of the major benefits of this approach is its high throughput nature and cost effectiveness on a per sample basis. Using such a method the consortium for metabonomic toxicology (COMET) are currently investigating approximately 150 model liver and kidney toxins. This investigation will allow the generation of expert systems where liver and kidney toxicity can be predicted for model drug compounds, providing a new research tool in the field of drug metabolism. The review will also include how metabonomics may be used to investigate co-responses with transcripts and proteins involved in metabolism and stress responses, such as during drug induced fatty liver disease. By using data integration to combine metabolite analysis and gene expression profiling key perturbed metabolic pathways can be identified and used as a tool to investigate drug function.
2013-01-01
Background Multicellular organisms consist of cells of many different types that are established during development. Each type of cell is characterized by the unique combination of expressed gene products as a result of spatiotemporal gene regulation. Currently, a fundamental challenge in regulatory biology is to elucidate the gene expression controls that generate the complex body plans during development. Recent advances in high-throughput biotechnologies have generated spatiotemporal expression patterns for thousands of genes in the model organism fruit fly Drosophila melanogaster. Existing qualitative methods enhanced by a quantitative analysis based on computational tools we present in this paper would provide promising ways for addressing key scientific questions. Results We develop a set of computational methods and open source tools for identifying co-expressed embryonic domains and the associated genes simultaneously. To map the expression patterns of many genes into the same coordinate space and account for the embryonic shape variations, we develop a mesh generation method to deform a meshed generic ellipse to each individual embryo. We then develop a co-clustering formulation to cluster the genes and the mesh elements, thereby identifying co-expressed embryonic domains and the associated genes simultaneously. Experimental results indicate that the gene and mesh co-clusters can be correlated to key developmental events during the stages of embryogenesis we study. The open source software tool has been made available at http://compbio.cs.odu.edu/fly/. Conclusions Our mesh generation and machine learning methods and tools improve upon the flexibility, ease-of-use and accuracy of existing methods. PMID:24373308
The FTS atomic spectrum tool (FAST) for rapid analysis of line spectra
NASA Astrophysics Data System (ADS)
Ruffoni, M. P.
2013-07-01
The FTS Atomic Spectrum Tool (FAST) is an interactive graphical program designed to simplify the analysis of atomic emission line spectra obtained from Fourier transform spectrometers. Calculated, predicted and/or known experimental line parameters are loaded alongside experimentally observed spectral line profiles for easy comparison between new experimental data and existing results. Many such line profiles, which could span numerous spectra, may be viewed simultaneously to help the user detect problems from line blending or self-absorption. Once the user has determined that their experimental line profile fits are good, a key feature of FAST is the ability to calculate atomic branching fractions, transition probabilities, and oscillator strengths-and their uncertainties-which is not provided by existing analysis packages. Program SummaryProgram title: FAST: The FTS Atomic Spectrum Tool Catalogue identifier: AEOW_v1_0 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEOW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 293058 No. of bytes in distributed program, including test data, etc.: 13809509 Distribution format: tar.gz Programming language: C++. Computer: Intel x86-based systems. Operating system: Linux/Unix/Windows. RAM: 8 MB minimum. About 50-200 MB for a typical analysis. Classification: 2.2, 2.3, 21.2. Nature of problem: Visualisation of atomic line spectra including the comparison of theoretical line parameters with experimental atomic line profiles. Accurate intensity calibration of experimental spectra, and the determination of observed relative line intensities that are needed for calculating atomic branching fractions and oscillator strengths. Solution method: FAST is centred around a graphical interface, where a user may view sets of experimental line profiles and compare them to calculated data (such as from the Kurucz database [1]), predicted line parameters, and/or previously known experimental results. With additional information on the spectral response of the spectrometer, obtained from a calibrated standard light source, FT spectra may be intensity calibrated. In turn, this permits the user to calculate atomic branching fractions and oscillator strengths, and their respective uncertainties. Running time: Open ended. Defined by the user. References: [1] R.L. Kurucz (2007). URL http://kurucz.harvard.edu/atoms/.
Cyclostationarity approach for monitoring chatter and tool wear in high speed milling
NASA Astrophysics Data System (ADS)
Lamraoui, M.; Thomas, M.; El Badaoui, M.
2014-02-01
Detection of chatter and tool wear is crucial in the machining process and their monitoring is a key issue, for: (1) insuring better surface quality, (2) increasing productivity and (3) protecting both machines and safe workpiece. This paper presents an investigation of chatter and tool wear using the cyclostationary method to process the vibrations signals acquired from high speed milling. Experimental cutting tests were achieved on slot milling operation of aluminum alloy. The experimental set-up is designed for acquisition of accelerometer signals and encoding information picked up from an encoder. The encoder signal is used for re-sampling accelerometers signals in angular domain using a specific algorithm that was developed in LASPI laboratory. The use of cyclostationary on accelerometer signals has been applied for monitoring chatter and tool wear in high speed milling. The cyclostationarity appears on average properties (first order) of signals, on the energetic properties (second order) and it generates spectral lines at cyclic frequencies in spectral correlation. Angular power and kurtosis are used to analyze chatter phenomena. The formation of chatter is characterized by unstable, chaotic motion of the tool and strong anomalous fluctuations of cutting forces. Results show that stable machining generates only very few cyclostationary components of second order while chatter is strongly correlated to cyclostationary components of second order. By machining in the unstable region, chatter results in flat angular kurtosis and flat angular power, such as a pseudo (white) random signal with flat spectrum. Results reveal that spectral correlation and Wigner Ville spectrum or integrated Wigner Ville issued from second-order cyclostationary are an efficient parameter for the early diagnosis of faults in high speed machining, such as chatter, tool wear and bearings, compared to traditional stationary methods. Wigner Ville representation of the residual signal shows that the energy corresponding to the tooth passing decreases when chatter phenomenon occurs. The effect of the tool wear and the number of broken teeth on the excitation of structure resonances appears in Wigner Ville presentation.
Contribution of proteomics to the study of plant pathogenic fungi.
Gonzalez-Fernandez, Raquel; Jorrin-Novo, Jesus V
2012-01-01
Phytopathogenic fungi are one of the most damaging plant parasitic organisms, and can cause serious diseases and important yield losses in crops. The study of the biology of these microorganisms and the interaction with their hosts has experienced great advances in recent years due to the development of moderm, holistic and high-throughput -omic techniques, together with the increasing number of genome sequencing projects and the development of mutants and reverse genetics tools. We highlight among these -omic techniques the importance of proteomics, which has become a relevant tool in plant-fungus pathosystem research. Proteomics intends to identify gene products with a key role in pathogenicity and virulence. These studies would help in the search of key protein targets and in the development of agrochemicals, which may open new ways for crop disease diagnosis and protection. In this review, we made an overview on the contribution of proteomics to the knowledge of life cycle, infection mechanisms, and virulence of the plant pathogenic fungi. Data from current, innovative literature, according to both methodological and experimental systems, were summarized and discussed. Specific sections were devoted to the most studied fungal phytopathogens: Botrytis cinerea, Sclerotinia sclerotiorum, and Fusarium graminearum.
Peak provoked craving: an alternative to smoking cue-reactivity.
Sayette, Michael A; Tiffany, Stephen T
2013-06-01
Smoking cue-exposure research has provided a powerful tool for examining cravings in the laboratory. A key attraction of this method is that tightly controlled experimental procedures can model craving experiences that are presumed to relate to addiction. Despite its appeal, key assumptions underlying the clinical relevance of smoking cue-reactivity studies have been questioned recently. For both conceptual and methodological reasons it may be difficult to tease apart cue-based and abstinence-based cravings. Moreover, conventional cue-reactivity procedures typically generate levels of craving with only minimal clinical relevance. We argue here that sometimes it is unfeasible-and in some instances conceptually misguided-to disentangle abstinence-based and cued components of cigarette cravings. In light of the challenges associated with cue-reactivity research, we offer an alternative approach to smoking cue-exposure experimental research focusing on peak provoked craving (PPC) states. The PPC approach uses nicotine-deprived smokers and focuses on urges during smoking cue-exposure without subtracting out urge ratings during control cue or baseline assessments. This design relies on two factors found in many cue-exposure studies-nicotine deprivation and exposure to explicit smoking cues-which, when combined, can create powerful craving states. The PPC approach retains key aspects of the cue-exposure method, and in many circumstances may be a viable design for studies examining robust laboratory-induced cravings. © 2012 The Authors, Addiction © 2012 Society for the Study of Addiction.
The future is now: single-cell genomics of bacteria and archaea
Blainey, Paul C.
2013-01-01
Interest in the expanding catalog of uncultivated microorganisms, increasing recognition of heterogeneity among seemingly similar cells, and technological advances in whole-genome amplification and single-cell manipulation are driving considerable progress in single-cell genomics. Here, the spectrum of applications for single-cell genomics, key advances in the development of the field, and emerging methodology for single-cell genome sequencing are reviewed by example with attention to the diversity of approaches and their unique characteristics. Experimental strategies transcending specific methodologies are identified and organized as a road map for future studies in single-cell genomics of environmental microorganisms. Over the next decade, increasingly powerful tools for single-cell genome sequencing and analysis will play key roles in accessing the genomes of uncultivated organisms, determining the basis of microbial community functions, and fundamental aspects of microbial population biology. PMID:23298390
NASA Astrophysics Data System (ADS)
Groeneweg, John F.; Sofrin, Thomas G.; Rice, Edward J.; Gliebe, Phillip R.
1991-08-01
Summarized here are key advances in experimental techniques and theoretical applications which point the way to a broad understanding and control of turbomachinery noise. On the experimental side, the development of effective inflow control techniques makes it possible to conduct, in ground based facilities, definitive experiments in internally controlled blade row interactions. Results can now be valid indicators of flight behavior and can provide a firm base for comparison with analytical results. Inflow control coupled with detailed diagnostic tools such as blade pressure measurements can be used to uncover the more subtle mechanisms such as rotor strut interaction, which can set tone levels for some engine configurations. Initial mappings of rotor wake-vortex flow fields have provided a data base for a first generation semiempirical flow disturbance model. Laser velocimetry offers a nonintrusive method for validating and improving the model. Digital data systems and signal processing algorithms are bringing mode measurement closer to a working tool that can be frequently applied to a real machine such as a turbofan engine. On the analytical side, models of most of the links in the chain from turbomachine blade source to far field observation point have been formulated. Three dimensional lifting surface theory for blade rows, including source noncompactness and cascade effects, blade row transmission models incorporating mode and frequency scattering, and modal radiation calculations, including hybrid numerical-analytical approaches, are tools which await further application.
Bengston, Sarah E; Dahan, Romain A; Donaldson, Zoe; Phelps, Steven M; van Oers, Kees; Sih, Andrew; Bell, Alison M
2018-06-01
Behaviour is a key interface between an animal's genome and its environment. Repeatable individual differences in behaviour have been extensively documented in animals, but the molecular underpinnings of behavioural variation among individuals within natural populations remain largely unknown. Here, we offer a critical review of when molecular techniques may yield new insights, and we provide specific guidance on how and whether the latest tools available are appropriate given different resources, system and organismal constraints, and experimental designs. Integrating molecular genetic techniques with other strategies to study the proximal causes of behaviour provides opportunities to expand rapidly into new avenues of exploration. Such endeavours will enable us to better understand how repeatable individual differences in behaviour have evolved, how they are expressed and how they can be maintained within natural populations of animals.
Public access management as an adaptive wildlife management tool
Ouren, Douglas S.; Watts, Raymond D.
2005-01-01
One key issue in the Black Mesa – Black Canyon area is the interaction between motorized vehicles and. The working hypothesis for this study is that early season elk movement onto private lands and the National Park is precipitated by increased use of Off Highway Vehicles (OHV’s). Data on intensity of motorized use is extremely limited. In this study, we monitor intensity of motorized vehicle and trail use on elk movements and habitat usage and analyze interactions. If management agencies decide to alter accessibility, we will monitor wildlife responses to changes in the human-use regime. This provides a unique opportunity for adaptive management experimentation based on coordinated research and monitoring. The products from this project will provide natural resource managers across the nation with tools and information to better meet these resource challenges.
Rangannan, Vetriselvi; Bansal, Manju
2009-12-01
The rapid increase in genome sequence information has necessitated the annotation of their functional elements, particularly those occurring in the non-coding regions, in the genomic context. Promoter region is the key regulatory region, which enables the gene to be transcribed or repressed, but it is difficult to determine experimentally. Hence an in silico identification of promoters is crucial in order to guide experimental work and to pin point the key region that controls the transcription initiation of a gene. In this analysis, we demonstrate that while the promoter regions are in general less stable than the flanking regions, their average free energy varies depending on the GC composition of the flanking genomic sequence. We have therefore obtained a set of free energy threshold values, for genomic DNA with varying GC content and used them as generic criteria for predicting promoter regions in several microbial genomes, using an in-house developed tool PromPredict. On applying it to predict promoter regions corresponding to the 1144 and 612 experimentally validated TSSs in E. coli (50.8% GC) and B. subtilis (43.5% GC) sensitivity of 99% and 95% and precision values of 58% and 60%, respectively, were achieved. For the limited data set of 81 TSSs available for M. tuberculosis (65.6% GC) a sensitivity of 100% and precision of 49% was obtained.
Stecher, Bärbel; Berry, David; Loy, Alexander
2013-09-01
The highly diverse intestinal microbiota forms a structured community engaged in constant communication with itself and its host and is characterized by extensive ecological interactions. A key benefit that the microbiota affords its host is its ability to protect against infections in a process termed colonization resistance (CR), which remains insufficiently understood. In this review, we connect basic concepts of CR with new insights from recent years and highlight key technological advances in the field of microbial ecology. We present a selection of statistical and bioinformatics tools used to generate hypotheses about synergistic and antagonistic interactions in microbial ecosystems from metagenomic datasets. We emphasize the importance of experimentally testing these hypotheses and discuss the value of gnotobiotic mouse models for investigating specific aspects related to microbiota-host-pathogen interactions in a well-defined experimental system. We further introduce new developments in the area of single-cell analysis using fluorescence in situ hybridization in combination with metabolic stable isotope labeling technologies for studying the in vivo activities of complex community members. These approaches promise to yield novel insights into the mechanisms of CR and intestinal ecophysiology in general, and give researchers the means to experimentally test hypotheses in vivo at varying levels of biological and ecological complexity. © 2013 Federation of European Microbiological Societies. Published by John Wiley & Sons Ltd. All rights reserved.
Optical scheme for simulating post-quantum nonlocality distillation.
Chu, Wen-Jing; Yang, Ming; Pan, Guo-Zhu; Yang, Qing; Cao, Zhuo-Liang
2016-11-28
An optical scheme for simulating nonlocality distillation is proposed in post-quantum regime. The nonlocal boxes are simulated by measurements on appropriately pre- and post-selected polarization entangled photon pairs, i.e. post-quantum nonlocality is simulated by exploiting fair-sampling loophole in a Bell test. Mod 2 addition on the outputs of two nonlocal boxes combined with pre- and post-selection operations constitutes the key operation of simulating nonlocality distillation. This scheme provides a possible tool for the experimental study on the nonlocality in post-quantum regime and the exact physical principle precisely distinguishing physically realizable correlations from nonphysical ones.
NASA Astrophysics Data System (ADS)
Bantes, B.; Bayadilov, D.; Beck, R.; Becker, M.; Bella, A.; Bieling, J.; Böse, S.; Braglieri, A.; Brinkmann, K.; Burdeynyi, D.; Curciarello, F.; de Leo, V.; di Salvo, R.; Dutz, H.; Elsner, D.; Fantini, A.; Frese, T.; Friedrick, S.; Frommberger, F.; Ganenko, V.; Gervino, G.; Ghio, F.; Giardina, G.; Girolami, B.; Glazier, D.; Goertz, S.; Gridnev, A.; Gutz, E.; Hammann, D.; Hannappel, J.; Hillert, W.; Ignatov, A.; Jahn, O.; Jahn, R.; Joosten, R.; Jude, T. C.; Klein, F.; Koop, K.; Krusche, B.; Lapik, A.; Levi Sandri, P.; Lopatin, I.; Mandaglio, G.; Messi, F.; Messi, R.; Metag, V.; Moricciani, D.; Nanova, M.; Nedorezov, V.; Noviskiy, D.; Pedroni, P.; Romaniuk, M.; Rostomyan, T.; Schaerf, C.; Schmieden, H.; Sumachev, V.; Tarakonov, V.; Vegna, V.; Vlasov, P.; Walther, D.; Watts, D.; Zaunick, H.-G.; Zimmermann, T.
2014-01-01
Meson photoproduction is a key tool for the experimental investigation of the nucleon excitation spectrum. To disentangle the specific couplings of resonances, in addition to the rather well measured pion and eta photoproduction channels it is mandatory to obtain information on channels involving strange and vector mesons and higher mass pseudoscalar mesons, and the associated multi-particle final states with both charged and neutral particles. In this respect, the new BGO-OD experiment at the ELSA accelerator of the University of Bonn's Physikalisches Institut provides unique instrumentation. We describe the experiment, present its status and the initial program of measurements.
Skyrmion morphology in ultrathin magnetic films
NASA Astrophysics Data System (ADS)
Gross, I.; Akhtar, W.; Hrabec, A.; Sampaio, J.; Martínez, L. J.; Chouaieb, S.; Shields, B. J.; Maletinsky, P.; Thiaville, A.; Rohart, S.; Jacques, V.
2018-02-01
Nitrogen-vacancy magnetic microscopy is employed in the quenching mode as a noninvasive, high-resolution tool to investigate the morphology of isolated skyrmions in ultrathin magnetic films. The skyrmion size and shape are found to be strongly affected by local pinning effects and magnetic field history. Micromagnetic simulations including a static disorder, based on the physical model of grain-to-grain thickness variations, reproduce all experimental observations and reveal the key role of disorder and magnetic history in the stabilization of skyrmions in ultrathin magnetic films. This work opens the way to an in-depth understanding of skyrmion dynamics in real, disordered media.
Recent advances in multidimensional ultrafast spectroscopy
NASA Astrophysics Data System (ADS)
Oliver, Thomas A. A.
2018-01-01
Multidimensional ultrafast spectroscopies are one of the premier tools to investigate condensed phase dynamics of biological, chemical and functional nanomaterial systems. As they reach maturity, the variety of frequency domains that can be explored has vastly increased, with experimental techniques capable of correlating excitation and emission frequencies from the terahertz through to the ultraviolet. Some of the most recent innovations also include extreme cross-peak spectroscopies that directly correlate the dynamics of electronic and vibrational states. This review article summarizes the key technological advances that have permitted these recent advances, and the insights gained from new multidimensional spectroscopic probes.
Recent advances in multidimensional ultrafast spectroscopy
2018-01-01
Multidimensional ultrafast spectroscopies are one of the premier tools to investigate condensed phase dynamics of biological, chemical and functional nanomaterial systems. As they reach maturity, the variety of frequency domains that can be explored has vastly increased, with experimental techniques capable of correlating excitation and emission frequencies from the terahertz through to the ultraviolet. Some of the most recent innovations also include extreme cross-peak spectroscopies that directly correlate the dynamics of electronic and vibrational states. This review article summarizes the key technological advances that have permitted these recent advances, and the insights gained from new multidimensional spectroscopic probes. PMID:29410844
Mollica, Luca; Theret, Isabelle; Antoine, Mathias; Perron-Sierra, Françoise; Charton, Yves; Fourquez, Jean-Marie; Wierzbicki, Michel; Boutin, Jean A; Ferry, Gilles; Decherchi, Sergio; Bottegoni, Giovanni; Ducrot, Pierre; Cavalli, Andrea
2016-08-11
Ligand-target residence time is emerging as a key drug discovery parameter because it can reliably predict drug efficacy in vivo. Experimental approaches to binding and unbinding kinetics are nowadays available, but we still lack reliable computational tools for predicting kinetics and residence time. Most attempts have been based on brute-force molecular dynamics (MD) simulations, which are CPU-demanding and not yet particularly accurate. We recently reported a new scaled-MD-based protocol, which showed potential for residence time prediction in drug discovery. Here, we further challenged our procedure's predictive ability by applying our methodology to a series of glucokinase activators that could be useful for treating type 2 diabetes mellitus. We combined scaled MD with experimental kinetics measurements and X-ray crystallography, promptly checking the protocol's reliability by directly comparing computational predictions and experimental measures. The good agreement highlights the potential of our scaled-MD-based approach as an innovative method for computationally estimating and predicting drug residence times.
Interaction between IGFBP7 and insulin: a theoretical and experimental study
NASA Astrophysics Data System (ADS)
Ruan, Wenjing; Kang, Zhengzhong; Li, Youzhao; Sun, Tianyang; Wang, Lipei; Liang, Lijun; Lai, Maode; Wu, Tao
2016-04-01
Insulin-like growth factor binding protein 7 (IGFBP7) can bind to insulin with high affinity which inhibits the early steps of insulin action. Lack of recognition mechanism impairs our understanding of insulin regulation before it binds to insulin receptor. Here we combine computational simulations with experimental methods to investigate the interaction between IGFBP7 and insulin. Molecular dynamics simulations indicated that His200 and Arg198 in IGFBP7 were key residues. Verified by experimental data, the interaction remained strong in single mutation systems R198E and H200F but became weak in double mutation system R198E-H200F relative to that in wild-type IGFBP7. The results and methods in present study could be adopted in future research of discovery of drugs by disrupting protein-protein interactions in insulin signaling. Nevertheless, the accuracy, reproducibility, and costs of free-energy calculation are still problems that need to be addressed before computational methods can become standard binding prediction tools in discovery pipelines.
Real-Time Leaky Lamb Wave Spectrum Measurement and Its Application to NDE of Composites
NASA Technical Reports Server (NTRS)
Lih, Shyh-Shiuh; Bar-Cohen, Yoseph
1999-01-01
Numerous analytical and theoretical studies of the behavior of leaky Lamb waves (LLW) in composite materials were documented in the literature. One of the key issues that are constraining the application of this method as a practical tool is the amount of data that needs to be acquired and the slow process that is involved with such experiments. Recently, a methodology that allows quasi real-time acquisition of LLW dispersion data was developed. At each angle of incidence the reflection spectrum is available in real time from the experimental setup and it can be used for rapid detection of the defects. This technique can be used to rapidly acquire the various plate wave modes along various angles of incidence for the characterization of the material elastic properties. The experimental method and data acquisition technique will be described in this paper. Experimental data was used to examine a series of flaws including porosity and delaminations and demonstrated the efficiency of the developed technique.
Simons, Jack
2008-07-24
The experimental and theoretical study of molecular anions has undergone explosive growth over the past 40 years. Advances in techniques used to generate anions in appreciable numbers as well as new ion-storage, ion-optics, and laser spectroscopic tools have been key on the experimental front. Theoretical developments on the electronic structure and molecular dynamics fronts now allow one to achieve higher accuracy and to study electronically metastable states, thus bringing theory in close collaboration with experiment in this field. In this article, many of the experimental and theoretical challenges specific to studying molecular anions are discussed. Results from many research groups on several classes of molecular anions are overviewed, and both literature citations and active (in online html and pdf versions) links to numerous contributing scientists' Web sites are provided. Specific focus is made on the following families of anions: dipole-bound, zwitterion-bound, double-Rydberg, multiply charged, metastable, cluster-based, and biological anions. In discussing each kind of anion, emphasis is placed on the structural, energetic, spectroscopic, and chemical-reactivity characteristics that make these anions novel, interesting, and important.
Employing immersive virtual environments for innovative experiments in health care communication.
Persky, Susan
2011-03-01
This report reviews the literature for studies that employ immersive virtual environment technology methods to conduct experimental studies in health care communication. Advantages and challenges of using these tools for research in this area are also discussed. A literature search was conducted using the Scopus database. Results were hand searched to identify the body of studies, conducted since 1995, that are related to the report objective. The review identified four relevant studies that stem from two unique projects. One project focused on the impact of a clinician's characteristics and behavior on health care communication, the other focused on the characteristics of the patient. Both projects illustrate key methodological advantages conferred by immersive virtual environments, including, ability to maintain simultaneously high experimental control and realism, ability to manipulate variables in new ways, and unique behavioral measurement opportunities. Though implementation challenges exist for immersive virtual environment-based research methods, given the technology's unique capabilities, benefits can outweigh the costs in many instances. Immersive virtual environments may therefore prove an important addition to the array of tools available for advancing our understanding of communication in health care. Published by Elsevier Ireland Ltd.
Tools for Early Prediction of Drug Loading in Lipid-Based Formulations
2015-01-01
Identification of the usefulness of lipid-based formulations (LBFs) for delivery of poorly water-soluble drugs is at date mainly experimentally based. In this work we used a diverse drug data set, and more than 2,000 solubility measurements to develop experimental and computational tools to predict the loading capacity of LBFs. Computational models were developed to enable in silico prediction of solubility, and hence drug loading capacity, in the LBFs. Drug solubility in mixed mono-, di-, triglycerides (Maisine 35-1 and Capmul MCM EP) correlated (R2 0.89) as well as the drug solubility in Carbitol and other ethoxylated excipients (PEG400, R2 0.85; Polysorbate 80, R2 0.90; Cremophor EL, R2 0.93). A melting point below 150 °C was observed to result in a reasonable solubility in the glycerides. The loading capacity in LBFs was accurately calculated from solubility data in single excipients (R2 0.91). In silico models, without the demand of experimentally determined solubility, also gave good predictions of the loading capacity in these complex formulations (R2 0.79). The framework established here gives a better understanding of drug solubility in single excipients and of LBF loading capacity. The large data set studied revealed that experimental screening efforts can be rationalized by solubility measurements in key excipients or from solid state information. For the first time it was shown that loading capacity in complex formulations can be accurately predicted using molecular information extracted from calculated descriptors and thermal properties of the crystalline drug. PMID:26568134
Tools for Early Prediction of Drug Loading in Lipid-Based Formulations.
Alskär, Linda C; Porter, Christopher J H; Bergström, Christel A S
2016-01-04
Identification of the usefulness of lipid-based formulations (LBFs) for delivery of poorly water-soluble drugs is at date mainly experimentally based. In this work we used a diverse drug data set, and more than 2,000 solubility measurements to develop experimental and computational tools to predict the loading capacity of LBFs. Computational models were developed to enable in silico prediction of solubility, and hence drug loading capacity, in the LBFs. Drug solubility in mixed mono-, di-, triglycerides (Maisine 35-1 and Capmul MCM EP) correlated (R(2) 0.89) as well as the drug solubility in Carbitol and other ethoxylated excipients (PEG400, R(2) 0.85; Polysorbate 80, R(2) 0.90; Cremophor EL, R(2) 0.93). A melting point below 150 °C was observed to result in a reasonable solubility in the glycerides. The loading capacity in LBFs was accurately calculated from solubility data in single excipients (R(2) 0.91). In silico models, without the demand of experimentally determined solubility, also gave good predictions of the loading capacity in these complex formulations (R(2) 0.79). The framework established here gives a better understanding of drug solubility in single excipients and of LBF loading capacity. The large data set studied revealed that experimental screening efforts can be rationalized by solubility measurements in key excipients or from solid state information. For the first time it was shown that loading capacity in complex formulations can be accurately predicted using molecular information extracted from calculated descriptors and thermal properties of the crystalline drug.
Mysara, Mohamed; Elhefnawi, Mahmoud; Garibaldi, Jonathan M
2012-06-01
The investigation of small interfering RNA (siRNA) and its posttranscriptional gene-regulation has become an extremely important research topic, both for fundamental reasons and for potential longer-term therapeutic benefits. Several factors affect the functionality of siRNA including positional preferences, target accessibility and other thermodynamic features. State of the art tools aim to optimize the selection of target siRNAs by identifying those that may have high experimental inhibition. Such tools implement artificial neural network models as Biopredsi and ThermoComposition21, and linear regression models as DSIR, i-Score and Scales, among others. However, all these models have limitations in performance. In this work, a neural-network trained new siRNA scoring/efficacy prediction model was developed based on combining two existing scoring algorithms (ThermoComposition21 and i-Score), together with the whole stacking energy (ΔG), in a multi-layer artificial neural network. These three parameters were chosen after a comparative combinatorial study between five well known tools. Our developed model, 'MysiRNA' was trained on 2431 siRNA records and tested using three further datasets. MysiRNA was compared with 11 alternative existing scoring tools in an evaluation study to assess the predicted and experimental siRNA efficiency where it achieved the highest performance both in terms of correlation coefficient (R(2)=0.600) and receiver operating characteristics analysis (AUC=0.808), improving the prediction accuracy by up to 18% with respect to sensitivity and specificity of the best available tools. MysiRNA is a novel, freely accessible model capable of predicting siRNA inhibition efficiency with improved specificity and sensitivity. This multiclassifier approach could help improve the performance of prediction in several bioinformatics areas. MysiRNA model, part of MysiRNA-Designer package [1], is expected to play a key role in siRNA selection and evaluation. Copyright © 2012 Elsevier Inc. All rights reserved.
Tool use and affordance: Manipulation-based versus reasoning-based approaches.
Osiurak, François; Badets, Arnaud
2016-10-01
Tool use is a defining feature of human species. Therefore, a fundamental issue is to understand the cognitive bases of human tool use. Given that people cannot use tools without manipulating them, proponents of the manipulation-based approach have argued that tool use might be supported by the simulation of past sensorimotor experiences, also sometimes called affordances. However, in the meanwhile, evidence has been accumulated demonstrating the critical role of mechanical knowledge in tool use (i.e., the reasoning-based approach). The major goal of the present article is to examine the validity of the assumptions derived from the manipulation-based versus the reasoning-based approach. To do so, we identified 3 key issues on which the 2 approaches differ, namely, (a) the reference frame issue, (b) the intention issue, and (c) the action domain issue. These different issues will be addressed in light of studies in experimental psychology and neuropsychology that have provided valuable contributions to the topic (i.e., tool-use interaction, orientation effect, object-size effect, utilization behavior and anarchic hand, tool use and perception, apraxia of tool use, transport vs. use actions). To anticipate our conclusions, the reasoning-based approach seems to be promising for understanding the current literature, even if it is not fully satisfactory because of a certain number of findings easier to interpret with regard to the manipulation-based approach. A new avenue for future research might be to develop a framework accommodating both approaches, thereby shedding a new light on the cognitive bases of human tool use and affordances. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
2017-04-01
A COMPARISON OF PREDICTIVE THERMO AND WATER SOLVATION PROPERTY PREDICTION TOOLS AND EXPERIMENTAL DATA FOR...4. TITLE AND SUBTITLE A Comparison of Predictive Thermo and Water Solvation Property Prediction Tools and Experimental Data for Selected...1 2. EXPERIMENTAL PROCEDURE
Tools and technologies for expert systems: A human factors perspective
NASA Technical Reports Server (NTRS)
Rajaram, Navaratna S.
1987-01-01
It is widely recognized that technologies based on artificial intelligence (AI), especially expert systems, can make significant contributions to the productivity and effectiveness of operations of information and knowledge intensive organizations such as NASA. At the same time, these being relatively new technologies, there is the problem of transfering technology to key personnel of such organizations. The problems of examining the potential of expert systems and of technology transfer is addressed in the context of human factors applications. One of the topics of interest was the investigation of the potential use of expert system building tools, particularly NEXPERT as a technology transfer medium. Two basic conclusions were reached in this regard. First, NEXPERT is an excellent tool for rapid prototyping of experimental expert systems, but not ideal as a delivery vehicle. Therefore, it is not a substitute for general purpose system implementation languages such a LISP or C. This assertion probably holds for nearly all such tools on the market today. Second, an effective technology transfer mechanism is to formulate and implement expert systems for problems which members of the organization in question can relate to. For this purpose, the LIghting EnGineering Expert (LIEGE) was implemented using NEXPERT as the tool for technology transfer and to illustrate the value of expert systems to the activities of the Man-System Division.
NASA Astrophysics Data System (ADS)
McEver, Jimmie; Davis, Paul K.; Bigelow, James H.
2000-06-01
We have developed and used families of multiresolution and multiple-perspective models (MRM and MRMPM), both in our substantive analytic work for the Department of Defense and to learn more about how such models can be designed and implemented. This paper is a brief case history of our experience with a particular family of models addressing the use of precision fires in interdicting and halting an invading army. Our models were implemented as closed-form analytic solutions, in spreadsheets, and in the more sophisticated AnalyticaTM environment. We also drew on an entity-level simulation for data. The paper reviews the importance of certain key attributes of development environments (visual modeling, interactive languages, friendly use of array mathematics, facilities for experimental design and configuration control, statistical analysis tools, graphical visualization tools, interactive post-processing, and relational database tools). These can go a long way towards facilitating MRMPM work, but many of these attributes are not yet widely available (or available at all) in commercial model-development tools--especially for use with personal computers. We conclude with some lessons learned from our experience.
Kinematics of mechanical and adhesional micromanipulation under a scanning electron microscope
NASA Astrophysics Data System (ADS)
Saito, Shigeki; Miyazaki, Hideki T.; Sato, Tomomasa; Takahashi, Kunio
2002-11-01
In this paper, the kinematics of mechanical and adhesional micromanipulation using a needle-shaped tool under a scanning electron microscope is analyzed. A mode diagram is derived to indicate the possible micro-object behavior for the specified operational conditions. Based on the diagram, a reasonable method for pick and place operation is proposed. The keys to successful analysis are to introduce adhesional and rolling-resistance factors into the kinematic system consisting of a sphere, a needle-shaped tool, and a substrate, and to consider the time dependence of these factors due to the electron-beam (EB) irradiation. Adhesional force and the lower limit of maximum rolling resistance are evaluated quantitatively in theoretical and experimental ways. This analysis shows that it is possible to control the fracture of either the tool-sphere or substrate-sphere interface of the system selectively by the tool-loading angle and that such a selective fracture of the interfaces enables reliable pick or place operation even under EB irradiation. Although the conventional micromanipulation was not repeatable because the technique was based on an empirically effective method, this analysis should provide us with a guideline to reliable micromanipulation.
EPA Funding Instruments and Authorities
Key historical information to enhance your knowledge courtesy of the EPA Grants desktop resource tool at the Office of Grants and Debarment. This useful tool will provided information about userful key terms.
Quantum computing on encrypted data
NASA Astrophysics Data System (ADS)
Fisher, K. A. G.; Broadbent, A.; Shalm, L. K.; Yan, Z.; Lavoie, J.; Prevedel, R.; Jennewein, T.; Resch, K. J.
2014-01-01
The ability to perform computations on encrypted data is a powerful tool for protecting privacy. Recently, protocols to achieve this on classical computing systems have been found. Here, we present an efficient solution to the quantum analogue of this problem that enables arbitrary quantum computations to be carried out on encrypted quantum data. We prove that an untrusted server can implement a universal set of quantum gates on encrypted quantum bits (qubits) without learning any information about the inputs, while the client, knowing the decryption key, can easily decrypt the results of the computation. We experimentally demonstrate, using single photons and linear optics, the encryption and decryption scheme on a set of gates sufficient for arbitrary quantum computations. As our protocol requires few extra resources compared with other schemes it can be easily incorporated into the design of future quantum servers. These results will play a key role in enabling the development of secure distributed quantum systems.
Quantum computing on encrypted data.
Fisher, K A G; Broadbent, A; Shalm, L K; Yan, Z; Lavoie, J; Prevedel, R; Jennewein, T; Resch, K J
2014-01-01
The ability to perform computations on encrypted data is a powerful tool for protecting privacy. Recently, protocols to achieve this on classical computing systems have been found. Here, we present an efficient solution to the quantum analogue of this problem that enables arbitrary quantum computations to be carried out on encrypted quantum data. We prove that an untrusted server can implement a universal set of quantum gates on encrypted quantum bits (qubits) without learning any information about the inputs, while the client, knowing the decryption key, can easily decrypt the results of the computation. We experimentally demonstrate, using single photons and linear optics, the encryption and decryption scheme on a set of gates sufficient for arbitrary quantum computations. As our protocol requires few extra resources compared with other schemes it can be easily incorporated into the design of future quantum servers. These results will play a key role in enabling the development of secure distributed quantum systems.
Interferometry on a Balloon; Paving the Way for Space-based Interferometers
NASA Technical Reports Server (NTRS)
Rinehart, Stephen A.
2008-01-01
Astronomical studies at infrared wavelengths have dramatically improved our understanding of the universe, and observations with Spitzer, the upcoming Herschel mission, and SOFIA will continue to provide exciting new discoveries. The relatively low angular resolution of these missions, however, is insufficient to resolve the physical scale on which mid-to-far-infrared emission arises, resulting in source and structure ambiguities that limit our ability to answer key science questions. Interferometry enables high angular resolution at these wavelengths- a powerful tool for scientific discovery. We will build the Balloon Experimental Twin Telescope for Infrared Interferometry (BETTII), an eight-meter baseline Michelson stellar interferometer to fly on a high-altitude balloon. BETTII's spectral-spatial capability, provided by an instrument using double-Fourier techniques, will address key questions about the nature of disks in young star clusters and active galactic nuclei and the envelopes of evolved stars. BETTII will also lay the technological groundwork for future space interferometers.
Nilsson, Ingemar; Polla, Magnus O
2012-10-01
Drug design is a multi-parameter task present in the analysis of experimental data for synthesized compounds and in the prediction of new compounds with desired properties. This article describes the implementation of a binned scoring and composite ranking scheme for 11 experimental parameters that were identified as key drivers in the MC4R project. The composite ranking scheme was implemented in an AstraZeneca tool for analysis of project data, thereby providing an immediate re-ranking as new experimental data was added. The automated ranking also highlighted compounds overlooked by the project team. The successful implementation of a composite ranking on experimental data led to the development of an equivalent virtual score, which was based on Free-Wilson models of the parameters from the experimental ranking. The individual Free-Wilson models showed good to high predictive power with a correlation coefficient between 0.45 and 0.97 based on the external test set. The virtual ranking adds value to the selection of compounds for synthesis but error propagation must be controlled. The experimental ranking approach adds significant value, is parameter independent and can be tuned and applied to any drug discovery project.
The SeaHorn Verification Framework
NASA Technical Reports Server (NTRS)
Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.
2015-01-01
In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.
Quantum random oracle model for quantum digital signature
NASA Astrophysics Data System (ADS)
Shang, Tao; Lei, Qi; Liu, Jianwei
2016-10-01
The goal of this work is to provide a general security analysis tool, namely, the quantum random oracle (QRO), for facilitating the security analysis of quantum cryptographic protocols, especially protocols based on quantum one-way function. QRO is used to model quantum one-way function and different queries to QRO are used to model quantum attacks. A typical application of quantum one-way function is the quantum digital signature, whose progress has been hampered by the slow pace of the experimental realization. Alternatively, we use the QRO model to analyze the provable security of a quantum digital signature scheme and elaborate the analysis procedure. The QRO model differs from the prior quantum-accessible random oracle in that it can output quantum states as public keys and give responses to different queries. This tool can be a test bed for the cryptanalysis of more quantum cryptographic protocols based on the quantum one-way function.
Error-related negativity varies with the activation of gender stereotypes.
Ma, Qingguo; Shu, Liangchao; Wang, Xiaoyi; Dai, Shenyi; Che, Hongmin
2008-09-19
The error-related negativity (ERN) was suggested to reflect the response-performance monitoring process. The purpose of this study is to investigate how the activation of gender stereotypes influences the ERN. Twenty-eight male participants were asked to complete a tool or kitchenware identification task. The prime stimulus is a picture of a male or female face and the target stimulus is either a kitchen utensil or a hand tool. The ERN amplitude on male-kitchenware trials is significantly larger than that on female-kitchenware trials, which reveals the low-level, automatic activation of gender stereotypes. The ERN that was elicited in this task has two sources--operation errors and the conflict between the gender stereotype activation and the non-prejudice beliefs. And the gender stereotype activation may be the key factor leading to this difference of ERN. In other words, the stereotype activation in this experimental paradigm may be indexed by the ERN.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brunett, A. J.; Fanning, T. H.
The United States has extensive experience with the design, construction, and operation of sodium cooled fast reactors (SFRs) over the last six decades. Despite the closure of various facilities, the U.S. continues to dedicate research and development (R&D) efforts to the design of innovative experimental, prototype, and commercial facilities. Accordingly, in support of the rich operating history and ongoing design efforts, the U.S. has been developing and maintaining a series of tools with capabilities that envelope all facets of SFR design and safety analyses. This paper provides an overview of the current U.S. SFR analysis toolset, including codes such asmore » SAS4A/SASSYS-1, MC2-3, SE2-ANL, PERSENT, NUBOW-3D, and LIFE-METAL, as well as the higher-fidelity tools (e.g. PROTEUS) being integrated into the toolset. Current capabilities of the codes are described and key ongoing development efforts are highlighted for some codes.« less
Toward Personalized Control of Human Gut Bacterial Communities.
David, Lawrence A
2018-01-01
A key challenge in microbiology will be developing tools for manipulating human gut bacterial communities. Our ability to predict and control the dynamics of these communities is now in its infancy. To manage human gut microbiota, I am developing methods in three research domains. First, I am refining in vitro tools to experimentally study gut microbes at high throughput and in controlled settings. Second, I am adapting "big data" techniques to overcome statistical challenges confronting microbiota modeling. Third, I am testing study designs that can streamline human testing of microbiota manipulations. Assembling these methods creates new challenges, including training scientists who can work across disciplines such as engineering, ecology, and medicine. Nevertheless, I envision that overcoming these obstacles will enable my group to construct platforms that can personalize microbiota treatments, particularly ones based on diet. More broadly, I anticipate that such platforms will have applications across fields such as agriculture, biotechnology, and environmental management.
KeyWare: an open wireless distributed computing environment
NASA Astrophysics Data System (ADS)
Shpantzer, Isaac; Schoenfeld, Larry; Grindahl, Merv; Kelman, Vladimir
1995-12-01
Deployment of distributed applications in the wireless domain lack equivalent tools, methodologies, architectures, and network management that exist in LAN based applications. A wireless distributed computing environment (KeyWareTM) based on intelligent agents within a multiple client multiple server scheme was developed to resolve this problem. KeyWare renders concurrent application services to wireline and wireless client nodes encapsulated in multiple paradigms such as message delivery, database access, e-mail, and file transfer. These services and paradigms are optimized to cope with temporal and spatial radio coverage, high latency, limited throughput and transmission costs. A unified network management paradigm for both wireless and wireline facilitates seamless extensions of LAN- based management tools to include wireless nodes. A set of object oriented tools and methodologies enables direct asynchronous invocation of agent-based services supplemented by tool-sets matched to supported KeyWare paradigms. The open architecture embodiment of KeyWare enables a wide selection of client node computing platforms, operating systems, transport protocols, radio modems and infrastructures while maintaining application portability.
2009-06-01
AUTOMATED GEOSPATIAL TOOLS : AGILITY IN COMPLEX PLANNING Primary Topic: Track 5 – Experimentation and Analysis Walter A. Powell [STUDENT] - GMU...TITLE AND SUBTITLE Results of an Experimental Exploration of Advanced Automated Geospatial Tools : Agility in Complex Planning 5a. CONTRACT NUMBER...Std Z39-18 Abstract Typically, the development of tools and systems for the military is requirement driven; systems are developed to meet
HyQue: evaluating hypotheses using Semantic Web technologies.
Callahan, Alison; Dumontier, Michel; Shah, Nigam H
2011-05-17
Key to the success of e-Science is the ability to computationally evaluate expert-composed hypotheses for validity against experimental data. Researchers face the challenge of collecting, evaluating and integrating large amounts of diverse information to compose and evaluate a hypothesis. Confronted with rapidly accumulating data, researchers currently do not have the software tools to undertake the required information integration tasks. We present HyQue, a Semantic Web tool for querying scientific knowledge bases with the purpose of evaluating user submitted hypotheses. HyQue features a knowledge model to accommodate diverse hypotheses structured as events and represented using Semantic Web languages (RDF/OWL). Hypothesis validity is evaluated against experimental and literature-sourced evidence through a combination of SPARQL queries and evaluation rules. Inference over OWL ontologies (for type specifications, subclass assertions and parthood relations) and retrieval of facts stored as Bio2RDF linked data provide support for a given hypothesis. We evaluate hypotheses of varying levels of detail about the genetic network controlling galactose metabolism in Saccharomyces cerevisiae to demonstrate the feasibility of deploying such semantic computing tools over a growing body of structured knowledge in Bio2RDF. HyQue is a query-based hypothesis evaluation system that can currently evaluate hypotheses about the galactose metabolism in S. cerevisiae. Hypotheses as well as the supporting or refuting data are represented in RDF and directly linked to one another allowing scientists to browse from data to hypothesis and vice versa. HyQue hypotheses and data are available at http://semanticscience.org/projects/hyque.
PyRhO: A Multiscale Optogenetics Simulation Platform
Evans, Benjamin D.; Jarvis, Sarah; Schultz, Simon R.; Nikolic, Konstantin
2016-01-01
Optogenetics has become a key tool for understanding the function of neural circuits and controlling their behavior. An array of directly light driven opsins have been genetically isolated from several families of organisms, with a wide range of temporal and spectral properties. In order to characterize, understand and apply these opsins, we present an integrated suite of open-source, multi-scale computational tools called PyRhO. The purpose of developing PyRhO is three-fold: (i) to characterize new (and existing) opsins by automatically fitting a minimal set of experimental data to three-, four-, or six-state kinetic models, (ii) to simulate these models at the channel, neuron and network levels, and (iii) provide functional insights through model selection and virtual experiments in silico. The module is written in Python with an additional IPython/Jupyter notebook based GUI, allowing models to be fit, simulations to be run and results to be shared through simply interacting with a webpage. The seamless integration of model fitting algorithms with simulation environments (including NEURON and Brian2) for these virtual opsins will enable neuroscientists to gain a comprehensive understanding of their behavior and rapidly identify the most suitable variant for application in a particular biological system. This process may thereby guide not only experimental design and opsin choice but also alterations of the opsin genetic code in a neuro-engineering feed-back loop. In this way, we expect PyRhO will help to significantly advance optogenetics as a tool for transforming biological sciences. PMID:27148037
PyRhO: A Multiscale Optogenetics Simulation Platform.
Evans, Benjamin D; Jarvis, Sarah; Schultz, Simon R; Nikolic, Konstantin
2016-01-01
Optogenetics has become a key tool for understanding the function of neural circuits and controlling their behavior. An array of directly light driven opsins have been genetically isolated from several families of organisms, with a wide range of temporal and spectral properties. In order to characterize, understand and apply these opsins, we present an integrated suite of open-source, multi-scale computational tools called PyRhO. The purpose of developing PyRhO is three-fold: (i) to characterize new (and existing) opsins by automatically fitting a minimal set of experimental data to three-, four-, or six-state kinetic models, (ii) to simulate these models at the channel, neuron and network levels, and (iii) provide functional insights through model selection and virtual experiments in silico. The module is written in Python with an additional IPython/Jupyter notebook based GUI, allowing models to be fit, simulations to be run and results to be shared through simply interacting with a webpage. The seamless integration of model fitting algorithms with simulation environments (including NEURON and Brian2) for these virtual opsins will enable neuroscientists to gain a comprehensive understanding of their behavior and rapidly identify the most suitable variant for application in a particular biological system. This process may thereby guide not only experimental design and opsin choice but also alterations of the opsin genetic code in a neuro-engineering feed-back loop. In this way, we expect PyRhO will help to significantly advance optogenetics as a tool for transforming biological sciences.
Han, Seong Kyu; Lee, Dongyeop; Lee, Heetak; Kim, Donghyo; Son, Heehwa G; Yang, Jae-Seong; Lee, Seung-Jae V; Kim, Sanguk
2016-08-30
Online application for survival analysis (OASIS) has served as a popular and convenient platform for the statistical analysis of various survival data, particularly in the field of aging research. With the recent advances in the fields of aging research that deal with complex survival data, we noticed a need for updates to the current version of OASIS. Here, we report OASIS 2 (http://sbi.postech.ac.kr/oasis2), which provides extended statistical tools for survival data and an enhanced user interface. In particular, OASIS 2 enables the statistical comparison of maximal lifespans, which is potentially useful for determining key factors that limit the lifespan of a population. Furthermore, OASIS 2 provides statistical and graphical tools that compare values in different conditions and times. That feature is useful for comparing age-associated changes in physiological activities, which can be used as indicators of "healthspan." We believe that OASIS 2 will serve as a standard platform for survival analysis with advanced and user-friendly statistical tools for experimental biologists in the field of aging research.
Lesaint, Florian; Sigaud, Olivier; Khamassi, Mehdi
2014-01-01
Animals, including Humans, are prone to develop persistent maladaptive and suboptimal behaviours. Some of these behaviours have been suggested to arise from interactions between brain systems of Pavlovian conditioning, the acquisition of responses to initially neutral stimuli previously paired with rewards, and instrumental conditioning, the acquisition of active behaviours leading to rewards. However the mechanics of these systems and their interactions are still unclear. While extensively studied independently, few models have been developed to account for these interactions. On some experiment, pigeons have been observed to display a maladaptive behaviour that some suggest to involve conflicts between Pavlovian and instrumental conditioning. In a procedure referred as negative automaintenance, a key light is paired with the subsequent delivery of food, however any peck towards the key light results in the omission of the reward. Studies showed that in such procedure some pigeons persisted in pecking to a substantial level despite its negative consequence, while others learned to refrain from pecking and maximized their cumulative rewards. Furthermore, the pigeons that were unable to refrain from pecking could nevertheless shift their pecks towards a harmless alternative key light. We confronted a computational model that combines dual-learning systems and factored representations, recently developed to account for sign-tracking and goal-tracking behaviours in rats, to these negative automaintenance experimental data. We show that it can explain the variability of the observed behaviours and the capacity of alternative key lights to distract pigeons from their detrimental behaviours. These results confirm the proposed model as an interesting tool to reproduce experiments that could involve interactions between Pavlovian and instrumental conditioning. The model allows us to draw predictions that may be experimentally verified, which could help further investigate the neural mechanisms underlying theses interactions. PMID:25347531
Optimization Techniques for Design Problems in Selected Areas in WSNs: A Tutorial
Ibrahim, Ahmed; Alfa, Attahiru
2017-01-01
This paper is intended to serve as an overview of, and mostly a tutorial to illustrate, the optimization techniques used in several different key design aspects that have been considered in the literature of wireless sensor networks (WSNs). It targets the researchers who are new to the mathematical optimization tool, and wish to apply it to WSN design problems. We hence divide the paper into two main parts. One part is dedicated to introduce optimization theory and an overview on some of its techniques that could be helpful in design problem in WSNs. In the second part, we present a number of design aspects that we came across in the WSN literature in which mathematical optimization methods have been used in the design. For each design aspect, a key paper is selected, and for each we explain the formulation techniques and the solution methods implemented. We also provide in-depth analyses and assessments of the problem formulations, the corresponding solution techniques and experimental procedures in some of these papers. The analyses and assessments, which are provided in the form of comments, are meant to reflect the points that we believe should be taken into account when using optimization as a tool for design purposes. PMID:28763039
Optimization Techniques for Design Problems in Selected Areas in WSNs: A Tutorial.
Ibrahim, Ahmed; Alfa, Attahiru
2017-08-01
This paper is intended to serve as an overview of, and mostly a tutorial to illustrate, the optimization techniques used in several different key design aspects that have been considered in the literature of wireless sensor networks (WSNs). It targets the researchers who are new to the mathematical optimization tool, and wish to apply it to WSN design problems. We hence divide the paper into two main parts. One part is dedicated to introduce optimization theory and an overview on some of its techniques that could be helpful in design problem in WSNs. In the second part, we present a number of design aspects that we came across in the WSN literature in which mathematical optimization methods have been used in the design. For each design aspect, a key paper is selected, and for each we explain the formulation techniques and the solution methods implemented. We also provide in-depth analyses and assessments of the problem formulations, the corresponding solution techniques and experimental procedures in some of these papers. The analyses and assessments, which are provided in the form of comments, are meant to reflect the points that we believe should be taken into account when using optimization as a tool for design purposes.
The WEIZMASS spectral library for high-confidence metabolite identification
NASA Astrophysics Data System (ADS)
Shahaf, Nir; Rogachev, Ilana; Heinig, Uwe; Meir, Sagit; Malitsky, Sergey; Battat, Maor; Wyner, Hilary; Zheng, Shuning; Wehrens, Ron; Aharoni, Asaph
2016-08-01
Annotation of metabolites is an essential, yet problematic, aspect of mass spectrometry (MS)-based metabolomics assays. The current repertoire of definitive annotations of metabolite spectra in public MS databases is limited and suffers from lack of chemical and taxonomic diversity. Furthermore, the heterogeneity of the data prevents the development of universally applicable metabolite annotation tools. Here we present a combined experimental and computational platform to advance this key issue in metabolomics. WEIZMASS is a unique reference metabolite spectral library developed from high-resolution MS data acquired from a structurally diverse set of 3,540 plant metabolites. We also present MatchWeiz, a multi-module strategy using a probabilistic approach to match library and experimental data. This strategy allows efficient and high-confidence identification of dozens of metabolites in model and exotic plants, including metabolites not previously reported in plants or found in few plant species to date.
NASA Astrophysics Data System (ADS)
Yamazaki, Kenji; Maehara, Yosuke; Gohara, Kazutoshi
2018-06-01
The number of layers affects the electronic properties of graphene owing to its unique band structure, called the Dirac corn. Raman spectroscopy is a key diagnostic tool for identifying the number of graphene layers and for determining their physical properties. Here, we observed moiré structures in transmission electron microscopy (TEM) observations; these are signature patterns in multilayer, although Raman spectra showed the typical intensity of the 2D/G peak in the monolayer. We also performed a multi-slice TEM image simulation to compare the 3D atomic structures of the two graphene membranes with experimental TEM images. We found that the experimental moiré image was constructed with a 9-12 Å interlayer distance between graphene membranes. This structure was constructed by transferring CVD-grown graphene films that formed on both sides of the Cu substrate at once.
Micro- and nano-mechanics in China: A brief review of recent progress and perspectives
NASA Astrophysics Data System (ADS)
Xu, ZhiPing; Zheng, QuanShui
2018-07-01
The past three decades have witnessed the explosion of nanoscience and technology, where notable research efforts have been made in synthesizing nanomaterials and controlling nanostructures of bulk materials. The uncovered mechanical behaviors of structures and materials with reduced sizes and dimensions pose open questions to the community of mechanicians, which expand the framework of continuum mechanics by advancing the theory, as well as modeling and experimental tools. Researchers in China have been actively involved into this exciting area, making remarkable contributions to the understanding of nanoscale mechanical processes, the development of multi-scale, multi-field modeling and experimental techniques to resolve the processing-microstructures-properties relationship of materials, and the interdisciplinary studies that broaden the subjects of mechanics. This article reviews selected progress made by this community, with the aim to clarify the key concepts, methods and applications of micro- and nano-mechanics, and to outline the perspectives in this fast-evolving field.
NASA Astrophysics Data System (ADS)
Huang, Yuanyuan; Hou, Panyu; Yuan, Xinxing; Chang, Xiuying; Zu, Chong; He, Li; Duan, Luming; CenterQuantum Information, IIIS, Tsinghua University, Beijing 100084, PR China Team; Department of Physics, University of Michigan, Ann Arbor, Michigan 48109, USA Team
2016-05-01
Quantum teleportation is of great importance to various quantum technologies, and has been realized between light beams, trapped atoms, superconducting qubits, and defect spins in solids. Here we report an experimental demonstration of quantum teleportation from light beams to vibrational states of a macroscopic diamond under ambient conditions. In our experiment, the ultrafast laser technology provides the key tool for fast processing and detection of quantum states within its short life time in macroscopic objects consisting of many strongly interacting atoms that are coupled to the environment, and finally we demonstrate an average teleportation fidelity (90 . 6 +/- 1 . 0) % , clearly exceeding the classical limit of 2/3. Quantum control of the optomechanical coupling may provide efficient ways for realization of transduction of quantum signals, processing of quantum information, and sensing of small mechanical vibrations. Center for Quantum Information, IIIS, Tsinghua University, Beijing 100084, PR China.
High Level Analysis, Design and Validation of Distributed Mobile Systems with
NASA Astrophysics Data System (ADS)
Farahbod, R.; Glässer, U.; Jackson, P. J.; Vajihollahi, M.
System design is a creative activity calling for abstract models that facilitate reasoning about the key system attributes (desired requirements and resulting properties) so as to ensure these attributes are properly established prior to actually building a system. We explore here the practical side of using the abstract state machine (ASM) formalism in combination with the CoreASM open source tool environment for high-level design and experimental validation of complex distributed systems. Emphasizing the early phases of the design process, a guiding principle is to support freedom of experimentation by minimizing the need for encoding. CoreASM has been developed and tested building on a broad scope of applications, spanning computational criminology, maritime surveillance and situation analysis. We critically reexamine here the CoreASM project in light of three different application scenarios.
Neural-network quantum state tomography
NASA Astrophysics Data System (ADS)
Torlai, Giacomo; Mazzola, Guglielmo; Carrasquilla, Juan; Troyer, Matthias; Melko, Roger; Carleo, Giuseppe
2018-05-01
The experimental realization of increasingly complex synthetic quantum systems calls for the development of general theoretical methods to validate and fully exploit quantum resources. Quantum state tomography (QST) aims to reconstruct the full quantum state from simple measurements, and therefore provides a key tool to obtain reliable analytics1-3. However, exact brute-force approaches to QST place a high demand on computational resources, making them unfeasible for anything except small systems4,5. Here we show how machine learning techniques can be used to perform QST of highly entangled states with more than a hundred qubits, to a high degree of accuracy. We demonstrate that machine learning allows one to reconstruct traditionally challenging many-body quantities—such as the entanglement entropy—from simple, experimentally accessible measurements. This approach can benefit existing and future generations of devices ranging from quantum computers to ultracold-atom quantum simulators6-8.
NASA Astrophysics Data System (ADS)
Farroni, Flavio
2016-05-01
The most powerful engine, the most sophisticated aerodynamic devices or the most complex control systems will not improve vehicle performances if the forces exchanged with the road are not optimized by proper employment and knowledge of tires. The vehicle interface with the ground is constituted by the sum of small surfaces, wide about as one of our palms, in which tire/road interaction forces are exchanged. From this it is clear to see how the optimization of tire behavior represents a key-factor in the definition of the best setup of the whole vehicle. Nowadays, people and companies playing a role in automotive sector are looking for the optimal solution to model and understand tire's behavior both in experimental and simulation environments. The studies carried out and the tool developed herein demonstrate a new approach in tire characterization and in vehicle simulation procedures. This enables the reproduction of the dynamic response of a tire through the use of specific track sessions, carried out with the aim to employ the vehicle as a moving lab. The final product, named TRICK tool (Tire/Road Interaction Characterization and Knowledge), comprises of a vehicle model which processes experimental signals acquired from vehicle CAN bus and from sideslip angle estimation additional instrumentation. The output of the tool is several extra "virtual telemetry" channels, based on the time history of the acquired signals and containing force and slip estimations, useful to provide tire interaction characteristics. TRICK results can be integrated with the physical models developed by the Vehicle Dynamics UniNa research group, providing a multitude of working solutions and constituting an ideal instrument for the prediction and the simulation of the real tire dynamics.
Lakshminarayana, Indumathy; Wall, David; Bindal, Taruna; Goodyear, Helen M
2015-05-01
Leading a ward round is an essential skill for hospital consultants and senior trainees but is rarely assessed during training. To investigate the key attributes for ward round leadership and to use these results to develop a multisource feedback (MSF) tool to assess the ward round leadership skills of senior specialist trainees. A panel of experts comprising four senior paediatric consultants and two nurse managers were interviewed from May to August 2009. From analysis of the interview transcripts, 10 key themes emerged. A structured questionnaire based on the key themes was designed and sent electronically to paediatric consultants, nurses and trainees at a large university hospital (June-October 2010). 81 consultants, nurses and trainees responded to the survey. The internal consistency of this tool was high (Cronbach's α 0.95). Factor analysis showed that five factors accounted for 72% of variance. The five key areas for ward round leadership were communication skills, preparation and organisation, teaching and enthusiasm, team working and punctuality; communication was the most important key theme. A MSF tool for ward round leadership skills was developed with these areas as five domains. We believe that this tool will add to the current assessment tools available by providing feedback about ward round leadership skills. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
2011-01-01
Background To make sense out of gene expression profiles, such analyses must be pushed beyond the mere listing of affected genes. For example, if a group of genes persistently display similar changes in expression levels under particular experimental conditions, and the proteins encoded by these genes interact and function in the same cellular compartments, this could be taken as very strong indicators for co-regulated protein complexes. One of the key requirements is having appropriate tools to detect such regulatory patterns. Results We have analyzed the global adaptations in gene expression patterns in the budding yeast when the Hsp90 molecular chaperone complex is perturbed either pharmacologically or genetically. We integrated these results with publicly accessible expression, protein-protein interaction and intracellular localization data. But most importantly, all experimental conditions were simultaneously and dynamically visualized with an animation. This critically facilitated the detection of patterns of gene expression changes that suggested underlying regulatory networks that a standard analysis by pairwise comparison and clustering could not have revealed. Conclusions The results of the animation-assisted detection of changes in gene regulatory patterns make predictions about the potential roles of Hsp90 and its co-chaperone p23 in regulating whole sets of genes. The simultaneous dynamic visualization of microarray experiments, represented in networks built by integrating one's own experimental with publicly accessible data, represents a powerful discovery tool that allows the generation of new interpretations and hypotheses. PMID:21672238
Thermodynamic fingerprints of non-Markovianity in a system of coupled superconducting qubits
NASA Astrophysics Data System (ADS)
Hamedani Raja, Sina; Borrelli, Massimo; Schmidt, Rebecca; Pekola, Jukka P.; Maniscalco, Sabrina
2018-03-01
The exploitation and characterization of memory effects arising from the interaction between system and environment is a key prerequisite for quantum reservoir engineering beyond the standard Markovian limit. In this paper we investigate a prototype of non-Markovian dynamics experimentally implementable with superconducting qubits. We rigorously quantify non-Markovianity, highlighting the effects of the environmental temperature on the Markovian to non-Markovian crossover. We investigate how memory effects influence, and specifically suppress, the ability to perform work on the driven qubit. We show that the average work performed on the qubit can be used as a diagnostic tool to detect the presence or absence of memory effects.
Ribosome profiling reveals the what, when, where and how of protein synthesis.
Brar, Gloria A; Weissman, Jonathan S
2015-11-01
Ribosome profiling, which involves the deep sequencing of ribosome-protected mRNA fragments, is a powerful tool for globally monitoring translation in vivo. The method has facilitated discovery of the regulation of gene expression underlying diverse and complex biological processes, of important aspects of the mechanism of protein synthesis, and even of new proteins, by providing a systematic approach for experimental annotation of coding regions. Here, we introduce the methodology of ribosome profiling and discuss examples in which this approach has been a key factor in guiding biological discovery, including its prominent role in identifying thousands of novel translated short open reading frames and alternative translation products.
Sathyasaikumar, Korrapati V; Breda, Carlo; Schwarcz, Robert; Giorgini, Flaviano
2018-01-01
The link between disturbances in kynurenine pathway (KP) metabolism and Huntington's disease (HD) pathogenesis has been explored for a number of years. Several novel genetic and pharmacological tools have recently been developed to modulate key regulatory steps in the KP such as the reaction catalyzed by the enzyme kynurenine 3-monooxygenase (KMO). This insight has offered new options for exploring the mechanistic link between this metabolic pathway and HD, and provided novel opportunities for the development of candidate drug-like compounds. Here, we present an overview of the field, focusing on some novel approaches for interrogating the pathway experimentally.
Fourteen Years of R/qtl: Just Barely Sustainable
Broman, Karl W.
2014-01-01
R/qtl is an R package for mapping quantitative trait loci (genetic loci that contribute to variation in quantitative traits) in experimental crosses. Its development began in 2000. There have been 38 software releases since 2001. The latest release contains 35k lines of R code and 24k lines of C code, plus 15k lines of code for the documentation. Challenges in the development and maintenance of the software are discussed. A key to the success of R/qtl is that it remains a central tool for the chief developer's own research work, and so its maintenance is of selfish importance. PMID:25364504
Simplified models for dark matter face their consistent completions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gonçalves, Dorival; Machado, Pedro A. N.; No, Jose Miguel
Simplified dark matter models have been recently advocated as a powerful tool to exploit the complementarity between dark matter direct detection, indirect detection and LHC experimental probes. Focusing on pseudoscalar mediators between the dark and visible sectors, we show that the simplified dark matter model phenomenology departs significantly from that of consistentmore » $${SU(2)_{\\mathrm{L}} \\times U(1)_{\\mathrm{Y}}}$$ gauge invariant completions. We discuss the key physics simplified models fail to capture, and its impact on LHC searches. Notably, we show that resonant mono-Z searches provide competitive sensitivities to standard mono-jet analyses at $13$ TeV LHC.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tarchalski, M.; Pytel, K.; Wroblewska, M.
2015-07-01
Precise computational determination of nuclear heating which consists predominantly of gamma heating (more than 80 %) is one of the challenges in material testing reactor exploitation. Due to sophisticated construction and conditions of experimental programs planned in JHR it became essential to use most accurate and precise gamma heating model. Before the JHR starts to operate, gamma heating evaluation methods need to be developed and qualified in other experimental reactor facilities. This is done inter alia using OSIRIS, MINERVE or EOLE research reactors in France. Furthermore, MARIA - Polish material testing reactor - has been chosen to contribute to themore » qualification of gamma heating calculation schemes/tools. This reactor has some characteristics close to those of JHR (beryllium usage, fuel element geometry). To evaluate gamma heating in JHR and MARIA reactors, both simulation tools and experimental program have been developed and performed. For gamma heating simulation, new calculation scheme and gamma heating model of MARIA have been carried out using TRIPOLI4 and APOLLO2 codes. Calculation outcome has been verified by comparison to experimental measurements in MARIA reactor. To have more precise calculation results, model of MARIA in TRIPOLI4 has been made using the whole geometry of the core. This has been done for the first time in the history of MARIA reactor and was complex due to cut cone shape of all its elements. Material composition of burnt fuel elements has been implemented from APOLLO2 calculations. An experiment for nuclear heating measurements and calculation verification has been done in September 2014. This involved neutron, photon and nuclear heating measurements at selected locations in MARIA reactor using in particular Rh SPND, Ag SPND, Ionization Chamber (all three from CEA), KAROLINA calorimeter (NCBJ) and Gamma Thermometer (CEA/SCK CEN). Measurements were done in forty points using four channels. Maximal nuclear heating evaluated from measurements is of the order of 2.5 W/g at half of the possible MARIA power - 15 MW. The approach and the detailed program for experimental verification of calculations will be presented. The following points will be discussed: - Development of a gamma heating model of MARIA reactor with TRIPOLI 4 (coupled neutron-photon mode) and APOLLO2 model taking into account the key parameters like: configuration of the core, experimental loading, control rod location, reactor power, fuel depletion); - Design of specific measurement tools for MARIA experiments including for instance a new single-cell calorimeter called KAROLINA calorimeter; - MARIA experimental program description and a preliminary analysis of results; - Comparison of calculations for JHR and MARIA cores with experimental verification analysis, calculation behavior and n-γ 'environments'. (authors)« less
The use of clinical trials in comparative effectiveness research on mental health
Blanco, Carlos; Rafful, Claudia; Olfson, Mark
2013-01-01
Objectives A large body of research on comparative effectiveness research (CER) focuses on the use of observational and quasi-experimental approaches. We sought to examine the use of clinical trials as a tool for CER, particularly in mental health. Study Design and Setting Examination of three ongoing randomized clinical trials in psychiatry that address issues which would pose difficulties for non-experimental CER methods. Results Existing statistical approaches to non-experimental data appear insufficient to compensate for biases that may arise when the pattern of missing data cannot be properly modeled such as when there are no standards for treatment, when affected populations have limited access to treatment, or when there are high rates of treatment dropout. Conclusions Clinical trials should retain an important role in CER, particularly in cases of high disorder prevalence, large expected effect sizes, difficult to reach populations or when examining sequential treatments or stepped-care algorithms. Progress in CER in mental health will require careful consideration of appropriate selection between clinical trials and non-experimental designs and on allocation of research resources to optimally inform key treatment decisions for each individual patient. PMID:23849150
ATtRACT-a database of RNA-binding proteins and associated motifs.
Giudice, Girolamo; Sánchez-Cabo, Fátima; Torroja, Carlos; Lara-Pezzi, Enrique
2016-01-01
RNA-binding proteins (RBPs) play a crucial role in key cellular processes, including RNA transport, splicing, polyadenylation and stability. Understanding the interaction between RBPs and RNA is key to improve our knowledge of RNA processing, localization and regulation in a global manner. Despite advances in recent years, a unified non-redundant resource that includes information on experimentally validated motifs, RBPs and integrated tools to exploit this information is lacking. Here, we developed a database named ATtRACT (available athttp://attract.cnic.es) that compiles information on 370 RBPs and 1583 RBP consensus binding motifs, 192 of which are not present in any other database. To populate ATtRACT we (i) extracted and hand-curated experimentally validated data from CISBP-RNA, SpliceAid-F, RBPDB databases, (ii) integrated and updated the unavailable ASD database and (iii) extracted information from Protein-RNA complexes present in Protein Data Bank database through computational analyses. ATtRACT provides also efficient algorithms to search a specific motif and scan one or more RNA sequences at a time. It also allows discoveringde novomotifs enriched in a set of related sequences and compare them with the motifs included in the database.Database URL:http:// attract. cnic. es. © The Author(s) 2016. Published by Oxford University Press.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Im, Piljae; Bhandari, Mahabir S.; New, Joshua Ryan
This document describes the Oak Ridge National Laboratory (ORNL) multiyear experimental plan for validation and uncertainty characterization of whole-building energy simulation for a multi-zone research facility using a traditional rooftop unit (RTU) as a baseline heating, ventilating, and air conditioning (HVAC) system. The project’s overarching objective is to increase the accuracy of energy simulation tools by enabling empirical validation of key inputs and algorithms. Doing so is required to inform the design of increasingly integrated building systems and to enable accountability for performance gaps between design and operation of a building. The project will produce documented data sets that canmore » be used to validate key functionality in different energy simulation tools and to identify errors and inadequate assumptions in simulation engines so that developers can correct them. ASHRAE Standard 140, Method of Test for the Evaluation of Building Energy Analysis Computer Programs (ASHRAE 2004), currently consists primarily of tests to compare different simulation programs with one another. This project will generate sets of measured data to enable empirical validation, incorporate these test data sets in an extended version of Standard 140, and apply these tests to the Department of Energy’s (DOE) EnergyPlus software (EnergyPlus 2016) to initiate the correction of any significant deficiencies. The fitness-for-purpose of the key algorithms in EnergyPlus will be established and demonstrated, and vendors of other simulation programs will be able to demonstrate the validity of their products. The data set will be equally applicable to validation of other simulation engines as well.« less
Sheridan, Kimberly M; Konopasky, Abigail W; Kirkwood, Sophie; Defeyter, Margaret A
2016-03-19
Research indicates that in experimental settings, young children of 3-7 years old are unlikely to devise a simple tool to solve a problem. This series of exploratory studies done in museums in the US and UK explores how environment and ownership of materials may improve children's ability and inclination for (i) tool material selection and (ii) innovation. The first study takes place in a children's museum, an environment where children can use tools and materials freely. We replicated a tool innovation task in this environment and found that while 3-4 year olds showed the predicted low levels of innovation rates, 4-7 year olds showed higher rates of innovation than the younger children and than reported in prior studies. The second study explores the effect of whether the experimental materials are owned by the experimenter or the child on tool selection and innovation. Results showed that 5-6 year olds and 6-7 year olds were more likely to select tool material they owned compared to tool material owned by the experimenter, although ownership had no effect on tool innovation. We argue that learning environments supporting tool exploration and invention and conveying ownership over materials may encourage successful tool innovation at earlier ages. © 2016 The Author(s).
Kastner, Monika; Perrier, Laure; Hamid, Jemila; Tricco, Andrea C; Cardoso, Roberta; Ivers, Noah M; Liu, Barbara; Marr, Sharon; Holroyd-Leduc, Jayna; Wong, Geoff; Graves, Lisa; Straus, Sharon E
2015-01-01
Introduction The burden of chronic disease is a global phenomenon, particularly among people aged 65 years and older. More than half of older adults have more than one chronic disease and their care is not optimal. Chronic disease management (CDM) tools have the potential to meet this challenge but they are primarily focused on a single disease, which fails to address the growing number of seniors with multiple chronic conditions. Methods and analysis We will conduct a systematic review alongside a realist review to identify effective CDM tools that integrate one or more high-burden chronic diseases affecting older adults and to better understand for whom, under what circumstances, how and why they produce their outcomes. We will search MEDLINE, EMBASE, CINAHL, AgeLine and the Cochrane Library for experimental, quasi-experimental, observational and qualitative studies in any language investigating CDM tools that facilitate optimal disease management in one or more high-burden chronic diseases affecting adults aged ≥65 years. Study selection will involve calibration of reviewers to ensure reliability of screening and duplicate assessment of articles. Data abstraction and risk of bias assessment will also be performed independently. Analysis will include descriptive summaries of study and appraisal characteristics, effectiveness of each CDM tool (meta-analysis if appropriate); and a realist programme theory will be developed and refined to explain the outcome patterns within the included studies. Ethics and dissemination Ethics approval is not required for this study. We anticipate that our findings, pertaining to gaps in care across high-burden chronic diseases affecting seniors and highlighting specific areas that may require more research, will be of interest to a wide range of knowledge users and stakeholders. We will publish and present our findings widely, and also plan more active dissemination strategies such as workshops with our key stakeholders. Trial registration number Our protocol is registered with PROSPERO (registration number CRD42014014489). PMID:25649215
Providing Guidance in Virtual Lab Experimentation: The Case of an Experiment Design Tool
ERIC Educational Resources Information Center
Efstathiou, Charalampos; Hovardas, Tasos; Xenofontos, Nikoletta A.; Zacharia, Zacharias C.; deJong, Ton; Anjewierden, Anjo; van Riesen, Siswa A. N.
2018-01-01
The present study employed a quasi-experimental design to assess a computer-based tool, which was intended to scaffold the task of designing experiments when using a virtual lab for the process of experimentation. In particular, we assessed the impact of this tool on primary school students' cognitive processes and inquiry skills before and after…
Morosi, J; Berti, N; Akrout, A; Picozzi, A; Guasoni, M; Fatome, J
2018-01-22
In this manuscript, we experimentally and numerically investigate the chaotic dynamics of the state-of-polarization in a nonlinear optical fiber due to the cross-interaction between an incident signal and its intense backward replica generated at the fiber-end through an amplified reflective delayed loop. Thanks to the cross-polarization interaction between the two-delayed counter-propagating waves, the output polarization exhibits fast temporal chaotic dynamics, which enable a powerful scrambling process with moving speeds up to 600-krad/s. The performance of this all-optical scrambler was then evaluated on a 10-Gbit/s On/Off Keying telecom signal achieving an error-free transmission. We also describe how these temporal and chaotic polarization fluctuations can be exploited as an all-optical random number generator. To this aim, a billion-bit sequence was experimentally generated and successfully confronted to the dieharder benchmarking statistic tools. Our experimental analysis are supported by numerical simulations based on the resolution of counter-propagating coupled nonlinear propagation equations that confirm the observed behaviors.
Using a Mobile Dichotomous Key iPad Application as a Scaffolding Tool in a Museum Setting
ERIC Educational Resources Information Center
Knight, Kathryn; Davies, Randall S.
2016-01-01
This study tested an iPad application using a dichotomous key as a scaffolding tool to help students make more detailed observations as they identified various species of birds on display in a museum of natural science. The Mobile Dichotomous Key (MDK) iPad application was used by groups of fifth- and seventh-grade students. Analysis of the…
Morschett, Holger; Freier, Lars; Rohde, Jannis; Wiechert, Wolfgang; von Lieres, Eric; Oldiges, Marco
2017-01-01
Even though microalgae-derived biodiesel has regained interest within the last decade, industrial production is still challenging for economic reasons. Besides reactor design, as well as value chain and strain engineering, laborious and slow early-stage parameter optimization represents a major drawback. The present study introduces a framework for the accelerated development of phototrophic bioprocesses. A state-of-the-art micro-photobioreactor supported by a liquid-handling robot for automated medium preparation and product quantification was used. To take full advantage of the technology's experimental capacity, Kriging-assisted experimental design was integrated to enable highly efficient execution of screening applications. The resulting platform was used for medium optimization of a lipid production process using Chlorella vulgaris toward maximum volumetric productivity. Within only four experimental rounds, lipid production was increased approximately threefold to 212 ± 11 mg L -1 d -1 . Besides nitrogen availability as a key parameter, magnesium, calcium and various trace elements were shown to be of crucial importance. Here, synergistic multi-parameter interactions as revealed by the experimental design introduced significant further optimization potential. The integration of parallelized microscale cultivation, laboratory automation and Kriging-assisted experimental design proved to be a fruitful tool for the accelerated development of phototrophic bioprocesses. By means of the proposed technology, the targeted optimization task was conducted in a very timely and material-efficient manner.
Single-molecule fluorescence microscopy review: shedding new light on old problems
Shashkova, Sviatlana
2017-01-01
Fluorescence microscopy is an invaluable tool in the biosciences, a genuine workhorse technique offering exceptional contrast in conjunction with high specificity of labelling with relatively minimal perturbation to biological samples compared with many competing biophysical techniques. Improvements in detector and dye technologies coupled to advances in image analysis methods have fuelled recent development towards single-molecule fluorescence microscopy, which can utilize light microscopy tools to enable the faithful detection and analysis of single fluorescent molecules used as reporter tags in biological samples. For example, the discovery of GFP, initiating the so-called ‘green revolution’, has pushed experimental tools in the biosciences to a completely new level of functional imaging of living samples, culminating in single fluorescent protein molecule detection. Today, fluorescence microscopy is an indispensable tool in single-molecule investigations, providing a high signal-to-noise ratio for visualization while still retaining the key features in the physiological context of native biological systems. In this review, we discuss some of the recent discoveries in the life sciences which have been enabled using single-molecule fluorescence microscopy, paying particular attention to the so-called ‘super-resolution’ fluorescence microscopy techniques in live cells, which are at the cutting-edge of these methods. In particular, how these tools can reveal new insights into long-standing puzzles in biology: old problems, which have been impossible to tackle using other more traditional tools until the emergence of new single-molecule fluorescence microscopy techniques. PMID:28694303
Facial expression: An under-utilised tool for the assessment of welfare in mammals.
Descovich, Kris A; Wathan, Jennifer; Leach, Matthew C; Buchanan-Smith, Hannah M; Flecknell, Paul; Farningham, David; Vick, Sarah-Jane
2017-01-01
Animal welfare is a key issue for industries that use or impact upon animals. The accurate identification of welfare states is particularly relevant to the field of bioscience, where the 3Rs framework encourages refinement of experimental procedures involving animal models. The assessment and improvement of welfare states in animals depends on reliable and valid measurement tools. Behavioral measures (activity, attention, posture and vocalization) are frequently used because they are immediate and non-invasive, however no single indicator can yield a complete picture of the internal state of an animal. Facial expressions are extensively studied in humans as a measure of psychological and emotional experiences but are infrequently used in animal studies, with the exception of emerging research on pain behavior. In this review, we discuss current evidence for facial representations of underlying affective states, and how communicative or functional expressions can be useful within welfare assessments. Validated tools for measuring facial movement are outlined, and the potential of expressions as honest signals is discussed, alongside other challenges and limitations to facial expression measurement within the context of animal welfare. We conclude that facial expression determination in animals is a useful but underutilized measure that complements existing tools in the assessment of welfare.
Experimental Evaluation of the Tools of the Mind Pre-K Curriculum. Technical Report. Working Paper
ERIC Educational Resources Information Center
Farran, Dale C.; Wilson, Sandra J.; Meador, Deanna; Norvell, Jennifer; Nesbitt, Kimberly
2015-01-01
The experimental evaluation of the "Tools of the Mind Pre-K Curriculum" described in this report was designed to examine the effectiveness of the "Tools of the Mind" ("Tools") curriculum for enhancing children's self-regulation skills and their academic preparation for kindergarten when compared to the usual…
Ernecoff, Natalie C; Witteman, Holly O; Chon, Kristen; Chen, Yanquan Iris; Buddadhumaruk, Praewpannarai; Chiarchiaro, Jared; Shotsberger, Kaitlin J; Shields, Anne-Marie; Myers, Brad A; Hough, Catherine L; Carson, Shannon S; Lo, Bernard; Matthay, Michael A; Anderson, Wendy G; Peterson, Michael W; Steingrub, Jay S; Arnold, Robert M; White, Douglas B
2016-06-01
Although barriers to shared decision making in intensive care units are well documented, there are currently no easily scaled interventions to overcome these problems. We sought to assess stakeholders' perceptions of the acceptability, usefulness, and design suggestions for a tablet-based tool to support communication and shared decision making in ICUs. We conducted in-depth semi-structured interviews with 58 key stakeholders (30 surrogates and 28 ICU care providers). Interviews explored stakeholders' perceptions about the acceptability of a tablet-based tool to support communication and shared decision making, including the usefulness of modules focused on orienting families to the ICU, educating them about the surrogate's role, completing a question prompt list, eliciting patient values, educating about treatment options, eliciting perceptions about prognosis, and providing psychosocial support resources. The interviewer also elicited stakeholders' design suggestions for such a tool. We used constant comparative methods to identify key themes that arose during the interviews. Overall, 95% (55/58) of participants perceived the proposed tool to be acceptable, with 98% (57/58) of interviewees finding six or more of the seven content domains acceptable. Stakeholders identified several potential benefits of the tool including that it would help families prepare for the surrogate role and for family meetings as well as give surrogates time and a framework to think about the patient's values and treatment options. Key design suggestions included: conceptualize the tool as a supplement to rather than a substitute for surrogate-clinician communication; make the tool flexible with respect to how, where, and when surrogates can access the tool; incorporate interactive exercises; use video and narration to minimize the cognitive load of the intervention; and build an extremely simple user interface to maximize usefulness for individuals with low computer literacy. There is broad support among stakeholders for the use of a tablet-based tool to improve communication and shared decision making in ICUs. Eliciting the perspectives of key stakeholders early in the design process yielded important insights to create a tool tailored to the needs of surrogates and care providers in ICUs. Copyright © 2016 Elsevier Inc. All rights reserved.
2014-01-01
Background Indoor Residual Spraying (IRS) and Long-Lasting Insecticidal nets (LLINs) are major malaria vector control tools in Ethiopia. However, recent reports from different parts of the country showed that populations of Anopheles arabiensis, the principal malaria vector, have developed resistance to most families of insecticides recommended for public health use which may compromise the efficacy of both of these key vector control interventions. Thus, this study evaluated the efficacy of DDT IRS and LLINs against resistant populations of An. arabiensis using experimental huts in Asendabo area, southwestern Ethiopia. Methods The susceptibility status of populations of An. arabiensis was assessed using WHO test kits to DDT, deltamethrin, malathion, lambda-cyhalothrin, fenitrothion and bendiocarb. The efficacy of LLIN (PermaNet® 2.0), was evaluated using the WHO cone bioassay. Moreover, the effect of the observed resistance against malaria vector control interventions (DDT IRS and LLINs) were assessed using experimental huts. Results The findings of this study revealed that populations of An. arabiensis were resistant to DDT, deltamethrin, lambda-cyhalothrin and malathion with mortality rates of 1.3%, 18.8%, 36.3% and 72.5%, respectively but susceptible to fenitrothion and bendiocarb with mortality rates of 98.81% and 97.5%, respectively. The bio-efficacy test of LLIN (PermaNet® 2.0) against An. arabiensis revealed that the mosquito population showed moderate knockdown (64%) and mortality (78%). Moreover, mosquito mortalities in DDT sprayed huts and in huts with LLINs were not significantly different (p > 0.05) from their respective controls. Conclusion The evaluation of the efficacy of DDT IRS and LLINs using experimental huts showed that both vector control tools had only low to moderate efficacy against An. arabiensis populations from Ethiopia. Despite DDT being replaced by carbamates for IRS, the low efficacy of LLINs against the resistant population of An. arabiensis is still a problem. Thus, there is a need for alternative vector control tools and implementation of appropriate insecticide resistance management strategies as part of integrated vector management by the national malaria control program. PMID:24678605
HyQue: evaluating hypotheses using Semantic Web technologies
2011-01-01
Background Key to the success of e-Science is the ability to computationally evaluate expert-composed hypotheses for validity against experimental data. Researchers face the challenge of collecting, evaluating and integrating large amounts of diverse information to compose and evaluate a hypothesis. Confronted with rapidly accumulating data, researchers currently do not have the software tools to undertake the required information integration tasks. Results We present HyQue, a Semantic Web tool for querying scientific knowledge bases with the purpose of evaluating user submitted hypotheses. HyQue features a knowledge model to accommodate diverse hypotheses structured as events and represented using Semantic Web languages (RDF/OWL). Hypothesis validity is evaluated against experimental and literature-sourced evidence through a combination of SPARQL queries and evaluation rules. Inference over OWL ontologies (for type specifications, subclass assertions and parthood relations) and retrieval of facts stored as Bio2RDF linked data provide support for a given hypothesis. We evaluate hypotheses of varying levels of detail about the genetic network controlling galactose metabolism in Saccharomyces cerevisiae to demonstrate the feasibility of deploying such semantic computing tools over a growing body of structured knowledge in Bio2RDF. Conclusions HyQue is a query-based hypothesis evaluation system that can currently evaluate hypotheses about the galactose metabolism in S. cerevisiae. Hypotheses as well as the supporting or refuting data are represented in RDF and directly linked to one another allowing scientists to browse from data to hypothesis and vice versa. HyQue hypotheses and data are available at http://semanticscience.org/projects/hyque. PMID:21624158
ERIC Educational Resources Information Center
Meador, Deanna; Nesbitt, Kimberly; Farran, Dale
2015-01-01
The "Experimental Evaluation of the Tools of the Mind Pre-K Curriculum" study was designed to compare the effectiveness of the "Tools of the Mind" ("Tools") curriculum to the curricula the school system is currently using in enhancing children's self-regulation skills and their academic preparation for kindergarten.…
Tol, Wietse; Jordans, Mark; Zangana, Goran Sabir; Amin, Ahmed Mohammed; Bolton, Paul; Bass, Judith; Bonilla-Escobar, Fransisco Javier; Thornicroft, Graham
2014-01-01
The burden of mental health problems in (post)conflict low- and middle-income countries (LMIC) is substantial. Despite growing evidence for the effectiveness of selected mental health programs in conflict-affected LMIC and growing policy support, actual uptake and implementation have been slow. A key direction for future research, and a new frontier within science and practice, is Dissemination and Implementation (DI) which directly addresses the movement of evidence-based, effective health care approaches from experimental settings into routine use. This paper outlines some key implementation challenges, and strategies to address these, while implementing evidence-based treatments in conflict-affected LMIC based on the authors’ collective experiences. Dissemination and implementation evaluation and research in conflict settings is an essential new research direction. Future DI work in LMIC should include: 1) defining concepts and developing measurement tools, 2) the measurement of DI outcomes for all programming, and 3) the systematic evaluation of specific implementation strategies. PMID:28316559
Stem cells in clinical practice: applications and warnings.
Lodi, Daniele; Iannitti, Tommaso; Palmieri, Beniamino
2011-01-17
Stem cells are a relevant source of information about cellular differentiation, molecular processes and tissue homeostasis, but also one of the most putative biological tools to treat degenerative diseases. This review focuses on human stem cells clinical and experimental applications. Our aim is to take a correct view of the available stem cell subtypes and their rational use in the medical area, with a specific focus on their therapeutic benefits and side effects. We have reviewed the main clinical trials dividing them basing on their clinical applications, and taking into account the ethical issue associated with the stem cell therapy. We have searched Pubmed/Medline for clinical trials, involving the use of human stem cells, using the key words "stem cells" combined with the key words "transplantation", "pathology", "guidelines", "properties" and "risks". All the relevant clinical trials have been included. The results have been divided into different categories, basing on the way stem cells have been employed in different pathological conditions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boscá, A., E-mail: alberto.bosca@upm.es; Dpto. de Ingeniería Electrónica, E.T.S.I. de Telecomunicación, Universidad Politécnica de Madrid, Madrid 28040; Pedrós, J.
2015-01-28
Due to its intrinsic high mobility, graphene has proved to be a suitable material for high-speed electronics, where graphene field-effect transistor (GFET) has shown excellent properties. In this work, we present a method for extracting relevant electrical parameters from GFET devices using a simple electrical characterization and a model fitting. With experimental data from the device output characteristics, the method allows to calculate parameters such as the mobility, the contact resistance, and the fixed charge. Differentiated electron and hole mobilities and direct connection with intrinsic material properties are some of the key aspects of this method. Moreover, the method outputmore » values can be correlated with several issues during key fabrication steps such as the graphene growth and transfer, the lithographic steps, or the metalization processes, providing a flexible tool for quality control in GFET fabrication, as well as a valuable feedback for improving the material-growth process.« less
Perspectives on biotechnological applications of archaea
Schiraldi, Chiara; Giuliano, Mariateresa; De Rosa, Mario
2002-01-01
Many archaea colonize extreme environments. They include hyperthermophiles, sulfur-metabolizing thermophiles, extreme halophiles and methanogens. Because extremophilic microorganisms have unusual properties, they are a potentially valuable resource in the development of novel biotechnological processes. Despite extensive research, however, there are few existing industrial applications of either archaeal biomass or archaeal enzymes. This review summarizes current knowledge about the biotechnological uses of archaea and archaeal enzymes with special attention to potential applications that are the subject of current experimental evaluation. Topics covered include cultivation methods, recent achievements in genomics, which are of key importance for the development of new biotechnological tools, and the application of wild-type biomasses, engineered microorganisms, enzymes and specific metabolites in particular bioprocesses of industrial interest. PMID:15803645
Perspectives on biotechnological applications of archaea.
Schiraldi, Chiara; Giuliano, Mariateresa; De Rosa, Mario
2002-09-01
Many archaea colonize extreme environments. They include hyperthermophiles, sulfur-metabolizing thermophiles, extreme halophiles and methanogens. Because extremophilic microorganisms have unusual properties, they are a potentially valuable resource in the development of novel biotechnological processes. Despite extensive research, however, there are few existing industrial applications of either archaeal biomass or archaeal enzymes. This review summarizes current knowledge about the biotechnological uses of archaea and archaeal enzymes with special attention to potential applications that are the subject of current experimental evaluation. Topics covered include cultivation methods, recent achievements in genomics, which are of key importance for the development of new biotechnological tools, and the application of wild-type biomasses, engineered microorganisms, enzymes and specific metabolites in particular bioprocesses of industrial interest.
Imanbaew, Dimitri; Lang, Johannes; Gelin, Maxim F; Kaufhold, Simon; Pfeffer, Michael G; Rau, Sven; Riehn, Christoph
2017-05-08
We present a proof of concept that ultrafast dynamics combined with photochemical stability information of molecular photocatalysts can be acquired by electrospray ionization mass spectrometry combined with time-resolved femtosecond laser spectroscopy in an ion trap. This pump-probe "fragmentation action spectroscopy" gives straightforward access to information that usually requires high purity compounds and great experimental efforts. Results of gas-phase studies on the electronic dynamics of two supramolecular photocatalysts compare well to previous findings in solution and give further evidence for a directed electron transfer, a key process for photocatalytic hydrogen generation. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Sah, Jay P.; Ross, Michael S.; Snyder, James R.; Ogurcak, Danielle E.
2010-01-01
In fire-dependent forests, managers are interested in predicting the consequences of prescribed burning on postfire tree mortality. We examined the effects of prescribed fire on tree mortality in Florida Keys pine forests, using a factorial design with understory type, season, and year of burn as factors. We also used logistic regression to model the effects of burn season, fire severity, and tree dimensions on individual tree mortality. Despite limited statistical power due to problems in carrying out the full suite of planned experimental burns, associations with tree and fire variables were observed. Post-fire pine tree mortality was negatively correlated with tree size and positively correlated with char height and percent crown scorch. Unlike post-fire mortality, tree mortality associated with storm surge from Hurricane Wilma was greater in the large size classes. Due to their influence on population structure and fuel dynamics, the size-selective mortality patterns following fire and storm surge have practical importance for using fire as a management tool in Florida Keys pinelands in the future, particularly when the threats to their continued existence from tropical storms and sea level rise are expected to increase.
Predicting the Consequences of Workload Management Strategies with Human Performance Modeling
NASA Technical Reports Server (NTRS)
Mitchell, Diane Kuhl; Samma, Charneta
2011-01-01
Human performance modelers at the US Army Research Laboratory have developed an approach for establishing Soldier high workload that can be used for analyses of proposed system designs. Their technique includes three key components. To implement the approach in an experiment, the researcher would create two experimental conditions: a baseline and a design alternative. Next they would identify a scenario in which the test participants perform all their representative concurrent interactions with the system. This scenario should include any events that would trigger a different set of goals for the human operators. They would collect workload values during both the control and alternative design condition to see if the alternative increased workload and decreased performance. They have successfully implemented this approach for military vehicle. designs using the human performance modeling tool, IMPRINT. Although ARL researches use IMPRINT to implement their approach, it can be applied to any workload analysis. Researchers using other modeling and simulations tools or conducting experiments or field tests can use the same approach.
Maia, Joaquim; Rodríguez-Bernaldo de Quirós, Ana; Sendón, Raquel; Cruz, José Manuel; Seiler, Annika; Franz, Roland; Simoneau, Catherine; Castle, Laurence; Driffield, Malcolm; Mercea, Peter; Oldring, Peter; Tosa, Valer; Paseiro, Perfecto
2016-01-01
The mass transport process (migration) of a model substance, benzophenone (BZP), from LDPE into selected foodstuffs at three temperatures was studied. A mathematical model based on Fick's Second Law of Diffusion was used to simulate the migration process and a good correlation between experimental and predicted values was found. The acquired results contribute to a better understanding of this phenomenon and the parameters so-derived were incorporated into the migration module of the recently launched FACET tool (Flavourings, Additives and Food Contact Materials Exposure Tool). The migration tests were carried out at different time-temperature conditions, and BZP was extracted from LDPE and analysed by HPLC-DAD. With all data, the parameters for migration modelling (diffusion and partition coefficients) were calculated. Results showed that the diffusion coefficients (within both the polymer and the foodstuff) are greatly affected by the temperature and food's physical state, whereas the partition coefficient was affected significantly only by food characteristics, particularly fat content.
Fast and Accurate Metadata Authoring Using Ontology-Based Recommendations.
Martínez-Romero, Marcos; O'Connor, Martin J; Shankar, Ravi D; Panahiazar, Maryam; Willrett, Debra; Egyedi, Attila L; Gevaert, Olivier; Graybeal, John; Musen, Mark A
2017-01-01
In biomedicine, high-quality metadata are crucial for finding experimental datasets, for understanding how experiments were performed, and for reproducing those experiments. Despite the recent focus on metadata, the quality of metadata available in public repositories continues to be extremely poor. A key difficulty is that the typical metadata acquisition process is time-consuming and error prone, with weak or nonexistent support for linking metadata to ontologies. There is a pressing need for methods and tools to speed up the metadata acquisition process and to increase the quality of metadata that are entered. In this paper, we describe a methodology and set of associated tools that we developed to address this challenge. A core component of this approach is a value recommendation framework that uses analysis of previously entered metadata and ontology-based metadata specifications to help users rapidly and accurately enter their metadata. We performed an initial evaluation of this approach using metadata from a public metadata repository.
Fast and Accurate Metadata Authoring Using Ontology-Based Recommendations
Martínez-Romero, Marcos; O’Connor, Martin J.; Shankar, Ravi D.; Panahiazar, Maryam; Willrett, Debra; Egyedi, Attila L.; Gevaert, Olivier; Graybeal, John; Musen, Mark A.
2017-01-01
In biomedicine, high-quality metadata are crucial for finding experimental datasets, for understanding how experiments were performed, and for reproducing those experiments. Despite the recent focus on metadata, the quality of metadata available in public repositories continues to be extremely poor. A key difficulty is that the typical metadata acquisition process is time-consuming and error prone, with weak or nonexistent support for linking metadata to ontologies. There is a pressing need for methods and tools to speed up the metadata acquisition process and to increase the quality of metadata that are entered. In this paper, we describe a methodology and set of associated tools that we developed to address this challenge. A core component of this approach is a value recommendation framework that uses analysis of previously entered metadata and ontology-based metadata specifications to help users rapidly and accurately enter their metadata. We performed an initial evaluation of this approach using metadata from a public metadata repository. PMID:29854196
Wang, Mengmeng; Ong, Lee-Ling Sharon; Dauwels, Justin; Asada, H Harry
2018-04-01
Cell migration is a key feature for living organisms. Image analysis tools are useful in studying cell migration in three-dimensional (3-D) in vitro environments. We consider angiogenic vessels formed in 3-D microfluidic devices (MFDs) and develop an image analysis system to extract cell behaviors from experimental phase-contrast microscopy image sequences. The proposed system initializes tracks with the end-point confocal nuclei coordinates. We apply convolutional neural networks to detect cell candidates and combine backward Kalman filtering with multiple hypothesis tracking to link the cell candidates at each time step. These hypotheses incorporate prior knowledge on vessel formation and cell proliferation rates. The association accuracy reaches 86.4% for the proposed algorithm, indicating that the proposed system is able to associate cells more accurately than existing approaches. Cell culture experiments in 3-D MFDs have shown considerable promise for improving biology research. The proposed system is expected to be a useful quantitative tool for potential microscopy problems of MFDs.
Cell culture medium improvement by rigorous shuffling of components using media blending.
Jordan, Martin; Voisard, Damien; Berthoud, Antoine; Tercier, Laetitia; Kleuser, Beate; Baer, Gianni; Broly, Hervé
2013-01-01
A novel high-throughput methodology for the simultaneous optimization of many cell culture media components is presented. The method is based on the media blending approach which has several advantages as it works with ready-to-use media. In particular it allows precise pH and osmolarity adjustments and eliminates the need of concentrated stock solutions, a frequent source of serious solubility issues. In addition, media blending easily generates a large number of new compositions providing a remarkable screening tool. However, media blending designs usually do not provide information on distinct factors or components that are causing the desired improvements. This paper addresses this last point by considering the concentration of individual medium components to fix the experimental design and for the interpretation of the results. The extended blending strategy was used to reshuffle the 20 amino acids in one round of experiments. A small set of 10 media was specifically designed to generate a large number of mixtures. 192 mixtures were then prepared by media blending and tested on a recombinant CHO cell line expressing a monoclonal antibody. A wide range of performances (titers and viable cell density) was achieved from the different mixtures with top titers significantly above our previous results seen with this cell line. In addition, information about major effects of key amino acids on cell densities and titers could be extracted from the experimental results. This demonstrates that the extended blending approach is a powerful experimental tool which allows systematic and simultaneous reshuffling of multiple medium components.
The Macro Dynamics of Weapon System Acquisition: Shaping Early Decisions to Get Better Outcomes
2012-05-17
defects and rework •Design tools and processes •Lack of feedback to key design and SE processes •Lack of quantified risk and uncertainty at key... Tools for Rapid Exploration of the Physical Design Space Coupling Operability, Interoperability, and Physical Feasibility Analyses – a Game Changer...Interoperability •Training Quantified Margins and Uncertainties at Each Critical Decision Point M&S RDT&E A Continuum of Tools Underpinned with
NASA Astrophysics Data System (ADS)
Buhari, Abudhahir; Zukarnain, Zuriati Ahmad; Khalid, Roszelinda; Zakir Dato', Wira Jaafar Ahmad
2016-11-01
The applications of quantum information science move towards bigger and better heights for the next generation technology. Especially, in the field of quantum cryptography and quantum computation, the world already witnessed various ground-breaking tangible product and promising results. Quantum cryptography is one of the mature field from quantum mechanics and already available in the markets. The current state of quantum cryptography is still under various researches in order to reach the heights of digital cryptography. The complexity of quantum cryptography is higher due to combination of hardware and software. The lack of effective simulation tool to design and analyze the quantum cryptography experiments delays the reaching distance of the success. In this paper, we propose a framework to achieve an effective non-entanglement based quantum cryptography simulation tool. We applied hybrid simulation technique i.e. discrete event, continuous event and system dynamics. We also highlight the limitations of a commercial photonic simulation tool based experiments. Finally, we discuss ideas for achieving one-stop simulation package for quantum based secure key distribution experiments. All the modules of simulation framework are viewed from the computer science perspective.
Anthropology and cultural neuroscience: creating productive intersections in parallel fields.
Brown, R A; Seligman, R
2009-01-01
Partly due to the failure of anthropology to productively engage the fields of psychology and neuroscience, investigations in cultural neuroscience have occurred largely without the active involvement of anthropologists or anthropological theory. Dramatic advances in the tools and findings of social neuroscience have emerged in parallel with significant advances in anthropology that connect social and political-economic processes with fine-grained descriptions of individual experience and behavior. We describe four domains of inquiry that follow from these recent developments, and provide suggestions for intersections between anthropological tools - such as social theory, ethnography, and quantitative modeling of cultural models - and cultural neuroscience. These domains are: the sociocultural construction of emotion, status and dominance, the embodiment of social information, and the dual social and biological nature of ritual. Anthropology can help locate unique or interesting populations and phenomena for cultural neuroscience research. Anthropological tools can also help "drill down" to investigate key socialization processes accountable for cross-group differences. Furthermore, anthropological research points at meaningful underlying complexity in assumed relationships between social forces and biological outcomes. Finally, ethnographic knowledge of cultural content can aid with the development of ecologically relevant stimuli for use in experimental protocols.
Mutant mice: experimental organisms as materialised models in biomedicine.
Huber, Lara; Keuck, Lara K
2013-09-01
Animal models have received particular attention as key examples of material models. In this paper, we argue that the specificities of establishing animal models-acknowledging their status as living beings and as epistemological tools-necessitate a more complex account of animal models as materialised models. This becomes particularly evident in animal-based models of diseases that only occur in humans: in these cases, the representational relation between animal model and human patient needs to be generated and validated. The first part of this paper presents an account of how disease-specific animal models are established by drawing on the example of transgenic mice models for Alzheimer's disease. We will introduce an account of validation that involves a three-fold process including (1) from human being to experimental organism; (2) from experimental organism to animal model; and (3) from animal model to human patient. This process draws upon clinical relevance as much as scientific practices and results in disease-specific, yet incomplete, animal models. The second part of this paper argues that the incompleteness of models can be described in terms of multi-level abstractions. We qualify this notion by pointing to different experimental techniques and targets of modelling, which give rise to a plurality of models for a specific disease. Copyright © 2013 Elsevier Ltd. All rights reserved.
Mishra, Bud; Daruwala, Raoul-Sam; Zhou, Yi; Ugel, Nadia; Policriti, Alberto; Antoniotti, Marco; Paxia, Salvatore; Rejali, Marc; Rudra, Archisman; Cherepinsky, Vera; Silver, Naomi; Casey, William; Piazza, Carla; Simeoni, Marta; Barbano, Paolo; Spivak, Marina; Feng, Jiawu; Gill, Ofer; Venkatesh, Mysore; Cheng, Fang; Sun, Bing; Ioniata, Iuliana; Anantharaman, Thomas; Hubbard, E Jane Albert; Pnueli, Amir; Harel, David; Chandru, Vijay; Hariharan, Ramesh; Wigler, Michael; Park, Frank; Lin, Shih-Chieh; Lazebnik, Yuri; Winkler, Franz; Cantor, Charles R; Carbone, Alessandra; Gromov, Mikhael
2003-01-01
We collaborate in a research program aimed at creating a rigorous framework, experimental infrastructure, and computational environment for understanding, experimenting with, manipulating, and modifying a diverse set of fundamental biological processes at multiple scales and spatio-temporal modes. The novelty of our research is based on an approach that (i) requires coevolution of experimental science and theoretical techniques and (ii) exploits a certain universality in biology guided by a parsimonious model of evolutionary mechanisms operating at the genomic level and manifesting at the proteomic, transcriptomic, phylogenic, and other higher levels. Our current program in "systems biology" endeavors to marry large-scale biological experiments with the tools to ponder and reason about large, complex, and subtle natural systems. To achieve this ambitious goal, ideas and concepts are combined from many different fields: biological experimentation, applied mathematical modeling, computational reasoning schemes, and large-scale numerical and symbolic simulations. From a biological viewpoint, the basic issues are many: (i) understanding common and shared structural motifs among biological processes; (ii) modeling biological noise due to interactions among a small number of key molecules or loss of synchrony; (iii) explaining the robustness of these systems in spite of such noise; and (iv) cataloging multistatic behavior and adaptation exhibited by many biological processes.
Simulated maximum likelihood method for estimating kinetic rates in gene expression.
Tian, Tianhai; Xu, Songlin; Gao, Junbin; Burrage, Kevin
2007-01-01
Kinetic rate in gene expression is a key measurement of the stability of gene products and gives important information for the reconstruction of genetic regulatory networks. Recent developments in experimental technologies have made it possible to measure the numbers of transcripts and protein molecules in single cells. Although estimation methods based on deterministic models have been proposed aimed at evaluating kinetic rates from experimental observations, these methods cannot tackle noise in gene expression that may arise from discrete processes of gene expression, small numbers of mRNA transcript, fluctuations in the activity of transcriptional factors and variability in the experimental environment. In this paper, we develop effective methods for estimating kinetic rates in genetic regulatory networks. The simulated maximum likelihood method is used to evaluate parameters in stochastic models described by either stochastic differential equations or discrete biochemical reactions. Different types of non-parametric density functions are used to measure the transitional probability of experimental observations. For stochastic models described by biochemical reactions, we propose to use the simulated frequency distribution to evaluate the transitional density based on the discrete nature of stochastic simulations. The genetic optimization algorithm is used as an efficient tool to search for optimal reaction rates. Numerical results indicate that the proposed methods can give robust estimations of kinetic rates with good accuracy.
Miller, Douglass R.; Rung, Alessandra; Parikh, Grishma
2014-01-01
Abstract We provide a general overview of features and technical specifications of an online, interactive tool for the identification of scale insects of concern to the U.S.A. ports-of-entry. Full lists of terminal taxa included in the keys (of which there are four), a list of features used in them, and a discussion of the structure of the tool are provided. We also briefly discuss the advantages of interactive keys for the identification of potential scale insect pests. The interactive key is freely accessible on http://idtools.org/id/scales/index.php PMID:25152668
Rosen, Jacob; Brown, Jeffrey D; Barreca, Marco; Chang, Lily; Hannaford, Blake; Sinanan, Mika
2002-01-01
Minimally invasive surgeiy (MIS) involves a multi-dimensional series of tasks requiring a synthesis between visual information and the kinematics and dynamics of the surgical tools. Analysis of these sources of information is a key step in mastering MIS surgery but may also be used to define objective criteria for characterizing surgical performance. The BIueDRAGON is a new system for acquiring the kinematics and the dynamics of two endoscopic tools along with the visual view of the surgical scene. It includes two four-bar mechanisms equipped with position and force torque sensors for measuring the positions and the orientations (P/O) of two endoscopic tools along with the forces and torques applied by the surgeons hands. The methodology of decomposing the surgical task is based on a fully connected, finite-states (28 states) Markov model where each states corresponded to a fundamental tool/tissue interaction based on the tool kinematics and associated with unique F/T signatures. The experimental protocol included seven MIS tasks performed on an animal model (pig) by 30 surgeons at different levels of their residency training. Preliminary analysis of these data showed that major differences between residents at different skill levels were: (i) the types of tool/tissue interactions being used, (ii) the transitions between tool/tissue interactions being applied by each hand, (iii) time spent while perfonning each tool/tissue interaction, (iv) the overall completion time, and (v) the variable F/T magnitudes being applied by the subjects through the endoscopic tools. Systems like surgical robots or virtual reality simulators that inherently measure the kinematics and the dynamics of the surgical tool may benefit from inclusion of the proposed methodology for analysis of efficacy and objective evaluation of surgical skills during training.
Ball Bearing Analysis with the ORBIS Tool
NASA Technical Reports Server (NTRS)
Halpin, Jacob D.
2016-01-01
Ball bearing design is critical to the success of aerospace mechanisms. Key bearing performance parameters, such as load capability, stiffness, torque, and life all depend on accurate determination of the internal load distribution. Hence, a good analytical bearing tool that provides both comprehensive capabilities and reliable results becomes a significant asset to the engineer. This paper introduces the ORBIS bearing tool. A discussion of key modeling assumptions and a technical overview is provided. Numerous validation studies and case studies using the ORBIS tool are presented. All results suggest the ORBIS code closely correlates to predictions on bearing internal load distributions, stiffness, deflection and stresses.
Using robots to understand animal cognition.
Frohnwieser, Anna; Murray, John C; Pike, Thomas W; Wilkinson, Anna
2016-01-01
In recent years, robotic animals and humans have been used to answer a variety of questions related to behavior. In the case of animal behavior, these efforts have largely been in the field of behavioral ecology. They have proved to be a useful tool for this enterprise as they allow the presentation of naturalistic social stimuli whilst providing the experimenter with full control of the stimulus. In interactive experiments, the behavior of robots can be controlled in a manner that is impossible with real animals, making them ideal instruments for the study of social stimuli in animals. This paper provides an overview of the current state of the field and considers the impact that the use of robots could have on fundamental questions related to comparative psychology: namely, perception, spatial cognition, social cognition, and early cognitive development. We make the case that the use of robots to investigate these key areas could have an important impact on the field of animal cognition. © 2016 Society for the Experimental Analysis of Behavior.
Quantum probabilities from quantum entanglement: experimentally unpacking the Born rule
Harris, Jérémie; Bouchard, Frédéric; Santamato, Enrico; ...
2016-05-11
The Born rule, a foundational axiom used to deduce probabilities of events from wavefunctions, is indispensable in the everyday practice of quantum physics. It is also key in the quest to reconcile the ostensibly inconsistent laws of the quantum and classical realms, as it confers physical significance to reduced density matrices, the essential tools of decoherence theory. Following Bohr's Copenhagen interpretation, textbooks postulate the Born rule outright. But, recent attempts to derive it from other quantum principles have been successful, holding promise for simplifying and clarifying the quantum foundational bedrock. Moreover, a major family of derivations is based on envariance,more » a recently discovered symmetry of entangled quantum states. Here, we identify and experimentally test three premises central to these envariance-based derivations, thus demonstrating, in the microworld, the symmetries from which the Born rule is derived. Furthermore, we demonstrate envariance in a purely local quantum system, showing its independence from relativistic causality.« less
Co-digestion of solid waste: Towards a simple model to predict methane production.
Kouas, Mokhles; Torrijos, Michel; Schmitz, Sabine; Sousbie, Philippe; Sayadi, Sami; Harmand, Jérôme
2018-04-01
Modeling methane production is a key issue for solid waste co-digestion. Here, the effect of a step-wise increase in the organic loading rate (OLR) on reactor performance was investigated, and four new models were evaluated to predict methane yields using data acquired in batch mode. Four co-digestion experiments of mixtures of 2 solid substrates were conducted in semi-continuous mode. Experimental methane yields were always higher than the BMP values of mixtures calculated from the BMP of each substrate, highlighting the importance of endogenous production (methane produced from auto-degradation of microbial community and generated solids). The experimental methane productions under increasing OLRs corresponded well to the modeled data using the model with constant endogenous production and kinetics identified at 80% from total batch time. This model provides a simple and useful tool for technical design consultancies and plant operators to optimize the co-digestion and the choice of the OLRs. Copyright © 2018 Elsevier Ltd. All rights reserved.
Modelling the effect of GRP78 on anti-oestrogen sensitivity and resistance in breast cancer
Parmar, Jignesh H.; Cook, Katherine L.; Shajahan-Haq, Ayesha N.; Clarke, Pamela A. G.; Tavassoly, Iman; Clarke, Robert; Tyson, John J.; Baumann, William T.
2013-01-01
Understanding the origins of resistance to anti-oestrogen drugs is of critical importance to many breast cancer patients. Recent experiments show that knockdown of GRP78, a key gene in the unfolded protein response (UPR), can re-sensitize resistant cells to anti-oestrogens, and overexpression of GRP78 in sensitive cells can cause them to become resistant. These results appear to arise from the operation and interaction of three cellular systems: the UPR, autophagy and apoptosis. To determine whether our current mechanistic understanding of these systems is sufficient to explain the experimental results, we built a mathematical model of the three systems and their interactions. We show that the model is capable of reproducing previously published experimental results and some new data gathered specifically for this paper. The model provides us with a tool to better understand the interactions that bring about anti-oestrogen resistance and the effects of GRP78 on both sensitive and resistant breast cancer cells. PMID:24511377
Cocco, Daniele; Idir, Mourad; Morton, Daniel; ...
2018-03-20
Experiments using high brightness X-rays are on the forefront of science due to the vast variety of knowledge they can provide. New Synchrotron Radiation (SR) and Free Electron Laser (FEL) light sources provide unique tools for advanced studies using X-rays. Top-level scientists from around the world are attracted to these beamlines to perform unprecedented experiments. High brightness, low emittance light sources allow beamline scientists the possibility to dream up cutting-edge experimental stations. X-ray optics play a key role in bringing the beam from the source to the experimental stations. This paper explores the recent developments in X-ray optics. It touchesmore » on simulations, diagnostics, metrology and adaptive optics, giving an overview of the role X-ray optics have played in the recent past. It will also touch on future developments for one of the most active field in the X-ray science.« less
A formula for half-life of proton radioactivity
NASA Astrophysics Data System (ADS)
Zhang, Zhi-Xing; Dong, Jian-Min
2018-01-01
We present a formula for proton radioactivity half-lives of spherical proton emitters with the inclusion of the spectroscopic factor. The coefficients in the formula are calibrated with the available experimental data. As an input to calculate the half-life, the spectroscopic factor that characterizes the important information on nuclear structure should be obtained with a nuclear many-body approach. This formula is found to work quite well, and in better agreement with experimental measurements than other theoretical models. Therefore, it can be used as a powerful tool in the investigation of proton emission, in particular for experimentalists. Supported by National Natural Science Foundation of China (11435014, 11405223, 11675265, 11575112), the 973 Program of China (2013CB834401, 2013CB834405), National Key Program for S&T Research and Development (2016YFA0400501), the Knowledge Innovation Project (KJCX2-EW-N01) of Chinese Academy of Sciences, the Funds for Creative Research Groups of China (11321064) and the Youth Innovation Promotion Association of Chinese Academy of Sciences
Li, Hongkun; Zhang, Xuefeng; Xu, Fujian
2013-09-18
Centrifugal compressors are a key piece of equipment for modern production. Among the components of the centrifugal compressor, the impeller is a pivotal part as it is used to transform kinetic energy into pressure energy. Blade crack condition monitoring and classification has been broadly investigated in the industrial and academic area. In this research, a pressure pulsation (PP) sensor arranged in close vicinity to the crack area and the corresponding casing vibration signals are used to monitor blade crack information. As these signals cannot directly demonstrate the blade crack, the method employed in this research is based on the extraction of weak signal characteristics that are induced by blade cracking. A method for blade crack classification based on the signals monitored by using a squared envelope spectrum (SES) is presented. Experimental investigations on blade crack classification are carried out to verify the effectiveness of this method. The results show that it is an effective tool for blade crack classification in centrifugal compressors.
NASA Technical Reports Server (NTRS)
Shen, Hayley H.
1991-01-01
Liquid fuel combustion process is greatly affected by the rate of droplet evaporation. The heat and mass exchanges between gas and liquid couple the dynamics of both phases in all aspects: mass, momentum, and energy. Correct prediction of the evaporation rate is therefore a key issue in engineering design of liquid combustion devices. Current analytical tools for characterizing the behavior of these devices are based on results from a single isolated droplet. Numerous experimental studies have challenged the applicability of these results in a dense spray. To account for the droplets' interaction in a dense spray, a number of theories have been developed in the past decade. Herein, two tasks are examined. One was to study how to implement the existing theoretical results, and the other was to explore the possibility of experimental verifications. The current theoretical results of group evaporation are given for a monodispersed cluster subject to adiabatic conditions. The time evolution of the fluid mechanic and thermodynamic behavior in this cluster is derived. The results given are not in the form of a subscale model for CFD codes.
NASA Astrophysics Data System (ADS)
Stojadinović, Bojana; Nestorović, Zorica; Djurić, Biljana; Tenne, Tamar; Zikich, Dragoslav; Žikić, Dejan
2017-03-01
The velocity by which a disturbance moves through the medium is the wave velocity. Pulse wave velocity is among the key parameters in hemodynamics. Investigation of wave propagation through the fluid-filled elastic tube has a great importance for the proper biophysical understanding of the nature of blood flow through the cardiovascular system. Here, we present a laboratory model of the cardiovascular system. We have designed an experimental setup which can help medical and nursing students to properly learn and understand basic fluid hemodynamic principles, pulse wave and the phenomenon of wave propagation in blood vessels. Demonstration of wave propagation allowed a real time observation of the formation of compression and expansion waves by students, thus enabling them to better understand the difference between the two waves, and also to measure the pulse wave velocity for different fluid viscosities. The laboratory model of the cardiovascular system could be useful as an active learning methodology and a complementary tool for understanding basic principles of hemodynamics.
Li, Hongkun; Zhang, Xuefeng; Xu, Fujian
2013-01-01
Centrifugal compressors are a key piece of equipment for modern production. Among the components of the centrifugal compressor, the impeller is a pivotal part as it is used to transform kinetic energy into pressure energy. Blade crack condition monitoring and classification has been broadly investigated in the industrial and academic area. In this research, a pressure pulsation (PP) sensor arranged in close vicinity to the crack area and the corresponding casing vibration signals are used to monitor blade crack information. As these signals cannot directly demonstrate the blade crack, the method employed in this research is based on the extraction of weak signal characteristics that are induced by blade cracking. A method for blade crack classification based on the signals monitored by using a squared envelope spectrum (SES) is presented. Experimental investigations on blade crack classification are carried out to verify the effectiveness of this method. The results show that it is an effective tool for blade crack classification in centrifugal compressors. PMID:24051521
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cocco, Daniele; Idir, Mourad; Morton, Daniel
Experiments using high brightness X-rays are on the forefront of science due to the vast variety of knowledge they can provide. New Synchrotron Radiation (SR) and Free Electron Laser (FEL) light sources provide unique tools for advanced studies using X-rays. Top-level scientists from around the world are attracted to these beamlines to perform unprecedented experiments. High brightness, low emittance light sources allow beamline scientists the possibility to dream up cutting-edge experimental stations. X-ray optics play a key role in bringing the beam from the source to the experimental stations. This paper explores the recent developments in X-ray optics. It touchesmore » on simulations, diagnostics, metrology and adaptive optics, giving an overview of the role X-ray optics have played in the recent past. It will also touch on future developments for one of the most active field in the X-ray science.« less
NASA Technical Reports Server (NTRS)
Groleau, Nicolas; Frainier, Richard; Colombano, Silvano; Hazelton, Lyman; Szolovits, Peter
1993-01-01
This paper describes portions of a novel system called MARIKA (Model Analysis and Revision of Implicit Key Assumptions) to automatically revise a model of the normal human orientation system. The revision is based on analysis of discrepancies between experimental results and computer simulations. The discrepancies are calculated from qualitative analysis of quantitative simulations. The experimental and simulated time series are first discretized in time segments. Each segment is then approximated by linear combinations of simple shapes. The domain theory and knowledge are represented as a constraint network. Incompatibilities detected during constraint propagation within the network yield both parameter and structural model alterations. Interestingly, MARIKA diagnosed a data set from the Massachusetts Eye and Ear Infirmary Vestibular Laboratory as abnormal though the data was tagged as normal. Published results from other laboratories confirmed the finding. These encouraging results could lead to a useful clinical vestibular tool and to a scientific discovery system for space vestibular adaptation.
Translation and Its Discontents: Key Concepts in English and German History Education
ERIC Educational Resources Information Center
Seixas, Peter
2016-01-01
Key terms and concepts are crucial tools in teaching and learning in the disciplines. Different linguistic traditions approach such tools in diverse ways. This paper offers an initial contribution by a monolingual Anglophone history educator in dialogue with German history educators. It presents three different scenarios for the potential of…
Illustrated Plant Identification Keys: An Interactive Tool to Learn Botany
ERIC Educational Resources Information Center
Silva, Helena; Pinho, Rosa; Lopes, Lisia; Nogueira, Antonio J. A.; Silveira, Paulo
2011-01-01
An Interactive Dichotomous Key (IDK) for 390 "taxa" of vascular plants from the Ria de Aveiro, available on a website, was developed to help teach botany to school and universitary students. This multimedia tool includes several links to Descriptive and Illustrated Glossaries. Questionnaires answered by high-school and undergraduate students about…
Gomis, Melissa Ines; Wang, Zhanyun; Scheringer, Martin; Cousins, Ian T
2015-02-01
Long-chain perfluoroalkyl carboxylic acids (PFCAs) and perfluoroalkane sulfonic acids (PFSAs) are persistent, bioaccumulative, and toxic contaminants that are globally present in the environment, wildlife and humans. Phase-out actions and use restrictions to reduce the environmental release of long-chain PFCAs, PFSAs and their precursors have been taken since 2000. In particular, long-chain poly- and perfluoroalkyl substances (PFASs) are being replaced with shorter-chain homologues or other fluorinated or non-fluorinated alternatives. A key question is: are these alternatives, particularly the structurally similar fluorinated alternatives, less hazardous to humans and the environment than the substances they replace? Several fluorinated alternatives including perfluoroether carboxylic acids (PFECAs) and perfluoroether sulfonic acids (PFESAs) have been recently identified. However, the scarcity of experimental data prevents hazard and risk assessments for these substances. In this study, we use state-of-the-art in silico tools to estimate key properties of these newly identified fluorinated alternatives. [i] COSMOtherm and SPARC are used to estimate physicochemical properties. The US EPA EPISuite software package is used to predict degradation half-lives in air, water and soil. [ii] In combination with estimated chemical properties, a fugacity-based multimedia mass-balance unit-world model - the OECD Overall Persistence (POV) and Long-Range Transport Potential (LRTP) Screening Tool - is used to assess the likely environmental fate of these alternatives. Even though the fluorinated alternatives contain some structural differences, their physicochemical properties are not significantly different from those of their predecessors. Furthermore, most of the alternatives are estimated to be similarly persistent and mobile in the environment as the long-chain PFASs. The models therefore predict that the fluorinated alternatives will become globally distributed in the environment similar to their predecessors. Although such in silico methods are coupled with uncertainties, this preliminary assessment provides enough cause for concern to warrant experimental work to better determine the properties of these fluorinated alternatives. Copyright © 2014 Elsevier B.V. All rights reserved.
Chaudhuri, Rima; Sadrieh, Arash; Hoffman, Nolan J; Parker, Benjamin L; Humphrey, Sean J; Stöckli, Jacqueline; Hill, Adam P; James, David E; Yang, Jean Yee Hwa
2015-08-19
Most biological processes are influenced by protein post-translational modifications (PTMs). Identifying novel PTM sites in different organisms, including humans and model organisms, has expedited our understanding of key signal transduction mechanisms. However, with increasing availability of deep, quantitative datasets in diverse species, there is a growing need for tools to facilitate cross-species comparison of PTM data. This is particularly important because functionally important modification sites are more likely to be evolutionarily conserved; yet cross-species comparison of PTMs is difficult since they often lie in structurally disordered protein domains. Current tools that address this can only map known PTMs between species based on known orthologous phosphosites, and do not enable the cross-species mapping of newly identified modification sites. Here, we addressed this by developing a web-based software tool, PhosphOrtholog ( www.phosphortholog.com ) that accurately maps protein modification sites between different species. This facilitates the comparison of datasets derived from multiple species, and should be a valuable tool for the proteomics community. Here we describe PhosphOrtholog, a web-based application for mapping known and novel orthologous PTM sites from experimental data obtained from different species. PhosphOrtholog is the only generic and automated tool that enables cross-species comparison of large-scale PTM datasets without relying on existing PTM databases. This is achieved through pairwise sequence alignment of orthologous protein residues. To demonstrate its utility we apply it to two sets of human and rat muscle phosphoproteomes generated following insulin and exercise stimulation, respectively, and one publicly available mouse phosphoproteome following cellular stress revealing high mapping and coverage efficiency. Although coverage statistics are dataset dependent, PhosphOrtholog increased the number of cross-species mapped sites in all our example data sets by more than double when compared to those recovered using existing resources such as PhosphoSitePlus. PhosphOrtholog is the first tool that enables mapping of thousands of novel and known protein phosphorylation sites across species, accessible through an easy-to-use web interface. Identification of conserved PTMs across species from large-scale experimental data increases our knowledgebase of functional PTM sites. Moreover, PhosphOrtholog is generic being applicable to other PTM datasets such as acetylation, ubiquitination and methylation.
ERIC Educational Resources Information Center
Dong, Nianbo; Maynard, Rebecca
2013-01-01
This paper and the accompanying tool are intended to complement existing supports for conducting power analysis tools by offering a tool based on the framework of Minimum Detectable Effect Sizes (MDES) formulae that can be used in determining sample size requirements and in estimating minimum detectable effect sizes for a range of individual- and…
NASA Astrophysics Data System (ADS)
Cordova, Martin; Serio, Andrew; Meza, Francisco; Arriagada, Gustavo; Swett, Hector; Ball, Jesse; Collins, Paul; Masuda, Neal; Fuentes, Javier
2016-07-01
In 2014 Gemini Observatory started the base facility operations (BFO) project. The project's goal was to provide the ability to operate the two Gemini telescopes from their base facilities (respectively Hilo, HI at Gemini North, and La Serena, Chile at Gemini South). BFO was identified as a key project for Gemini's transition program, as it created an opportunity to reduce operational costs. In November 2015, the Gemini North telescope started operating from the base facility in Hilo, Hawaii. In order to provide the remote operator the tools to work from the base, many of the activities that were normally performed by the night staff at the summit were replaced with new systems and tools. This paper describes some of the key systems and tools implemented for environmental monitoring, and the design used in the implementation at the Gemini North telescope.
Development of TIF based figuring algorithm for deterministic pitch tool polishing
NASA Astrophysics Data System (ADS)
Yi, Hyun-Su; Kim, Sug-Whan; Yang, Ho-Soon; Lee, Yun-Woo
2007-12-01
Pitch is perhaps the oldest material used for optical polishing, leaving superior surface texture, and has been used widely in the optics shop floor. However, for its unpredictable controllability of removal characteristics, the pitch tool polishing has been rarely analysed quantitatively and many optics shops rely heavily on optician's "feel" even today. In order to bring a degree of process controllability to the pitch tool polishing, we added motorized tool motions to the conventional Draper type polishing machine and modelled the tool path in the absolute machine coordinate. We then produced a number of Tool Influence Function (TIF) both from an analytical model and a series of experimental polishing runs using the pitch tool. The theoretical TIFs agreed well with the experimental TIFs to the profile accuracy of 79 % in terms of its shape. The surface figuring algorithm was then developed in-house utilizing both theoretical and experimental TIFs. We are currently undertaking a series of trial figuring experiments to prove the performance of the polishing algorithm, and the early results indicate that the highly deterministic material removal control with the pitch tool can be achieved to a certain level of form error. The machine renovation, TIF theory and experimental confirmation, figuring simulation results are reported together with implications to deterministic polishing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoogmartens, Rob, E-mail: rob.hoogmartens@uhasselt.be; Van Passel, Steven, E-mail: steven.vanpassel@uhasselt.be; Van Acker, Karel, E-mail: karel.vanacker@lrd.kuleuven.be
Increasing interest in sustainability has led to the development of sustainability assessment tools such as Life Cycle Analysis (LCA), Life Cycle Costing (LCC) and Cost–Benefit Analysis (CBA). Due to methodological disparity of these three tools, conflicting assessment results generate confusion for many policy and business decisions. In order to interpret and integrate assessment results, the paper provides a framework that clarifies the connections and coherence between the included assessment methodologies. Building on this framework, the paper further focuses on key aspects to adapt any of the methodologies to full sustainability assessments. Aspects dealt with in the review are for examplemore » the reported metrics, the scope, data requirements, discounting, product- or project-related and approaches with respect to scarcity and labor requirements. In addition to these key aspects, the review shows that important connections exist: (i) the three tools can cope with social inequality, (ii) processes such as valuation techniques for LCC and CBA are common, (iii) Environmental Impact Assessment (EIA) is used as input in both LCA and CBA and (iv) LCA can be used in parallel with LCC. Furthermore, the most integrated sustainability approach combines elements of LCA and LCC to achieve the Life Cycle Sustainability Assessment (LCSA). The key aspects and the connections referred to in the review are illustrated with a case study on the treatment of end-of-life automotive glass. - Highlights: • Proliferation of assessment tools creates ambiguity and confusion. • The developed assessment framework clarifies connections between assessment tools. • Broadening LCA, key aspects are metric and data requirements. • Broadening LCC, key aspects are scope, time frame and discounting. • Broadening CBA, focus point, timespan, references, labor and scarcity are key.« less
National Fusion Collaboratory: Grid Computing for Simulations and Experiments
NASA Astrophysics Data System (ADS)
Greenwald, Martin
2004-05-01
The National Fusion Collaboratory Project is creating a computational grid designed to advance scientific understanding and innovation in magnetic fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling and allowing more efficient use of experimental facilities. The philosophy of FusionGrid is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as network available services, easily used by the fusion scientist. In such an environment, access to services is stressed rather than portability. By building on a foundation of established computer science toolkits, deployment time can be minimized. These services all share the same basic infrastructure that allows for secure authentication and resource authorization which allows stakeholders to control their own resources such as computers, data and experiments. Code developers can control intellectual property, and fair use of shared resources can be demonstrated and controlled. A key goal is to shield scientific users from the implementation details such that transparency and ease-of-use are maximized. The first FusionGrid service deployed was the TRANSP code, a widely used tool for transport analysis. Tools for run preparation, submission, monitoring and management have been developed and shared among a wide user base. This approach saves user sites from the laborious effort of maintaining such a large and complex code while at the same time reducing the burden on the development team by avoiding the need to support a large number of heterogeneous installations. Shared visualization and A/V tools are being developed and deployed to enhance long-distance collaborations. These include desktop versions of the Access Grid, a highly capable multi-point remote conferencing tool and capabilities for sharing displays and analysis tools over local and wide-area networks.
Experimental study on internal cooling system in hard turning of HCWCI using CBN tools
NASA Astrophysics Data System (ADS)
Ravi, A. M.; Murigendrappa, S. M.
2018-04-01
In recent times, hard turning became most emerging technique in manufacturing processes, especially to cut high hard materials like high chrome white cast iron (HCWCI). Use of Cubic boron nitride (CBN), pCBN and Carbide tools are most appropriate to shear the metals but are uneconomical. Since hard turning carried out in dry condition, lowering the tool wear by minimizing tool temperature is the only solution. Study reveals, no effective cooling systems are available so for in order to enhance the tool life of the cutting tools and to improve machinability characteristics. The detrimental effect of cutting parameters on cutting temperature is generally controlled by proper selections. The objective of this paper is to develop a new cooling system to control tool tip temperature, thereby minimizing the cutting forces and the tool wear rates. The materials chosen for this work was HCWCI and cutting tools are CBN inserts. Intricate cavities were made on the periphery of the tool holder for easy flow of cold water. Taguchi techniques were adopted to carry out the experimentations. The experimental results confirm considerable reduction in the cutting forces and tool wear rates.
Cadena, Natalia L; Cue-Sampedro, Rodrigo; Siller, Héctor R; Arizmendi-Morquecho, Ana M; Rivera-Solorio, Carlos I; Di-Nardo, Santiago
2013-05-24
The manufacture of medical and aerospace components made of titanium alloys and other difficult-to-cut materials requires the parallel development of high performance cutting tools coated with materials capable of enhanced tribological and resistance properties. In this matter, a thin nanocomposite film made out of AlCrN (aluminum-chromium-nitride) was studied in this research, showing experimental work in the deposition process and its characterization. A heat-treated monolayer coating, competitive with other coatings in the machining of titanium alloys, was analyzed. Different analysis and characterizations were performed on the manufactured coating by scanning electron microscopy and energy-dispersive X-ray spectroscopy (SEM-EDXS), and X-ray diffraction (XRD). Furthermore, the mechanical behavior of the coating was evaluated through hardness test and tribology with pin-on-disk to quantify friction coefficient and wear rate. Finally, machinability tests using coated tungsten carbide cutting tools were executed in order to determine its performance through wear resistance, which is a key issue of cutting tools in high-end cutting at elevated temperatures. It was demonstrated that the specimen (with lower friction coefficient than previous research) is more efficient in machinability tests in Ti6Al4V alloys. Furthermore, the heat-treated monolayer coating presented better performance in comparison with a conventional monolayer of AlCrN coating.
Cadena, Natalia L.; Cue-Sampedro, Rodrigo; Siller, Héctor R.; Arizmendi-Morquecho, Ana M.; Rivera-Solorio, Carlos I.; Di-Nardo, Santiago
2013-01-01
The manufacture of medical and aerospace components made of titanium alloys and other difficult-to-cut materials requires the parallel development of high performance cutting tools coated with materials capable of enhanced tribological and resistance properties. In this matter, a thin nanocomposite film made out of AlCrN (aluminum–chromium–nitride) was studied in this research, showing experimental work in the deposition process and its characterization. A heat-treated monolayer coating, competitive with other coatings in the machining of titanium alloys, was analyzed. Different analysis and characterizations were performed on the manufactured coating by scanning electron microscopy and energy-dispersive X-ray spectroscopy (SEM-EDXS), and X-ray diffraction (XRD). Furthermore, the mechanical behavior of the coating was evaluated through hardness test and tribology with pin-on-disk to quantify friction coefficient and wear rate. Finally, machinability tests using coated tungsten carbide cutting tools were executed in order to determine its performance through wear resistance, which is a key issue of cutting tools in high-end cutting at elevated temperatures. It was demonstrated that the specimen (with lower friction coefficient than previous research) is more efficient in machinability tests in Ti6Al4V alloys. Furthermore, the heat-treated monolayer coating presented better performance in comparison with a conventional monolayer of AlCrN coating. PMID:28809266
NASA Technical Reports Server (NTRS)
Simoneau, Robert J.; Strazisar, Anthony J.; Sockol, Peter M.; Reid, Lonnie; Adamczyk, John J.
1987-01-01
The discipline research in turbomachinery, which is directed toward building the tools needed to understand such a complex flow phenomenon, is based on the fact that flow in turbomachinery is fundamentally unsteady or time dependent. Success in building a reliable inventory of analytic and experimental tools will depend on how the time and time-averages are treated, as well as on who the space and space-averages are treated. The raw tools at disposal (both experimentally and computational) are truly powerful and their numbers are growing at a staggering pace. As a result of this power, a case can be made that a situation exists where information is outstripping understanding. The challenge is to develop a set of computational and experimental tools which genuinely increase understanding of the fluid flow and heat transfer in a turbomachine. Viewgraphs outline a philosophy based on working on a stairstep hierarchy of mathematical and experimental complexity to build a system of tools, which enable one to aggressively design the turbomachinery of the next century. Examples of the types of computational and experimental tools under current development at Lewis, with progress to date, are examined. The examples include work in both the time-resolved and time-averaged domains. Finally, an attempt is made to identify the proper place for Lewis in this continuum of research.
Expanding a dynamic flux balance model of yeast fermentation to genome-scale
2011-01-01
Background Yeast is considered to be a workhorse of the biotechnology industry for the production of many value-added chemicals, alcoholic beverages and biofuels. Optimization of the fermentation is a challenging task that greatly benefits from dynamic models able to accurately describe and predict the fermentation profile and resulting products under different genetic and environmental conditions. In this article, we developed and validated a genome-scale dynamic flux balance model, using experimentally determined kinetic constraints. Results Appropriate equations for maintenance, biomass composition, anaerobic metabolism and nutrient uptake are key to improve model performance, especially for predicting glycerol and ethanol synthesis. Prediction profiles of synthesis and consumption of the main metabolites involved in alcoholic fermentation closely agreed with experimental data obtained from numerous lab and industrial fermentations under different environmental conditions. Finally, fermentation simulations of genetically engineered yeasts closely reproduced previously reported experimental results regarding final concentrations of the main fermentation products such as ethanol and glycerol. Conclusion A useful tool to describe, understand and predict metabolite production in batch yeast cultures was developed. The resulting model, if used wisely, could help to search for new metabolic engineering strategies to manage ethanol content in batch fermentations. PMID:21595919
Synthetic biology as it relates to CAM photosynthesis: challenges and opportunities.
DePaoli, Henrique C; Borland, Anne M; Tuskan, Gerald A; Cushman, John C; Yang, Xiaohan
2014-07-01
To meet future food and energy security needs, which are amplified by increasing population growth and reduced natural resource availability, metabolic engineering efforts have moved from manipulating single genes/proteins to introducing multiple genes and novel pathways to improve photosynthetic efficiency in a more comprehensive manner. Biochemical carbon-concentrating mechanisms such as crassulacean acid metabolism (CAM), which improves photosynthetic, water-use, and possibly nutrient-use efficiency, represent a strategic target for synthetic biology to engineer more productive C3 crops for a warmer and drier world. One key challenge for introducing multigene traits like CAM onto a background of C3 photosynthesis is to gain a better understanding of the dynamic spatial and temporal regulatory events that underpin photosynthetic metabolism. With the aid of systems and computational biology, vast amounts of experimental data encompassing transcriptomics, proteomics, and metabolomics can be related in a network to create dynamic models. Such models can undergo simulations to discover key regulatory elements in metabolism and suggest strategic substitution or augmentation by synthetic components to improve photosynthetic performance and water-use efficiency in C3 crops. Another key challenge in the application of synthetic biology to photosynthesis research is to develop efficient systems for multigene assembly and stacking. Here, we review recent progress in computational modelling as applied to plant photosynthesis, with attention to the requirements for CAM, and recent advances in synthetic biology tool development. Lastly, we discuss possible options for multigene pathway construction in plants with an emphasis on CAM-into-C3 engineering. © The Author 2014. Published by Oxford University Press on behalf of the Society for Experimental Biology. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Currie, Richard A.; Peffer, Richard C.; Goetz, Amber K.; Omiecinski, Curtis J.; Goodman, Jay I.
2014-01-01
Toxicogenomics (TGx) is employed frequently to investigate underlying molecular mechanisms of the compound of interest and, thus, has become an aid to mode of action determination. However, the results and interpretation of a TGx dataset are influenced by the experimental design and methods of analysis employed. This article describes an evaluation and reanalysis, by two independent laboratories, of previously published TGx mouse liver microarray data for a triazole fungicide, propiconazole (PPZ), and the anticonvulsant drug phenobarbital (PB). Propiconazole produced an increase incidence of liver tumors in male CD-1 mice only at a dose that exceeded the maximum tolerated dose (2500 ppm). Firstly, we illustrate how experimental design differences between two in vivo studies with PPZ and PB may impact the comparisons of TGx results. Secondly, we demonstrate that different researchers using different pathway analysis tools can come to different conclusions on specific mechanistic pathways, even when using the same datasets. Finally, despite these differences the results across three different analyses also show a striking degree of similarity observed for PPZ and PB treated livers when the expression data are viewed as major signaling pathways and cell processes affected. Additional studies described here show that the postulated key event of hepatocellular proliferation was observed in CD-1 mice for both PPZ and PB, and that PPZ is also a potent activator of the mouse CAR nuclear receptor. Thus, with regard to the events which are hallmarks of CAR-induced effects that are key events in the mode of action (MOA) of mouse liver carcinogenesis with PB, PPZ-induced tumors can be viewed as being promoted by a similar PB-like CAR-dependent MOA. PMID:24675475
Guitton, Yann; Tremblay-Franco, Marie; Le Corguillé, Gildas; Martin, Jean-François; Pétéra, Mélanie; Roger-Mele, Pierrick; Delabrière, Alexis; Goulitquer, Sophie; Monsoor, Misharl; Duperier, Christophe; Canlet, Cécile; Servien, Rémi; Tardivel, Patrick; Caron, Christophe; Giacomoni, Franck; Thévenot, Etienne A
2017-12-01
Metabolomics is a key approach in modern functional genomics and systems biology. Due to the complexity of metabolomics data, the variety of experimental designs, and the multiplicity of bioinformatics tools, providing experimenters with a simple and efficient resource to conduct comprehensive and rigorous analysis of their data is of utmost importance. In 2014, we launched the Workflow4Metabolomics (W4M; http://workflow4metabolomics.org) online infrastructure for metabolomics built on the Galaxy environment, which offers user-friendly features to build and run data analysis workflows including preprocessing, statistical analysis, and annotation steps. Here we present the new W4M 3.0 release, which contains twice as many tools as the first version, and provides two features which are, to our knowledge, unique among online resources. First, data from the four major metabolomics technologies (i.e., LC-MS, FIA-MS, GC-MS, and NMR) can be analyzed on a single platform. By using three studies in human physiology, alga evolution, and animal toxicology, we demonstrate how the 40 available tools can be easily combined to address biological issues. Second, the full analysis (including the workflow, the parameter values, the input data and output results) can be referenced with a permanent digital object identifier (DOI). Publication of data analyses is of major importance for robust and reproducible science. Furthermore, the publicly shared workflows are of high-value for e-learning and training. The Workflow4Metabolomics 3.0 e-infrastructure thus not only offers a unique online environment for analysis of data from the main metabolomics technologies, but it is also the first reference repository for metabolomics workflows. Copyright © 2017 Elsevier Ltd. All rights reserved.
Strengthening Ecological Mindfulness through Hybrid Learning in Vital Coalitions
ERIC Educational Resources Information Center
Sol, Jifke; Wals, Arjen E. J.
2015-01-01
In this contribution a key policy "tool" used in the Dutch Environmental Education and Learning for Sustainability Policy framework is introduced as a means to develop a sense of place and associated ecological mindfulness. The key elements of this tool, called the vital coalition, are described while an example of its use in practice,…
Keys and the crisis in taxonomy: extinction or reinvention?
Walter, David Evans; Winterton, Shaun
2007-01-01
Dichotomous keys that follow a single pathway of character state choices to an end point have been the primary tools for the identification of unknown organisms for more than two centuries. However, a revolution in computer diagnostics is now under way that may result in the replacement of traditional keys by matrix-based computer interactive keys that have many paths to a correct identification and make extensive use of hypertext to link to images, glossaries, and other support material. Progress is also being made on replacing keys entirely by optical matching of specimens to digital databases and DNA sequences. These new tools may go some way toward alleviating the taxonomic impediment to biodiversity studies and other ecological and evolutionary research, especially with better coordination between those who produce keys and those who use them and by integrating interactive keys into larger biological Web sites.
Randomized test of an implementation intention-based tool to reduce stress-induced eating.
O'Connor, Daryl B; Armitage, Christopher J; Ferguson, Eamonn
2015-06-01
Stress may indirectly contribute to disease (e.g. cardiovascular disease, cancer) by producing deleterious changes to diet. The purpose of this study was to test the effectiveness of a stress management support (SMS) tool to reduce stress-related unhealthy snacking and to promote stress-related healthy snacking. Participants were randomized to complete a SMS tool with instruction to link stressful situations with healthy snack alternatives (experimental) or a SMS tool without a linking instruction (control). On-line daily reports of stressors and snacking were completed for 7 days. Daily stressors were associated with unhealthy snack consumption in the control condition but not in the experimental condition. Participants highly motivated towards healthy eating consumed a greater number of healthy snacks in the experimental condition on stressful days compared to participants in the experimental condition with low and mean levels of motivation. This tool is an effective, theory driven, intervention that helps to protect against stress-induced high-calorie snack consumption.
2011-01-01
Background Multiple types of assays allow sensitive detection of virus-specific neutralizing antibodies. For example, the extent of antibody neutralization of HIV-1, SIV and SHIV can be measured in the TZM-bl cell line through the degree of luciferase reporter gene expression after infection. In the past, neutralization curves and titers for this standard assay have been calculated using an Excel macro. Updating all instances of such a macro with new techniques can be unwieldy and introduce non-uniformity across multi-lab teams. Using Excel also poses challenges in centrally storing, sharing and associating raw data files and results. Results We present LabKey Server's NAb tool for organizing, analyzing and securely sharing data, files and results for neutralizing antibody (NAb) assays, including the luciferase-based TZM-bl NAb assay. The customizable tool supports high-throughput experiments and includes a graphical plate template designer, allowing researchers to quickly adapt calculations to new plate layouts. The tool calculates the percent neutralization for each serum dilution based on luminescence measurements, fits a range of neutralization curves to titration results and uses these curves to estimate the neutralizing antibody titers for benchmark dilutions. Results, curve visualizations and raw data files are stored in a database and shared through a secure, web-based interface. NAb results can be integrated with other data sources based on sample identifiers. It is simple to make results public after publication by updating folder security settings. Conclusions Standardized tools for analyzing, archiving and sharing assay results can improve the reproducibility, comparability and reliability of results obtained across many labs. LabKey Server and its NAb tool are freely available as open source software at http://www.labkey.com under the Apache 2.0 license. Many members of the HIV research community can also access the LabKey Server NAb tool without installing the software by using the Atlas Science Portal (https://atlas.scharp.org). Atlas is an installation of LabKey Server. PMID:21619655
Ferber, Julia; Schneider, Gudrun; Havlik, Linda; Heuft, Gereon; Friederichs, Hendrik; Schrewe, Franz-Bernhard; Schulz-Steinel, Andrea; Burgmer, Markus
2014-01-01
To improve the synergy of established methods of teaching, the Department of Psychosomatics and Psychotherapy, University Hospital Münster, developed a web-based elearning tool using video clips of standardized patients. The effect of this blended-learning approach was evaluated. A multiple-choice test was performed by a naive (without the e-learning tool) and an experimental (with the tool) cohort of medical students to test the groups' expertise in psychosomatics. In addition, participants' satisfaction with the new tool was evaluated (numeric rating scale of 0-10). The experimental cohort was more satisfied with the curriculum and more interested in psychosomatics. Furthermore, the experimental cohort scored significantly better in the multiple-choice test. The new tool proved to be an important addition to the classical curriculum as a blended-learning approach which improves students' satisfaction and knowledge in psychosomatics.
Motorized wellbore fishing tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, L.E.; Schasteen, T.
1989-08-15
This patent describes a fishing tool for retrieving an article located in a wellbore, wherein the fishing tool may be lowered into the wellbore by means connected to one end of the fishing tool. The fishing tool comprising: an elongated tubular body; an inner sleeve member secured to the body and extending axially within the body; a ball key disposed within each of the openings and movable at least partially into the bore in locking registration with a fishing head connected to the article; an outer sleeve member disposed in sleeved relationship around the inner sleeve member and movable axiallymore » between first and second positions with respect to the inner sleeve member. The outer sleeve member being operable to prevent, in the first position, radial outward movement of the ball keys out of the bore. The outer sleeve member including recess means formed thereon such that in the second position of the outer sleeve member the recess means is adjacent to the circumferentially spaced openings to allow limited radial outward movement of the ball keys; and means for axially moving the outer sleeve member between the first and second positions for engaging and releasing the fishing head with respect to the tool.« less
ERIC Educational Resources Information Center
Maseda, F. J.; Martija, I.; Martija, I.
2012-01-01
This paper describes a novel Electrical Machine and Power Electronic Training Tool (EM&PE[subscript TT]), a methodology for using it, and associated experimental educational activities. The training tool is implemented by recreating a whole power electronics system, divided into modular blocks. This process is similar to that applied when…
Progress in modelling agricultural impacts of and adaptations to climate change.
Rötter, R P; Hoffmann, M P; Koch, M; Müller, C
2018-06-01
Modelling is a key tool to explore agricultural impacts of and adaptations to climate change. Here we report recent progress made especially referring to the large project initiatives MACSUR and AgMIP; in particular, in modelling potential crop impacts from field to global using multi-model ensembles. We identify two main fields where further progress is necessary: a more mechanistic understanding of climate impacts and management options for adaptation and mitigation; and focusing on cropping systems and integrative multi-scale assessments instead of single season and crops, especially in complex tropical and neglected but important cropping systems. Stronger linking of experimentation with statistical and eco-physiological crop modelling could facilitate the necessary methodological advances. Copyright © 2018 Elsevier Ltd. All rights reserved.
Carol-Visser, Jeroen; van der Schans, Marcel; Fidder, Alex; Hulst, Albert G; van Baar, Ben L M; Irth, Hubertus; Noort, Daan
2008-07-01
Rapid monitoring and retrospective verification are key issues in protection against and non-proliferation of chemical warfare agents (CWA). Such monitoring and verification are adequately accomplished by the analysis of persistent protein adducts of these agents. Liquid chromatography-mass spectrometry (LC-MS) is the tool of choice in the analysis of such protein adducts, but the overall experimental procedure is quite elaborate. Therefore, an automated on-line pepsin digestion-LC-MS configuration has been developed for the rapid determination of CWA protein adducts. The utility of this configuration is demonstrated by the analysis of specific adducts of sarin and sulfur mustard to human butyryl cholinesterase and human serum albumin, respectively.
Bioimage informatics for experimental biology
Swedlow, Jason R.; Goldberg, Ilya G.; Eliceiri, Kevin W.
2012-01-01
Over the last twenty years there have been great advances in light microscopy with the result that multi-dimensional imaging has driven a revolution in modern biology. The development of new approaches of data acquisition are reportedly frequently, and yet the significant data management and analysis challenges presented by these new complex datasets remains largely unsolved. Like the well-developed field of genome bioinformatics, central repositories are and will be key resources, but there is a critical need for informatics tools in individual laboratories to help manage, share, visualize, and analyze image data. In this article we present the recent efforts by the bioimage informatics community to tackle these challenges and discuss our own vision for future development of bioimage informatics solution. PMID:19416072
Gianni, Stefano; Dogan, Jakob; Jemth, Per
2014-01-01
The Φ value analysis is a method to analyze the structure of metastable states in reaction pathways. Such a methodology is based on the quantitative analysis of the effect of point mutations on the kinetics and thermodynamics of the probed reaction. The Φ value analysis is routinely used in protein folding studies and is potentially an extremely powerful tool to analyze the mechanism of binding induced folding of intrinsically disordered proteins. In this review we recapitulate the key equations and experimental advices to perform the Φ value analysis in the perspective of the possible caveats arising in intrinsically disordered systems. Finally, we briefly discuss some few examples already available in the literature.
Folding and unfolding single RNA molecules under tension
Woodside, Michael T; García-García, Cuauhtémoc; Block, Steven M
2010-01-01
Single-molecule force spectroscopy constitutes a powerful method for probing RNA folding: it allows the kinetic, energetic, and structural properties of intermediate and transition states to be determined quantitatively, yielding new insights into folding pathways and energy landscapes. Recent advances in experimental and theoretical methods, including fluctuation theorems, kinetic theories, novel force clamps, and ultrastable instruments, have opened new avenues for study. These tools have been used to probe folding in simple model systems, for example, RNA and DNA hairpins. Knowledge gained from such systems is helping to build our understanding of more complex RNA structures composed of multiple elements, as well as how nucleic acids interact with proteins involved in key cellular activities, such as transcription and translation. PMID:18786653
Spontaneous generation of frequency combs in QD lasers
NASA Astrophysics Data System (ADS)
Columbo, Lorenzo Luigi; Bardella, Paolo; Gioannini, Mariangela
2018-02-01
We report a systematic analysis of the phenomenon of self-generation of optical frequency combs in single section Fabry-Perot Quantum Dot lasers using a Time Domain Travelling Wave model. We show that the carriers grating due to the standing wave pattern (spatial hole burning) peculiar of Quantum Dots laser and the Four Wave Mixing are the key ingredients to explain spontaneous Optical Frequency Combs in these devices. Our results well agree with recent experimental evidences reported in semiconductor lasers based on Quantum Dots and Quantum Dashes active material and pave the way to the development of a simulation tool for the design of these comb laser sources for innovative applications in the field of high-data rate optical communications.
Measurement of W + bb and a search for MSSM Higgs bosons with the CMS detector at the LHC
NASA Astrophysics Data System (ADS)
O'Connor, Alexander Pinpin
Tooling used to cure composite laminates in the aerospace and automotive industries must provide a dimensionally stable geometry throughout the thermal cycle applied during the part curing process. This requires that the Coefficient of Thermal Expansion (CTE) of the tooling materials match that of the composite being cured. The traditional tooling material for production applications is a nickel alloy. Poor machinability and high material costs increase the expense of metallic tooling made from nickel alloys such as 'Invar 36' or 'Invar 42'. Currently, metallic tooling is unable to meet the needs of applications requiring rapid affordable tooling solutions. In applications where the tooling is not required to have the durability provided by metals, such as for small area repair, an opportunity exists for non-metallic tooling materials like graphite, carbon foams, composites, or ceramics and machinable glasses. Nevertheless, efficient machining of brittle, non-metallic materials is challenging due to low ductility, porosity, and high hardness. The machining of a layup tool comprises a large portion of the final cost. Achieving maximum process economy requires optimization of the machining process in the given tooling material. Therefore, machinability of the tooling material is a critical aspect of the overall cost of the tool. In this work, three commercially available, brittle/porous, non-metallic candidate tooling materials were selected, namely: (AAC) Autoclaved Aerated Concrete, CB1100 ceramic block and Cfoam carbon foam. Machining tests were conducted in order to evaluate the machinability of these materials using end milling. Chip formation, cutting forces, cutting tool wear, machining induced damage, surface quality and surface integrity were investigated using High Speed Steel (HSS), carbide, diamond abrasive and Polycrystalline Diamond (PCD) cutting tools. Cutting forces were found to be random in magnitude, which was a result of material porosity. The abrasive nature of Cfoam produced rapid tool wear when using HSS and PCD type cutting tools. However, tool wear was not significant in AAC or CB1100 regardless of the type of cutting edge. Machining induced damage was observed in the form of macro-scale chipping and fracture in combination with micro-scale cracking. Transverse rupture test results revealed significant reductions in residual strength and damage tolerance in CB1100. In contrast, AAC and Cfoam showed no correlation between machining induced damage and a reduction in surface integrity. Cutting forces in machining were modeled for all materials. Cutting force regression models were developed based on Design of Experiment and Analysis of Variance. A mechanistic cutting force model was proposed based upon conventional end milling force models and statistical distributions of material porosity. In order to validate the model, predicted cutting forces were compared to experimental results. Predicted cutting forces agreed well with experimental measurements. Furthermore, over the range of cutting conditions tested, the proposed model was shown to have comparable predictive accuracy to empirically produced regression models; greatly reducing the number of cutting tests required to simulate cutting forces. Further, this work demonstrates a key adaptation of metallic cutting force models to brittle porous material; a vital step in the research into the machining of these materials using end milling.
The EPA Control Strategy Tool (CoST) is a software tool for projecting potential future control scenarios, their effects on emissions and estimated costs. This tool uses the NEI and the Control Measures Dataset as key inputs. CoST outputs are projections of future control scenarios.
Kastner, Monika; Perrier, Laure; Hamid, Jemila; Tricco, Andrea C; Cardoso, Roberta; Ivers, Noah M; Liu, Barbara; Marr, Sharon; Holroyd-Leduc, Jayna; Wong, Geoff; Graves, Lisa; Straus, Sharon E
2015-02-03
The burden of chronic disease is a global phenomenon, particularly among people aged 65 years and older. More than half of older adults have more than one chronic disease and their care is not optimal. Chronic disease management (CDM) tools have the potential to meet this challenge but they are primarily focused on a single disease, which fails to address the growing number of seniors with multiple chronic conditions. We will conduct a systematic review alongside a realist review to identify effective CDM tools that integrate one or more high-burden chronic diseases affecting older adults and to better understand for whom, under what circumstances, how and why they produce their outcomes. We will search MEDLINE, EMBASE, CINAHL, AgeLine and the Cochrane Library for experimental, quasi-experimental, observational and qualitative studies in any language investigating CDM tools that facilitate optimal disease management in one or more high-burden chronic diseases affecting adults aged ≥65 years. Study selection will involve calibration of reviewers to ensure reliability of screening and duplicate assessment of articles. Data abstraction and risk of bias assessment will also be performed independently. Analysis will include descriptive summaries of study and appraisal characteristics, effectiveness of each CDM tool (meta-analysis if appropriate); and a realist programme theory will be developed and refined to explain the outcome patterns within the included studies. Ethics approval is not required for this study. We anticipate that our findings, pertaining to gaps in care across high-burden chronic diseases affecting seniors and highlighting specific areas that may require more research, will be of interest to a wide range of knowledge users and stakeholders. We will publish and present our findings widely, and also plan more active dissemination strategies such as workshops with our key stakeholders. Our protocol is registered with PROSPERO (registration number CRD42014014489). Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Lobo, Daniel; Levin, Michael
2015-01-01
Transformative applications in biomedicine require the discovery of complex regulatory networks that explain the development and regeneration of anatomical structures, and reveal what external signals will trigger desired changes of large-scale pattern. Despite recent advances in bioinformatics, extracting mechanistic pathway models from experimental morphological data is a key open challenge that has resisted automation. The fundamental difficulty of manually predicting emergent behavior of even simple networks has limited the models invented by human scientists to pathway diagrams that show necessary subunit interactions but do not reveal the dynamics that are sufficient for complex, self-regulating pattern to emerge. To finally bridge the gap between high-resolution genetic data and the ability to understand and control patterning, it is critical to develop computational tools to efficiently extract regulatory pathways from the resultant experimental shape phenotypes. For example, planarian regeneration has been studied for over a century, but despite increasing insight into the pathways that control its stem cells, no constructive, mechanistic model has yet been found by human scientists that explains more than one or two key features of its remarkable ability to regenerate its correct anatomical pattern after drastic perturbations. We present a method to infer the molecular products, topology, and spatial and temporal non-linear dynamics of regulatory networks recapitulating in silico the rich dataset of morphological phenotypes resulting from genetic, surgical, and pharmacological experiments. We demonstrated our approach by inferring complete regulatory networks explaining the outcomes of the main functional regeneration experiments in the planarian literature; By analyzing all the datasets together, our system inferred the first systems-biology comprehensive dynamical model explaining patterning in planarian regeneration. This method provides an automated, highly generalizable framework for identifying the underlying control mechanisms responsible for the dynamic regulation of growth and form. PMID:26042810
Experimental validation of the RATE tool for inferring HLA restrictions of T cell epitopes.
Paul, Sinu; Arlehamn, Cecilia S Lindestam; Schulten, Veronique; Westernberg, Luise; Sidney, John; Peters, Bjoern; Sette, Alessandro
2017-06-21
The RATE tool was recently developed to computationally infer the HLA restriction of given epitopes from immune response data of HLA typed subjects without additional cumbersome experimentation. Here, RATE was validated using experimentally defined restriction data from a set of 191 tuberculosis-derived epitopes and 63 healthy individuals with MTB infection from the Western Cape Region of South Africa. Using this experimental dataset, the parameters utilized by the RATE tool to infer restriction were optimized, which included relative frequency (RF) of the subjects responding to a given epitope and expressing a given allele as compared to the general test population and the associated p-value in a Fisher's exact test. We also examined the potential for further optimization based on the predicted binding affinity of epitopes to potential restricting HLA alleles, and the absolute number of individuals expressing a given allele and responding to the specific epitope. Different statistical measures, including Matthew's correlation coefficient, accuracy, sensitivity and specificity were used to evaluate performance of RATE as a function of these criteria. Based on our results we recommend selection of HLA restrictions with cutoffs of p-value < 0.01 and RF ≥ 1.3. The usefulness of the tool was demonstrated by inferring new HLA restrictions for epitope sets where restrictions could not be experimentally determined due to lack of necessary cell lines and for an additional data set related to recognition of pollen derived epitopes from allergic patients. Experimental data sets were used to validate RATE tool and the parameters used by the RATE tool to infer restriction were optimized. New HLA restrictions were identified using the optimized RATE tool.
Bersini, Simone; Gilardi, Mara; Arrigoni, Chiara; Talò, Giuseppe; Zamai, Moreno; Zagra, Luigi; Caiolfa, Valeria; Moretti, Matteo
2016-01-01
The generation of functional, vascularized tissues is a key challenge for both tissue engineering applications and the development of advanced in vitro models analyzing interactions among circulating cells, endothelium and organ-specific microenvironments. Since vascularization is a complex process guided by multiple synergic factors, it is critical to analyze the specific role that different experimental parameters play in the generation of physiological tissues. Our goals were to design a novel meso-scale model bridging the gap between microfluidic and macro-scale studies, and high-throughput screen the effects of multiple variables on the vascularization of bone-mimicking tissues. We investigated the influence of endothelial cell (EC) density (3-5 Mcells/ml), cell ratio among ECs, mesenchymal stem cells (MSCs) and osteo-differentiated MSCs (1:1:0, 10:1:0, 10:1:1), culture medium (endothelial, endothelial + angiopoietin-1, 1:1 endothelial/osteo), hydrogel type (100%fibrin, 60%fibrin+40%collagen), tissue geometry (2 × 2 × 2, 2 × 2 × 5 mm(3)). We optimized the geometry and oxygen gradient inside hydrogels through computational simulations and we analyzed microvascular network features including total network length/area and vascular branch number/length. Particularly, we employed the "Design of Experiment" statistical approach to identify key differences among experimental conditions. We combined the generation of 3D functional tissue units with the fine control over the local microenvironment (e.g. oxygen gradients), and developed an effective strategy to enable the high-throughput screening of multiple experimental parameters. Our approach allowed to identify synergic correlations among critical parameters driving microvascular network development within a bone-mimicking environment and could be translated to any vascularized tissue. Copyright © 2015 Elsevier Ltd. All rights reserved.
Experimental evaluation of tool run-out in micro milling
NASA Astrophysics Data System (ADS)
Attanasio, Aldo; Ceretti, Elisabetta
2018-05-01
This paper deals with micro milling cutting process focusing the attention on tool run-out measurement. In fact, among the effects of the scale reduction from macro to micro (i.e., size effects) tool run-out plays an important role. This research is aimed at developing an easy and reliable method to measure tool run-out in micro milling based on experimental tests and an analytical model. From an Industry 4.0 perspective this measuring strategy can be integrated into an adaptive system for controlling cutting forces, with the objective of improving the production quality, the process stability, reducing at the same time the tool wear and the machining costs. The proposed procedure estimates the tool run-out parameters from the tool diameter, the channel width, and the phase angle between the cutting edges. The cutting edge phase measurement is based on the force signal analysis. The developed procedure has been tested on data coming from micro milling experimental tests performed on a Ti6Al4V sample. The results showed that the developed procedure can be successfully used for tool run-out estimation.
Video analysis of projectile motion using tablet computers as experimental tools
NASA Astrophysics Data System (ADS)
Klein, P.; Gröber, S.; Kuhn, J.; Müller, A.
2014-01-01
Tablet computers were used as experimental tools to record and analyse the motion of a ball thrown vertically from a moving skateboard. Special applications plotted the measurement data component by component, allowing a simple determination of initial conditions and g in order to explore the underlying laws of motion. This experiment can easily be performed by students themselves, providing more autonomy in their problem-solving processes than traditional learning approaches. We believe that this autonomy and the authenticity of the experimental tool both foster their motivation.
ERIC Educational Resources Information Center
Wendt, Oliver; Miller, Bridget
2012-01-01
Critical appraisal of the research literature is an essential step in informing and implementing evidence-based practice. Quality appraisal tools that assess the methodological quality of experimental studies provide a means to identify the most rigorous research suitable for evidence-based decision-making. In single-subject experimental research,…
Experimental and analytical tools for evaluation of Stirling engine rod seal behavior
NASA Technical Reports Server (NTRS)
Krauter, A. I.; Cheng, H. S.
1979-01-01
The first year of a two year experimental and analytical program is reported. The program is directed at the elastohydrodynamic behavior of sliding elastomeric rod seals for the Stirling engine. During the year, experimental and analytical tools were developed for evaluating seal leakage, seal friction, and the fluid film thickness at the seal/cylinder interface.
Spatial cognition in a virtual reality home-cage extension for freely moving rodents
Kaupert, Ursula; Frei, Katja; Bagorda, Francesco; Schatz, Alexej; Tocker, Gilad; Rapoport, Sophie; Derdikman, Dori
2017-01-01
Virtual reality (VR) environments are a powerful tool to investigate brain mechanisms involved in the behavior of animals. With this technique, animals are usually head fixed or secured in a harness, and training for cognitively more complex VR paradigms is time consuming. A VR apparatus allowing free animal movement and the constant operator-independent training of tasks would enable many new applications. Key prospective usages include brain imaging of animal behavior when carrying a miniaturized mobile device such as a fluorescence microscope or an optetrode. Here, we introduce the Servoball, a spherical VR treadmill based on the closed-loop tracking of a freely moving animal and feedback counterrotation of the ball. Furthermore, we present the complete integration of this experimental system with the animals’ group home cage, from which single individuals can voluntarily enter through a tunnel with radio-frequency identification (RFID)-automated access control and commence experiments. This automated animal sorter functions as a mechanical replacement of the experimenter. We automatically trained rats using visual or acoustic cues to solve spatial cognitive tasks and recorded spatially modulated entorhinal cells. When electrophysiological extracellular recordings from awake behaving rats were performed, head fixation can dramatically alter results, so that any complex behavior that requires head movement is impossible to achieve. We circumvented this problem with the use of the Servoball in open-field scenarios, as it allows the combination of open-field behavior with the recording of nerve cells, along with all the flexibility that a virtual environment brings. This integrated home cage with a VR arena experimental system permits highly efficient experimentation for complex cognitive experiments. NEW & NOTEWORTHY Virtual reality (VR) environments are a powerful tool for the investigation of brain mechanisms. We introduce the Servoball, a VR treadmill for freely moving rodents. The Servoball is integrated with the animals’ group home cage. Single individuals voluntarily enter using automated access control. Training is highly time-efficient, even for cognitively complex VR paradigms. PMID:28077665
Proximity matching for ArF and KrF scanners
NASA Astrophysics Data System (ADS)
Kim, Young Ki; Pohling, Lua; Hwee, Ng Teng; Kim, Jeong Soo; Benyon, Peter; Depre, Jerome; Hong, Jongkyun; Serebriakov, Alexander
2009-03-01
There are many IC-manufacturers over the world that use various exposure systems and work with very high requirements in order to establish and maintain stable lithographic processes of 65 nm, 45 nm and below. Once the process is established, manufacturer desires to be able to run it on different tools that are available. This is why the proximity matching plays a key role to maximize tools utilization in terms of productivity for different types of exposure tools. In this paper, we investigate the source of errors that cause optical proximity mismatch and evaluate several approaches for proximity matching of different types of 193 nm and 248 nm scanner systems such as set-get sigma calibration, contrast adjustment, and, finally, tuning imaging parameters by optimization with Manual Scanner Matcher. First, to monitor the proximity mismatch, we collect CD measurement data for the reference tool and for the tool-to-be-matched. Normally, the measurement is performed for a set of line or space through pitch structures. Secondly, by simulation or experiment, we determine the sensitivity of the critical structures with respect to small adjustment of exposure settings such as NA, sigma inner, sigma outer, dose, focus scan range etc. that are called 'proximity tuning knobs'. Then, with the help of special optimization software, we compute the proximity knob adjustment that has to be applied to the tool-to-be-matched to match the reference tool. Finally, we verify successful matching by exposing on the tool-to-be-matched with tuned exposure settings. This procedure is applicable for inter- and intra scanner type matching, but possibly also for process transfers to the design targets. In order to illustrate the approach we show experimental data as well as results of imaging simulations. The set demonstrate successful matching of critical structures for ArF scanners of different tool generations.
Collaborating with Youth to Inform and Develop Tools for Psychotropic Decision Making
Murphy, Andrea; Gardner, David; Kutcher, Stan; Davidson, Simon; Manion, Ian
2010-01-01
Introduction: Youth oriented and informed resources designed to support psychopharmacotherapeutic decision-making are essentially unavailable. This article outlines the approach taken to design such resources, the product that resulted from the approach taken, and the lessons learned from the process. Methods: A project team with psychopharmacology expertise was assembled. The project team reviewed best practices regarding medication educational materials and related tools to support decisions. Collaboration with key stakeholders who were thought of as primary end-users and target groups occurred. A graphic designer and a plain language consultant were also retained. Results: Through an iterative and collaborative process over approximately 6 months, Med Ed and Med Ed Passport were developed. Literature and input from key stakeholders, in particular youth, was instrumental to the development of the tools and materials within Med Ed. A training program utilizing a train-the-trainer model was developed to facilitate the implementation of Med Ed in Ontario, which is currently ongoing. Conclusion: An evidence-informed process that includes youth and key stakeholder engagement is required for developing tools to support in psychopharmacotherapeutic decision-making. The development process fostered an environment of reciprocity between the project team and key stakeholders. PMID:21037916
Optical metrology for testing an all-composite 2-meter diameter mirror
NASA Technical Reports Server (NTRS)
Catanzaro, B.; Thomas, James A.; Small, D.; Johnston, R.; Barber, D.; Connell, S.; Whitmore, S.; Cohen, E.
2001-01-01
The Herschel Space Observatory (formerly known as FIRST) consists of a 3.5 m space telescope designed for use in the long IR and sub-milimeter wavebands. To demonstrate the viability of a carbon fiber composite telescope for this application, Composite Optics Incorporated (COI) manufactured a fast (f/1), large (2 m), lightweight (10.1 kg/m squared) demonstration mirror. A key challenge in demonstrating the performance of this novel mirror was to characterize the surface accuracy at cryogenic (70 K) temperatures. A wide variety of optical metrology techniques were investigated and a brief survey of empirical test results and limitations of the various techniques will be presented in this paper. Two complementary infrared (IR)techniques operating at a wavelength of 10.6 microns were chosen for further development: (1) IR Twyman-Green Phase Shifting Interferometry (IR PSI) and (2) IR Shack-Hartmann (IR SH) Wavefront Sensing. Innovative design modifications made to an existing IR PSI to achieve high-resolution, scannable, infrared measurements of the composite mirror are described. The modified interferometer was capable of measuring surface gradients larger than 350 microradians. The design and results of measurements made with a custom-built IR SH Wavefrong Sensor operating at 10.6 microns are also presented. A compact experimental setup permitting simultaneous operation of both the IR PSI and IR SH tools is shown. The advantages and the limitations of the two key IR metrology tools are discussed.
Sah, Jay P.; Ross, Michael S.; Snyder, James R.; ...
2010-01-01
In fire-dependent forests, managers are interested in predicting the consequences of prescribed burning on postfire tree mortality. We examined the effects of prescribed fire on tree mortality in Florida Keys pine forests, using a factorial design with understory type, season, and year of burn as factors. We also used logistic regression to model the effects of burn season, fire severity, and tree dimensions on individual tree mortality. Despite limited statistical power due to problems in carrying out the full suite of planned experimental burns, associations with tree and fire variables were observed. Post-fire pine tree mortality was negatively correlated withmore » tree size and positively correlated with char height and percent crown scorch. Unlike post-fire mortality, tree mortality associated with storm surge from Hurricane Wilma was greater in the large size classes. Due to their influence on population structure and fuel dynamics, the size-selective mortality patterns following fire and storm surge have practical importance for using fire as a management tool in Florida Keys pinelands in the future, particularly when the threats to their continued existence from tropical storms and sea level rise are expected to increase.« less
Mid-frequency Band Dynamics of Large Space Structures
NASA Technical Reports Server (NTRS)
Coppolino, Robert N.; Adams, Douglas S.
2004-01-01
High and low intensity dynamic environments experienced by a spacecraft during launch and on-orbit operations, respectively, induce structural loads and motions, which are difficult to reliably predict. Structural dynamics in low- and mid-frequency bands are sensitive to component interface uncertainty and non-linearity as evidenced in laboratory testing and flight operations. Analytical tools for prediction of linear system response are not necessarily adequate for reliable prediction of mid-frequency band dynamics and analysis of measured laboratory and flight data. A new MATLAB toolbox, designed to address the key challenges of mid-frequency band dynamics, is introduced in this paper. Finite-element models of major subassemblies are defined following rational frequency-wavelength guidelines. For computational efficiency, these subassemblies are described as linear, component mode models. The complete structural system model is composed of component mode subassemblies and linear or non-linear joint descriptions. Computation and display of structural dynamic responses are accomplished employing well-established, stable numerical methods, modern signal processing procedures and descriptive graphical tools. Parametric sensitivity and Monte-Carlo based system identification tools are used to reconcile models with experimental data and investigate the effects of uncertainties. Models and dynamic responses are exported for employment in applications, such as detailed structural integrity and mechanical-optical-control performance analyses.
Microstructure Modeling of 3rd Generation Disk Alloys
NASA Technical Reports Server (NTRS)
Jou, Herng-Jeng
2010-01-01
The objective of this program is to model, validate, and predict the precipitation microstructure evolution, using PrecipiCalc (QuesTek Innovations LLC) software, for 3rd generation Ni-based gas turbine disc superalloys during processing and service, with a set of logical and consistent experiments and characterizations. Furthermore, within this program, the originally research-oriented microstructure simulation tool will be further improved and implemented to be a useful and user-friendly engineering tool. In this report, the key accomplishment achieved during the second year (2008) of the program is summarized. The activities of this year include final selection of multicomponent thermodynamics and mobility databases, precipitate surface energy determination from nucleation experiment, multiscale comparison of predicted versus measured intragrain precipitation microstructure in quench samples showing good agreement, isothermal coarsening experiment and interaction of grain boundary and intergrain precipitates, primary microstructure of subsolvus treatment, and finally the software implementation plan for the third year of the project. In the following year, the calibrated models and simulation tools will be validated against an independently developed experimental data set, with actual disc heat treatment process conditions. Furthermore, software integration and implementation will be developed to provide material engineers valuable information in order to optimize the processing of the 3rd generation gas turbine disc alloys.
JCDSA: a joint covariate detection tool for survival analysis on tumor expression profiles.
Wu, Yiming; Liu, Yanan; Wang, Yueming; Shi, Yan; Zhao, Xudong
2018-05-29
Survival analysis on tumor expression profiles has always been a key issue for subsequent biological experimental validation. It is crucial how to select features which closely correspond to survival time. Furthermore, it is important how to select features which best discriminate between low-risk and high-risk group of patients. Common features derived from the two aspects may provide variable candidates for prognosis of cancer. Based on the provided two-step feature selection strategy, we develop a joint covariate detection tool for survival analysis on tumor expression profiles. Significant features, which are not only consistent with survival time but also associated with the categories of patients with different survival risks, are chosen. Using the miRNA expression data (Level 3) of 548 patients with glioblastoma multiforme (GBM) as an example, miRNA candidates for prognosis of cancer are selected. The reliability of selected miRNAs using this tool is demonstrated by 100 simulations. Furthermore, It is discovered that significant covariates are not directly composed of individually significant variables. Joint covariate detection provides a viewpoint for selecting variables which are not individually but jointly significant. Besides, it helps to select features which are not only consistent with survival time but also associated with prognosis risk. The software is available at http://bio-nefu.com/resource/jcdsa .
Lavis, John N; Bärnighausen, Till; El-Jardali, Fadi
2017-09-01
To describe the infrastructure available to support the production of policy-relevant health systems research syntheses, particularly those incorporating quasi-experimental evidence, and the tools available to support the use of these syntheses. Literature review. The general challenges associated with the available infrastructure include their sporadic nature or limited coverage of issues and countries, whereas the specific ones related to policy-relevant syntheses of quasi-experimental evidence include the lack of mechanism to register synthesis titles and scoping review protocols, the limited number of groups preparing user-friendly summaries, and the difficulty of finding quasi-experimental studies for inclusion in rapid syntheses and research syntheses more generally. Although some new tools have emerged in recent years, such as guidance workbooks and citizen briefs and panels, challenges related to using available tools to support the use of policy-relevant syntheses of quasi-experimental evidence arise from such studies potentially being harder for policymakers and stakeholders to commission and understand. Policymakers, stakeholders, and researchers need to expand the coverage and institutionalize the use of the available infrastructure and tools to support the use of health system research syntheses containing quasi-experimental evidence. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Liu, Yi; Anusonti-Inthra, Phuriwat; Diskin, Boris
2011-01-01
A physics-based, systematically coupled, multidisciplinary prediction tool (MUTE) for rotorcraft noise was developed and validated with a wide range of flight configurations and conditions. MUTE is an aggregation of multidisciplinary computational tools that accurately and efficiently model the physics of the source of rotorcraft noise, and predict the noise at far-field observer locations. It uses systematic coupling approaches among multiple disciplines including Computational Fluid Dynamics (CFD), Computational Structural Dynamics (CSD), and high fidelity acoustics. Within MUTE, advanced high-order CFD tools are used around the rotor blade to predict the transonic flow (shock wave) effects, which generate the high-speed impulsive noise. Predictions of the blade-vortex interaction noise in low speed flight are also improved by using the Particle Vortex Transport Method (PVTM), which preserves the wake flow details required for blade/wake and fuselage/wake interactions. The accuracy of the source noise prediction is further improved by utilizing a coupling approach between CFD and CSD, so that the effects of key structural dynamics, elastic blade deformations, and trim solutions are correctly represented in the analysis. The blade loading information and/or the flow field parameters around the rotor blade predicted by the CFD/CSD coupling approach are used to predict the acoustic signatures at far-field observer locations with a high-fidelity noise propagation code (WOPWOP3). The predicted results from the MUTE tool for rotor blade aerodynamic loading and far-field acoustic signatures are compared and validated with a variation of experimental data sets, such as UH60-A data, DNW test data and HART II test data.
Protein Structure Prediction by Protein Threading
NASA Astrophysics Data System (ADS)
Xu, Ying; Liu, Zhijie; Cai, Liming; Xu, Dong
The seminal work of Bowie, Lüthy, and Eisenberg (Bowie et al., 1991) on "the inverse protein folding problem" laid the foundation of protein structure prediction by protein threading. By using simple measures for fitness of different amino acid types to local structural environments defined in terms of solvent accessibility and protein secondary structure, the authors derived a simple and yet profoundly novel approach to assessing if a protein sequence fits well with a given protein structural fold. Their follow-up work (Elofsson et al., 1996; Fischer and Eisenberg, 1996; Fischer et al., 1996a,b) and the work by Jones, Taylor, and Thornton (Jones et al., 1992) on protein fold recognition led to the development of a new brand of powerful tools for protein structure prediction, which we now term "protein threading." These computational tools have played a key role in extending the utility of all the experimentally solved structures by X-ray crystallography and nuclear magnetic resonance (NMR), providing structural models and functional predictions for many of the proteins encoded in the hundreds of genomes that have been sequenced up to now.
Binding-Site Compatible Fragment Growing Applied to the Design of β2-Adrenergic Receptor Ligands.
Chevillard, Florent; Rimmer, Helena; Betti, Cecilia; Pardon, Els; Ballet, Steven; van Hilten, Niek; Steyaert, Jan; Diederich, Wibke E; Kolb, Peter
2018-02-08
Fragment-based drug discovery is intimately linked to fragment extension approaches that can be accelerated using software for de novo design. Although computers allow for the facile generation of millions of suggestions, synthetic feasibility is however often neglected. In this study we computationally extended, chemically synthesized, and experimentally assayed new ligands for the β 2 -adrenergic receptor (β 2 AR) by growing fragment-sized ligands. In order to address the synthetic tractability issue, our in silico workflow aims at derivatized products based on robust organic reactions. The study started from the predicted binding modes of five fragments. We suggested a total of eight diverse extensions that were easily synthesized, and further assays showed that four products had an improved affinity (up to 40-fold) compared to their respective initial fragment. The described workflow, which we call "growing via merging" and for which the key tools are available online, can improve early fragment-based drug discovery projects, making it a useful creative tool for medicinal chemists during structure-activity relationship (SAR) studies.
Strategies for single-point diamond machining a large format germanium blazed immersion grating
NASA Astrophysics Data System (ADS)
Montesanti, R. C.; Little, S. L.; Kuzmenko, P. J.; Bixler, J. V.; Jackson, J. L.; Lown, J. G.; Priest, R. E.; Yoxall, B. E.
2016-07-01
A large format germanium immersion grating was flycut with a single-point diamond tool on the Precision Engineering Research Lathe (PERL) at the Lawrence Livermore National Laboratory (LLNL) in November - December 2015. The grating, referred to as 002u, has an area of 59 mm x 67 mm (along-groove and cross-groove directions), line pitch of 88 line/mm, and blaze angle of 32 degree. Based on total groove length, the 002u grating is five times larger than the previous largest grating (ZnSe) cut on PERL, and forty-five times larger than the previous largest germanium grating cut on PERL. The key risks associated with cutting the 002u grating were tool wear and keeping the PERL machine running uninterrupted in a stable machining environment. This paper presents the strategies employed to mitigate these risks, introduces pre-machining of the as-etched grating substrate to produce a smooth, flat, damage-free surface into which the grooves are cut, and reports on trade-offs that drove decisions and experimental results.
A Genome-Scale Metabolic Reconstruction of Mycoplasma genitalium, iPS189
Suthers, Patrick F.; Dasika, Madhukar S.; Kumar, Vinay Satish; Denisov, Gennady; Glass, John I.; Maranas, Costas D.
2009-01-01
With a genome size of ∼580 kb and approximately 480 protein coding regions, Mycoplasma genitalium is one of the smallest known self-replicating organisms and, additionally, has extremely fastidious nutrient requirements. The reduced genomic content of M. genitalium has led researchers to suggest that the molecular assembly contained in this organism may be a close approximation to the minimal set of genes required for bacterial growth. Here, we introduce a systematic approach for the construction and curation of a genome-scale in silico metabolic model for M. genitalium. Key challenges included estimation of biomass composition, handling of enzymes with broad specificities, and the lack of a defined medium. Computational tools were subsequently employed to identify and resolve connectivity gaps in the model as well as growth prediction inconsistencies with gene essentiality experimental data. The curated model, M. genitalium iPS189 (262 reactions, 274 metabolites), is 87% accurate in recapitulating in vivo gene essentiality results for M. genitalium. Approaches and tools described herein provide a roadmap for the automated construction of in silico metabolic models of other organisms. PMID:19214212
Haak, Andrew J; Girtman, Megan A; Ali, Mohamed F; Carmona, Eva M; Limper, Andrew H; Tschumperlin, Daniel J
2017-09-15
Pirfenidone recently received FDA approval as one of the first two drugs designed to treat idiopathic pulmonary fibrosis. While the clinical data continues to support the efficacy of pirfenidone, the specific molecular mechanism of action of this drug has not been fully defined. From a chemical perspective the comparatively simple and lipophilic structure of pirfenidone combined with its administration at high doses, both experimentally and clinically, complicates some of the basic tenants of drug action and drug design. Our objective here was to identify a commercially available structural mimic of pirfenidone which retains key aspects of its physical chemical properties but does not display any of its antifibrotic effects. We tested these molecules using lung fibroblasts derived from patients with idiopathic pulmonary fibrosis and found phenylpyrrolidine based analogs of pirfenidone that were non-toxic and lacked antifibrotic activity even when applied at millimolar concentrations. Based on our findings, these molecules represent pharmacological tools for future studies delineating pirfenidone's mechanism of action. Copyright © 2017 Elsevier B.V. All rights reserved.
Altan, Irem; Charbonneau, Patrick; Snell, Edward H.
2016-01-01
Crystallization is a key step in macromolecular structure determination by crystallography. While a robust theoretical treatment of the process is available, due to the complexity of the system, the experimental process is still largely one of trial and error. In this article, efforts in the field are discussed together with a theoretical underpinning using a solubility phase diagram. Prior knowledge has been used to develop tools that computationally predict the crystallization outcome and define mutational approaches that enhance the likelihood of crystallization. For the most part these tools are based on binary outcomes (crystal or no crystal), and the full information contained in an assembly of crystallization screening experiments is lost. The potential of this additional information is illustrated by examples where new biological knowledge can be obtained and where a target can be sub-categorized to predict which class of reagents provides the crystallization driving force. Computational analysis of crystallization requires complete and correctly formatted data. While massive crystallization screening efforts are under way, the data available from many of these studies are sparse. The potential for this data and the steps needed to realize this potential are discussed. PMID:26792536
PetriScape - A plugin for discrete Petri net simulations in Cytoscape.
Almeida, Diogo; Azevedo, Vasco; Silva, Artur; Baumbach, Jan
2016-06-04
Systems biology plays a central role for biological network analysis in the post-genomic era. Cytoscape is the standard bioinformatics tool offering the community an extensible platform for computational analysis of the emerging cellular network together with experimental omics data sets. However, only few apps/plugins/tools are available for simulating network dynamics in Cytoscape 3. Many approaches of varying complexity exist but none of them have been integrated into Cytoscape as app/plugin yet. Here, we introduce PetriScape, the first Petri net simulator for Cytoscape. Although discrete Petri nets are quite simplistic models, they are capable of modeling global network properties and simulating their behaviour. In addition, they are easily understood and well visualizable. PetriScape comes with the following main functionalities: (1) import of biological networks in SBML format, (2) conversion into a Petri net, (3) visualization as Petri net, and (4) simulation and visualization of the token flow in Cytoscape. PetriScape is the first Cytoscape plugin for Petri nets. It allows a straightforward Petri net model creation, simulation and visualization with Cytoscape, providing clues about the activity of key components in biological networks.
PetriScape - A plugin for discrete Petri net simulations in Cytoscape.
Almeida, Diogo; Azevedo, Vasco; Silva, Artur; Baumbach, Jan
2016-03-01
Systems biology plays a central role for biological network analysis in the post-genomic era. Cytoscape is the standard bioinformatics tool offering the community an extensible platform for computational analysis of the emerging cellular network together with experimental omics data sets. However, only few apps/plugins/tools are available for simulating network dynamics in Cytoscape 3. Many approaches of varying complexity exist but none of them have been integrated into Cytoscape as app/plugin yet. Here, we introduce PetriScape, the first Petri net simulator for Cytoscape. Although discrete Petri nets are quite simplistic models, they are capable of modeling global network properties and simulating their behaviour. In addition, they are easily understood and well visualizable. PetriScape comes with the following main functionalities: (1) import of biological networks in SBML format, (2) conversion into a Petri net, (3) visualization as Petri net, and (4) simulation and visualization of the token flow in Cytoscape. PetriScape is the first Cytoscape plugin for Petri nets. It allows a straightforward Petri net model creation, simulation and visualization with Cytoscape, providing clues about the activity of key components in biological networks.
Maidenbaum, Shachar; Abboud, Sami; Amedi, Amir
2014-04-01
Sensory substitution devices (SSDs) have come a long way since first developed for visual rehabilitation. They have produced exciting experimental results, and have furthered our understanding of the human brain. Unfortunately, they are still not used for practical visual rehabilitation, and are currently considered as reserved primarily for experiments in controlled settings. Over the past decade, our understanding of the neural mechanisms behind visual restoration has changed as a result of converging evidence, much of which was gathered with SSDs. This evidence suggests that the brain is more than a pure sensory-machine but rather is a highly flexible task-machine, i.e., brain regions can maintain or regain their function in vision even with input from other senses. This complements a recent set of more promising behavioral achievements using SSDs and new promising technologies and tools. All these changes strongly suggest that the time has come to revive the focus on practical visual rehabilitation with SSDs and we chart several key steps in this direction such as training protocols and self-train tools. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
Enhancing biomedical text summarization using semantic relation extraction.
Shang, Yue; Li, Yanpeng; Lin, Hongfei; Yang, Zhihao
2011-01-01
Automatic text summarization for a biomedical concept can help researchers to get the key points of a certain topic from large amount of biomedical literature efficiently. In this paper, we present a method for generating text summary for a given biomedical concept, e.g., H1N1 disease, from multiple documents based on semantic relation extraction. Our approach includes three stages: 1) We extract semantic relations in each sentence using the semantic knowledge representation tool SemRep. 2) We develop a relation-level retrieval method to select the relations most relevant to each query concept and visualize them in a graphic representation. 3) For relations in the relevant set, we extract informative sentences that can interpret them from the document collection to generate text summary using an information retrieval based method. Our major focus in this work is to investigate the contribution of semantic relation extraction to the task of biomedical text summarization. The experimental results on summarization for a set of diseases show that the introduction of semantic knowledge improves the performance and our results are better than the MEAD system, a well-known tool for text summarization.
ERIC Educational Resources Information Center
Lahm, Elizabeth A.; Morrissette, Sandra K.
This collection of materials describes different types of computer applications and software that can help students with disabilities. It contains information on: (1) Easy Access, a feature of the systems software on every Macintosh computer that allows use of the keypad instead of the mouse, options for slow keys, and options for sticky keys; (2)…
Armstrong, Elizabeth M; Ciccone, Natalie; Hersh, Deborah; Katzenellebogen, Judith; Coffin, Juli; Thompson, Sandra; Flicker, Leon; Hayward, Colleen; Woods, Deborah; McAllister, Meaghan
2017-06-01
Acquired communication disorders (ACD), following stroke and traumatic brain injury, may not be correctly identified in Aboriginal Australians due to a lack of linguistically and culturally appropriate assessment tools. Within this paper we explore key issues that were considered in the development of the Aboriginal Communication Assessment After Brain Injury (ACAABI) - a screening tool designed to assess the presence of ACD in Aboriginal populations. A literature review and consultation with key stakeholders were undertaken to explore directions needed to develop a new tool, based on existing tools and recommendations for future developments. The literature searches revealed no existing screening tool for ACD in these populations, but identified tools in the areas of cognition and social-emotional wellbeing. Articles retrieved described details of the content and style of these tools, with recommendations for the development and administration of a new tool. The findings from the interview and focus group views were consistent with the approach recommended in the literature. There is a need for a screening tool for ACD to be developed but any tool must be informed by knowledge of Aboriginal language, culture and community input in order to be acceptable and valid.
Data Assimilation at FLUXNET to Improve Models towards Ecological Forecasting (Invited)
NASA Astrophysics Data System (ADS)
Luo, Y.
2009-12-01
Dramatically increased volumes of data from observational and experimental networks such as FLUXNET call for transformation of ecological research to increase its emphasis on quantitative forecasting. Ecological forecasting will also meet the societal need to develop better strategies for natural resource management in a world of ongoing global change. Traditionally, ecological forecasting has been based on process-based models, informed by data in largely ad hoc ways. Although most ecological models incorporate some representation of mechanistic processes, today’s ecological models are generally not adequate to quantify real-world dynamics and provide reliable forecasts with accompanying estimates of uncertainty. A key tool to improve ecological forecasting is data assimilation, which uses data to inform initial conditions and to help constrain a model during simulation to yield results that approximate reality as closely as possible. In an era with dramatically increased availability of data from observational and experimental networks, data assimilation is a key technique that helps convert the raw data into ecologically meaningful products so as to accelerate our understanding of ecological processes, test ecological theory, forecast changes in ecological services, and better serve the society. This talk will use examples to illustrate how data from FLUXNET have been assimilated with process-based models to improve estimates of model parameters and state variables; to quantify uncertainties in ecological forecasting arising from observations, models and their interactions; and to evaluate information contributions of data and model toward short- and long-term forecasting of ecosystem responses to global change.
Electromagnetomechanical elastodynamic model for Lamb wave damage quantification in composites
NASA Astrophysics Data System (ADS)
Borkowski, Luke; Chattopadhyay, Aditi
2014-03-01
Physics-based wave propagation computational models play a key role in structural health monitoring (SHM) and the development of improved damage quantification methodologies. Guided waves (GWs), such as Lamb waves, provide the capability to monitor large plate-like aerospace structures with limited actuators and sensors and are sensitive to small scale damage; however due to the complex nature of GWs, accurate and efficient computation tools are necessary to investigate the mechanisms responsible for dispersion, coupling, and interaction with damage. In this paper, the local interaction simulation approach (LISA) coupled with the sharp interface model (SIM) solution methodology is used to solve the fully coupled electro-magneto-mechanical elastodynamic equations for the piezoelectric and piezomagnetic actuation and sensing of GWs in fiber reinforced composite material systems. The final framework provides the full three-dimensional displacement as well as electrical and magnetic potential fields for arbitrary plate and transducer geometries and excitation waveform and frequency. The model is validated experimentally and proven computationally efficient for a laminated composite plate. Studies are performed with surface bonded piezoelectric and embedded piezomagnetic sensors to gain insight into the physics of experimental techniques used for SHM. The symmetric collocation of piezoelectric actuators is modeled to demonstrate mode suppression in laminated composites for the purpose of damage detection. The effect of delamination and damage (i.e., matrix cracking) on the GW propagation is demonstrated and quantified. The developed model provides a valuable tool for the improvement of SHM techniques due to its proven accuracy and computational efficiency.
NASA Research Center Contributions to Space Shuttle Return to Flight (SSRTF)
NASA Technical Reports Server (NTRS)
Cockrell, Charles E., Jr.; Barnes, Robert S.; Belvin, Harry L.; Allmen, John; Otero, Angel
2005-01-01
Contributions provided by the NASA Research Centers to key Space Shuttle return-to-flight milestones, with an emphasis on debris and Thermal Protection System (TPS) damage characterization, are described herein. Several CAIB recommendations and Space Shuttle Program directives deal with the mitigation of external tank foam insulation as a debris source, including material characterization as well as potential design changes, and an understanding of Orbiter TPS material characteristics, damage scenarios, and repair options. Ames, Glenn, and Langley Research Centers have performed analytic studies, conducted experimental testing, and developed new technologies, analysis tools, and hardware to contribute to each of these recommendations. For the External Tank (ET), these include studies of spray-on foam insulation (SOFI), investigations of potential design changes, and applications of advanced non-destructive evaluation (NDE) technologies to understand ET TPS shedding during liftoff and ascent. The end-to-end debris assessment included transport analysis to determine the probabilities of impact for various debris sources. For the Orbiter, methods were developed, and validated through experimental testing, to determine thresholds for potential damage of Orbiter TPS components. Analysis tools were developed and validated for on-orbit TPS damage assessments, especially in the area of aerothermal environments. Advanced NDE technologies were also applied to the Orbiter TPS components, including sensor technologies to detect wing leading edge impacts during liftoff and ascent. Work is continuing to develop certified TPS repair options and to develop improved methodologies for reinforced carbon-carbon (RCC) damage progression to assist in on-orbit repair decision philosophy.
Paton, C; Hansen, M; Fernandez-Luque, L; Lau, A Y S
2012-01-01
This paper explores the range of self-tracking devices and social media platforms used by the self-tracking community, and examines the implications of widespread adoption of these tools for scientific progress in health informatics. A literature review was performed to investigate the use of social media and self-tracking technologies in the health sector. An environmental scan identified a range of products and services which were used to exemplify three levels of self-tracking: self-experimentation, social sharing of data and patient controlled electronic health records. There appears to be an increase in the use of self-tracking tools, particularly in the health and fitness sector, but also used in the management of chronic diseases. Evidence of efficacy and effectiveness is limited to date, primarily due to the health and fitness focus of current solutions as opposed to their use in disease management. Several key technologies are converging to produce a trend of increased personal health surveillance and monitoring, social connectedness and sharing, and integration of regional and national health information systems. These trends are enabling new applications of scientific techniques, from personal experimentation to e-epidemiology, as data gathered by individuals are aggregated and shared across increasingly connected healthcare networks. These trends also raise significant new ethical and scientific issues that will need to be addressed, both by health informatics researchers and the communities of self-trackers themselves.
The design, synthesis and pharmacological characterization of novel β2-adrenoceptor antagonists
Hothersall, J Daniel; Black, James; Caddick, Stephen; Vinter, Jeremy G; Tinker, Andrew; Baker, James R
2011-01-01
BACKGROUND AND PURPOSE Selective and potent antagonists for the β2-adrenoceptor are potentially interesting as experimental and clinical tools, and we sought to identify novel ligands with this pharmacology. EXPERIMENTAL APPROACH A range of pharmacological assays was used to assess potency, affinity, selectivity (β2-adrenoceptor vs. β1-adrenoceptor) and efficacy. KEY RESULTS Ten novel compounds were identified but none had as high affinity as the prototypical β2-adrenoceptor blocker ICI-118,551, although one of the novel compounds was more selective for β2-adrenoceptors. Most of the ligands were inverse agonists for β2-adrenoceptor-cAMP signalling, although one (5217377) was a partial agonist and another a neutral antagonist (7929193). None of the ligands were efficacious with regard to β2-adrenoceptor-β-arrestin signalling. The (2S,3S) enantiomers were identified as the most active, although unusually the racemates were the most selective for the β2-adrenoceptors. This was taken as evidence for some unusual enantiospecific behaviour. CONCLUSIONS AND IMPLICATIONS In terms of improving on the pharmacology of the ligand ICI-118,551, one of the compounds was more selective (racemic JB-175), while one was a neutral antagonist (7929193), although none had as high an affinity. The results substantiate the notion that β-blockers do more than simply inhibit receptor activation, and differences between the ligands could provide useful tools to investigate receptor biology. PMID:21323900
Changes to Quantum Cryptography
NASA Astrophysics Data System (ADS)
Sakai, Yasuyuki; Tanaka, Hidema
Quantum cryptography has become a subject of widespread interest. In particular, quantum key distribution, which provides a secure key agreement by using quantum systems, is believed to be the most important application of quantum cryptography. Quantum key distribution has the potential to achieve the “unconditionally” secure infrastructure. We also have many cryptographic tools that are based on “modern cryptography” at the present time. They are being used in an effort to guarantee secure communication over open networks such as the Internet. Unfortunately, their ultimate efficacy is in doubt. Quantum key distribution systems are believed to be close to practical and commercial use. In this paper, we discuss what we should do to apply quantum cryptography to our communications. We also discuss how quantum key distribution can be combined with or used to replace cryptographic tools based on modern cryptography.
Dubovenko, Alexey; Nikolsky, Yuri; Rakhmatulin, Eugene; Nikolskaya, Tatiana
2017-01-01
Analysis of NGS and other sequencing data, gene variants, gene expression, proteomics, and other high-throughput (OMICs) data is challenging because of its biological complexity and high level of technical and biological noise. One way to deal with both problems is to perform analysis with a high fidelity annotated knowledgebase of protein interactions, pathways, and functional ontologies. This knowledgebase has to be structured in a computer-readable format and must include software tools for managing experimental data, analysis, and reporting. Here, we present MetaCore™ and Key Pathway Advisor (KPA), an integrated platform for functional data analysis. On the content side, MetaCore and KPA encompass a comprehensive database of molecular interactions of different types, pathways, network models, and ten functional ontologies covering human, mouse, and rat genes. The analytical toolkit includes tools for gene/protein list enrichment analysis, statistical "interactome" tool for the identification of over- and under-connected proteins in the dataset, and a biological network analysis module made up of network generation algorithms and filters. The suite also features Advanced Search, an application for combinatorial search of the database content, as well as a Java-based tool called Pathway Map Creator for drawing and editing custom pathway maps. Applications of MetaCore and KPA include molecular mode of action of disease research, identification of potential biomarkers and drug targets, pathway hypothesis generation, analysis of biological effects for novel small molecule compounds and clinical applications (analysis of large cohorts of patients, and translational and personalized medicine).
Learning to merge: a new tool for interactive mapping
NASA Astrophysics Data System (ADS)
Porter, Reid B.; Lundquist, Sheng; Ruggiero, Christy
2013-05-01
The task of turning raw imagery into semantically meaningful maps and overlays is a key area of remote sensing activity. Image analysts, in applications ranging from environmental monitoring to intelligence, use imagery to generate and update maps of terrain, vegetation, road networks, buildings and other relevant features. Often these tasks can be cast as a pixel labeling problem, and several interactive pixel labeling tools have been developed. These tools exploit training data, which is generated by analysts using simple and intuitive paint-program annotation tools, in order to tailor the labeling algorithm for the particular dataset and task. In other cases, the task is best cast as a pixel segmentation problem. Interactive pixel segmentation tools have also been developed, but these tools typically do not learn from training data like the pixel labeling tools do. In this paper we investigate tools for interactive pixel segmentation that also learn from user input. The input has the form of segment merging (or grouping). Merging examples are 1) easily obtained from analysts using vector annotation tools, and 2) more challenging to exploit than traditional labels. We outline the key issues in developing these interactive merging tools, and describe their application to remote sensing.
Discovering mechanisms relevant for radiation damage evolution
Uberuaga, Blas Pedro; Martinez, Enrique Saez; Perez, Danny; ...
2018-02-22
he response of a material to irradiation is a consequence of the kinetic evolution of defects produced during energetic damage events. Thus, accurate predictions of radiation damage evolution require knowing the atomic scale mechanisms associated with those defects. Atomistic simulations are a key tool in providing insight into the types of mechanisms possible. Further, by extending the time scale beyond what is achievable with conventional molecular dynamics, even greater insight can be obtained. Here, we provide examples in which such simulations have revealed new kinetic mechanisms that were not obvious before performing the simulations. We also demonstrate, through the couplingmore » with higher level models, how those mechanisms impact experimental observables in irradiated materials. Lastly, we discuss the importance of these types of simulations in the context of predicting material behavior.« less
Discovering mechanisms relevant for radiation damage evolution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Uberuaga, Blas Pedro; Martinez, Enrique Saez; Perez, Danny
he response of a material to irradiation is a consequence of the kinetic evolution of defects produced during energetic damage events. Thus, accurate predictions of radiation damage evolution require knowing the atomic scale mechanisms associated with those defects. Atomistic simulations are a key tool in providing insight into the types of mechanisms possible. Further, by extending the time scale beyond what is achievable with conventional molecular dynamics, even greater insight can be obtained. Here, we provide examples in which such simulations have revealed new kinetic mechanisms that were not obvious before performing the simulations. We also demonstrate, through the couplingmore » with higher level models, how those mechanisms impact experimental observables in irradiated materials. Lastly, we discuss the importance of these types of simulations in the context of predicting material behavior.« less
The bright future of single-molecule fluorescence imaging
Juette, Manuel F.; Terry, Daniel S.; Wasserman, Michael R.; Zhou, Zhou; Altman, Roger B.; Zheng, Qinsi; Blanchard, Scott C.
2014-01-01
Single-molecule Förster resonance energy transfer (smFRET) is an essential and maturing tool to probe biomolecular interactions and conformational dynamics in vitro and, increasingly, in living cells. Multi-color smFRET enables the correlation of multiple such events and the precise dissection of their order and timing. However, the requirements for good spectral separation, high time resolution, and extended observation times place extraordinary demands on the fluorescent labels used in such experiments. Together with advanced experimental designs and data analysis, the development of long-lasting, non-fluctuating fluorophores is therefore proving key to progress in the field. Recently developed strategies for obtaining ultra-stable organic fluorophores spanning the visible spectrum are underway that will enable multi-color smFRET studies to deliver on their promise of previously unachievable biological insights. PMID:24956235
Mechanism and experimental research on ultra-precision grinding of ferrite
NASA Astrophysics Data System (ADS)
Ban, Xinxing; Zhao, Huiying; Dong, Longchao; Zhu, Xueliang; Zhang, Chupeng; Gu, Yawen
2017-02-01
Ultra-precision grinding of ferrite is conducted to investigate the removal mechanism. Effect of the accuracy of machine tool key components on grinding surface quality is analyzed. The surface generation model of ferrite ultra-precision grinding machining is established. In order to reveal the surface formation mechanism of ferrite in the process of ultraprecision grinding, furthermore, the scientific and accurate of the calculation model are taken into account to verify the grinding surface roughness, which is proposed. Orthogonal experiment is designed using the high precision aerostatic turntable and aerostatic spindle for ferrite which is a typical hard brittle materials. Based on the experimental results, the influence factors and laws of ultra-precision grinding surface of ferrite are discussed through the analysis of the surface roughness. The results show that the quality of ferrite grinding surface is the optimal parameters, when the wheel speed of 20000r/mm, feed rate of 10mm/min, grinding depth of 0.005mm, and turntable rotary speed of 5r/min, the surface roughness Ra can up to 75nm.
Reconstruction of Tissue-Specific Metabolic Networks Using CORDA
Schultz, André; Qutub, Amina A.
2016-01-01
Human metabolism involves thousands of reactions and metabolites. To interpret this complexity, computational modeling becomes an essential experimental tool. One of the most popular techniques to study human metabolism as a whole is genome scale modeling. A key challenge to applying genome scale modeling is identifying critical metabolic reactions across diverse human tissues. Here we introduce a novel algorithm called Cost Optimization Reaction Dependency Assessment (CORDA) to build genome scale models in a tissue-specific manner. CORDA performs more efficiently computationally, shows better agreement to experimental data, and displays better model functionality and capacity when compared to previous algorithms. CORDA also returns reaction associations that can greatly assist in any manual curation to be performed following the automated reconstruction process. Using CORDA, we developed a library of 76 healthy and 20 cancer tissue-specific reconstructions. These reconstructions identified which metabolic pathways are shared across diverse human tissues. Moreover, we identified changes in reactions and pathways that are differentially included and present different capacity profiles in cancer compared to healthy tissues, including up-regulation of folate metabolism, the down-regulation of thiamine metabolism, and tight regulation of oxidative phosphorylation. PMID:26942765
Fundamental Physics with Electroweak Probes of Nuclei
NASA Astrophysics Data System (ADS)
Pastore, Saori
2018-02-01
The past decade has witnessed tremendous progress in the theoretical and computational tools that produce our understanding of nuclei. A number of microscopic calculations of nuclear electroweak structure and reactions have successfully explained the available experimental data, yielding a complex picture of the way nuclei interact with electroweak probes. This achievement is of great interest from the pure nuclear-physics point of view. But it is of much broader interest too, because the level of accuracy and confidence reached by these calculations opens up the concrete possibility of using nuclei to address open questions in other sub-fields of physics, such as, understanding the fundamental properties of neutrinos, or the particle nature of dark matter. In this talk, I will review recent progress in microscopic calculations of electroweak properties of light nuclei, including electromagnetic moments, form factors and transitions in between lowlying nuclear states along with preliminary studies for single- and double-beta decay rates. I will illustrate the key dynamical features required to explain the available experimental data, and, if time permits, present a novel framework to calculate neutrino-nucleus cross sections for A > 12 nuclei.
Patel, Disha; Bauman, Joseph D.; Arnold, Eddy
2015-01-01
X-ray crystallography has been an under-appreciated screening tool for fragment-based drug discovery due to the perception of low throughput and technical difficulty. Investigators in industry and academia have overcome these challenges by taking advantage of key factors that contribute to a successful crystallographic screening campaign. Efficient cocktail design and soaking methodologies have evolved to maximize throughput while minimizing false positives/negatives. In addition, technical improvements at synchrotron beamlines have dramatically increased data collection rates thus enabling screening on a timescale comparable to other techniques. The combination of available resources and efficient experimental design has resulted in many successful crystallographic screening campaigns. The three-dimensional crystal structure of the bound fragment complexed to its target, a direct result of the screening effort, enables structure-based drug design while revealing insights regarding protein dynamics and function not readily obtained through other experimental approaches. Furthermore, this “chemical interrogation” of the target protein crystals can lead to the identification of useful reagents for improving diffraction resolution or compound solubility. PMID:25117499
A Novel Method for Tracking Individuals of Fruit Fly Swarms Flying in a Laboratory Flight Arena.
Cheng, Xi En; Qian, Zhi-Ming; Wang, Shuo Hong; Jiang, Nan; Guo, Aike; Chen, Yan Qiu
2015-01-01
The growing interest in studying social behaviours of swarming fruit flies, Drosophila melanogaster, has heightened the need for developing tools that provide quantitative motion data. To achieve such a goal, multi-camera three-dimensional tracking technology is the key experimental gateway. We have developed a novel tracking system for tracking hundreds of fruit flies flying in a confined cubic flight arena. In addition to the proposed tracking algorithm, this work offers additional contributions in three aspects: body detection, orientation estimation, and data validation. To demonstrate the opportunities that the proposed system offers for generating high-throughput quantitative motion data, we conducted experiments on five experimental configurations. We also performed quantitative analysis on the kinematics and the spatial structure and the motion patterns of fruit fly swarms. We found that there exists an asymptotic distance between fruit flies in swarms as the population density increases. Further, we discovered the evidence for repulsive response when the distance between fruit flies approached the asymptotic distance. Overall, the proposed tracking system presents a powerful method for studying flight behaviours of fruit flies in a three-dimensional environment.
Regenerative tissue remodeling in planarians - The mysteries of morphallaxis.
Pellettieri, Jason
2018-04-19
Biologists have long marveled at the ability of planarian flatworms to regenerate any parts of their bodies in just a little over a week. While great progress has been made in deciphering the mechanisms by which new tissue is formed at sites of amputation, we know relatively little about the complementary remodeling response that occurs in uninjured tissues to restore anatomical scale and proportion. This review explores the mysterious biology of this process, first described in hydra by the father of experimental zoology, Abraham Trembley, and later termed 'morphallaxis' by the father of experimental genetics, Thomas Hunt Morgan. The perceptive work of these early pioneers, together with recent studies using modern tools, has revealed some of the key features of regenerative tissue remodeling, including repatterning of the body axes, reproportioning of organs like the brain and gut, and a major increase in the rate of cell death. Yet a mechanistic solution to this longstanding problem in the field will require further study by the next generation of planarian researchers. Copyright © 2018 Elsevier Ltd. All rights reserved.
Salomonis, Nathan; Dexheimer, Phillip J; Omberg, Larsson; Schroll, Robin; Bush, Stacy; Huo, Jeffrey; Schriml, Lynn; Ho Sui, Shannan; Keddache, Mehdi; Mayhew, Christopher; Shanmukhappa, Shiva Kumar; Wells, James; Daily, Kenneth; Hubler, Shane; Wang, Yuliang; Zambidis, Elias; Margolin, Adam; Hide, Winston; Hatzopoulos, Antonis K; Malik, Punam; Cancelas, Jose A; Aronow, Bruce J; Lutzko, Carolyn
2016-07-12
The rigorous characterization of distinct induced pluripotent stem cells (iPSC) derived from multiple reprogramming technologies, somatic sources, and donors is required to understand potential sources of variability and downstream potential. To achieve this goal, the Progenitor Cell Biology Consortium performed comprehensive experimental and genomic analyses of 58 iPSC from ten laboratories generated using a variety of reprogramming genes, vectors, and cells. Associated global molecular characterization studies identified functionally informative correlations in gene expression, DNA methylation, and/or copy-number variation among key developmental and oncogenic regulators as a result of donor, sex, line stability, reprogramming technology, and cell of origin. Furthermore, X-chromosome inactivation in PSC produced highly correlated differences in teratoma-lineage staining and regulator expression upon differentiation. All experimental results, and raw, processed, and metadata from these analyses, including powerful tools, are interactively accessible from a new online portal at https://www.synapse.org to serve as a reusable resource for the stem cell community. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Patel, Disha; Bauman, Joseph D; Arnold, Eddy
2014-01-01
X-ray crystallography has been an under-appreciated screening tool for fragment-based drug discovery due to the perception of low throughput and technical difficulty. Investigators in industry and academia have overcome these challenges by taking advantage of key factors that contribute to a successful crystallographic screening campaign. Efficient cocktail design and soaking methodologies have evolved to maximize throughput while minimizing false positives/negatives. In addition, technical improvements at synchrotron beamlines have dramatically increased data collection rates thus enabling screening on a timescale comparable to other techniques. The combination of available resources and efficient experimental design has resulted in many successful crystallographic screening campaigns. The three-dimensional crystal structure of the bound fragment complexed to its target, a direct result of the screening effort, enables structure-based drug design while revealing insights regarding protein dynamics and function not readily obtained through other experimental approaches. Furthermore, this "chemical interrogation" of the target protein crystals can lead to the identification of useful reagents for improving diffraction resolution or compound solubility. Copyright © 2014. Published by Elsevier Ltd.
Computational approaches to metabolic engineering utilizing systems biology and synthetic biology.
Fong, Stephen S
2014-08-01
Metabolic engineering modifies cellular function to address various biochemical applications. Underlying metabolic engineering efforts are a host of tools and knowledge that are integrated to enable successful outcomes. Concurrent development of computational and experimental tools has enabled different approaches to metabolic engineering. One approach is to leverage knowledge and computational tools to prospectively predict designs to achieve the desired outcome. An alternative approach is to utilize combinatorial experimental tools to empirically explore the range of cellular function and to screen for desired traits. This mini-review focuses on computational systems biology and synthetic biology tools that can be used in combination for prospective in silico strain design.
RNA-seq Data: Challenges in and Recommendations for Experimental Design and Analysis.
Williams, Alexander G; Thomas, Sean; Wyman, Stacia K; Holloway, Alisha K
2014-10-01
RNA-seq is widely used to determine differential expression of genes or transcripts as well as identify novel transcripts, identify allele-specific expression, and precisely measure translation of transcripts. Thoughtful experimental design and choice of analysis tools are critical to ensure high-quality data and interpretable results. Important considerations for experimental design include number of replicates, whether to collect paired-end or single-end reads, sequence length, and sequencing depth. Common analysis steps in all RNA-seq experiments include quality control, read alignment, assigning reads to genes or transcripts, and estimating gene or transcript abundance. Our aims are two-fold: to make recommendations for common components of experimental design and assess tool capabilities for each of these steps. We also test tools designed to detect differential expression, since this is the most widespread application of RNA-seq. We hope that these analyses will help guide those who are new to RNA-seq and will generate discussion about remaining needs for tool improvement and development. Copyright © 2014 John Wiley & Sons, Inc.
Resonance Parameter Adjustment Based on Integral Experiments
Sobes, Vladimir; Leal, Luiz; Arbanas, Goran; ...
2016-06-02
Our project seeks to allow coupling of differential and integral data evaluation in a continuous-energy framework and to use the generalized linear least-squares (GLLS) methodology in the TSURFER module of the SCALE code package to update the parameters of a resolved resonance region evaluation. We recognize that the GLLS methodology in TSURFER is identical to the mathematical description of a Bayesian update in SAMMY, the SAMINT code was created to use the mathematical machinery of SAMMY to update resolved resonance parameters based on integral data. Traditionally, SAMMY used differential experimental data to adjust nuclear data parameters. Integral experimental data, suchmore » as in the International Criticality Safety Benchmark Experiments Project, remain a tool for validation of completed nuclear data evaluations. SAMINT extracts information from integral benchmarks to aid the nuclear data evaluation process. Later, integral data can be used to resolve any remaining ambiguity between differential data sets, highlight troublesome energy regions, determine key nuclear data parameters for integral benchmark calculations, and improve the nuclear data covariance matrix evaluation. Moreover, SAMINT is not intended to bias nuclear data toward specific integral experiments but should be used to supplement the evaluation of differential experimental data. Using GLLS ensures proper weight is given to the differential data.« less
IsoDesign: a software for optimizing the design of 13C-metabolic flux analysis experiments.
Millard, Pierre; Sokol, Serguei; Letisse, Fabien; Portais, Jean-Charles
2014-01-01
The growing demand for (13) C-metabolic flux analysis ((13) C-MFA) in the field of metabolic engineering and systems biology is driving the need to rationalize expensive and time-consuming (13) C-labeling experiments. Experimental design is a key step in improving both the number of fluxes that can be calculated from a set of isotopic data and the precision of flux values. We present IsoDesign, a software that enables these parameters to be maximized by optimizing the isotopic composition of the label input. It can be applied to (13) C-MFA investigations using a broad panel of analytical tools (MS, MS/MS, (1) H NMR, (13) C NMR, etc.) individually or in combination. It includes a visualization module to intuitively select the optimal label input depending on the biological question to be addressed. Applications of IsoDesign are described, with an example of the entire (13) C-MFA workflow from the experimental design to the flux map including important practical considerations. IsoDesign makes the experimental design of (13) C-MFA experiments more accessible to a wider biological community. IsoDesign is distributed under an open source license at http://metasys.insa-toulouse.fr/software/isodes/ © 2013 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Toledo Fuentes, A.; Kipfmueller, M.; José Prieto, M. A.
2017-10-01
Mobile manipulators are becoming a key instrument to increase the flexibility in industrial processes. Some of their requirements include handling of objects with different weights and sizes and their “fast” transportation, without jeopardizing production workers and machines. The compensation of forces affecting the system dynamic is therefore needed to avoid unwanted oscillations and tilting by sudden accelerations and decelerations. One general solution may be the implementation of external positioning elements to active stabilize the system. To accomplish the approach, the dynamic behavior of a robotic arm and a mobile platform was investigated to develop the stabilization mechanism using multibody simulations. The methodology used was divided into two phases for each subsystem: their natural frequencies and modal shapes were obtained using experimental modal analyses. Then, based on these experimental results, multibody simulation models (MBS) were set up and its dynamical parameters adjusted. Their modal shapes together with their obtained natural frequencies allowed a quantitative and qualitative analysis. In summary, the MBS models were successfully validated with the real subsystems, with a maximal percentage error of 15%. These models will serve as the basis for future steps in the design of the external actuators and its control strategy using a co-simulation tool.
Determination of the core temperature of a Li-ion cell during thermal runaway
NASA Astrophysics Data System (ADS)
Parhizi, M.; Ahmed, M. B.; Jain, A.
2017-12-01
Safety and performance of Li-ion cells is severely affected by thermal runaway where exothermic processes within the cell cause uncontrolled temperature rise, eventually leading to catastrophic failure. Most past experimental papers on thermal runaway only report surface temperature measurement, while the core temperature of the cell remains largely unknown. This paper presents an experimentally validated method based on thermal conduction analysis to determine the core temperature of a Li-ion cell during thermal runaway using surface temperature and chemical kinetics data. Experiments conducted on a thermal test cell show that core temperature computed using this method is in good agreement with independent thermocouple-based measurements in a wide range of experimental conditions. The validated method is used to predict core temperature as a function of time for several previously reported thermal runaway tests. In each case, the predicted peak core temperature is found to be several hundreds of degrees Celsius higher than the measured surface temperature. This shows that surface temperature alone is not sufficient for thermally characterizing the cell during thermal runaway. Besides providing key insights into the fundamental nature of thermal runaway, the ability to determine the core temperature shown here may lead to practical tools for characterizing and mitigating thermal runaway.
NASA Astrophysics Data System (ADS)
Sateesh Kumar, Ch; Patel, Saroj Kumar; Das, Anshuman
2018-03-01
Temperature generation in cutting tools is one of the major causes of tool failure especially during hard machining where machining forces are quite high resulting in elevated temperatures. Thus, the present work investigates the temperature generation during hard machining of AISI 52100 steel (62 HRC hardness) with uncoated and PVD AlTiN coated Al2O3/TiCN mixed ceramic cutting tools. The experiments were performed on a heavy duty lathe machine with both coated and uncoated cutting tools under dry cutting environment. The temperature of the cutting zone was measured using an infrared thermometer and a finite element model has been adopted to predict the temperature distribution in cutting tools during machining for comparative assessment with the measured temperature. The experimental and numerical results revealed a significant reduction of cutting zone temperature during machining with PVD AlTiN coated cutting tools when compared to uncoated cutting tools during each experimental run. The main reason for decrease in temperature for AlTiN coated tools is the lower coefficient of friction offered by the coating material which allows the free flow of the chips on the rake surface when compared with uncoated cutting tools. Further, the superior wear behaviour of AlTiN coating resulted in reduction of cutting temperature.
Envelope: interactive software for modeling and fitting complex isotope distributions.
Sykes, Michael T; Williamson, James R
2008-10-20
An important aspect of proteomic mass spectrometry involves quantifying and interpreting the isotope distributions arising from mixtures of macromolecules with different isotope labeling patterns. These patterns can be quite complex, in particular with in vivo metabolic labeling experiments producing fractional atomic labeling or fractional residue labeling of peptides or other macromolecules. In general, it can be difficult to distinguish the contributions of species with different labeling patterns to an experimental spectrum and difficult to calculate a theoretical isotope distribution to fit such data. There is a need for interactive and user-friendly software that can calculate and fit the entire isotope distribution of a complex mixture while comparing these calculations with experimental data and extracting the contributions from the differently labeled species. Envelope has been developed to be user-friendly while still being as flexible and powerful as possible. Envelope can simultaneously calculate the isotope distributions for any number of different labeling patterns for a given peptide or oligonucleotide, while automatically summing these into a single overall isotope distribution. Envelope can handle fractional or complete atom or residue-based labeling, and the contribution from each different user-defined labeling pattern is clearly illustrated in the interactive display and is individually adjustable. At present, Envelope supports labeling with 2H, 13C, and 15N, and supports adjustments for baseline correction, an instrument accuracy offset in the m/z domain, and peak width. Furthermore, Envelope can display experimental data superimposed on calculated isotope distributions, and calculate a least-squares goodness of fit between the two. All of this information is displayed on the screen in a single graphical user interface. Envelope supports high-quality output of experimental and calculated distributions in PNG or PDF format. Beyond simply comparing calculated distributions to experimental data, Envelope is useful for planning or designing metabolic labeling experiments, by visualizing hypothetical isotope distributions in order to evaluate the feasibility of a labeling strategy. Envelope is also useful as a teaching tool, with its real-time display capabilities providing a straightforward way to illustrate the key variable factors that contribute to an observed isotope distribution. Envelope is a powerful tool for the interactive calculation and visualization of complex isotope distributions for comparison to experimental data. It is available under the GNU General Public License from http://williamson.scripps.edu/envelope/.
Experimental demonstration of subcarrier multiplexed quantum key distribution system.
Mora, José; Ruiz-Alba, Antonio; Amaya, Waldimar; Martínez, Alfonso; García-Muñoz, Víctor; Calvo, David; Capmany, José
2012-06-01
We provide, to our knowledge, the first experimental demonstration of the feasibility of sending several parallel keys by exploiting the technique of subcarrier multiplexing (SCM) widely employed in microwave photonics. This approach brings several advantages such as high spectral efficiency compatible with the actual secure key rates, the sharing of the optical fainted pulse by all the quantum multiplexed channels reducing the system complexity, and the possibility of upgrading with wavelength division multiplexing in a two-tier scheme, to increase the number of parallel keys. Two independent quantum SCM channels featuring a sifted key rate of 10 Kb/s/channel over a link with quantum bit error rate <2% is reported.
A free-access online key to identify Amazonian ferns.
Zuquim, Gabriela; Tuomisto, Hanna; Prado, Jefferson
2017-01-01
There is urgent need for more data on species distributions in order to improve conservation planning. A crucial but challenging aspect of producing high-quality data is the correct identification of organisms. Traditional printed floras and dichotomous keys are difficult to use for someone not familiar with the technical jargon. In poorly known areas, such as Amazonia, they also become quickly outdated as new species are described or ranges extended. Recently, online tools have allowed developing dynamic, interactive, and accessible keys that make species identification possible for a broader public. In order to facilitate identifying plants collected in field inventories, we developed an internet-based free-access tool to identify Amazonian fern species. We focused on ferns, because they are easy to collect and their edaphic affinities are relatively well known, so they can be used as an indicator group for habitat mapping. Our key includes 302 terrestrial and aquatic entities mainly from lowland Amazonian forests. It is a free-access key, so the user can freely choose which morphological features to use and in which order to assess them. All taxa are richly illustrated, so specimens can be identified by a combination of character choices, visual comparison, and written descriptions. The identification tool was developed in Lucid 3.5 software and it is available at http://keyserver.lucidcentral.org:8080/sandbox/keys.jsp.
A free-access online key to identify Amazonian ferns
Zuquim, Gabriela; Tuomisto, Hanna; Prado, Jefferson
2017-01-01
Abstract There is urgent need for more data on species distributions in order to improve conservation planning. A crucial but challenging aspect of producing high-quality data is the correct identification of organisms. Traditional printed floras and dichotomous keys are difficult to use for someone not familiar with the technical jargon. In poorly known areas, such as Amazonia, they also become quickly outdated as new species are described or ranges extended. Recently, online tools have allowed developing dynamic, interactive, and accessible keys that make species identification possible for a broader public. In order to facilitate identifying plants collected in field inventories, we developed an internet-based free-access tool to identify Amazonian fern species. We focused on ferns, because they are easy to collect and their edaphic affinities are relatively well known, so they can be used as an indicator group for habitat mapping. Our key includes 302 terrestrial and aquatic entities mainly from lowland Amazonian forests. It is a free-access key, so the user can freely choose which morphological features to use and in which order to assess them. All taxa are richly illustrated, so specimens can be identified by a combination of character choices, visual comparison, and written descriptions. The identification tool was developed in Lucid 3.5 software and it is available at http://keyserver.lucidcentral.org:8080/sandbox/keys.jsp. PMID:28781548
Perioperative leadership: managing change with insights, priorities, and tools.
Taylor, David L
2014-07-01
The personal leadership of the perioperative director is a critical factor in the success of any change management initiative. This article presents an approach to perioperative nursing leadership that addresses obstacles that prevent surgical departments from achieving high performance in clinical and financial outcomes. This leadership approach consists of specific insights, priorities, and tools: key insights include self-understanding of personal barriers to leadership and accuracy at understanding economic and strategic considerations related to the OR environment; key priorities include creating a customer-centered organization, focusing on process improvement, and concentrating on culture change; and key tools include using techniques (e.g., direct engagement, collaborative leadership) to align surgical organizations with leadership priorities and mitigate specific perioperative management risks. Included in this article is a leadership development plan for perioperative directors. Copyright © 2014 AORN, Inc. Published by Elsevier Inc. All rights reserved.
Requirements Document for Development of a Livermore Tomography Tools Interface
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seetho, I. M.
In this document, we outline an exercise performed at LLNL to evaluate the user interface deficits of a LLNL-developed CT reconstruction software package, Livermore Tomography Tools (LTT). We observe that a difficult-to-use command line interface and the lack of support functions compound to generate a bottleneck in the CT reconstruction process when input parameters to key functions are not well known. Through the exercise of systems engineering best practices, we generate key performance parameters for a LTT interface refresh, and specify a combination of back-end (“test-mode” functions) and front-end (graphical user interface visualization and command scripting tools) solutions to LTT’smore » poor user interface that aim to mitigate issues and lower costs associated with CT reconstruction using LTT. Key functional and non-functional requirements and risk mitigation strategies for the solution are outlined and discussed.« less
Improving the seismic small-scale modelling by comparison with numerical methods
NASA Astrophysics Data System (ADS)
Pageot, Damien; Leparoux, Donatienne; Le Feuvre, Mathieu; Durand, Olivier; Côte, Philippe; Capdeville, Yann
2017-10-01
The potential of experimental seismic modelling at reduced scale provides an intermediate step between numerical tests and geophysical campaigns on field sites. Recent technologies such as laser interferometers offer the opportunity to get data without any coupling effects. This kind of device is used in the Mesures Ultrasonores Sans Contact (MUSC) measurement bench for which an automated support system makes possible to generate multisource and multireceivers seismic data at laboratory scale. Experimental seismic modelling would become a great tool providing a value-added stage in the imaging process validation if (1) the experimental measurement chain is perfectly mastered, and thus if the experimental data are perfectly reproducible with a numerical tool, as well as if (2) the effective source is reproducible along the measurement setup. These aspects for a quantitative validation concerning devices with piezoelectrical sources and a laser interferometer have not been yet quantitatively studied in published studies. Thus, as a new stage for the experimental modelling approach, these two key issues are tackled in the proposed paper in order to precisely define the quality of the experimental small-scale data provided by the bench MUSC, which are available in the scientific community. These two steps of quantitative validation are dealt apart any imaging techniques in order to offer the opportunity to geophysicists who want to use such data (delivered as free data) of precisely knowing their quality before testing any imaging technique. First, in order to overcome the 2-D-3-D correction usually done in seismic processing when comparing 2-D numerical data with 3-D experimental measurement, we quantitatively refined the comparison between numerical and experimental data by generating accurate experimental line sources, avoiding the necessity of geometrical spreading correction for 3-D point-source data. The comparison with 2-D and 3-D numerical modelling is based on the Spectral Element Method. The approach shows the relevance of building a line source by sampling several source points, except the boundaries effects on later arrival times. Indeed, the experimental results highlight the amplitude feature and the delay equal to π/4 provided by a line source in the same manner than numerical data. In opposite, the 2-D corrections applied on 3-D data showed discrepancies which are higher on experimental data than on numerical ones due to the source wavelet shape and interferences between different arrivals. The experimental results from the approach proposed here show that discrepancies are avoided, especially for the reflected echoes. Concerning the second point aiming to assess the experimental reproducibility of the source, correlation coefficients of recording from a repeated source impact on a homogeneous model are calculated. The quality of the results, that is, higher than 0.98, allow to calculate a mean source wavelet by inversion of a mean data set. Results obtained on a more realistic model simulating clays on limestones, confirmed the reproducibility of the source impact.
Cementitious Barriers Partnership FY2013 End-Year Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flach, G. P.; Langton, C. A.; Burns, H. H.
2013-11-01
In FY2013, the Cementitious Barriers Partnership (CBP) demonstrated continued tangible progress toward fulfilling the objective of developing a set of software tools to improve understanding and prediction of the long-term structural, hydraulic and chemical performance of cementitious barriers used in nuclear applications. In November 2012, the CBP released “Version 1.0” of the CBP Software Toolbox, a suite of software for simulating reactive transport in cementitious materials and important degradation phenomena. In addition, the CBP completed development of new software for the “Version 2.0” Toolbox to be released in early FY2014 and demonstrated use of the Version 1.0 Toolbox on DOEmore » applications. The current primary software components in both Versions 1.0 and 2.0 are LeachXS/ORCHESTRA, STADIUM, and a GoldSim interface for probabilistic analysis of selected degradation scenarios. The CBP Software Toolbox Version 1.0 supports analysis of external sulfate attack (including damage mechanics), carbonation, and primary constituent leaching. Version 2.0 includes the additional analysis of chloride attack and dual regime flow and contaminant migration in fractured and non-fractured cementitious material. The LeachXS component embodies an extensive material property measurements database along with chemical speciation and reactive mass transport simulation cases with emphasis on leaching of major, trace and radionuclide constituents from cementitious materials used in DOE facilities, such as Saltstone (Savannah River) and Cast Stone (Hanford), tank closure grouts, and barrier concretes. STADIUM focuses on the physical and structural service life of materials and components based on chemical speciation and reactive mass transport of major cement constituents and aggressive species (e.g., chloride, sulfate, etc.). THAMES is a planned future CBP Toolbox component focused on simulation of the microstructure of cementitious materials and calculation of resultant hydraulic and constituent mass transfer parameters needed in modeling. Two CBP software demonstrations were conducted in FY2013, one to support the Saltstone Disposal Facility (SDF) at SRS and the other on a representative Hanford high-level waste tank. The CBP Toolbox demonstration on the SDF provided analysis on the most probable degradation mechanisms to the cementitious vault enclosure caused by sulfate and carbonation ingress. This analysis was documented and resulted in the issuance of a SDF Performance Assessment Special Analysis by Liquid Waste Operations this fiscal year. The two new software tools supporting chloride attack and dual-regime flow will provide additional degradation tools to better evaluate performance of DOE and commercial cementitious barriers. The CBP SRNL experimental program produced two patent applications and field data that will be used in the development and calibration of CBP software tools being developed in FY2014. The CBP software and simulation tools varies from other efforts in that all the tools are based upon specific and relevant experimental research of cementitious materials utilized in DOE applications. The CBP FY2013 program involved continuing research to improve and enhance the simulation tools as well as developing new tools that model other key degradation phenomena not addressed in Version 1.0. Also efforts to continue to verify the various simulation tools through laboratory experiments and analysis of field specimens are ongoing and will continue into FY2014 to quantify and reduce the uncertainty associated with performance assessments. This end-year report summarizes FY2013 software development efforts and the various experimental programs that are providing data for calibration and validation of the CBP developed software.« less
DOT National Transportation Integrated Search
2013-06-01
As demand for public transportation grows, planning tools used by Florida Department of Transportation (FDOT) and other transit agencies must evolve to effectively predict changing patterns of ridership. A key tool for this purpose is the Transit Boa...
NASA Astrophysics Data System (ADS)
Haddag, B.; Kagnaya, T.; Nouari, M.; Cutard, T.
2013-01-01
Modelling machining operations allows estimating cutting parameters which are difficult to obtain experimentally and in particular, include quantities characterizing the tool-workpiece interface. Temperature is one of these quantities which has an impact on the tool wear, thus its estimation is important. This study deals with a new modelling strategy, based on two steps of calculation, for analysis of the heat transfer into the cutting tool. Unlike the classical methods, considering only the cutting tool with application of an approximate heat flux at the cutting face, estimated from experimental data (e.g. measured cutting force, cutting power), the proposed approach consists of two successive 3D Finite Element calculations and fully independent on the experimental measurements; only the definition of the behaviour of the tool-workpiece couple is necessary. The first one is a 3D thermomechanical modelling of the chip formation process, which allows estimating cutting forces, chip morphology and its flow direction. The second calculation is a 3D thermal modelling of the heat diffusion into the cutting tool, by using an adequate thermal loading (applied uniform or non-uniform heat flux). This loading is estimated using some quantities obtained from the first step calculation, such as contact pressure, sliding velocity distributions and contact area. Comparisons in one hand between experimental data and the first calculation and at the other hand between measured temperatures with embedded thermocouples and the second calculation show a good agreement in terms of cutting forces, chip morphology and cutting temperature.
2012-01-01
Background The impact of weather and climate on malaria transmission has attracted considerable attention in recent years, yet uncertainties around future disease trends under climate change remain. Mathematical models provide powerful tools for addressing such questions and understanding the implications for interventions and eradication strategies, but these require realistic modeling of the vector population dynamics and its response to environmental variables. Methods Published and unpublished field and experimental data are used to develop new formulations for modeling the relationships between key aspects of vector ecology and environmental variables. These relationships are integrated within a validated deterministic model of Anopheles gambiae s.s. population dynamics to provide a valuable tool for understanding vector response to biotic and abiotic variables. Results A novel, parsimonious framework for assessing the effects of rainfall, cloudiness, wind speed, desiccation, temperature, relative humidity and density-dependence on vector abundance is developed, allowing ease of construction, analysis, and integration into malaria transmission models. Model validation shows good agreement with longitudinal vector abundance data from Tanzania, suggesting that recent malaria reductions in certain areas of Africa could be due to changing environmental conditions affecting vector populations. Conclusions Mathematical models provide a powerful, explanatory means of understanding the role of environmental variables on mosquito populations and hence for predicting future malaria transmission under global change. The framework developed provides a valuable advance in this respect, but also highlights key research gaps that need to be resolved if we are to better understand future malaria risk in vulnerable communities. PMID:22877154
NASA Astrophysics Data System (ADS)
Odedeyi, P. B.; Abou-El-Hossein, K.; Liman, M.
2017-05-01
Stainless steel 316 is a difficult-to-machine iron-based alloys that contain minimum of about 12% of chromium commonly used in marine and aerospace industry. This paper presents an experimental study of the tool wear propagation variations in the end milling of stainless steel 316 with coated carbide inserts. The milling tests were conducted at three different cutting speeds while feed rate and depth of cut were at (0.02, 0.06 and 01) mm/rev and (1, 2 and 3) mm, respectively. The cutting tool used was TiAlN-PVD-multi-layered coated carbides. The effects of cutting speed, cutting tool coating top layer and workpiece material were investigated on the tool life. The results showed that cutting speed significantly affected the machined flank wears values. With increasing cutting speed, the flank wear values decreased. The experimental results showed that significant flank wear was the major and predominant failure mode affecting the tool life.
Mixed-Initiative Constraint-Based Activity Planning for Mars Exploration Rovers
NASA Technical Reports Server (NTRS)
Bresina, John; Jonsson, Ari K.; Morris, Paul H.; Rajan, Kanna
2004-01-01
In January, 2004, two NASA rovers, named Spirit and Opportunity, successfully landed on Mars, starting an unprecedented exploration of the Martian surface. Power and thermal concerns constrained the duration of this mission, leading to an aggressive plan for commanding both rovers every day. As part of the process for generating these command loads, the MAPGEN tool provides engineers and scientists an intelligent activity planning tool that allows them to more effectively generate complex plans that maximize the science return each day. The key to'the effectiveness of the MAPGEN tool is an underlying artificial intelligence plan and constraint reasoning engine. In this paper we outline the design and functionality of the MAEPGEN tool and focus on some of the key capabilities it offers to the MER mission engineers.
Computational Analysis of Material Flow During Friction Stir Welding of AA5059 Aluminum Alloys
2011-01-01
tool material (AISI H13 tool steel ) is modeled as an isotropic linear-elastic material. Within the analysis, the effects of some of the FSW key process...threads/m; (b) tool 598 material = AISI H13 tool steel ; (c) workpiece material = 599 AA5059; (d) tool rotation speed = 500 rpm; (e) tool travel 600 speed...the strain-hardening term is augmented to take into account for the effect of dynamic recrystallization) while the FSW tool material (AISI H13
SCOUT: A Fast Monte-Carlo Modeling Tool of Scintillation Camera Output
Hunter, William C. J.; Barrett, Harrison H.; Lewellen, Thomas K.; Miyaoka, Robert S.; Muzi, John P.; Li, Xiaoli; McDougald, Wendy; MacDonald, Lawrence R.
2011-01-01
We have developed a Monte-Carlo photon-tracking and readout simulator called SCOUT to study the stochastic behavior of signals output from a simplified rectangular scintillation-camera design. SCOUT models the salient processes affecting signal generation, transport, and readout. Presently, we compare output signal statistics from SCOUT to experimental results for both a discrete and a monolithic camera. We also benchmark the speed of this simulation tool and compare it to existing simulation tools. We find this modeling tool to be relatively fast and predictive of experimental results. Depending on the modeled camera geometry, we found SCOUT to be 4 to 140 times faster than other modeling tools. PMID:22072297
SCOUT: a fast Monte-Carlo modeling tool of scintillation camera output†
Hunter, William C J; Barrett, Harrison H.; Muzi, John P.; McDougald, Wendy; MacDonald, Lawrence R.; Miyaoka, Robert S.; Lewellen, Thomas K.
2013-01-01
We have developed a Monte-Carlo photon-tracking and readout simulator called SCOUT to study the stochastic behavior of signals output from a simplified rectangular scintillation-camera design. SCOUT models the salient processes affecting signal generation, transport, and readout of a scintillation camera. Presently, we compare output signal statistics from SCOUT to experimental results for both a discrete and a monolithic camera. We also benchmark the speed of this simulation tool and compare it to existing simulation tools. We find this modeling tool to be relatively fast and predictive of experimental results. Depending on the modeled camera geometry, we found SCOUT to be 4 to 140 times faster than other modeling tools. PMID:23640136
Zimmer, Christoph
2016-01-01
Computational modeling is a key technique for analyzing models in systems biology. There are well established methods for the estimation of the kinetic parameters in models of ordinary differential equations (ODE). Experimental design techniques aim at devising experiments that maximize the information encoded in the data. For ODE models there are well established approaches for experimental design and even software tools. However, data from single cell experiments on signaling pathways in systems biology often shows intrinsic stochastic effects prompting the development of specialized methods. While simulation methods have been developed for decades and parameter estimation has been targeted for the last years, only very few articles focus on experimental design for stochastic models. The Fisher information matrix is the central measure for experimental design as it evaluates the information an experiment provides for parameter estimation. This article suggest an approach to calculate a Fisher information matrix for models containing intrinsic stochasticity and high nonlinearity. The approach makes use of a recently suggested multiple shooting for stochastic systems (MSS) objective function. The Fisher information matrix is calculated by evaluating pseudo data with the MSS technique. The performance of the approach is evaluated with simulation studies on an Immigration-Death, a Lotka-Volterra, and a Calcium oscillation model. The Calcium oscillation model is a particularly appropriate case study as it contains the challenges inherent to signaling pathways: high nonlinearity, intrinsic stochasticity, a qualitatively different behavior from an ODE solution, and partial observability. The computational speed of the MSS approach for the Fisher information matrix allows for an application in realistic size models.
Experimental validation of predicted cancer genes using FRET
NASA Astrophysics Data System (ADS)
Guala, Dimitri; Bernhem, Kristoffer; Ait Blal, Hammou; Jans, Daniel; Lundberg, Emma; Brismar, Hjalmar; Sonnhammer, Erik L. L.
2018-07-01
Huge amounts of data are generated in genome wide experiments, designed to investigate diseases with complex genetic causes. Follow up of all potential leads produced by such experiments is currently cost prohibitive and time consuming. Gene prioritization tools alleviate these constraints by directing further experimental efforts towards the most promising candidate targets. Recently a gene prioritization tool called MaxLink was shown to outperform other widely used state-of-the-art prioritization tools in a large scale in silico benchmark. An experimental validation of predictions made by MaxLink has however been lacking. In this study we used Fluorescence Resonance Energy Transfer, an established experimental technique for detection of protein-protein interactions, to validate potential cancer genes predicted by MaxLink. Our results provide confidence in the use of MaxLink for selection of new targets in the battle with polygenic diseases.
Experimental extraction of secure correlations from a noisy private state.
Dobek, K; Karpiński, M; Demkowicz-Dobrzański, R; Banaszek, K; Horodecki, P
2011-01-21
We report experimental generation of a noisy entangled four-photon state that exhibits a separation between the secure key contents and distillable entanglement, a hallmark feature of the recently established quantum theory of private states. The privacy analysis, based on the full tomographic reconstruction of the prepared state, is utilized in a proof-of-principle key generation. The inferiority of distillation-based strategies to extract the key is exposed by an implementation of an entanglement distillation protocol for the produced state.
Experimental demonstration of counterfactual quantum key distribution
NASA Astrophysics Data System (ADS)
Ren, M.; Wu, G.; Wu, E.; Zeng, H.
2011-04-01
Counterfactual quantum key distribution provides natural advantage against the eavesdropping on the actual signal particles. It can prevent the photon-number-splitting attack when a weak coherent light source is used for the practical implementation. We experimentally realized the counterfactual quantum key distribution in an unbalanced Mach-Zehnder interferometer of 12.5-km-long quantum channel with a high-fringe visibility of 97.4%. According to the security analysis, the system was robust against the photon-number-splitting attack. The article is published in the original.
The effect of strength training on quality of prolonged basic cardiopulmonary resuscitation.
Abelairas-Gómez, Cristian; Barcala-Furelos, Roberto; Szarpak, Łukasz; García-García, Óscar; Paz-Domínguez, Álvaro; López-García, Sergio; Rodríguez-Núñez, Antonio
2017-01-01
Providing high-quality chest compressions and rescue breaths are key elements in the effectiveness of cardio-pulmonary resuscitation. To investigate the effects of a strength training programme on the quality of prolonged basic cardiopulmonary resuscitation on a manikin. This was a quasi-experimental trial. Thirty-nine participants with prior basic life support knowledge were randomised to an experimental or control group. They then performed a test of 10 min of chest compressions and mouth-to-mouth ventilation on manikins equipped with a skill reporter tool (baseline or test 1). The experimental group participated in a four-week strength training programme focused on the muscles involved in chest compressions. Both groups were subsequently tested again (test 2). After training, the experimental group significantly increased the mean depth of compression (53.7 ± 2.3 mm vs. 49.9 ± 5.9 mm; p = 0.003) and the correct compression fraction (68.2 ± 21.0% vs. 46.4 ± 29.1%; p = 0.004). Trained subjects maintained chest compression quality over time better than the control group. The mean tidal volume delivered was higher in the experimental than in the control group (701.5 ± 187.0 mL vs. 584.8 ± 113.6 mL; p = 0.040) and above the current resuscitation guidelines. In test 2, the percentage of rescue breaths with excessive volume was higher in the experi-mental group than in the controls (31.5 ± 19.6% vs. 15.6 ± 13.0%; p = 0.007). A simple strength training programme has a significant impact on the quality of chest compressions and its maintenance over time. Additional training is needed to avoid over-ventilation of potential patients.
van Roekel, Hendrik W H; Rosier, Bas J H M; Meijer, Lenny H H; Hilbers, Peter A J; Markvoort, Albert J; Huck, Wilhelm T S; de Greef, Tom F A
2015-11-07
Living cells are able to produce a wide variety of biological responses when subjected to biochemical stimuli. It has become apparent that these biological responses are regulated by complex chemical reaction networks (CRNs). Unravelling the function of these circuits is a key topic of both systems biology and synthetic biology. Recent progress at the interface of chemistry and biology together with the realisation that current experimental tools are insufficient to quantitatively understand the molecular logic of pathways inside living cells has triggered renewed interest in the bottom-up development of CRNs. This builds upon earlier work of physical chemists who extensively studied inorganic CRNs and showed how a system of chemical reactions can give rise to complex spatiotemporal responses such as oscillations and pattern formation. Using purified biochemical components, in vitro synthetic biologists have started to engineer simplified model systems with the goal of mimicking biological responses of intracellular circuits. Emulation and reconstruction of system-level properties of intracellular networks using simplified circuits are able to reveal key design principles and molecular programs that underlie the biological function of interest. In this Tutorial Review, we present an accessible overview of this emerging field starting with key studies on inorganic CRNs followed by a discussion of recent work involving purified biochemical components. Finally, we review recent work showing the versatility of programmable biochemical reaction networks (BRNs) in analytical and diagnostic applications.
Experimental quantum key distribution with finite-key security analysis for noisy channels.
Bacco, Davide; Canale, Matteo; Laurenti, Nicola; Vallone, Giuseppe; Villoresi, Paolo
2013-01-01
In quantum key distribution implementations, each session is typically chosen long enough so that the secret key rate approaches its asymptotic limit. However, this choice may be constrained by the physical scenario, as in the perspective use with satellites, where the passage of one terminal over the other is restricted to a few minutes. Here we demonstrate experimentally the extraction of secure keys leveraging an optimal design of the prepare-and-measure scheme, according to recent finite-key theoretical tight bounds. The experiment is performed in different channel conditions, and assuming two distinct attack models: individual attacks or general quantum attacks. The request on the number of exchanged qubits is then obtained as a function of the key size and of the ambient quantum bit error rate. The results indicate that viable conditions for effective symmetric, and even one-time-pad, cryptography are achievable.
Revolutions in Neuroscience: Tool Development
Bickle, John
2016-01-01
Thomas Kuhn’s famous model of the components and dynamics of scientific revolutions is still dominant to this day across science, philosophy, and history. The guiding philosophical theme of this article is that, concerning actual revolutions in neuroscience over the past 60 years, Kuhn’s account is wrong. There have been revolutions, and new ones are brewing, but they do not turn on competing paradigms, anomalies, or the like. Instead, they turn exclusively on the development of new experimental tools. I adopt a metascientific approach and examine in detail the development of two recent neuroscience revolutions: the impact of engineered genetically mutated mammals in the search for causal mechanisms of “higher” cognitive functions; and the more recent impact of optogenetics and designer receptors exclusively activated by designer drugs (DREADDs). The two key metascientific concepts, I derive from these case studies are a revolutionary new tool’s motivating problem, and its initial and second-phase hook experiments. These concepts hardly exhaust a detailed metascience of tool development experiments in neuroscience, but they get that project off to a useful start and distinguish the subsequent account of neuroscience revolutions clearly from Kuhn’s famous model. I close with a brief remark about the general importance of molecular biology for a current philosophical understanding of science, as comparable to the place physics occupied when Kuhn formulated his famous theory of scientific revolutions. PMID:27013992
Experimental anti-GBM disease as a tool for studying spontaneous lupus nephritis.
Fu, Yuyang; Du, Yong; Mohan, Chandra
2007-08-01
Lupus nephritis is an immune-mediated disease, where antibodies and T cells both play pathogenic roles. Since spontaneous lupus nephritis in mouse models takes 6-12 months to manifest, there is an urgent need for a mouse model that can be used to delineate the pathogenic processes that lead to immune nephritis, over a quicker time frame. We propose that the experimental anti-glomerular basement membrane (GBM) disease model might be a suitable tool for uncovering some of the molecular steps underlying lupus nephritis. This article reviews the current evidence that supports the use of the experimental anti-GBM nephritis model for studying spontaneous lupus nephritis. Importantly, out of about 25 different molecules that have been specifically examined in the experimental anti-GBM model and also spontaneous lupus nephritis, all influence both diseases concordantly, suggesting that the experimental model might be a useful tool for unraveling the molecular basis of spontaneous lupus nephritis. This has important clinical implications, both from the perspective of genetic susceptibility as well as clinical therapeutics.
The New, Improved 2016 SmartWay Truck Carrier Tool
This EPA presentation provides information on the SmartWay Transport Partnership Program, including key information about EPA, Partners' roles, benefits, tools, partner recognition, awards, and brand value. Transcript available
Auditing and Mapping Key Skills within University Curricula
ERIC Educational Resources Information Center
Tariq, Vicki N.; Scott, Eileen M.; Cochrane, A. Clive; Lee, Maria; Ryles, Linda
2004-01-01
Universities are encouraged to embed key skills in their undergraduate curricula, yet there is often little support on how to identify skills development and progression. This paper describes a tool that facilitates colleagues in auditing key skills and career/employability skills within individual modules and mapping these skills across degree…
Strategy Keys as Tools for Problem Solving
ERIC Educational Resources Information Center
Herold-Blasius, Raja
2017-01-01
Problem solving is one of the main competences we seek to teach students at school for use in their future lives. However, when dealing with mathematical problems, teachers encounter a wide variety of difficulties. To foster students' problem-solving skills, the authors developed "strategy keys." Strategy keys can serve as material to…
NASA Astrophysics Data System (ADS)
Dutheil, Sylvain; Pibarot, Julien; Tran, Dac; Vallee, Jean-Jacques; Tribot, Jean-Pierre
2016-07-01
With the aim of placing Europe among the world's space players in the strategic area of atmospheric re-entry, several studies on experimental vehicle concepts and improvements of critical re-entry technologies have paved the way for the flight of an experimental space craft. The successful flight of the Intermediate eXperimental Vehicle (IXV), under ESA's Future Launchers Preparatory Programme (FLPP), is definitively a significant step forward from the Atmospheric Reentry Demonstrator flight (1998), establishing Europe as a key player in this field. The IXV project objectives were the design, development, manufacture and ground and flight verification of an autonomous European lifting and aerodynamically controlled reentry system, which is highly flexible and maneuverable. The paper presents, the role of aerodynamics aerothermodynamics as part of the key technologies for designing an atmospheric re-entry spacecraft and securing a successful flight.
Acquisition of delayed matching in the pigeon.
Berryman, R; Cumming, W W; Nevin, J A
1963-01-01
Pigeons were exposed to three successive matching-to-sample procedures. On a given trial, the sample (red, green or blue light) appeared on a center key; observing responses to this key produced the comparison stimuli on two side keys. Seven different experimental conditions could govern the temporal relations between the sample and comparison stimuli. In the "simultaneous" condition, the center key response was followed immediately by illumination of the side key comparison stimuli, with the center key remaining on. In "zero delay" the center key response simultaneously turned the side keys on and the center key off, while in the "variable delay" conditions, intervals of 1, 2, 4, 10, and 24 sec were interposed between the offset of the sample and the appearance of the comparison stimuli on the side keys. In all conditions, a response to the side key of matching hue produced reinforcement, while a response to the non-matching side key was followed by a blackout. In procedure I all seven experimental conditions were presented in randomly permutated order. After nine sessions of exposure (at 191 trials per session, for a total of 1719 trials) the birds gave no evidence of acquisition in any of the conditions. They were therefore transferred to Procedure II, which required them to match only in the "simultaneous" condition, with both the sample and comparison stimuli present at the same time. With the exception of one bird, all subjects acquired this performance to near 100% levels. Next, in Procedure III, they were once more exposed to presentation of all seven experimental conditions in random order. In contrast to Procedure I, they now acquired the delay performance, and were able to match effectively at delays of about 4 sec.
Video Analysis of Projectile Motion Using Tablet Computers as Experimental Tools
ERIC Educational Resources Information Center
Klein, P.; Gröber, S.; Kuhn, J.; Müller, A.
2014-01-01
Tablet computers were used as experimental tools to record and analyse the motion of a ball thrown vertically from a moving skateboard. Special applications plotted the measurement data component by component, allowing a simple determination of initial conditions and "g" in order to explore the underlying laws of motion. This experiment…
ON DEVELOPING TOOLS AND METHODS FOR ENVIRONMENTALLY BENIGN PROCESSES
Two types of tools are generally needed for designing processes and products that are cleaner from environmental impact perspectives. The first kind is called process tools. Process tools are based on information obtained from experimental investigations in chemistry., material s...
Sandberg, Troy E; Pedersen, Margit; LaCroix, Ryan A; Ebrahim, Ali; Bonde, Mads; Herrgard, Markus J; Palsson, Bernhard O; Sommer, Morten; Feist, Adam M
2014-10-01
Adaptive laboratory evolution (ALE) has emerged as a valuable method by which to investigate microbial adaptation to a desired environment. Here, we performed ALE to 42 °C of ten parallel populations of Escherichia coli K-12 MG1655 grown in glucose minimal media. Tightly controlled experimental conditions allowed selection based on exponential-phase growth rate, yielding strains that uniformly converged toward a similar phenotype along distinct genetic paths. Adapted strains possessed as few as 6 and as many as 55 mutations, and of the 144 genes that mutated in total, 14 arose independently across two or more strains. This mutational recurrence pointed to the key genetic targets underlying the evolved fitness increase. Genome engineering was used to introduce the novel ALE-acquired alleles in random combinations into the ancestral strain, and competition between these engineered strains reaffirmed the impact of the key mutations on the growth rate at 42 °C. Interestingly, most of the identified key gene targets differed significantly from those found in similar temperature adaptation studies, highlighting the sensitivity of genetic evolution to experimental conditions and ancestral genotype. Additionally, transcriptomic analysis of the ancestral and evolved strains revealed a general trend for restoration of the global expression state back toward preheat stressed levels. This restorative effect was previously documented following evolution to metabolic perturbations, and thus may represent a general feature of ALE experiments. The widespread evolved expression shifts were enabled by a comparatively scant number of regulatory mutations, providing a net fitness benefit but causing suboptimal expression levels for certain genes, such as those governing flagellar formation, which then became targets for additional ameliorating mutations. Overall, the results of this study provide insight into the adaptation process and yield lessons important for the future implementation of ALE as a tool for scientific research and engineering. © The Author 2014. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
Bonato, Lucio; Minelli, Alessandro; Lopresti, Massimo; Cerretti, Pierfilippo
2014-01-01
ChiloKey is a matrix-based, interactive key to all 179 species of Geophilomorpha (Chilopoda) recorded from Europe, including species of uncertain identity and those whose morphology is known partially only. The key is intended to assist in identification of subadult and adult specimens, by means of microscopy and simple dissection techniques whenever necessary. The key is freely available through the web at: http://www.biologia.unipd.it/chilokey/ and at http://www.interactive-keys.eu/chilokey/.
Model Rocketry: University-Level Educational Tool
ERIC Educational Resources Information Center
Barrowman, James S.
1974-01-01
Describes how model rocketry can be a useful educational tool at the university level as a practical application of theoretical aerodynamic concepts and as a tool for students in experimental research. (BR)
Experimental eavesdropping attack against Ekert's protocol based on Wigner's inequality
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bovino, F. A.; Colla, A. M.; Castagnoli, G.
2003-09-01
We experimentally implemented an eavesdropping attack against the Ekert protocol for quantum key distribution based on the Wigner inequality. We demonstrate a serious lack of security of this protocol when the eavesdropper gains total control of the source. In addition we tested a modified Wigner inequality which should guarantee a secure quantum key distribution.
Molecular simulations of self-assembly processes in metal-organic frameworks: Model dependence
NASA Astrophysics Data System (ADS)
Biswal, Debasmita; Kusalik, Peter G.
2017-07-01
Molecular simulation is a powerful tool for investigating microscopic behavior in various chemical systems, where the use of suitable models is critical to successfully reproduce the structural and dynamic properties of the real systems of interest. In this context, molecular dynamics simulation studies of self-assembly processes in metal-organic frameworks (MOFs), a well-known class of porous materials with interesting chemical and physical properties, are relatively challenging, where a reasonably accurate representation of metal-ligand interactions is anticipated to play an important role. In the current study, we both investigate the performance of some existing models and introduce and test new models to help explore the self-assembly in an archetypal Zn-carboxylate MOF system. To this end, the behavior of six different Zn-ion models, three solvent models, and two ligand models was examined and validated against key experimental structural parameters. To explore longer time scale ordering events during MOF self-assembly via explicit solvent simulations, it is necessary to identify a suitable combination of simplified model components representing metal ions, organic ligands, and solvent molecules. It was observed that an extended cationic dummy atom (ECDA) Zn-ion model combined with an all-atom carboxylate ligand model and a simple dipolar solvent model can reproduce characteristic experimental structures for the archetypal MOF system. The successful use of these models in extensive sets of molecular simulations, which provide key insights into the self-assembly mechanism of this archetypal MOF system occurring during the early stages of this process, has been very recently reported.
PopED lite: An optimal design software for preclinical pharmacokinetic and pharmacodynamic studies.
Aoki, Yasunori; Sundqvist, Monika; Hooker, Andrew C; Gennemark, Peter
2016-04-01
Optimal experimental design approaches are seldom used in preclinical drug discovery. The objective is to develop an optimal design software tool specifically designed for preclinical applications in order to increase the efficiency of drug discovery in vivo studies. Several realistic experimental design case studies were collected and many preclinical experimental teams were consulted to determine the design goal of the software tool. The tool obtains an optimized experimental design by solving a constrained optimization problem, where each experimental design is evaluated using some function of the Fisher Information Matrix. The software was implemented in C++ using the Qt framework to assure a responsive user-software interaction through a rich graphical user interface, and at the same time, achieving the desired computational speed. In addition, a discrete global optimization algorithm was developed and implemented. The software design goals were simplicity, speed and intuition. Based on these design goals, we have developed the publicly available software PopED lite (http://www.bluetree.me/PopED_lite). Optimization computation was on average, over 14 test problems, 30 times faster in PopED lite compared to an already existing optimal design software tool. PopED lite is now used in real drug discovery projects and a few of these case studies are presented in this paper. PopED lite is designed to be simple, fast and intuitive. Simple, to give many users access to basic optimal design calculations. Fast, to fit a short design-execution cycle and allow interactive experimental design (test one design, discuss proposed design, test another design, etc). Intuitive, so that the input to and output from the software tool can easily be understood by users without knowledge of the theory of optimal design. In this way, PopED lite is highly useful in practice and complements existing tools. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kalantari, Alireza; Sullivan-Lewis, Elliot; McDonell, Vincent
Due to increasingly stringent air quality requirements stationary power gas turbines have moved to lean-premixed operation, which reduces pollutant emissions but can result in flashback. Curtailing flashback can be difficult with hydrocarbon fuels and becomes even more challenging when hydrogen is used as the fuel. In fact, flashback is a key operability issue associated with low emission combustion of high hydrogen content fuels. Flashback can cause serious damage to the premixer hardware. Hence, design tools to predict flashback propensity are of interest. Such a design tool has been developed based on the data gathered by experimental study to predict boundarymore » layer flashback using non-dimensional parameters. The flashback propensity of a premixed jet flame has been studied experimentally. Boundary layer flashback has been investigated under turbulent flow conditions at elevated pressures and temperatures (i.e. 3 atm to 8 atm and 300 K to 500 K). The data presented in this study are for hydrogen fuel at various Reynolds numbers, which are representative of practical gas turbine premixer conditions and are significantly higher than results currently available in the literature. Three burner heads constructed of different materials (stainless steel, copper, and zirconia ceramic) were used to evaluate the effect of tip temperature, a parameter found previously to be an important factor in triggering flashback. This study characterizes flashback systematically by developing a comprehensive non-dimensional model which takes into account all effective parameters in boundary layer flashback propensity. The model was optimized for new data and captures the behavior of the new results well. Further, comparison of the model with the single existing study of high pressure jet flame flashback also indicates good agreement. The model developed using the high pressure test rig is able to predict flashback tendencies for a commercial gas turbine engine and can thus serve as a design tool for identifying when flashback is likely to occur for a given geometry and condition.« less
Evaluating a bedside tool for neuroanatomical localization with extended-matching questions.
Tan, Kevin; Chin, Han Xin; Yau, Christine W L; Lim, Erle C H; Samarasekera, Dujeepa; Ponnamperuma, Gominda; Tan, Nigel C K
2018-05-06
Neuroanatomical localization (NL) is a key skill in neurology, but learners often have difficulty with it. This study aims to evaluate a concise NL tool (NLT) developed to help teach and learn NL. To evaluate the NLT, an extended-matching questions (EMQ) test to assess NL was designed and validated. The EMQ was validated with fourth-year medical students and internal medicine and neurology residents. The NLT's usability was evaluated with third- and fourth-year students, and the effectiveness was evaluated with an experimental study of second-year students, using the EMQ as the outcome measure. Students were taught how to use both the NLT and textbook algorithms (control) to perform NL, then randomized into either group, and only allowed to use their assigned tool to complete the EMQ. Primary outcome was the difference in mean EMQ scores expressed as a percentage of total score. For EMQ validation, students (n = 56) scored lower than residents (n = 50) (76.7% ± 1.7 vs. 83.0% ± 1.6; mean ± standard error of mean, P < 0.009). The EMQ demonstrated good reliability (Cronbach's α 0.85) and generalizability (G-coefficient 0.85). Third- (n = 77) and fourth-year (n = 42) students found the NLT user-friendly and helpful in their learning of NL. In the experimental study, scores were significantly higher for NLT group (n = 94) than for controls (n = 101) (42.5 vs. 37.0%, P = 0.014); the effect size (Cohen's d) was 0.36. The EMQ is validated to reliably assess NL and is generalizable, feasible, practical, and of low cost. The concise and user-friendly NLT for NL was effective in aiding medical student performance of NL. Anat Sci Educ 11: 262-269. © 2017 American Association of Anatomists. © 2017 American Association of Anatomists.
Pelagic Habitat Analysis Module (PHAM) for GIS Based Fisheries Decision Support
NASA Technical Reports Server (NTRS)
Kiefer, D. A.; Armstrong, Edward M.; Harrison, D. P.; Hinton, M. G.; Kohin, S.; Snyder, S.; O'Brien, F. J.
2011-01-01
We have assembled a system that integrates satellite and model output with fisheries data We have developed tools that allow analysis of the interaction between species and key environmental variables Demonstrated the capacity to accurately map habitat of Thresher Sharks Alopias vulpinus & pelagicus. Their seasonal migration along the California Current is at least partly driven by the seasonal migration of sardine, key prey of the sharks.We have assembled a system that integrates satellite and model output with fisheries data We have developed tools that allow analysis of the interaction between species and key environmental variables Demonstrated the capacity to accurately map habitat of Thresher Sharks Alopias vulpinus nd pelagicus. Their seasonal migration along the California Current is at least partly driven by the seasonal migration of sardine, key prey of the sharks.
Cowley, Benjamin R.; Kaufman, Matthew T.; Butler, Zachary S.; Churchland, Mark M.; Ryu, Stephen I.; Shenoy, Krishna V.; Yu, Byron M.
2014-01-01
Objective Analyzing and interpreting the activity of a heterogeneous population of neurons can be challenging, especially as the number of neurons, experimental trials, and experimental conditions increases. One approach is to extract a set of latent variables that succinctly captures the prominent co-fluctuation patterns across the neural population. A key problem is that the number of latent variables needed to adequately describe the population activity is often greater than three, thereby preventing direct visualization of the latent space. By visualizing a small number of 2-d projections of the latent space or each latent variable individually, it is easy to miss salient features of the population activity. Approach To address this limitation, we developed a Matlab graphical user interface (called DataHigh) that allows the user to quickly and smoothly navigate through a continuum of different 2-d projections of the latent space. We also implemented a suite of additional visualization tools (including playing out population activity timecourses as a movie and displaying summary statistics, such as covariance ellipses and average timecourses) and an optional tool for performing dimensionality reduction. Main results To demonstrate the utility and versatility of DataHigh, we used it to analyze single-trial spike count and single-trial timecourse population activity recorded using a multi-electrode array, as well as trial-averaged population activity recorded using single electrodes. Significance DataHigh was developed to fulfill a need for visualization in exploratory neural data analysis, which can provide intuition that is critical for building scientific hypotheses and models of population activity. PMID:24216250
Michielsen, Kristien; De Meyer, Sara; Ivanova, Olena; Anderson, Ragnar; Decat, Peter; Herbiet, Céline; Kabiru, Caroline W; Ketting, Evert; Lees, James; Moreau, Caroline; Tolman, Deborah L; Vanwesenbeeck, Ine; Vega, Bernardo; Verhetsel, Elizabeth; Chandra-Mouli, Venkatraman
2016-01-13
On December 4th 2014, the International Centre for Reproductive Health (ICRH) at Ghent University organized an international conference on adolescent sexual and reproductive health (ASRH) and well-being. This viewpoint highlights two key messages of the conference--(1) ASRH promotion is broadening on different levels and (2) this broadening has important implications for research and interventions--that can guide this research field into the next decade. Adolescent sexuality has long been equated with risk and danger. However, throughout the presentations, it became clear that ASRH and related promotion efforts are broadening on different levels: from risk to well-being, from targeted and individual to comprehensive and structural, from knowledge transfer to innovative tools. However, indicators to measure adolescent sexuality that should accompany this broadening trend, are lacking. While public health related indicators (HIV/STIs, pregnancies) and their behavioral proxies (e.g., condom use, number of partners) are well developed and documented, there is a lack of consensus on indicators for the broader construct of adolescent sexuality, including sexual well-being and aspects of positive sexuality. Furthermore, the debate during the conference clearly indicated that experimental designs may not be the only appropriate study design to measure effectiveness of comprehensive, context-specific and long-term ASRH programmes, and that alternatives need to be identified and applied. Presenters at the conference clearly expressed the need to develop validated tools to measure different sub-constructs of adolescent sexuality and environmental factors. There was a plea to combine (quasi-)experimental effectiveness studies with evaluations of the development and implementation of ASRH promotion initiatives.
Robust Informatics Infrastructure Required For ICME: Combining Virtual and Experimental Data
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Holland, Frederic A. Jr.; Bednarcyk, Brett A.
2014-01-01
With the increased emphasis on reducing the cost and time to market of new materials, the need for robust automated materials information management system(s) enabling sophisticated data mining tools is increasing, as evidenced by the emphasis on Integrated Computational Materials Engineering (ICME) and the recent establishment of the Materials Genome Initiative (MGI). This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Further, the use of increasingly sophisticated nonlinear, anisotropic and or multi-scale models requires both the processing of large volumes of test data and complex materials data necessary to establish processing-microstructure-property-performance relationships. Fortunately, material information management systems have kept pace with the growing user demands and evolved to enable: (i) the capture of both point wise data and full spectra of raw data curves, (ii) data management functions such as access, version, and quality controls;(iii) a wide range of data import, export and analysis capabilities; (iv) data pedigree traceability mechanisms; (v) data searching, reporting and viewing tools; and (vi) access to the information via a wide range of interfaces. This paper discusses key principles for the development of a robust materials information management system to enable the connections at various length scales to be made between experimental data and corresponding multiscale modeling toolsets to enable ICME. In particular, NASA Glenn's efforts towards establishing such a database for capturing constitutive modeling behavior for both monolithic and composites materials
Optimal CCD readout by digital correlated double sampling
NASA Astrophysics Data System (ADS)
Alessandri, C.; Abusleme, A.; Guzman, D.; Passalacqua, I.; Alvarez-Fontecilla, E.; Guarini, M.
2016-01-01
Digital correlated double sampling (DCDS), a readout technique for charge-coupled devices (CCD), is gaining popularity in astronomical applications. By using an oversampling ADC and a digital filter, a DCDS system can achieve a better performance than traditional analogue readout techniques at the expense of a more complex system analysis. Several attempts to analyse and optimize a DCDS system have been reported, but most of the work presented in the literature has been experimental. Some approximate analytical tools have been presented for independent parameters of the system, but the overall performance and trade-offs have not been yet modelled. Furthermore, there is disagreement among experimental results that cannot be explained by the analytical tools available. In this work, a theoretical analysis of a generic DCDS readout system is presented, including key aspects such as the signal conditioning stage, the ADC resolution, the sampling frequency and the digital filter implementation. By using a time-domain noise model, the effect of the digital filter is properly modelled as a discrete-time process, thus avoiding the imprecision of continuous-time approximations that have been used so far. As a result, an accurate, closed-form expression for the signal-to-noise ratio at the output of the readout system is reached. This expression can be easily optimized in order to meet a set of specifications for a given CCD, thus providing a systematic design methodology for an optimal readout system. Simulated results are presented to validate the theory, obtained with both time- and frequency-domain noise generation models for completeness.
NASA Astrophysics Data System (ADS)
Cowley, Benjamin R.; Kaufman, Matthew T.; Butler, Zachary S.; Churchland, Mark M.; Ryu, Stephen I.; Shenoy, Krishna V.; Yu, Byron M.
2013-12-01
Objective. Analyzing and interpreting the activity of a heterogeneous population of neurons can be challenging, especially as the number of neurons, experimental trials, and experimental conditions increases. One approach is to extract a set of latent variables that succinctly captures the prominent co-fluctuation patterns across the neural population. A key problem is that the number of latent variables needed to adequately describe the population activity is often greater than 3, thereby preventing direct visualization of the latent space. By visualizing a small number of 2-d projections of the latent space or each latent variable individually, it is easy to miss salient features of the population activity. Approach. To address this limitation, we developed a Matlab graphical user interface (called DataHigh) that allows the user to quickly and smoothly navigate through a continuum of different 2-d projections of the latent space. We also implemented a suite of additional visualization tools (including playing out population activity timecourses as a movie and displaying summary statistics, such as covariance ellipses and average timecourses) and an optional tool for performing dimensionality reduction. Main results. To demonstrate the utility and versatility of DataHigh, we used it to analyze single-trial spike count and single-trial timecourse population activity recorded using a multi-electrode array, as well as trial-averaged population activity recorded using single electrodes. Significance. DataHigh was developed to fulfil a need for visualization in exploratory neural data analysis, which can provide intuition that is critical for building scientific hypotheses and models of population activity.
Cowley, Benjamin R; Kaufman, Matthew T; Butler, Zachary S; Churchland, Mark M; Ryu, Stephen I; Shenoy, Krishna V; Yu, Byron M
2013-12-01
Analyzing and interpreting the activity of a heterogeneous population of neurons can be challenging, especially as the number of neurons, experimental trials, and experimental conditions increases. One approach is to extract a set of latent variables that succinctly captures the prominent co-fluctuation patterns across the neural population. A key problem is that the number of latent variables needed to adequately describe the population activity is often greater than 3, thereby preventing direct visualization of the latent space. By visualizing a small number of 2-d projections of the latent space or each latent variable individually, it is easy to miss salient features of the population activity. To address this limitation, we developed a Matlab graphical user interface (called DataHigh) that allows the user to quickly and smoothly navigate through a continuum of different 2-d projections of the latent space. We also implemented a suite of additional visualization tools (including playing out population activity timecourses as a movie and displaying summary statistics, such as covariance ellipses and average timecourses) and an optional tool for performing dimensionality reduction. To demonstrate the utility and versatility of DataHigh, we used it to analyze single-trial spike count and single-trial timecourse population activity recorded using a multi-electrode array, as well as trial-averaged population activity recorded using single electrodes. DataHigh was developed to fulfil a need for visualization in exploratory neural data analysis, which can provide intuition that is critical for building scientific hypotheses and models of population activity.
The Role of Quantum Decoherence in FRET.
Nelson, Philip C
2018-02-16
Resonance energy transfer has become an indispensable experimental tool for single-molecule and single-cell biophysics. Its physical underpinnings, however, are subtle: it involves a discrete jump of excitation from one molecule to another, and so we regard it as a strongly quantum-mechanical process. And yet its kinetics differ from what many of us were taught about two-state quantum systems, quantum superpositions of the states do not seem to arise, and so on. Although J. R. Oppenheimer and T. Förster navigated these subtleties successfully, it remains hard to find an elementary derivation in modern language. The key step involves acknowledging quantum decoherence. Appreciating that aspect can be helpful when we attempt to extend our understanding to situations in which Förster's original analysis is not applicable. Copyright © 2018 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Bertanza, Giorgio; Papa, Matteo; Canato, Matteo; Collivignarelli, Maria Cristina; Pedrazzani, Roberta
2014-05-01
A key issue in biological Waste Water Treatment Plants (WWTPs) operation is represented by the sludge management. Mechanical dewatering is a crucial stage for sludge volume reduction; though, being a costly operation, its optimization is required. We developed an original experimental methodology to evaluate the technical (dewatering efficiency) and financial (total treatment costs) performance of dewatering devices, which might be used as a DSS (Decision Support System) for WWTP managers. This tool was then applied to two real case studies for comparing, respectively, three industrial size centrifuges, and two different operation modes of the same machine (fixed installation vs. outsourcing service). In both the cases, the best option was identified, based jointly on economic and (site-specific) technical evaluations. Copyright © 2014 Elsevier Ltd. All rights reserved.
Two-dimensional vacuum ultraviolet images in different MHD events on the EAST tokamak
NASA Astrophysics Data System (ADS)
Zhijun, WANG; Xiang, GAO; Tingfeng, MING; Yumin, WANG; Fan, ZHOU; Feifei, LONG; Qing, ZHUANG; EAST Team
2018-02-01
A high-speed vacuum ultraviolet (VUV) imaging telescope system has been developed to measure the edge plasma emission (including the pedestal region) in the Experimental Advanced Superconducting Tokamak (EAST). The key optics of the high-speed VUV imaging system consists of three parts: an inverse Schwarzschild-type telescope, a micro-channel plate (MCP) and a visible imaging high-speed camera. The VUV imaging system has been operated routinely in the 2016 EAST experiment campaign. The dynamics of the two-dimensional (2D) images of magnetohydrodynamic (MHD) instabilities, such as edge localized modes (ELMs), tearing-like modes and disruptions, have been observed using this system. The related VUV images are presented in this paper, and it indicates the VUV imaging system is a potential tool which can be applied successfully in various plasma conditions.
Scientific workflows as productivity tools for drug discovery.
Shon, John; Ohkawa, Hitomi; Hammer, Juergen
2008-05-01
Large pharmaceutical companies annually invest tens to hundreds of millions of US dollars in research informatics to support their early drug discovery processes. Traditionally, most of these investments are designed to increase the efficiency of drug discovery. The introduction of do-it-yourself scientific workflow platforms has enabled research informatics organizations to shift their efforts toward scientific innovation, ultimately resulting in a possible increase in return on their investments. Unlike the handling of most scientific data and application integration approaches, researchers apply scientific workflows to in silico experimentation and exploration, leading to scientific discoveries that lie beyond automation and integration. This review highlights some key requirements for scientific workflow environments in the pharmaceutical industry that are necessary for increasing research productivity. Examples of the application of scientific workflows in research and a summary of recent platform advances are also provided.
MosaicSolver: a tool for determining recombinants of viral genomes from pileup data
Wood, Graham R.; Ryabov, Eugene V.; Fannon, Jessica M.; Moore, Jonathan D.; Evans, David J.; Burroughs, Nigel
2014-01-01
Viral recombination is a key evolutionary mechanism, aiding escape from host immunity, contributing to changes in tropism and possibly assisting transmission across species barriers. The ability to determine whether recombination has occurred and to locate associated specific recombination junctions is thus of major importance in understanding emerging diseases and pathogenesis. This paper describes a method for determining recombinant mosaics (and their proportions) originating from two parent genomes, using high-throughput sequence data. The method involves setting the problem geometrically and the use of appropriately constrained quadratic programming. Recombinants of the honeybee deformed wing virus and the Varroa destructor virus-1 are inferred to illustrate the method from both siRNAs and reads sampling the viral genome population (cDNA library); our results are confirmed experimentally. Matlab software (MosaicSolver) is available. PMID:25120266
Simultaneous AFM topography and recognition imaging at the plasma membrane of mammalian cells.
Chtcheglova, Lilia A; Hinterdorfer, Peter
2018-01-01
Elucidation the nano-organization of membrane proteins at/within the plasma membrane is probably the most demanding and still challenging task in cell biology since requires experimental approaches with nanoscale resolution. During last decade, atomic force microscopy (AFM)-based simultaneous topography and recognition imaging (TREC) has become a powerful tool to quickly obtain local receptor nano-maps on complex heterogeneous biosurfaces such as cells and membranes. Here we emphasize the TREC technique and explain how to unravel the nano-landscape of mammalian cells. We describe the procedures for all steps of the experiment including tip functionalization with ligand molecules, sample preparation, and localization of key molecules on the cell surface. We also discuss the current limitations and future perspectives of this technique. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Comparisons of non-Gaussian statistical models in DNA methylation analysis.
Ma, Zhanyu; Teschendorff, Andrew E; Yu, Hong; Taghia, Jalil; Guo, Jun
2014-06-16
As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance.
Comparisons of Non-Gaussian Statistical Models in DNA Methylation Analysis
Ma, Zhanyu; Teschendorff, Andrew E.; Yu, Hong; Taghia, Jalil; Guo, Jun
2014-01-01
As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance. PMID:24937687
A hybrid brain-computer interface-based mail client.
Yu, Tianyou; Li, Yuanqing; Long, Jinyi; Li, Feng
2013-01-01
Brain-computer interface-based communication plays an important role in brain-computer interface (BCI) applications; electronic mail is one of the most common communication tools. In this study, we propose a hybrid BCI-based mail client that implements electronic mail communication by means of real-time classification of multimodal features extracted from scalp electroencephalography (EEG). With this BCI mail client, users can receive, read, write, and attach files to their mail. Using a BCI mouse that utilizes hybrid brain signals, that is, motor imagery and P300 potential, the user can select and activate the function keys and links on the mail client graphical user interface (GUI). An adaptive P300 speller is employed for text input. The system has been tested with 6 subjects, and the experimental results validate the efficacy of the proposed method.
A Hybrid Brain-Computer Interface-Based Mail Client
Yu, Tianyou; Li, Yuanqing; Long, Jinyi; Li, Feng
2013-01-01
Brain-computer interface-based communication plays an important role in brain-computer interface (BCI) applications; electronic mail is one of the most common communication tools. In this study, we propose a hybrid BCI-based mail client that implements electronic mail communication by means of real-time classification of multimodal features extracted from scalp electroencephalography (EEG). With this BCI mail client, users can receive, read, write, and attach files to their mail. Using a BCI mouse that utilizes hybrid brain signals, that is, motor imagery and P300 potential, the user can select and activate the function keys and links on the mail client graphical user interface (GUI). An adaptive P300 speller is employed for text input. The system has been tested with 6 subjects, and the experimental results validate the efficacy of the proposed method. PMID:23690880
Systems Biology in Immunology – A Computational Modeling Perspective
Germain, Ronald N.; Meier-Schellersheim, Martin; Nita-Lazar, Aleksandra; Fraser, Iain D. C.
2011-01-01
Systems biology is an emerging discipline that combines high-content, multiplexed measurements with informatic and computational modeling methods to better understand biological function at various scales. Here we present a detailed review of the methods used to create computational models and conduct simulations of immune function, We provide descriptions of the key data gathering techniques employed to generate the quantitative and qualitative data required for such modeling and simulation and summarize the progress to date in applying these tools and techniques to questions of immunological interest, including infectious disease. We include comments on what insights modeling can provide that complement information obtained from the more familiar experimental discovery methods used by most investigators and why quantitative methods are needed to eventually produce a better understanding of immune system operation in health and disease. PMID:21219182
... Aim for a Healthy Weight » Healthy Weight Tools » BMI Calculator » Body Mass Index Table 1 Home Assessing ... Eat Right Be Physically Active Healthy Weight Tools BMI Calculator Menu Plans Portion Distortion Key Recommendations Healthy ...
CDinFusion – Submission-Ready, On-Line Integration of Sequence and Contextual Data
Hankeln, Wolfgang; Wendel, Norma Johanna; Gerken, Jan; Waldmann, Jost; Buttigieg, Pier Luigi; Kostadinov, Ivaylo; Kottmann, Renzo; Yilmaz, Pelin; Glöckner, Frank Oliver
2011-01-01
State of the art (DNA) sequencing methods applied in “Omics” studies grant insight into the ‘blueprints’ of organisms from all domains of life. Sequencing is carried out around the globe and the data is submitted to the public repositories of the International Nucleotide Sequence Database Collaboration. However, the context in which these studies are conducted often gets lost, because experimental data, as well as information about the environment are rarely submitted along with the sequence data. If these contextual or metadata are missing, key opportunities of comparison and analysis across studies and habitats are hampered or even impossible. To address this problem, the Genomic Standards Consortium (GSC) promotes checklists and standards to better describe our sequence data collection and to promote the capturing, exchange and integration of sequence data with contextual data. In a recent community effort the GSC has developed a series of recommendations for contextual data that should be submitted along with sequence data. To support the scientific community to significantly enhance the quality and quantity of contextual data in the public sequence data repositories, specialized software tools are needed. In this work we present CDinFusion, a web-based tool to integrate contextual and sequence data in (Multi)FASTA format prior to submission. The tool is open source and available under the Lesser GNU Public License 3. A public installation is hosted and maintained at the Max Planck Institute for Marine Microbiology at http://www.megx.net/cdinfusion. The tool may also be installed locally using the open source code available at http://code.google.com/p/cdinfusion. PMID:21935468
Heart rate variability in normal and pathological sleep.
Tobaldini, Eleonora; Nobili, Lino; Strada, Silvia; Casali, Karina R; Braghiroli, Alberto; Montano, Nicola
2013-10-16
Sleep is a physiological process involving different biological systems, from molecular to organ level; its integrity is essential for maintaining health and homeostasis in human beings. Although in the past sleep has been considered a state of quiet, experimental and clinical evidences suggest a noteworthy activation of different biological systems during sleep. A key role is played by the autonomic nervous system (ANS), whose modulation regulates cardiovascular functions during sleep onset and different sleep stages. Therefore, an interest on the evaluation of autonomic cardiovascular control in health and disease is growing by means of linear and non-linear heart rate variability (HRV) analyses. The application of classical tools for ANS analysis, such as HRV during physiological sleep, showed that the rapid eye movement (REM) stage is characterized by a likely sympathetic predominance associated with a vagal withdrawal, while the opposite trend is observed during non-REM sleep. More recently, the use of non-linear tools, such as entropy-derived indices, have provided new insight on the cardiac autonomic regulation, revealing for instance changes in the cardiovascular complexity during REM sleep, supporting the hypothesis of a reduced capability of the cardiovascular system to deal with stress challenges. Interestingly, different HRV tools have been applied to characterize autonomic cardiac control in different pathological conditions, from neurological sleep disorders to sleep disordered breathing (SDB). In summary, linear and non-linear analysis of HRV are reliable approaches to assess changes of autonomic cardiac modulation during sleep both in health and diseases. The use of these tools could provide important information of clinical and prognostic relevance.
Study of AMPK-Regulated Metabolic Fluxes in Neurons Using the Seahorse XFe Analyzer.
Marinangeli, Claudia; Kluza, Jérome; Marchetti, Philippe; Buée, Luc; Vingtdeux, Valérie
2018-01-01
AMP-activated protein kinase (AMPK) is the intracellular master energy sensor and metabolic regulator. AMPK is involved in cell energy homeostasis through the regulation of glycolytic flux and mitochondrial biogenesis. Interestingly, metabolic dysfunctions and AMPK deregulations are observed in many neurodegenerative diseases, including Alzheimer's. While these deregulations could play a key role in the development of these diseases, the study of metabolic fluxes has remained quite challenging and time-consuming. In this chapter, we describe the Seahorse XFe respirometry assay as a fundamental experimental tool to investigate the role of AMPK in controlling and modulating cell metabolic fluxes in living and intact differentiated primary neurons. The Seahorse XFe respirometry assay allows the real-time monitoring of glycolytic flux and mitochondrial respiration from different kind of cells, tissues, and isolated mitochondria. Here, we specify a protocol optimized for primary neuronal cells using several energy substrates such as glucose, pyruvate, lactate, glutamine, and ketone bodies. Nevertheless, this protocol can easily be adapted to monitor metabolic fluxes from other types of cells, tissues, or isolated mitochondria by taking into account the notes proposed for each key step of this assay.
Yu, Qiang; Wu, Honghui; Wang, Zhengwen; Flynn, Dan F. B.; Yang, Hao; Lü, Fumei; Smith, Melinda; Han, Xingguo
2015-01-01
Limitation of disturbances, such as grazing and fire, is a key tool for nature reserve management and ecological restoration. While the role of these disturbances in shaping ecosystem structure and functioning has been intensively studied, less is known about the consequences of long-term prevention of grazing and fire. Based on a 31-year study, we show that relative biomass of the dominant grass, Leymus chinensis, of grasslands in northern China declined dramatically, but only after 21 years of exclusion of fire and grazing. However, aboveground net primary productivity (ANPP) did not decline accordingly due to compensatory responses of several subdominant grass species. The decline in dominance of L. chinensis was not related to gradually changing climate during the same period, whereas experimentally imposed litter removal (simulating fire), mowing (simulating grazing), fire and moderate grazing enhanced dominance of L. chinensis significantly. Thus, our findings show that disturbances can be critical to maintain the dominance of key grass species in semiarid grassland, but that the collapse of a dominant species does not necessarily result in significant change in ANPP if there are species in the community capable of compensating for loss of a dominant. PMID:26388168
Yu, Qiang; Wu, Honghui; Wang, Zhengwen; Flynn, Dan F B; Yang, Hao; Lü, Fumei; Smith, Melinda; Han, Xingguo
2015-09-21
Limitation of disturbances, such as grazing and fire, is a key tool for nature reserve management and ecological restoration. While the role of these disturbances in shaping ecosystem structure and functioning has been intensively studied, less is known about the consequences of long-term prevention of grazing and fire. Based on a 31-year study, we show that relative biomass of the dominant grass, Leymus chinensis, of grasslands in northern China declined dramatically, but only after 21 years of exclusion of fire and grazing. However, aboveground net primary productivity (ANPP) did not decline accordingly due to compensatory responses of several subdominant grass species. The decline in dominance of L. chinensis was not related to gradually changing climate during the same period, whereas experimentally imposed litter removal (simulating fire), mowing (simulating grazing), fire and moderate grazing enhanced dominance of L. chinensis significantly. Thus, our findings show that disturbances can be critical to maintain the dominance of key grass species in semiarid grassland, but that the collapse of a dominant species does not necessarily result in significant change in ANPP if there are species in the community capable of compensating for loss of a dominant.
Population-based 3D genome structure analysis reveals driving forces in spatial genome organization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tjong, Harianto; Li, Wenyuan; Kalhor, Reza
Conformation capture technologies (e.g., Hi-C) chart physical interactions between chromatin regions on a genome-wide scale. However, the structural variability of the genome between cells poses a great challenge to interpreting ensemble-averaged Hi-C data, particularly for long-range and interchromosomal interactions. Here, we present a probabilistic approach for deconvoluting Hi-C data into a model population of distinct diploid 3D genome structures, which facilitates the detection of chromatin interactions likely to co-occur in individual cells. Here, our approach incorporates the stochastic nature of chromosome conformations and allows a detailed analysis of alternative chromatin structure states. For example, we predict and experimentally confirm themore » presence of large centromere clusters with distinct chromosome compositions varying between individual cells. The stability of these clusters varies greatly with their chromosome identities. We show that these chromosome-specific clusters can play a key role in the overall chromosome positioning in the nucleus and stabilizing specific chromatin interactions. By explicitly considering genome structural variability, our population-based method provides an important tool for revealing novel insights into the key factors shaping the spatial genome organization.« less
Computational Tools for Interpreting Ion Channel pH-Dependence.
Sazanavets, Ivan; Warwicker, Jim
2015-01-01
Activity in many biological systems is mediated by pH, involving proton titratable groups with pKas in the relevant pH range. Experimental analysis of pH-dependence in proteins focusses on particular sidechains, often with mutagenesis of histidine, due to its pKa near to neutral pH. The key question for algorithms that predict pKas is whether they are sufficiently accurate to effectively narrow the search for molecular determinants of pH-dependence. Through analysis of inwardly rectifying potassium (Kir) channels and acid-sensing ion channels (ASICs), mutational effects on pH-dependence are probed, distinguishing between groups described as pH-coupled or pH-sensor. Whereas mutation can lead to a shift in transition pH between open and closed forms for either type of group, only for pH-sensor groups does mutation modulate the amplitude of the transition. It is shown that a hybrid Finite Difference Poisson-Boltzmann (FDPB) - Debye-Hückel continuum electrostatic model can filter mutation candidates, providing enrichment for key pH-coupled and pH-sensor residues in both ASICs and Kir channels, in comparison with application of FDPB alone.
Population-based 3D genome structure analysis reveals driving forces in spatial genome organization
Tjong, Harianto; Li, Wenyuan; Kalhor, Reza; ...
2016-03-07
Conformation capture technologies (e.g., Hi-C) chart physical interactions between chromatin regions on a genome-wide scale. However, the structural variability of the genome between cells poses a great challenge to interpreting ensemble-averaged Hi-C data, particularly for long-range and interchromosomal interactions. Here, we present a probabilistic approach for deconvoluting Hi-C data into a model population of distinct diploid 3D genome structures, which facilitates the detection of chromatin interactions likely to co-occur in individual cells. Here, our approach incorporates the stochastic nature of chromosome conformations and allows a detailed analysis of alternative chromatin structure states. For example, we predict and experimentally confirm themore » presence of large centromere clusters with distinct chromosome compositions varying between individual cells. The stability of these clusters varies greatly with their chromosome identities. We show that these chromosome-specific clusters can play a key role in the overall chromosome positioning in the nucleus and stabilizing specific chromatin interactions. By explicitly considering genome structural variability, our population-based method provides an important tool for revealing novel insights into the key factors shaping the spatial genome organization.« less
Computational Tools for Interpreting Ion Channel pH-Dependence
Sazanavets, Ivan; Warwicker, Jim
2015-01-01
Activity in many biological systems is mediated by pH, involving proton titratable groups with pKas in the relevant pH range. Experimental analysis of pH-dependence in proteins focusses on particular sidechains, often with mutagenesis of histidine, due to its pKa near to neutral pH. The key question for algorithms that predict pKas is whether they are sufficiently accurate to effectively narrow the search for molecular determinants of pH-dependence. Through analysis of inwardly rectifying potassium (Kir) channels and acid-sensing ion channels (ASICs), mutational effects on pH-dependence are probed, distinguishing between groups described as pH-coupled or pH-sensor. Whereas mutation can lead to a shift in transition pH between open and closed forms for either type of group, only for pH-sensor groups does mutation modulate the amplitude of the transition. It is shown that a hybrid Finite Difference Poisson-Boltzmann (FDPB) – Debye-Hückel continuum electrostatic model can filter mutation candidates, providing enrichment for key pH-coupled and pH-sensor residues in both ASICs and Kir channels, in comparison with application of FDPB alone. PMID:25915903
Mechanistic Design of Chemically Diverse Polymers with Applications in Oral Drug Delivery.
Mosquera-Giraldo, Laura I; Borca, Carlos H; Meng, Xiangtao; Edgar, Kevin J; Slipchenko, Lyudmila V; Taylor, Lynne S
2016-11-14
Polymers play a key role in stabilizing amorphous drug formulations, a recent strategy employed to improve solubility and bioavailability of drugs delivered orally. However, the molecular mechanism of stabilization is unclear, therefore, the rational design of new crystallization-inhibiting excipients remains a substantial challenge. This article presents a combined experimental and computational approach to elucidate the molecular features that improve the effectiveness of cellulose polymers as solution crystallization inhibitors, a crucial first step toward their rational design. Polymers with chemically diverse substituents including carboxylic acids, esters, ethers, alcohols, amides, amines, and sulfides were synthesized. Measurements of nucleation induction times of the model drug, telaprevir, show that the only effective polymers contained carboxylate groups in combination with an optimal hydrocarbon chain length. Computational results indicate that polymer conformation as well as solvation free energy are important determinants of effectiveness at inhibiting crystallization and show that simulations are a promising predictive tool in the screening of polymers. This study suggests that polymers need to have an adequate hydrophilicity to promote solvation in an aqueous environment, and sufficient hydrophobic regions to drive interactions with the drug. Particularly, the right balance between key substituent groups and lengths of hydrocarbon side chains is needed to create effective materials.
Impact of ADC parameters on linear optical sampling systems
NASA Astrophysics Data System (ADS)
Nguyen, Trung-Hien; Gay, Mathilde; Gomez-Agis, Fausto; Lobo, Sébastien; Sentieys, Olivier; Simon, Jean-Claude; Peucheret, Christophe; Bramerie, Laurent
2017-11-01
Linear optical sampling (LOS), based on the coherent photodetection of an optical signal under test with a low repetition-rate signal originating from a pulsed local oscillator (LO), enables the characterization of the temporal electric field of optical sources. Thanks to this technique, low-speed photodetectors and analog-to-digital converters (ADCs) can be integrated in the LOS system providing a cost-effective tool for characterizing high-speed signals. However, the impact of photodetector and ADC parameters on such LOS systems has not been explored in detail so far. These parameters, including the integration time of the track-and-hold function, the effective number of bits (ENOB) of the ADC, as well as the combined limited bandwidth of the photodetector and ADC are experimentally and numerically investigated in a LOS system for the first time. More specifically, by reconstructing 10-Gbit/s non-return-to-zero on-off keying (NRZ-OOK) and 10-Gbaud NRZ-quadrature phase-shift-keying (QPSK) signals, it is shown that a short integration time provides a better recovered signal fidelity. Furthermore, an ENOB of 6 bits and an ADC bandwidth normalized to the sampling rate of 2.8 are found to be sufficient in order to reliably monitor the considered signals.
SYRCLE’s risk of bias tool for animal studies
2014-01-01
Background Systematic Reviews (SRs) of experimental animal studies are not yet common practice, but awareness of the merits of conducting such SRs is steadily increasing. As animal intervention studies differ from randomized clinical trials (RCT) in many aspects, the methodology for SRs of clinical trials needs to be adapted and optimized for animal intervention studies. The Cochrane Collaboration developed a Risk of Bias (RoB) tool to establish consistency and avoid discrepancies in assessing the methodological quality of RCTs. A similar initiative is warranted in the field of animal experimentation. Methods We provide an RoB tool for animal intervention studies (SYRCLE’s RoB tool). This tool is based on the Cochrane RoB tool and has been adjusted for aspects of bias that play a specific role in animal intervention studies. To enhance transparency and applicability, we formulated signalling questions to facilitate judgment. Results The resulting RoB tool for animal studies contains 10 entries. These entries are related to selection bias, performance bias, detection bias, attrition bias, reporting bias and other biases. Half these items are in agreement with the items in the Cochrane RoB tool. Most of the variations between the two tools are due to differences in design between RCTs and animal studies. Shortcomings in, or unfamiliarity with, specific aspects of experimental design of animal studies compared to clinical studies also play a role. Conclusions SYRCLE’s RoB tool is an adapted version of the Cochrane RoB tool. Widespread adoption and implementation of this tool will facilitate and improve critical appraisal of evidence from animal studies. This may subsequently enhance the efficiency of translating animal research into clinical practice and increase awareness of the necessity of improving the methodological quality of animal studies. PMID:24667063
Freely Accessible Chemical Database Resources of Compounds for in Silico Drug Discovery.
Yang, JingFang; Wang, Di; Jia, Chenyang; Wang, Mengyao; Hao, GeFei; Yang, GuangFu
2018-05-07
In silico drug discovery has been proved to be a solidly established key component in early drug discovery. However, this task is hampered by the limitation of quantity and quality of compound databases for screening. In order to overcome these obstacles, freely accessible database resources of compounds have bloomed in recent years. Nevertheless, how to choose appropriate tools to treat these freely accessible databases are crucial. To the best of our knowledge, this is the first systematic review on this issue. The existed advantages and drawbacks of chemical databases were analyzed and summarized based on the collected six categories of freely accessible chemical databases from literature in this review. Suggestions on how and in which conditions the usage of these databases could be reasonable were provided. Tools and procedures for building 3D structure chemical libraries were also introduced. In this review, we described the freely accessible chemical database resources for in silico drug discovery. In particular, the chemical information for building chemical database appears as attractive resources for drug design to alleviate experimental pressure. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Review of hardware-in-the-loop simulation and its prospects in the automotive area
NASA Astrophysics Data System (ADS)
Fathy, Hosam K.; Filipi, Zoran S.; Hagena, Jonathan; Stein, Jeffrey L.
2006-05-01
Hardware-in-the-loop (HIL) simulation is rapidly evolving from a control prototyping tool to a system modeling, simulation, and synthesis paradigm synergistically combining many advantages of both physical and virtual prototyping. This paper provides a brief overview of the key enablers and numerous applications of HIL simulation, focusing on its metamorphosis from a control validation tool into a system development paradigm. It then describes a state-of-the art engine-in-the-loop (EIL) simulation facility that highlights the use of HIL simulation for the system-level experimental evaluation of powertrain interactions and development of strategies for clean and efficient propulsion. The facility comprises a real diesel engine coupled to accurate real-time driver, driveline, and vehicle models through a highly responsive dynamometer. This enables the verification of both performance and fuel economy predictions of different conventional and hybrid powertrains. Furthermore, the facility can both replicate the highly dynamic interactions occurring within a real powertrain and measure their influence on transient emissions and visual signature through state-of-the-art instruments. The viability of this facility for integrated powertrain system development is demonstrated through a case study exploring the development of advanced High Mobility Multipurpose Wheeled Vehicle (HMMWV) powertrains.
Microstructure Modeling of 3rd Generation Disk Alloy
NASA Technical Reports Server (NTRS)
Jou, Herng-Jeng
2008-01-01
The objective of this initiative, funded by NASA's Aviation Safety Program, is to model, validate, and predict, with high fidelity, the microstructural evolution of third-generation high-refractory Ni-based disc superalloys during heat treating and service conditions. This initiative is a natural extension of the DARPA-AIM (Accelerated Insertion of Materials) initiative with GE/Pratt-Whitney and with other process simulation tools. Strong collaboration with the NASA Glenn Research Center (GRC) is a key component of this initiative and the focus of this program is on industrially relevant disk alloys and heat treatment processes identified by GRC. Employing QuesTek s Computational Materials Dynamics technology and PrecipiCalc precipitation simulator, physics-based models are being used to achieve high predictive accuracy and precision. Combining these models with experimental data and probabilistic analysis, "virtual alloy design" can be performed. The predicted microstructures can be optimized to promote desirable features and concurrently eliminate nondesirable phases that can limit the reliability and durability of the alloys. The well-calibrated and well-integrated software tools that are being applied under the proposed program will help gas turbine disk alloy manufacturers, processing facilities, and NASA, to efficiently and effectively improve the performance of current and future disk materials.
Supporting and structuring "contributing student pedagogy" in Computer Science curricula
NASA Astrophysics Data System (ADS)
Falkner, Katrina; Falkner, Nickolas J. G.
2012-12-01
Contributing student pedagogy (CSP) builds upon social constructivist and community-based learning principles to create engaging and productive learning experiences. What makes CSP different from other, related, learning approaches is that it involves students both learning from and also explicitly valuing the contributions of other students. The creation of such a learning community builds upon established educational psychology that encourages deep learning, reflection and engagement. Our school has recently completed a review and update of its curriculum, incorporating student content-creation and collaboration into the design of key courses across the curriculum. Our experiences, based on several years of experimentation and development, support CSP-based curriculum design to reinforce the value of the student perspective, the clear description of their own transformative pathway to knowledge and the importance of establishing student-to-student networks in which students are active and willing participants. In this paper, we discuss the tools and approaches that we have employed to guide, support and structure student collaboration across a range of courses and year levels. By providing an account of our intentions, our approaches and tools, we hope to provide useful and transferrable knowledge that can be readily used by other academics who are considering this approach.
Wang, Qinghua; Ross, Karen E; Huang, Hongzhan; Ren, Jia; Li, Gang; Vijay-Shanker, K; Wu, Cathy H; Arighi, Cecilia N
2017-01-01
Post-translational modifications (PTMs) are one of the main contributors to the diversity of proteoforms in the proteomic landscape. In particular, protein phosphorylation represents an essential regulatory mechanism that plays a role in many biological processes. Protein kinases, the enzymes catalyzing this reaction, are key participants in metabolic and signaling pathways. Their activation or inactivation dictate downstream events: what substrates are modified and their subsequent impact (e.g., activation state, localization, protein-protein interactions (PPIs)). The biomedical literature continues to be the main source of evidence for experimental information about protein phosphorylation. Automatic methods to bring together phosphorylation events and phosphorylation-dependent PPIs can help to summarize the current knowledge and to expose hidden connections. In this chapter, we demonstrate two text mining tools, RLIMS-P and eFIP, for the retrieval and extraction of kinase-substrate-site data and phosphorylation-dependent PPIs from the literature. These tools offer several advantages over a literature search in PubMed as their results are specific for phosphorylation. RLIMS-P and eFIP results can be sorted, organized, and viewed in multiple ways to answer relevant biological questions, and the protein mentions are linked to UniProt identifiers.
Rogalewicz, Vladimír; Barták, Miroslav
The paper summarizes the criticisms of the QALY concept utilization in health-economic evaluations that has been growing stronger in the last years. Despite of its limitations, the QALY concept has been routinely used in many countries incl. the Czech Republic. However, some states disapproved QALYs as an optimizing criterion at the level of their political decisions. The critical reflection concerns both the theoretical and the experimental issues. Based on a literary review, fundamental arguments against the concept are summarized, and a synthesis of material objections is presented. The critical arguments focus on the foundations of the QALY concept in the economic theory, some ethical principles, inconsistencies and technical imperfections of the quality-of-life measurement tools used in QALY calculations, the substitution rule, differences between various diagnoses, and disregarding some other important parameters. As a whole, the critics´ arguments can be judged as quite strong. The future will show whether the critical arguments summarized in this paper will lead to a development of alternative tools that have a potential of eliminating imperfections in QALYs, and consequently provide more complex data for the decision process.Key words: cost-effectiveness - health technology assessment - HTA - QALY - utility measure for medical interventions.
Computational crystallization.
Altan, Irem; Charbonneau, Patrick; Snell, Edward H
2016-07-15
Crystallization is a key step in macromolecular structure determination by crystallography. While a robust theoretical treatment of the process is available, due to the complexity of the system, the experimental process is still largely one of trial and error. In this article, efforts in the field are discussed together with a theoretical underpinning using a solubility phase diagram. Prior knowledge has been used to develop tools that computationally predict the crystallization outcome and define mutational approaches that enhance the likelihood of crystallization. For the most part these tools are based on binary outcomes (crystal or no crystal), and the full information contained in an assembly of crystallization screening experiments is lost. The potential of this additional information is illustrated by examples where new biological knowledge can be obtained and where a target can be sub-categorized to predict which class of reagents provides the crystallization driving force. Computational analysis of crystallization requires complete and correctly formatted data. While massive crystallization screening efforts are under way, the data available from many of these studies are sparse. The potential for this data and the steps needed to realize this potential are discussed. Copyright © 2016 Elsevier Inc. All rights reserved.
SmedGD 2.0: The Schmidtea mediterranea genome database
Robb, Sofia M.C.; Gotting, Kirsten; Ross, Eric; Sánchez Alvarado, Alejandro
2016-01-01
Planarians have emerged as excellent models for the study of key biological processes such as stem cell function and regulation, axial polarity specification, regeneration, and tissue homeostasis among others. The most widely used organism for these studies is the free-living flatworm Schmidtea mediterranea. In 2007, the Schmidtea mediterranea Genome Database (SmedGD) was first released to provide a much needed resource for the small, but growing planarian community. SmedGD 1.0 has been a depository for genome sequence, a draft assembly, and related experimental data (e.g., RNAi phenotypes, in situ hybridization images, and differential gene expression results). We report here a comprehensive update to SmedGD (SmedGD 2.0) that aims to expand its role as an interactive community resource. The new database includes more recent, and up-to-date transcription data, provides tools that enhance interconnectivity between different genome assemblies and transcriptomes, including next generation assemblies for both the sexual and asexual biotypes of S. mediterranea. SmedGD 2.0 (http://smedgd.stowers.org) not only provides significantly improved gene annotations, but also tools for data sharing, attributes that will help both the planarian and biomedical communities to more efficiently mine the genomics and transcriptomics of S. mediterranea. PMID:26138588
Kostal, Jakub; Voutchkova-Kostal, Adelina
2016-01-19
Using computer models to accurately predict toxicity outcomes is considered to be a major challenge. However, state-of-the-art computational chemistry techniques can now be incorporated in predictive models, supported by advances in mechanistic toxicology and the exponential growth of computing resources witnessed over the past decade. The CADRE (Computer-Aided Discovery and REdesign) platform relies on quantum-mechanical modeling of molecular interactions that represent key biochemical triggers in toxicity pathways. Here, we present an external validation exercise for CADRE-SS, a variant developed to predict the skin sensitization potential of commercial chemicals. CADRE-SS is a hybrid model that evaluates skin permeability using Monte Carlo simulations, assigns reactive centers in a molecule and possible biotransformations via expert rules, and determines reactivity with skin proteins via quantum-mechanical modeling. The results were promising with an overall very good concordance of 93% between experimental and predicted values. Comparison to performance metrics yielded by other tools available for this endpoint suggests that CADRE-SS offers distinct advantages for first-round screenings of chemicals and could be used as an in silico alternative to animal tests where permissible by legislative programs.
POD evaluation using simulation: A phased array UT case on a complex geometry part
NASA Astrophysics Data System (ADS)
Dominguez, Nicolas; Reverdy, Frederic; Jenson, Frederic
2014-02-01
The use of Probability of Detection (POD) for NDT performances demonstration is a key link in products lifecycle management. The POD approach is to apply the given NDT procedure on a series of known flaws to estimate the probability to detect with respect to the flaw size. A POD is relevant if and only if NDT operations are carried out within the range of variability authorized by the procedure. Such experimental campaigns require collection of large enough datasets to cover the range of variability with sufficient occurrences to build a reliable POD statistics, leading to expensive costs to get POD curves. In the last decade research activities have been led in the USA with the MAPOD group and later in Europe with the SISTAE and PICASSO projects based on the idea to use models and simulation tools to feed POD estimations. This paper proposes an example of application of POD using simulation on the inspection procedure of a complex -full 3D- geometry part using phased arrays ultrasonic testing. It illustrates the methodology and the associated tools developed in the CIVA software. The paper finally provides elements of further progress in the domain.
Bouhifd, Mounir; Beger, Richard; Flynn, Thomas; Guo, Lining; Harris, Georgina; Hogberg, Helena; Kaddurah-Daouk, Rima; Kamp, Hennicke; Kleensang, Andre; Maertens, Alexandra; Odwin-DaCosta, Shelly; Pamies, David; Robertson, Donald; Smirnova, Lena; Sun, Jinchun; Zhao, Liang; Hartung, Thomas
2017-01-01
Summary Metabolomics promises a holistic phenotypic characterization of biological responses to toxicants. This technology is based on advanced chemical analytical tools with reasonable throughput, including mass-spectroscopy and NMR. Quality assurance, however – from experimental design, sample preparation, metabolite identification, to bioinformatics data-mining – is urgently needed to assure both quality of metabolomics data and reproducibility of biological models. In contrast to microarray-based transcriptomics, where consensus on quality assurance and reporting standards has been fostered over the last two decades, quality assurance of metabolomics is only now emerging. Regulatory use in safety sciences, and even proper scientific use of these technologies, demand quality assurance. In an effort to promote this discussion, an expert workshop discussed the quality assurance needs of metabolomics. The goals for this workshop were 1) to consider the challenges associated with metabolomics as an emerging science, with an emphasis on its application in toxicology and 2) to identify the key issues to be addressed in order to establish and implement quality assurance procedures in metabolomics-based toxicology. Consensus has still to be achieved regarding best practices to make sure sound, useful, and relevant information is derived from these new tools. PMID:26536290
Enhancing Biomedical Text Summarization Using Semantic Relation Extraction
Shang, Yue; Li, Yanpeng; Lin, Hongfei; Yang, Zhihao
2011-01-01
Automatic text summarization for a biomedical concept can help researchers to get the key points of a certain topic from large amount of biomedical literature efficiently. In this paper, we present a method for generating text summary for a given biomedical concept, e.g., H1N1 disease, from multiple documents based on semantic relation extraction. Our approach includes three stages: 1) We extract semantic relations in each sentence using the semantic knowledge representation tool SemRep. 2) We develop a relation-level retrieval method to select the relations most relevant to each query concept and visualize them in a graphic representation. 3) For relations in the relevant set, we extract informative sentences that can interpret them from the document collection to generate text summary using an information retrieval based method. Our major focus in this work is to investigate the contribution of semantic relation extraction to the task of biomedical text summarization. The experimental results on summarization for a set of diseases show that the introduction of semantic knowledge improves the performance and our results are better than the MEAD system, a well-known tool for text summarization. PMID:21887336
MoKey: A versatile exergame creator for everyday usage.
Eckert, Martina; López, Marcos; Lázaro, Carlos; Meneses, Juan
2017-11-27
Currently, virtual applications for physical exercises are highly appreciated as rehabilitation instruments. This article presents a middleware called "MoKey" (Motion Keyboard), which converts standard off-the-shelf software into exergames (exercise games). A configurable set of gestures, captured by a motion capture camera, is translated into the key strokes required by the chosen software. The present study assesses the tool regarding usability and viability on a heterogeneous group of 11 participants, aged 5 to 51, with moderate to severe disabilities, and mostly bound to a wheelchair. In comparison with FAAST (The Flexible Action and Articulated Skeleton Toolkit), MoKey achieved better results in terms of ease of use and computational load. The viability as an exergame creator tool was proven with help of four applications (PowerPoint®, e-book reader, Skype®, and Tetris). Success rates of up to 91% have been achieved, subjective perception was rated with 4.5 points (from 0-5). The middleware provides increased motivation due to the use of favorite software and the advantage of exploiting it for exercise. Used together with communication software or online games, social inclusion can be stimulated. The therapists can employ the tool to monitor the correctness and progress of the exercises.
Westergaard, Gregory C; Liv, Chanya; Rocca, Andrea M; Cleveland, Allison; Suomi, Stephen J
2004-01-01
This research examined exchange and value attribution in tufted capuchin monkeys ( Cebus apella). We presented subjects with opportunities to obtain various foods and a tool from an experimenter in exchange for the foods or tool in the subjects' possession. The times elapsed before the first chow biscuits were expelled and/or an exchange took place were recorded as the dependent measures. Laboratory chow biscuits, grapes, apples, and a metal bolt (a tool used to probe for syrup) were used as experimental stimuli. The subjects demonstrated the ability to recognize that exchanges could occur when an experimenter was present with a desirable food. Results indicate that subjects exhibited significant variation in their willingness to barter based upon the types of foods that were both in their possession and presented by the experimenter. Subjects more readily traded chow biscuits for fruit, and more readily traded apples for grapes than grapes for apples. During the exchange of tools and food, the subjects preferred the following in descending order when the probing apparatus was baited with sweet syrup: grapes, metal bolts, and chow biscuits. However when the apparatus was not baited, the values changed to the following in descending order: grapes, chow, and metal bolts. These results indicate that tufted capuchins recognize opportunities to exchange and engage in a simple barter system whereby low-valued foods are readily traded for more highly valued food. Furthermore, these capuchins demonstrate that their value for a tool changes depending upon its utility.
"Key to Freshwater Algae": A Web-Based Tool to Enhance Understanding of Microscopic Biodiversity
ERIC Educational Resources Information Center
Shayler, Hannah A.; Siver, Peter A.
2006-01-01
The Freshwater Ecology Laboratory at Connecticut College has developed an interactive, Web-based identification key to freshwater algal genera using the Lucid Professional and Lucid 3 software developed by the Centre for Biological Information Technology at the University of Queensland, Brisbane, Australia. The "Key to Freshwater Algae"…
Another Strategy for Teaching Histology to A&P Students: Classification versus Memorization.
ERIC Educational Resources Information Center
Bavis, Ryan W.; Seveyka, Jerred; Shigeoka, Cassie A.
2000-01-01
Defines dichotomous keys as common learning tools based on identification rather than memorization. Provides an example of a dichotomous key developed for introducing histology in human anatomy and physiology (A&P) courses and explains how students can use the dichotomous key. Discusses the goals of the exercises and the process of…
ERIC Educational Resources Information Center
Klein, P.; Hirth, M.; Gröber, S.; Kuhn, J.; Müller, A.
2014-01-01
Smartphones and tablets are used as experimental tools and for quantitative measurements in two traditional laboratory experiments for undergraduate physics courses. The Doppler effect is analyzed and the speed of sound is determined with an accuracy of about 5% using ultrasonic frequency and two smartphones, which serve as rotating sound emitter…
3D TRUMP - A GBI launch window tool
NASA Astrophysics Data System (ADS)
Karels, Steven N.; Hancock, John; Matchett, Gary
3D TRUMP is a novel GPS and communicatons-link software analysis tool developed for the SDIO's Ground-Based Interceptor (GBI) program. 3D TRUMP uses a computationally efficient analysis tool which provides key GPS-based performance measures for an entire GBI mission's reentry vehicle and interceptor trajectories. Algorithms and sample outputs are presented.
Early Learning Assessment Innovation in South Africa: A Locally Appropriate Monitoring Tool
ERIC Educational Resources Information Center
Dawes, Andrew; Biersteker, Linda; Girdwood, Elizabeth; Snelling, Matthew; Tredoux, C. G.
2018-01-01
In 2015, Innovation Edge commissioned the development of South Africa's first national-level preschool child assessment tool. The project's key innovations were that the tool should fairly assess children from across the cultural and socio-economic spectrum, be inexpensive in terms of equipment and administration costs, and be administered in…
Development and Evaluation of Computer-Based Laboratory Practical Learning Tool
ERIC Educational Resources Information Center
Gandole, Y. B.
2006-01-01
Effective evaluation of educational software is a key issue for successful introduction of advanced tools in the curriculum. This paper details to developing and evaluating a tool for computer assisted learning of science laboratory courses. The process was based on the generic instructional system design model. Various categories of educational…
SS-mPMG and SS-GA: tools for finding pathways and dynamic simulation of metabolic networks.
Katsuragi, Tetsuo; Ono, Naoaki; Yasumoto, Keiichi; Altaf-Ul-Amin, Md; Hirai, Masami Y; Sriyudthsak, Kansuporn; Sawada, Yuji; Yamashita, Yui; Chiba, Yukako; Onouchi, Hitoshi; Fujiwara, Toru; Naito, Satoshi; Shiraishi, Fumihide; Kanaya, Shigehiko
2013-05-01
Metabolomics analysis tools can provide quantitative information on the concentration of metabolites in an organism. In this paper, we propose the minimum pathway model generator tool for simulating the dynamics of metabolite concentrations (SS-mPMG) and a tool for parameter estimation by genetic algorithm (SS-GA). SS-mPMG can extract a subsystem of the metabolic network from the genome-scale pathway maps to reduce the complexity of the simulation model and automatically construct a dynamic simulator to evaluate the experimentally observed behavior of metabolites. Using this tool, we show that stochastic simulation can reproduce experimentally observed dynamics of amino acid biosynthesis in Arabidopsis thaliana. In this simulation, SS-mPMG extracts the metabolic network subsystem from published databases. The parameters needed for the simulation are determined using a genetic algorithm to fit the simulation results to the experimental data. We expect that SS-mPMG and SS-GA will help researchers to create relevant metabolic networks and carry out simulations of metabolic reactions derived from metabolomics data.
Tools for Embedded Computing Systems Software
NASA Technical Reports Server (NTRS)
1978-01-01
A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.
Continuous variable quantum key distribution with modulated entangled states.
Madsen, Lars S; Usenko, Vladyslav C; Lassen, Mikael; Filip, Radim; Andersen, Ulrik L
2012-01-01
Quantum key distribution enables two remote parties to grow a shared key, which they can use for unconditionally secure communication over a certain distance. The maximal distance depends on the loss and the excess noise of the connecting quantum channel. Several quantum key distribution schemes based on coherent states and continuous variable measurements are resilient to high loss in the channel, but are strongly affected by small amounts of channel excess noise. Here we propose and experimentally address a continuous variable quantum key distribution protocol that uses modulated fragile entangled states of light to greatly enhance the robustness to channel noise. We experimentally demonstrate that the resulting quantum key distribution protocol can tolerate more noise than the benchmark set by the ideal continuous variable coherent state protocol. Our scheme represents a very promising avenue for extending the distance for which secure communication is possible.
The experimental verification on the shear bearing capacity of exposed steel column foot
NASA Astrophysics Data System (ADS)
Xijin, LIU
2017-04-01
In terms of the shear bearing capacity of the exposed steel column foot, there are many researches both home and abroad. However, the majority of the researches are limited to the theoretical analysis sector and few of them make the experimental analysis. In accordance with the prototype of an industrial plant in Beijing, this paper designs the experimental model. The experimental model is composed of six steel structural members in two groups, with three members without shear key and three members with shear key. The paper checks the shear bearing capacity of two groups respectively under different axial forces. The experiment shows: The anchor bolt of the exposed steel column foot features relatively large shear bearing capacity which could not be neglected. The results deducted through calculation methods proposed by this paper under two situations match the experimental results in terms of the shear bearing capacity of the steel column foot. Besides, it also proposed suggestions on revising the Code for Design of Steel Structure in the aspect of setting the shear key in the steel column foot.
DNA AND PROTEIN RECOVERY FROM WASHED EXPERIMENTAL STONE TOOLS
DNA residues may preserve on ancient stone tools used to process animals. We studied 24 stone tools recovered from the Bugas-Holding site in northwestern Wyoming. Nine tools that yielded DNA included five bifaces, two side scrapers, one end scraper, and one utilized flake. The...
Xiao, Chuan-Le; Chen, Xiao-Zhou; Du, Yang-Li; Sun, Xuesong; Zhang, Gong; He, Qing-Yu
2013-01-04
Mass spectrometry has become one of the most important technologies in proteomic analysis. Tandem mass spectrometry (LC-MS/MS) is a major tool for the analysis of peptide mixtures from protein samples. The key step of MS data processing is the identification of peptides from experimental spectra by searching public sequence databases. Although a number of algorithms to identify peptides from MS/MS data have been already proposed, e.g. Sequest, OMSSA, X!Tandem, Mascot, etc., they are mainly based on statistical models considering only peak-matches between experimental and theoretical spectra, but not peak intensity information. Moreover, different algorithms gave different results from the same MS data, implying their probable incompleteness and questionable reproducibility. We developed a novel peptide identification algorithm, ProVerB, based on a binomial probability distribution model of protein tandem mass spectrometry combined with a new scoring function, making full use of peak intensity information and, thus, enhancing the ability of identification. Compared with Mascot, Sequest, and SQID, ProVerB identified significantly more peptides from LC-MS/MS data sets than the current algorithms at 1% False Discovery Rate (FDR) and provided more confident peptide identifications. ProVerB is also compatible with various platforms and experimental data sets, showing its robustness and versatility. The open-source program ProVerB is available at http://bioinformatics.jnu.edu.cn/software/proverb/ .
The '3Is' of animal experimentation.
2012-05-29
Animal experimentation in scientific research is a good thing: important, increasing and often irreplaceable. Careful experimental design and reporting are at least as important as attention to welfare in ensuring that the knowledge we gain justifies using live animals as experimental tools.
Schulz, Matthias; Short, Michael D; Peters, Gregory M
2012-01-01
Water supply is a key consideration in sustainable urban planning. Ideally, detailed quantitative sustainability assessments are undertaken during the planning stage to inform the decision-making process. In reality, however, the significant time and cost associated with undertaking such detailed environmental and economic assessments is often cited as a barrier to wider implementation of these key decision support tools, particularly for decisions made at the local or regional government level. In an attempt to overcome this barrier of complexity, 4 water service providers in Melbourne, Australia, funded the development of a publicly available streamlined Environmental Sustainability Assessment Tool, which is aimed at a wide range of decision makers to assist them in broadening the type and number of water servicing options that can be considered for greenfield or backlog developments. The Environmental Sustainability Assessment Tool consists of a simple user interface and draws on life cycle inventory data to allow for rapid estimation of the environmental and economic performance of different water servicing scenarios. Scenario options can then be further prioritized by means of an interactive multicriteria analysis. The intent of this article is to identify the key issues to be considered in a streamlined sustainability assessment tool for the urban water industry, and to demonstrate the feasibility of generating accurate life cycle assessments and life cycle costings, using such a tool. We use a real-life case study example consisting of 3 separate scenarios for a planned urban development to show that this kind of tool can emulate life cycle assessments and life cycle costings outcomes obtained through more detailed studies. This simplified approach is aimed at supporting "sustainability thinking" early in the decision-making process, thereby encouraging more sustainable water and sewerage infrastructure solutions. Copyright © 2011 SETAC.
NASA Astrophysics Data System (ADS)
Equeter, Lucas; Ducobu, François; Rivière-Lorphèvre, Edouard; Abouridouane, Mustapha; Klocke, Fritz; Dehombreux, Pierre
2018-05-01
Industrial concerns arise regarding the significant cost of cutting tools in machining process. In particular, their improper replacement policy can lead either to scraps, or to early tool replacements, which would waste fine tools. ISO 3685 provides the flank wear end-of-life criterion. Flank wear is also the nominal type of wear for longest tool lifetimes in optimal cutting conditions. Its consequences include bad surface roughness and dimensional discrepancies. In order to aid the replacement decision process, several tool condition monitoring techniques are suggested. Force signals were shown in the literature to be strongly linked with tools flank wear. It can therefore be assumed that force signals are highly relevant for monitoring the condition of cutting tools and providing decision-aid information in the framework of their maintenance and replacement. The objective of this work is to correlate tools flank wear with numerically computed force signals. The present work uses a Finite Element Model with a Coupled Eulerian-Lagrangian approach. The geometry of the tool is changed for different runs of the model, in order to obtain results that are specific to a certain level of wear. The model is assessed by comparison with experimental data gathered earlier on fresh tools. Using the model at constant cutting parameters, force signals under different tool wear states are computed and provide force signals for each studied tool geometry. These signals are qualitatively compared with relevant data from the literature. At this point, no quantitative comparison could be performed on worn tools because the reviewed literature failed to provide similar studies in this material, either numerical or experimental. Therefore, further development of this work should include experimental campaigns aiming at collecting cutting forces signals and assessing the numerical results that were achieved through this work.
A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.
Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy
2016-12-01
Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.
Xu, Wenjun; Tang, Chen; Gu, Fan; Cheng, Jiajia
2017-04-01
It is a key step to remove the massive speckle noise in electronic speckle pattern interferometry (ESPI) fringe patterns. In the spatial-domain filtering methods, oriented partial differential equations have been demonstrated to be a powerful tool. In the transform-domain filtering methods, the shearlet transform is a state-of-the-art method. In this paper, we propose a filtering method for ESPI fringe patterns denoising, which is a combination of second-order oriented partial differential equation (SOOPDE) and the shearlet transform, named SOOPDE-Shearlet. Here, the shearlet transform is introduced into the ESPI fringe patterns denoising for the first time. This combination takes advantage of the fact that the spatial-domain filtering method SOOPDE and the transform-domain filtering method shearlet transform benefit from each other. We test the proposed SOOPDE-Shearlet on five experimentally obtained ESPI fringe patterns with poor quality and compare our method with SOOPDE, shearlet transform, windowed Fourier filtering (WFF), and coherence-enhancing diffusion (CEDPDE). Among them, WFF and CEDPDE are the state-of-the-art methods for ESPI fringe patterns denoising in transform domain and spatial domain, respectively. The experimental results have demonstrated the good performance of the proposed SOOPDE-Shearlet.
Carvalho, Diana P S R P; Azevedo, Isabelle C; Cruz, Giovanna K P; Mafra, Gabriela A C; Rego, Anna L C; Vitor, Allyne F; Santos, Viviane E P; Cogo, Ana L P; Ferreira Júnior, Marcos A
2017-10-01
Identifying the strategies used to promote critical thinking (CT) during undergraduate education in nursing courses. Systematic review. Five electronic databases were searched without language, publication time or geographic filters. A systematic review of the literature. Including experimental studies that considered at least one teaching strategy to promote critical thinking of undergraduate students in Nursing courses. The search for studies occurred in three phases: title and summary review, complete text and implementation of a clinical form of selection according to predetermined criteria. All included studies were assessed for quality through a classification tool for experimental studies. Six studies were selected. The results were grouped into three key themes: an evaluation of the quality of the selected studies, characterization of the studies and the strategies used to promote critical thinking. All selected studies were in English, with significant conceptual similarity of Critical Thinking and dominance in choosing the approached theme during strategies in clinical nursing education with an emphasis on the nursing process. The most widely used teaching intervention was Problem-Based Learning. Nursing education mediated by strategies that stimulate CT is considered a positive difference in undergraduate curriculums. Copyright © 2017 Elsevier Ltd. All rights reserved.
Preliminary Experimental Results for Charge Drag in a Simulated Low Earth Orbit Environment
NASA Astrophysics Data System (ADS)
Azema-Rovira, Monica
Interest in the Low Earth Orbit (LEO) environment is growing in the science community as well as in the private sector. The number of spacecraft launched in these altitudes (150 - 700 km) keeps growing, and this region is accumulating space debris. In this scenario, the precise location of all LEO objects is a key factor to avoid catastrophic collisions and to safely perform station-keeping maneuvers. The detailed study of the atmospheric models in LEO can enhance the disturbances forces calculation of an orbiting object. Recent numerical studies indicate that one of the biggest non-conservative forces on a spacecraft is underestimated, the charge drag phenomenon. Validating these numerical models experimentally, will help to improve the numerical models for future spacecraft mission design. For this reason, the motivation of this thesis is to characterize a plasma source to later be used for charged drag measurements. The characterization has been done at the University of Colorado Colorado Springs in the Chamber for Atmospheric and Orbital Space Simulation. In the characterization process, a nano-Newton Thrust Stand has been characterized as a plasma diagnosis tool and compared with Langmuir Probe data.
The Dynamics of Drug Resistance: A Mathematical Perspective
Lavi, Orit; Gottesman, Michael M.; Levy, Doron
2012-01-01
Resistance to chemotherapy is a key impediment to successful cancer treatment that has been intensively studied for the last three decades. Several central mechanisms have been identified as contributing to the resistance. In the case of multidrug resistance (MDR), the cell becomes resistant to a variety of structurally and mechanistically unrelated drugs in addition to the drug initially administered. Mathematical models of drug resistance have dealt with many of the known aspects of this field, such as pharmacologic sanctuary and location/diffusion resistance, intrinsic resistance that is therapy independent, therapy-dependent cellular alterations including induced resistance (dose-dependent) and acquired resistance (dose-independent). In addition, there are mathematical models that take into account the kinetic/phase resistance, and models that investigate intra-cellular mechanisms based on specific biological functions (such as ABC transporters, apoptosis and repair mechanisms). This review covers aspects of MDR that have been mathematically studied, and explains how, from a methodological perspective, mathematics can be used to study drug resistance. We discuss quantitative approaches of mathematical analysis, and demonstrate how mathematics can be used in combination with other experimental and clinical tools. We emphasize the potential benefits of integrating analytical and mathematical methods into future clinical and experimental studies of drug resistance. PMID:22387162
Control of the Pore Texture in Nanoporous Silicon via Chemical Dissolution.
Secret, Emilie; Wu, Chia-Chen; Chaix, Arnaud; Galarneau, Anne; Gonzalez, Philippe; Cot, Didier; Sailor, Michael J; Jestin, Jacques; Zanotti, Jean-Marc; Cunin, Frédérique; Coasne, Benoit
2015-07-28
The surface and textural properties of porous silicon (pSi) control many of its physical properties essential to its performance in key applications such as optoelectronics, energy storage, luminescence, sensing, and drug delivery. Here, we combine experimental and theoretical tools to demonstrate that the surface roughness at the nanometer scale of pSi can be tuned in a controlled fashion using partial thermal oxidation followed by removal of the resulting silicon oxide layer with hydrofluoric acid (HF) solution. Such a process is shown to smooth the pSi surface by means of nitrogen adsorption, electron microscopy, and small-angle X-ray and neutron scattering. Statistical mechanics Monte Carlo simulations, which are consistent with the experimental data, support the interpretation that the pore surface is initially rough and that the oxidation/oxide removal procedure diminishes the surface roughness while increasing the pore diameter. As a specific example considered in this work, the initial roughness ξ ∼ 3.2 nm of pSi pores having a diameter of 7.6 nm can be decreased to 1.0 nm following the simple procedure above. This study allows envisioning the design of pSi samples with optimal surface properties toward a specific process.
Computational mechanics of viral capsids.
Gibbons, Melissa M; Perotti, Luigi E; Klug, William S
2015-01-01
Viral capsids undergo significant mechanical deformations during their assembly, maturation, and infective life-span. In order to characterize the mechanics of viral capsids, their response to applied external forces is analyzed in several experimental studies using, for instance, Atomic Force Microscope (AFM) indentation experiments. In recent years, a broader approach to study the mechanics of viral capsids has leveraged the theoretical tools proper of continuum mechanics. Even though the theory of continuum elasticity is most commonly used to study deformable bodies at larger macroscopic length scales, it has been shown that this very rich theoretical field can still offer useful insights into the mechanics of viral structures at the nanometer scale. Here we show the construction of viral capsid continuum mechanics models starting from different forms of experimental data. We will discuss the kinematics assumptions, the issue of the reference configuration, the material constitutive laws, and the numerical discretization necessary to construct a complete Finite Element capsid mechanical model. Some examples in the second part of the chapter will show the predictive capabilities of the constructed models and underline useful practical aspects related to efficiency and accuracy. We conclude each example by collecting several key findings discovered by simulating AFM indentation experiments using the constructed numerical models.
Perspectives for on-line analysis of bauxite by neutron irradiation
NASA Astrophysics Data System (ADS)
Beurton, Gabriel; Ledru, Bertrand; Letourneur, Philippe
1995-03-01
The interest in bauxite as a major source of alumina results in a strong demand for on-line instrumentation suitable for sorting, blending, and processing operations at the bauxite mine and for monitoring instrumentation in the Bayer process. The results of laboratory experiments based on neutron interactions with bauxite are described. The technique was chosen in order to overcome the problem of spatial heterogeneity in bulk mineral analysis. The evaluated elements contributed to approximately 99.5% of the sample weight. In addition, the measurements provide valuable information on physical parameters such as density, hygrometry, and material flow. Using a pulsed generator, the analysis system offers potential for on-line measurements (borehole logging or conveyor belt). An overall description of the experimental set-up is given. The experimental data include measurements of natural radioactivity, delayed radioactivity induced by activation, and prompt gamma rays following neutron reaction. In situ applications of neutron interactions provide continuous analysis and produce results which are more statistically significant. The key factors contributing to advances in industrial applications are the development of high count rate gamma spectroscopy and computational tools to design measurement systems and interpret their results.
A Novel Method for Tracking Individuals of Fruit Fly Swarms Flying in a Laboratory Flight Arena
Cheng, Xi En; Qian, Zhi-Ming; Wang, Shuo Hong; Jiang, Nan; Guo, Aike; Chen, Yan Qiu
2015-01-01
The growing interest in studying social behaviours of swarming fruit flies, Drosophila melanogaster, has heightened the need for developing tools that provide quantitative motion data. To achieve such a goal, multi-camera three-dimensional tracking technology is the key experimental gateway. We have developed a novel tracking system for tracking hundreds of fruit flies flying in a confined cubic flight arena. In addition to the proposed tracking algorithm, this work offers additional contributions in three aspects: body detection, orientation estimation, and data validation. To demonstrate the opportunities that the proposed system offers for generating high-throughput quantitative motion data, we conducted experiments on five experimental configurations. We also performed quantitative analysis on the kinematics and the spatial structure and the motion patterns of fruit fly swarms. We found that there exists an asymptotic distance between fruit flies in swarms as the population density increases. Further, we discovered the evidence for repulsive response when the distance between fruit flies approached the asymptotic distance. Overall, the proposed tracking system presents a powerful method for studying flight behaviours of fruit flies in a three-dimensional environment. PMID:26083385
ClinicalKey: a point-of-care search engine.
Vardell, Emily
2013-01-01
ClinicalKey is a new point-of-care resource for health care professionals. Through controlled vocabulary, ClinicalKey offers a cross section of resources on diseases and procedures, from journals to e-books and practice guidelines to patient education. A sample search was conducted to demonstrate the features of the database, and a comparison with similar tools is presented.
Influence of speed on wear and cutting forces in end-milling nickel alloy
NASA Astrophysics Data System (ADS)
Estrems, M.; Sánchez, H. T.; Kurfess, T.; Bunget, C.
2012-04-01
The effect of speed on the flank wear of the cutting tool when a nickel alloy is milled is studied. From the analysis of the measured forces, a dynamic semi-experimental model is developed based on the parallelism between the curve of the thrust forces of the unworn tool and the curves when the flank of the tool is worn. Based on the change in the geometry of the contact in the flank worrn face, a theory of indentation of the tool on the workpiece is formulated in such a way that upon applying equations of contact mechanics, a good approximation of the experimental results is obtained.
Steele, James; Ferrari, Pier Francesco; Fogassi, Leonardo
2012-01-01
The papers in this Special Issue examine tool use and manual gestures in primates as a window on the evolution of the human capacity for language. Neurophysiological research has supported the hypothesis of a close association between some aspects of human action organization and of language representation, in both phonology and semantics. Tool use provides an excellent experimental context to investigate analogies between action organization and linguistic syntax. Contributors report and contextualize experimental evidence from monkeys, great apes, humans and fossil hominins, and consider the nature and the extent of overlaps between the neural representations of tool use, manual gestures and linguistic processes. PMID:22106422
Zimmer, Christoph
2016-01-01
Background Computational modeling is a key technique for analyzing models in systems biology. There are well established methods for the estimation of the kinetic parameters in models of ordinary differential equations (ODE). Experimental design techniques aim at devising experiments that maximize the information encoded in the data. For ODE models there are well established approaches for experimental design and even software tools. However, data from single cell experiments on signaling pathways in systems biology often shows intrinsic stochastic effects prompting the development of specialized methods. While simulation methods have been developed for decades and parameter estimation has been targeted for the last years, only very few articles focus on experimental design for stochastic models. Methods The Fisher information matrix is the central measure for experimental design as it evaluates the information an experiment provides for parameter estimation. This article suggest an approach to calculate a Fisher information matrix for models containing intrinsic stochasticity and high nonlinearity. The approach makes use of a recently suggested multiple shooting for stochastic systems (MSS) objective function. The Fisher information matrix is calculated by evaluating pseudo data with the MSS technique. Results The performance of the approach is evaluated with simulation studies on an Immigration-Death, a Lotka-Volterra, and a Calcium oscillation model. The Calcium oscillation model is a particularly appropriate case study as it contains the challenges inherent to signaling pathways: high nonlinearity, intrinsic stochasticity, a qualitatively different behavior from an ODE solution, and partial observability. The computational speed of the MSS approach for the Fisher information matrix allows for an application in realistic size models. PMID:27583802
Kunjara, Sirilaksana; Greenbaum, A Leslie; McLean, Patricia; Grønbaek, Henning; Flyvbjerg, Allan
2012-06-01
The availability of growth hormone (GH)-deficient dwarf rats with otherwise normal pituitary function provides a powerful tool to examine the relative role of hyperglycaemia and the reordering of hormonal factors in the hypertrophy-hyperfunction of the adrenal gland that is seen in experimental diabetes. Here, we examine the effects of long-term (6 months) experimental diabetes on the growth of the adrenal glands; their content of phosphoribosyl pyrophosphate (PRPP); and the activity of the PRPP synthetase, G6P dehydrogenase and 6PG dehydrogenase enzymes in GH-deficient dwarf rats compared to heterozygous controls. These parameters were selected in view of the known role of PRPP in both de novo and salvage pathways of purine and pyrimidine synthesis and in the formation of NAD, and in view of the role of the oxidative enzymes of the pentose phosphate pathway in both R5P formation and the generation of the NADPH that is required in reductive synthetic reactions. This study shows that GH deficiency prevents the increase in adrenal gland weight, PRPP synthetase, PRPP content and G6P dehydrogenase and 6PG dehydrogenase. This contrasts sharply with the heterozygous group that showed the expected increase in these parameters. The blood glucose levels of the groups of long-term diabetic rats, both GH-deficient and heterozygous, remained at an elevated level throughout the experiment. These results are fully in accord with earlier evidence from studies with somatostatin analogues which showed that the GH-insulin-like growth factor I (IGF-I)-axis plays a key role in the adrenal diabetic hypertrophy-hyperfunction syndrome. © 2012 The Authors. International Journal of Experimental Pathology © 2012 International Journal of Experimental Pathology.
Overview of the Machine-Tool Task Force
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sutton, G.P.
1981-06-08
The Machine Tool Task Force, (MTTF) surveyed the state of the art of machine tool technology for material removal for two and one-half years. This overview gives a brief summary of the approach, specific subjects covered, principal conclusions and some of the key recommendations aimed at improving the technology and advancing the productivity of machine tools. The Task Force consisted of 123 experts from the US and other countries. Their findings are documented in a five-volume report, Technology of Machine Tools.
A Research Writing Tool: Designing an Online Resource for Supervisors and Students
ERIC Educational Resources Information Center
Economou, Dorothy; James, Bronwyn
2017-01-01
This paper presents and discusses the key findings of a needs analysis, and its impact on the design for an innovative online Research Writing Tool, which aims to facilitate higher degree research writing development in medically related fields. Unlike other resources which target students only, this tool aims to support both supervisors and…
Data Visualization: An Exploratory Study into the Software Tools Used by Businesses
ERIC Educational Resources Information Center
Diamond, Michael; Mattia, Angela
2017-01-01
Data visualization is a key component to business and data analytics, allowing analysts in businesses to create tools such as dashboards for business executives. Various software packages allow businesses to create these tools in order to manipulate data for making informed business decisions. The focus is to examine what skills employers are…
Machine and Woodworking Tool Safety. Module SH-24. Safety and Health.
ERIC Educational Resources Information Center
Center for Occupational Research and Development, Inc., Waco, TX.
This student module on machine and woodworking tool safety is one of 50 modules concerned with job safety and health. This module discusses specific practices and precautions concerned with the efficient operation and use of most machine and woodworking tools in use today. Following the introduction, 13 objectives (each keyed to a page in the…
Data Visualization: An Exploratory Study into the Software Tools Used by Businesses
ERIC Educational Resources Information Center
Diamond, Michael; Mattia, Angela
2015-01-01
Data visualization is a key component to business and data analytics, allowing analysts in businesses to create tools such as dashboards for business executives. Various software packages allow businesses to create these tools in order to manipulate data for making informed business decisions. The focus is to examine what skills employers are…
Andrew Moldenke; Becky Fichter
1988-01-01
A fully illustrated key is presented for identifying genera of oribatid mites known from or suspected of occurring in the Pacific Northwest. The manual includes an introduction detailing sampling methodology; an illustrated glossary of all terminology used; two color plates of all taxa from the H. J. Andrews Experimental Forest; a diagrammatic key to the 16 major...
NASA Astrophysics Data System (ADS)
Zhang, P. P.; Guo, Y.; Wang, B.
2017-05-01
The main problems in milling difficult-to-machine materials are the high cutting temperature and rapid tool wear. However it is impossible to investigate tool wear in machining. Tool wear and cutting chip formation are two of the most important representations for machining efficiency and quality. The purpose of this paper is to develop the model of tool wear with cutting chip formation (width of chip and radian of chip) on difficult-to-machine materials. Thereby tool wear is monitored by cutting chip formation. A milling experiment on the machining centre with three sets cutting parameters was performed to obtain chip formation and tool wear. The experimental results show that tool wear increases gradually along with cutting process. In contrast, width of chip and radian of chip decrease. The model is developed by fitting the experimental data and formula transformations. The most of monitored errors of tool wear by the chip formation are less than 10%. The smallest error is 0.2%. Overall errors by the radian of chip are less than the ones by the width of chip. It is new way to monitor and detect tool wear by cutting chip formation in milling difficult-to-machine materials.
radiation. It includes an interactive chart of nuclides and a level plotting tool. XUNDL Experimental Unevaluated Nuclear Data List Experimental nuclear structure and decay data, covering more than 2,500 recent parameters* Retrieved information CSISRS alias EXFOR Nuclear reaction experimental data Experimental nuclear
Gaseous Sulfate Solubility in Glass: Experimental Method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bliss, Mary
2013-11-30
Sulfate solubility in glass is a key parameter in many commercial glasses and nuclear waste glasses. This report summarizes key publications specific to sulfate solubility experimental methods and the underlying physical chemistry calculations. The published methods and experimental data are used to verify the calculations in this report and are expanded to a range of current technical interest. The calculations and experimental methods described in this report will guide several experiments on sulfate solubility and saturation for the Hanford Waste Treatment Plant Enhanced Waste Glass Models effort. There are several tables of sulfate gas equilibrium values at high temperature tomore » guide experimental gas mixing and to achieve desired SO3 levels. This report also describes the necessary equipment and best practices to perform sulfate saturation experiments for molten glasses. Results and findings will be published when experimental work is finished and this report is validated from the data obtained.« less
Space physiology IV: mathematical modeling of the cardiovascular system in space exploration.
Keith Sharp, M; Batzel, Jerry Joseph; Montani, Jean-Pierre
2013-08-01
Mathematical modeling represents an important tool for analyzing cardiovascular function during spaceflight. This review describes how modeling of the cardiovascular system can contribute to space life science research and illustrates this process via modeling efforts to study postflight orthostatic intolerance (POI), a key issue for spaceflight. Examining this application also provides a context for considering broader applications of modeling techniques to the challenges of bioastronautics. POI, which affects a large fraction of astronauts in stand tests upon return to Earth, presents as dizziness, fainting and other symptoms, which can diminish crew performance and cause safety hazards. POI on the Moon or Mars could be more critical. In the field of bioastronautics, POI has been the dominant application of cardiovascular modeling for more than a decade, and a number of mechanisms for POI have been investigated. Modeling approaches include computational models with a range of incorporated factors and hemodynamic sophistication, and also physical models tested in parabolic and orbital flight. Mathematical methods such as parameter sensitivity analysis can help identify key system mechanisms. In the case of POI, this could lead to more effective countermeasures. Validation is a persistent issue in modeling efforts, and key considerations and needs for experimental data to synergistically improve understanding of cardiovascular responses are outlined. Future directions in cardiovascular modeling include subject-specific assessment of system status, as well as research on integrated physiological responses, leading, for instance, to assessment of subject-specific susceptibility to POI or effects of cardiovascular alterations on muscular, vision and cognitive function.
Mufford, J T; Paetkau, M J; Flood, N J; Regev-Shoshani, G; Miller, C C; Church, J S
2016-08-01
Many behavioral and physiological studies of laboratory mice employ invasive methods such as radio telemetry to measure key aspects of behavior and physiology. Radio telemetry requires surgical implants, which may impact mouse health and behavior, and thus reduce the reliability of the data collected. We developed a method to measure key aspects of thermoregulatory behavior without compromising animal welfare. We examined the thermoregulatory response to heat stress in a custom-built arena that permitted the use of simultaneous and continuous infrared thermography (IRT) and video capture. This allowed us to measure changes in surface body temperature and determine total distance traveled using EthoVision XT animal tracking software. Locomotor activity and surface body temperature differed between heat-stressed mice and mice kept within their thermal comfort zone. The former had an increase in surface body temperature and a decline in locomotor activity, whereas the latter had a stable surface body temperature and showed greater activity levels. Surface body temperature and locomotor activity are conventionally quantified by measurements taken at regular intervals, which limit the use and accuracy of the data. We obtained data of high resolution (i.e., recorded continuously) and accuracy that allowed for the examination of key physiological measurements such as energy expenditure and peripheral vasomotor tone. This novel experimental method for studying thermoregulatory behavior in mice using non-invasive tools has advantages over radio-telemetry and represents an improvement in laboratory animal welfare. Copyright © 2015 Elsevier B.V. All rights reserved.
Modeling and Tool Wear in Routing of CFRP
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iliescu, D.; Fernandez, A.; Gutierrez-Orrantia, M. E.
2011-01-17
This paper presents the prediction and evaluation of feed force in routing of carbon composite material. In order to extend tool life and improve quality of the machined surface, a better understanding of uncoated and coated tool behaviors is required. This work describes (1) the optimization of the geometry of multiple teeth tools minimizing the tool wear and the feed force, (2) the optimization of tool coating and (3) the development of a phenomenological model between the feed force, the routing parameters and the tool wear. The experimental results indicate that the feed rate, the cutting speed and the toolmore » wear are the most significant factors affecting the feed force. In the case of multiple teeth tools, a particular geometry with 14 teeth right helix right cut and 11 teeth left helix right cut gives the best results. A thick AlTiN coating or a diamond coating can dramatically improve the tool life while minimizing the axial force, roughness and delamination. A wear model has then been developed based on an abrasive behavior of the tool. The model links the feed rate to the tool geometry parameters (tool diameter), to the process parameters (feed rate, cutting speed and depth of cut) and to the wear. The model presented has been verified by experimental tests.« less
Appiah-Brempong, Emmanuel; Okyere, Paul; Owusu-Addo, Ebenezer; Cross, Ruth
2014-01-01
The study sought to assess the effectiveness of Motivational Interviewing (MI) interventions in reducing alcohol consumption among college students, as compared to no intervention or alternative interventions. It also sought to identify the potential moderators to MI intervention effects. Database sources consulted included Cochrane Central Register of Control Trials, PsycINFO, PsycARTICLE, PsycLIT, CINAHL, and MEDLINE. Included studies were (1) underpinned by experimental, quasi-experimental, and nonexperimental designs; (2) studies in which participants were either college males only or females only or both; and (3) studies in which adaptations of MI were based on key MI principles. Excluded studies were (1) non-English language studies; (2) studies not published from 2000-2012; (3) studies in which participants were not college students; (4) studies in which intervention was not delivered by face-to-face approach; and (5) studies that failed to embark on postintervention follow-ups. A total of 115 abstracts were screened. These were narrowed down to 13 studies from which data for the study were extracted. Selected studies were underpinned by experimental, quasi-experimental, and nonexperimental designs. Owing to the heterogeneity in selected studies, a narrative synthesis was used. MI interventions were found to be effective in reducing alcohol consumption among college students, when compared to alternative interventions or no intervention. Potential moderators of MI intervention effects were identified to include practitioner's adherence to MI techniques and individual's drinking motives. MI presents itself as a promising tool that can augment the many existing social-environmental strategies of health promotion.
Ali, A; Carré, A; Hassler, C; Spilka, S; Vanier, A; Barry, C; Berthoz, S
2016-06-01
The prevention of addictions in young people is a challenge for Mental and Public Health policies, and requires specific risk-screening tools. Specific personality traits, as assessed using the Substance Use Risk Profile Scale (SURPS), could play a key role in the onset and escalation of substance use. This study aimed to examine (1) measurement invariance across age and gender (2) the effects of age and gender on associations between SURPS scores and the most frequently-consumed substances. Analyses were based on the responses from 5069 participants (aged 14-20 years) from the 2011 ESPAD-France dataset. Substance-use outcomes were experimentation and current frequency of alcohol, tobacco and cannabis use, and drunkenness. Our approach, consisting in analysing measurement and structural invariance and interaction terms, established the stability of (i) SURPS profiles, and (ii) relationships between these scores and substance experimentation and use over a developmental period ranging from mid-adolescence to early adulthood. Measurement invariance across genders was also confirmed despite the absence of scalar invariance for 2 items. Significant interactions between gender and SURPS factors were established, highlighting differential vulnerability, especially concerning Hopelessness and experimentation of alcohol and drunkenness, or Impulsivity and tobacco experimentation. Finally, Anxiety Sensitivity could be protective against substance use, especially for cannabis in girls. Our results suggest the relevance of the SURPS to assess vulnerability towards drug use, and underline the need to consider gender differences in addiction risks. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Schlötterer, C; Kofler, R; Versace, E; Tobler, R; Franssen, S U
2015-05-01
Evolve and resequence (E&R) is a new approach to investigate the genomic responses to selection during experimental evolution. By using whole genome sequencing of pools of individuals (Pool-Seq), this method can identify selected variants in controlled and replicable experimental settings. Reviewing the current state of the field, we show that E&R can be powerful enough to identify causative genes and possibly even single-nucleotide polymorphisms. We also discuss how the experimental design and the complexity of the trait could result in a large number of false positive candidates. We suggest experimental and analytical strategies to maximize the power of E&R to uncover the genotype-phenotype link and serve as an important research tool for a broad range of evolutionary questions.
NASA Astrophysics Data System (ADS)
Adesta, Erry Yulian T.; Riza, Muhammad; Avicena
2018-03-01
Tool wear prediction plays a significant role in machining industry for proper planning and control machining parameters and optimization of cutting conditions. This paper aims to investigate the effect of tool path strategies that are contour-in and zigzag tool path strategies applied on tool wear during pocket milling process. The experiments were carried out on CNC vertical machining centre by involving PVD coated carbide inserts. Cutting speed, feed rate and depth of cut were set to vary. In an experiment with three factors at three levels, Response Surface Method (RSM) design of experiment with a standard called Central Composite Design (CCD) was employed. Results obtained indicate that tool wear increases significantly at higher range of feed per tooth compared to cutting speed and depth of cut. This result of this experimental work is then proven statistically by developing empirical model. The prediction model for the response variable of tool wear for contour-in strategy developed in this research shows a good agreement with experimental work.
Refining Pathways: A Model Comparison Approach
Moffa, Giusi; Erdmann, Gerrit; Voloshanenko, Oksana; Hundsrucker, Christian; Sadeh, Mohammad J.; Boutros, Michael; Spang, Rainer
2016-01-01
Cellular signalling pathways consolidate multiple molecular interactions into working models of signal propagation, amplification, and modulation. They are described and visualized as networks. Adjusting network topologies to experimental data is a key goal of systems biology. While network reconstruction algorithms like nested effects models are well established tools of computational biology, their data requirements can be prohibitive for their practical use. In this paper we suggest focussing on well defined aspects of a pathway and develop the computational tools to do so. We adapt the framework of nested effect models to focus on a specific aspect of activated Wnt signalling in HCT116 colon cancer cells: Does the activation of Wnt target genes depend on the secretion of Wnt ligands or do mutations in the signalling molecule β-catenin make this activation independent from them? We framed this question into two competing classes of models: Models that depend on Wnt ligands secretion versus those that do not. The model classes translate into restrictions of the pathways in the network topology. Wnt dependent models are more flexible than Wnt independent models. Bayes factors are the standard Bayesian tool to compare different models fairly on the data evidence. In our analysis, the Bayes factors depend on the number of potential Wnt signalling target genes included in the models. Stability analysis with respect to this number showed that the data strongly favours Wnt ligands dependent models for all realistic numbers of target genes. PMID:27248690
Optimal Design of Multitype Groundwater Monitoring Networks Using Easily Accessible Tools.
Wöhling, Thomas; Geiges, Andreas; Nowak, Wolfgang
2016-11-01
Monitoring networks are expensive to establish and to maintain. In this paper, we extend an existing data-worth estimation method from the suite of PEST utilities with a global optimization method for optimal sensor placement (called optimal design) in groundwater monitoring networks. Design optimization can include multiple simultaneous sensor locations and multiple sensor types. Both location and sensor type are treated simultaneously as decision variables. Our method combines linear uncertainty quantification and a modified genetic algorithm for discrete multilocation, multitype search. The efficiency of the global optimization is enhanced by an archive of past samples and parallel computing. We demonstrate our methodology for a groundwater monitoring network at the Steinlach experimental site, south-western Germany, which has been established to monitor river-groundwater exchange processes. The target of optimization is the best possible exploration for minimum variance in predicting the mean travel time of the hyporheic exchange. Our results demonstrate that the information gain of monitoring network designs can be explored efficiently and with easily accessible tools prior to taking new field measurements or installing additional measurement points. The proposed methods proved to be efficient and can be applied for model-based optimal design of any type of monitoring network in approximately linear systems. Our key contributions are (1) the use of easy-to-implement tools for an otherwise complex task and (2) yet to consider data-worth interdependencies in simultaneous optimization of multiple sensor locations and sensor types. © 2016, National Ground Water Association.
On Nonlinear Combustion Instability in Liquid Propellant Rocket Motors
NASA Technical Reports Server (NTRS)
Sims, J. D. (Technical Monitor); Flandro, Gary A.; Majdalani, Joseph; Sims, Joseph D.
2004-01-01
All liquid propellant rocket instability calculations in current use have limited value in the predictive sense and serve mainly as a correlating framework for the available data sets. The well-known n-t model first introduced by Crocco and Cheng in 1956 is still used as the primary analytical tool of this type. A multitude of attempts to establish practical analytical methods have achieved only limited success. These methods usually produce only stability boundary maps that are of little use in making critical design decisions in new motor development programs. Recent progress in understanding the mechanisms of combustion instability in solid propellant rockets"' provides a firm foundation for a new approach to prediction, diagnosis, and correction of the closely related problems in liquid motor instability. For predictive tools to be useful in the motor design process, they must have the capability to accurately determine: 1) time evolution of the pressure oscillations and limit amplitude, 2) critical triggering pulse amplitude, and 3) unsteady heat transfer rates at injector surfaces and chamber walls. The method described in this paper relates these critical motor characteristics directly to system design parameters. Inclusion of mechanisms such as wave steepening, vorticity production and transport, and unsteady detonation wave phenomena greatly enhance the representation of key features of motor chamber oscillatory behavior. The basic theoretical model is described and preliminary computations are compared to experimental data. A plan to develop the new predictive method into a comprehensive analysis tool is also described.
The Zebrafish Xenograft Platform: Evolution of a Novel Cancer Model and Preclinical Screening Tool.
Wertman, Jaime; Veinotte, Chansey J; Dellaire, Graham; Berman, Jason N
2016-01-01
Animal xenografts of human cancers represent a key preclinical tool in the field of cancer research. While mouse xenografts have long been the gold standard, investigators have begun to use zebrafish (Danio rerio) xenotransplantation as a relatively rapid, robust and cost-effective in vivo model of human cancers. There are several important methodological considerations in the design of an informative and efficient zebrafish xenotransplantation experiment. Various transgenic fish strains have been created that facilitate microscopic observation, ranging from the completely transparent casper fish to the Tg(fli1:eGFP) fish that expresses fluorescent GFP protein in its vascular tissue. While human cancer cell lines have been used extensively in zebrafish xenotransplantation studies, several reports have also used primary patient samples as the donor material. The zebrafish is ideally suited for transplanting primary patient material by virtue of the relatively low number of cells required for each embryo (between 50 and 300 cells), the absence of an adaptive immune system in the early zebrafish embryo, and the short experimental timeframe (5-7 days). Following xenotransplantation into the fish, cells can be tracked using in vivo or ex vivo measures of cell proliferation and migration, facilitated by fluorescence or human-specific protein expression. Importantly, assays have been developed that allow for the reliable detection of in vivo human cancer cell growth or inhibition following administration of drugs of interest. The zebrafish xenotransplantation model is a unique and effective tool for the study of cancer cell biology.
Ward, Adam S.; Kelleher, Christa A.; Mason, Seth J. K.; Wagener, Thorsten; McIntyre, Neil; McGlynn, Brian L.; Runkel, Robert L.; Payn, Robert A.
2017-01-01
Researchers and practitioners alike often need to understand and characterize how water and solutes move through a stream in terms of the relative importance of in-stream and near-stream storage and transport processes. In-channel and subsurface storage processes are highly variable in space and time and difficult to measure. Storage estimates are commonly obtained using transient-storage models (TSMs) of the experimentally obtained solute-tracer test data. The TSM equations represent key transport and storage processes with a suite of numerical parameters. Parameter values are estimated via inverse modeling, in which parameter values are iteratively changed until model simulations closely match observed solute-tracer data. Several investigators have shown that TSM parameter estimates can be highly uncertain. When this is the case, parameter values cannot be used reliably to interpret stream-reach functioning. However, authors of most TSM studies do not evaluate or report parameter certainty. Here, we present a software tool linked to the One-dimensional Transport with Inflow and Storage (OTIS) model that enables researchers to conduct uncertainty analyses via Monte-Carlo parameter sampling and to visualize uncertainty and sensitivity results. We demonstrate application of our tool to 2 case studies and compare our results to output obtained from more traditional implementation of the OTIS model. We conclude by suggesting best practices for transient-storage modeling and recommend that future applications of TSMs include assessments of parameter certainty to support comparisons and more reliable interpretations of transport processes.
Degradation of metallic materials studied by correlative tomography
NASA Astrophysics Data System (ADS)
Burnett, T. L.; Holroyd, N. J. H.; Lewandowski, J. J.; Ogurreck, M.; Rau, C.; Kelley, R.; Pickering, E. J.; Daly, M.; Sherry, A. H.; Pawar, S.; Slater, T. J. A.; Withers, P. J.
2017-07-01
There are a huge array of characterization techniques available today and increasingly powerful computing resources allowing for the effective analysis and modelling of large datasets. However, each experimental and modelling tool only spans limited time and length scales. Correlative tomography can be thought of as the extension of correlative microscopy into three dimensions connecting different techniques, each providing different types of information, or covering different time or length scales. Here the focus is on the linking of time lapse X-ray computed tomography (CT) and serial section electron tomography using the focussed ion beam (FIB)-scanning electron microscope to study the degradation of metals. Correlative tomography can provide new levels of detail by delivering a multiscale 3D picture of key regions of interest. Specifically, the Xe+ Plasma FIB is used as an enabling tool for large-volume high-resolution serial sectioning of materials, and also as a tool for preparation of microscale test samples and samples for nanoscale X-ray CT imaging. The exemplars presented illustrate general aspects relating to correlative workflows, as well as to the time-lapse characterisation of metal microstructures during various failure mechanisms, including ductile fracture of steel and the corrosion of aluminium and magnesium alloys. Correlative tomography is already providing significant insights into materials behaviour, linking together information from different instruments across different scales. Multiscale and multifaceted work flows will become increasingly routine, providing a feed into multiscale materials models as well as illuminating other areas, particularly where hierarchical structures are of interest.
Microstructure Modeling of Third Generation Disk Alloys
NASA Technical Reports Server (NTRS)
Jou, Herng-Jeng
2010-01-01
The objective of this program was to model, validate, and predict the precipitation microstructure evolution, using PrecipiCalc (QuesTek Innovations LLC) software, for 3rd generation Ni-based gas turbine disc superalloys during processing and service, with a set of logical and consistent experiments and characterizations. Furthermore, within this program, the originally research-oriented microstructure simulation tool was to be further improved and implemented to be a useful and user-friendly engineering tool. In this report, the key accomplishments achieved during the third year (2009) of the program are summarized. The activities of this year included: Further development of multistep precipitation simulation framework for gamma prime microstructure evolution during heat treatment; Calibration and validation of gamma prime microstructure modeling with supersolvus heat treated LSHR; Modeling of the microstructure evolution of the minor phases, particularly carbides, during isothermal aging, representing the long term microstructure stability during thermal exposure; and the implementation of software tools. During the research and development efforts to extend the precipitation microstructure modeling and prediction capability in this 3-year program, we identified a hurdle, related to slow gamma prime coarsening rate, with no satisfactory scientific explanation currently available. It is desirable to raise this issue to the Ni-based superalloys research community, with hope that in future there will be a mechanistic understanding and physics-based treatment to overcome the hurdle. In the mean time, an empirical correction factor was developed in this modeling effort to capture the experimental observations.
NASA Astrophysics Data System (ADS)
Hockicko, Peter; Krišt‧ák, L.‧uboš; Němec, Miroslav
2015-03-01
Video analysis, using the program Tracker (Open Source Physics), in the educational process introduces a new creative method of teaching physics and makes natural sciences more interesting for students. This way of exploring the laws of nature can amaze students because this illustrative and interactive educational software inspires them to think creatively, improves their performance and helps them in studying physics. This paper deals with increasing the key competencies in engineering by analysing real-life situation videos - physical problems - by means of video analysis and the modelling tools using the program Tracker and simulations of physical phenomena from The Physics Education Technology (PhET™) Project (VAS method of problem tasks). The statistical testing using the t-test confirmed the significance of the differences in the knowledge of the experimental and control groups, which were the result of interactive method application.
Molecular optoelectronics: the interaction of molecular conduction junctions with light.
Galperin, Michael; Nitzan, Abraham
2012-07-14
The interaction of light with molecular conduction junctions is attracting growing interest as a challenging experimental and theoretical problem on one hand, and because of its potential application as a characterization and control tool on the other. It stands at the interface between two important fields, molecular electronics and molecular plasmonics and has attracted attention as a challenging scientific problem with potentially important technological consequences. Here we review the present state of the art of this field, focusing on several key phenomena and applications: using light as a switching device, using light to control junction transport in the adiabatic and non-adiabatic regimes, light generation in biased junctions and Raman scattering from such systems. This field has seen remarkable progress in the past decade, and the growing availability of scanning tip configurations that can combine optical and electrical probes suggests that further progress towards the goal of realizing molecular optoelectronics on the nanoscale is imminent.
How to design PET experiments to study neurochemistry: application to alcoholism.
Morris, Evan D; Lucas, Molly V; Petrulli, J Ryan; Cosgrove, Kelly P
2014-03-01
Positron Emission Tomography (PET) (and the related Single Photon Emission Computed Tomography) is a powerful imaging tool with a molecular specificity and sensitivity that are unique among imaging modalities. PET excels in the study of neurochemistry in three ways: 1) It can detect and quantify neuroreceptor molecules; 2) it can detect and quantify changes in neurotransmitters; and 3) it can detect and quantify exogenous drugs delivered to the brain. To carry out any of these applications, the user must harness the power of kinetic modeling. Further, the quality of the information gained is only as good as the soundness of the experimental design. This article reviews the concepts behind the three main uses of PET, the rationale behind kinetic modeling of PET data, and some of the key considerations when planning a PET experiment. Finally, some examples of PET imaging related to the study of alcoholism are discussed and critiqued.
A proto-architecture for innate directionally selective visual maps.
Adams, Samantha V; Harris, Chris M
2014-01-01
Self-organizing artificial neural networks are a popular tool for studying visual system development, in particular the cortical feature maps present in real systems that represent properties such as ocular dominance (OD), orientation-selectivity (OR) and direction selectivity (DS). They are also potentially useful in artificial systems, for example robotics, where the ability to extract and learn features from the environment in an unsupervised way is important. In this computational study we explore a DS map that is already latent in a simple artificial network. This latent selectivity arises purely from the cortical architecture without any explicit coding for DS and prior to any self-organising process facilitated by spontaneous activity or training. We find DS maps with local patchy regions that exhibit features similar to maps derived experimentally and from previous modeling studies. We explore the consequences of changes to the afferent and lateral connectivity to establish the key features of this proto-architecture that support DS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patil, Abhijit A.; Pandey, Yogendra Narayan; Doxastakis, Manolis
2014-10-01
The acid-catalyzed deprotection of glassy poly(4-hydroxystyrene-co-tertbutyl acrylate) films was studied with infrared absorbance spectroscopy and stochastic simulations. Experimental data were interpreted with a simple description of subdiffusive acid transport coupled to second-order acid loss. This model predicts key attributes of observed deprotection rates, such as fast reaction at short times, slow reaction at long times, and a nonlinear dependence on acid loading. Fickian diffusion is approached by increasing the post-exposure bake temperature or adding plasticizing agents to the polymer resin. These findings demonstrate that acid mobility and overall deprotection kinetics are coupled to glassy matrix dynamics. To complement the analysismore » of bulk kinetics, acid diffusion lengths were calculated from the anomalous transport model and compared with nanopattern line widths. The consistent scaling between experiments and simulations suggests that the anomalous diffusion model could be further developed into a predictive lithography tool.« less
A toolbox of immunoprecipitation-grade monoclonal antibodies to human transcription factors.
Venkataraman, Anand; Yang, Kun; Irizarry, Jose; Mackiewicz, Mark; Mita, Paolo; Kuang, Zheng; Xue, Lin; Ghosh, Devlina; Liu, Shuang; Ramos, Pedro; Hu, Shaohui; Bayron Kain, Diane; Keegan, Sarah; Saul, Richard; Colantonio, Simona; Zhang, Hongyan; Behn, Florencia Pauli; Song, Guang; Albino, Edisa; Asencio, Lillyann; Ramos, Leonardo; Lugo, Luvir; Morell, Gloriner; Rivera, Javier; Ruiz, Kimberly; Almodovar, Ruth; Nazario, Luis; Murphy, Keven; Vargas, Ivan; Rivera-Pacheco, Zully Ann; Rosa, Christian; Vargas, Moises; McDade, Jessica; Clark, Brian S; Yoo, Sooyeon; Khambadkone, Seva G; de Melo, Jimmy; Stevanovic, Milanka; Jiang, Lizhi; Li, Yana; Yap, Wendy Y; Jones, Brittany; Tandon, Atul; Campbell, Elliot; Montelione, Gaetano T; Anderson, Stephen; Myers, Richard M; Boeke, Jef D; Fenyö, David; Whiteley, Gordon; Bader, Joel S; Pino, Ignacio; Eichinger, Daniel J; Zhu, Heng; Blackshaw, Seth
2018-03-19
A key component of efforts to address the reproducibility crisis in biomedical research is the development of rigorously validated and renewable protein-affinity reagents. As part of the US National Institutes of Health (NIH) Protein Capture Reagents Program (PCRP), we have generated a collection of 1,406 highly validated immunoprecipitation- and/or immunoblotting-grade mouse monoclonal antibodies (mAbs) to 737 human transcription factors, using an integrated production and validation pipeline. We used HuProt human protein microarrays as a primary validation tool to identify mAbs with high specificity for their cognate targets. We further validated PCRP mAbs by means of multiple experimental applications, including immunoprecipitation, immunoblotting, chromatin immunoprecipitation followed by sequencing (ChIP-seq), and immunohistochemistry. We also conducted a meta-analysis that identified critical variables that contribute to the generation of high-quality mAbs. All validation data, protocols, and links to PCRP mAb suppliers are available at http://proteincapture.org.
Planting molecular functions in an ecological context with Arabidopsis thaliana.
Krämer, Ute
2015-03-25
The vascular plant Arabidopsis thaliana is a central genetic model and universal reference organism in plant and crop science. The successful integration of different fields of research in the study of A. thaliana has made a large contribution to our molecular understanding of key concepts in biology. The availability and active development of experimental tools and resources, in combination with the accessibility of a wealth of cumulatively acquired knowledge about this plant, support the most advanced systems biology approaches among all land plants. Research in molecular ecology and evolution has also brought the natural history of A. thaliana into the limelight. This article showcases our current knowledge of the natural history of A. thaliana from the perspective of the most closely related plant species, providing an evolutionary framework for interpreting novel findings and for developing new hypotheses based on our knowledge of this plant.
Numerical Simulation and Mechanical Design for TPS Electron Beam Position Monitors
NASA Astrophysics Data System (ADS)
Hsueh, H. P.; Kuan, C. K.; Ueng, T. S.; Hsiung, G. Y.; Chen, J. R.
2007-01-01
Comprehensive study on the mechanical design and numerical simulation for the high resolution electron beam position monitors are key steps to build the newly proposed 3rd generation synchrotron radiation research facility, Taiwan Photon Source (TPS). With more advanced electromagnetic simulation tool like MAFIA tailored specifically for particle accelerator, the design for the high resolution electron beam position monitors can be tested in such environment before they are experimentally tested. The design goal of our high resolution electron beam position monitors is to get the best resolution through sensitivity and signal optimization. The definitions and differences between resolution and sensitivity of electron beam position monitors will be explained. The design consideration is also explained. Prototype deign has been carried out and the related simulations were also carried out with MAFIA. The results are presented here. Sensitivity as high as 200 in x direction has been achieved in x direction at 500 MHz.
How to Design PET Experiments to Study Neurochemistry: Application to Alcoholism
Morris, Evan D.; Lucas, Molly V.; Petrulli, J. Ryan; Cosgrove, Kelly P.
2014-01-01
Positron Emission Tomography (PET) (and the related Single Photon Emission Computed Tomography) is a powerful imaging tool with a molecular specificity and sensitivity that are unique among imaging modalities. PET excels in the study of neurochemistry in three ways: 1) It can detect and quantify neuroreceptor molecules; 2) it can detect and quantify changes in neurotransmitters; and 3) it can detect and quantify exogenous drugs delivered to the brain. To carry out any of these applications, the user must harness the power of kinetic modeling. Further, the quality of the information gained is only as good as the soundness of the experimental design. This article reviews the concepts behind the three main uses of PET, the rationale behind kinetic modeling of PET data, and some of the key considerations when planning a PET experiment. Finally, some examples of PET imaging related to the study of alcoholism are discussed and critiqued. PMID:24600335
Advances in Light Microscopy for Neuroscience
Wilt, Brian A.; Burns, Laurie D.; Ho, Eric Tatt Wei; Ghosh, Kunal K.; Mukamel, Eran A.
2010-01-01
Since the work of Golgi and Cajal, light microscopy has remained a key tool for neuroscientists to observe cellular properties. Ongoing advances have enabled new experimental capabilities using light to inspect the nervous system across multiple spatial scales, including ultrastructural scales finer than the optical diffraction limit. Other progress permits functional imaging at faster speeds, at greater depths in brain tissue, and over larger tissue volumes than previously possible. Portable, miniaturized fluorescence microscopes now allow brain imaging in freely behaving mice. Complementary progress on animal preparations has enabled imaging in head-restrained behaving animals, as well as time-lapse microscopy studies in the brains of live subjects. Mouse genetic approaches permit mosaic and inducible fluorescence-labeling strategies, whereas intrinsic contrast mechanisms allow in vivo imaging of animals and humans without use of exogenous markers. This review surveys such advances and highlights emerging capabilities of particular interest to neuroscientists. PMID:19555292
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pern, F.J.; Glick, S.H.; Czanderna, A.W.
The stabilization effects of various superstrate materials against UV-induced EVA discoloration and the effect of photocurrent enhancement by white light-reflecting substrates are summarized. Based on the results, some alternative PV module encapsulation schemes are proposed for improved module performance, where the current or modified formulations of EVA encapsulants still can be used so that the typical processing tools and conditions need not to be changed significantly. The schemes are designed in an attempt to eliminate or minimize the EVA yellow-browning and to increase the module power output. Four key experimental results from the studies of EVA discoloration and encapsulation aremore » to employ: (1) UV-absorbing (filtering) glasses as superstrates to protect EVA from UV-induced discoloration, (2) gas-permeable polymer films as superstrates and/or substrates to prevent EVA yellowing by permitting photobleaching reactions, (3) modified EVA formulations, and (4) internal reflection of the light by white substrates. {copyright} {ital 1996 American Institute of Physics.}« less
The Regulatory Function of Eosinophils
Wen, Ting; Rothenberg, Marc E.
2016-01-01
Eosinophils are a minority circulating granulocyte classically viewed as being involved in host defense against parasites and promoting allergic reactions. However, a series of new regulatory functions for these cells have been identified in the past decade. During homeostasis, eosinophils develop in the bone marrow and migrate from the blood into target tissues following an eotaxin gradient, with IL-5 being a key cytokine for eosinophil proliferation, survival and priming. In multiple target tissues, eosinophils actively regulate a variety of immune functions through their vast arsenal of granule products and cytokines, as well as direct cellular interaction with cells in proximity. The immunologic regulation of eosinophils extends from innate immunity to adaptive immunity and also involves non-immune cells. Herein, we summarize recent findings regarding novel roles of murine and human eosinophils focused on interactions with other hematopoietic cells. We also review new experimental tools available and remaining questions to uncover a greater understanding of this enigmatic cell. PMID:27780017
The Regulatory Function of Eosinophils.
Wen, Ting; Rothenberg, Marc E
2016-10-01
Eosinophils are a minority circulating granulocyte classically viewed as being involved in host defense against parasites and promoting allergic reactions. However, a series of new regulatory functions for these cells have been identified in the past decade. During homeostasis, eosinophils develop in the bone marrow and migrate from the blood into target tissues following an eotaxin gradient, with interleukin-5 being a key cytokine for eosinophil proliferation, survival, and priming. In multiple target tissues, eosinophils actively regulate a variety of immune functions through their vast arsenal of granule products and cytokines, as well as direct cellular interaction with cells in proximity. The immunologic regulation of eosinophils extends from innate immunity to adaptive immunity and also involves non-immune cells. Herein, we summarize recent findings regarding novel roles of murine and human eosinophils, focusing on interactions with other hematopoietic cells. We also review new experimental tools available and remaining questions to uncover a greater understanding of this enigmatic cell.
Jamshidian, Farid; Hubbard, Alan E; Jewell, Nicholas P
2014-06-01
There is a rich literature on the role of placebos in experimental design and evaluation of therapeutic agents or interventions. The importance of masking participants, investigators and evaluators to treatment assignment (treatment or placebo) has long been stressed as a key feature of a successful trial design. Nevertheless, there is considerable variability in the technical definition of the placebo effect and the impact of treatment assignments being unmasked. We suggest a formal concept of a 'perception effect' and define unmasking and placebo effects in the context of randomised trials. We employ modern tools from causal inference to derive semi-parametric estimators of such effects. The methods are illustrated on a motivating example from a recent pain trial where the occurrence of treatment-related side effects acts as a proxy for unmasking. © The Author(s) 2011 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Indirect human impacts turn off reciprocal feedbacks and decrease ecosystem resilience.
Bertness, Mark D; Brisson, Caitlin P; Crotty, Sinead M
2015-05-01
Creek bank salt marsh die-off is a conservation problem in New England, driven by predator depletion, which releases herbivores from consumer control. Many marshes, however, have begun to recover from die-off. We examined the hypothesis that the loss of the foundation species Spartina alterniflora has decreased facilitator populations, weakening reciprocal positive plant/animal feedbacks, resilience, and slowing recovery. Field surveys and experiments revealed that loss of Spartina leads to decreased biodiversity, and increased mortality and decreased growth of the ribbed mussel Geukensia demissa, a key facilitator of Spartina. Experimental addition of Geukensia facilitators to creek banks accelerated Spartina recovery, showing that their loss limits recovery and the reciprocal feedbacks that drive community resilience. Reciprocal positive feedbacks involving foundation species, often lost to human impacts, may be a common, but generally overlooked mechanism of ecosystem resilience, making their reestablishment a valuable restoration tool.
Endocannabinoid signaling and synaptic function
Castillo, Pablo E.; Younts, Thomas J.; Chávez, Andrés E.; Hashimotodani, Yuki
2012-01-01
Endocannabinoids are key modulators of synaptic function. By activating cannabinoid receptors expressed in the central nervous system, these lipid messengers can regulate several neural functions and behaviors. As experimental tools advance, the repertoire of known endocannabinoid-mediated effects at the synapse, and their underlying mechanism, continues to expand. Retrograde signaling is the principal mode by which endocannabinoids mediate short- and long-term forms of plasticity at both excitatory and inhibitory synapses. However, growing evidence suggests that endocannabinoids can also signal in a non-retrograde manner. In addition to mediating synaptic plasticity, the endocannabinoid system is itself subject to plastic changes. Multiple points of interaction with other neuromodulatory and signaling systems have now been identified. Synaptic endocannabinoid signaling is thus mechanistically more complex and diverse than originally thought. In this review, we focus on new advances in endocannabinoid signaling and highlight their role as potent regulators of synaptic function in the mammalian brain. PMID:23040807
Fundamentals of bipolar high-frequency surgery.
Reidenbach, H D
1993-04-01
In endoscopic surgery a very precise surgical dissection technique and an efficient hemostasis are of decisive importance. The bipolar technique may be regarded as a method which satisfies both requirements, especially regarding a high safety standard in application. In this context the biophysical and technical fundamentals of this method, which have been known in principle for a long time, are described with regard to the special demands of a newly developed field of modern surgery. After classification of this method into a general and a quasi-bipolar mode, various technological solutions of specific bipolar probes, in a strict and in a generalized sense, are characterized in terms of indication. Experimental results obtained with different bipolar instruments and probes are given. The application of modern microprocessor-controlled high-frequency surgery equipment and, wherever necessary, the integration of additional ancillary technology into the specialized bipolar instruments may result in most useful and efficient tools of a key technology in endoscopic surgery.
NASA Astrophysics Data System (ADS)
Callewaert, Vincent; Shastry, K.; Saniz, Rolando; Makkonen, Ilja; Barbiellini, Bernardo; Assaf, Badih A.; Heiman, Donald; Moodera, Jagadeesh S.; Partoens, Bart; Bansil, Arun; Weiss, A. H.
2016-09-01
Topological insulators are attracting considerable interest due to their potential for technological applications and as platforms for exploring wide-ranging fundamental science questions. In order to exploit, fine-tune, control, and manipulate the topological surface states, spectroscopic tools which can effectively probe their properties are of key importance. Here, we demonstrate that positrons provide a sensitive probe for topological states and that the associated annihilation spectrum provides a technique for characterizing these states. Firm experimental evidence for the existence of a positron surface state near Bi2Te2Se with a binding energy of Eb=2.7 ±0.2 eV is presented and is confirmed by first-principles calculations. Additionally, the simulations predict a significant signal originating from annihilation with the topological surface states and show the feasibility to detect their spin texture through the use of spin-polarized positron beams.
Gapinski, Mary Ann; Sheetz, Anne H
2014-10-01
The National Association of School Nurses' research priorities include the recommendation that data reliability, quality, and availability be addressed to advance research in child and school health. However, identifying a national school nursing data set has remained a challenge for school nurses, school nursing leaders, school nurse professional organizations, and state school nurse consultants. While there is much agreement that school nursing data (with associated data integrity) is an incredibly powerful tool for multiple uses, the content of a national data set must be developed. In 1993, recognizing the unique power of data, Massachusetts began addressing the need for consistent school nurse data collection. With more than 20 years' experience--and much experimentation, pilot testing, and system modification--Massachusetts is now ready to share its data collection system and certain key indicators with other states, thus offering a beginning foundation for a national school nursing data set. © The Author(s) 2014.
Challenges and Opportunities in Interdisciplinary Materials Research Experiences for Undergraduates
NASA Astrophysics Data System (ADS)
Vohra, Yogesh; Nordlund, Thomas
2009-03-01
The University of Alabama at Birmingham (UAB) offer a broad range of interdisciplinary materials research experiences to undergraduate students with diverse backgrounds in physics, chemistry, applied mathematics, and engineering. The research projects offered cover a broad range of topics including high pressure physics, microelectronic materials, nano-materials, laser materials, bioceramics and biopolymers, cell-biomaterials interactions, planetary materials, and computer simulation of materials. The students welcome the opportunity to work with an interdisciplinary team of basic science, engineering, and biomedical faculty but the challenge is in learning the key vocabulary for interdisciplinary collaborations, experimental tools, and working in an independent capacity. The career development workshops dealing with the graduate school application process and the entrepreneurial business activities were found to be most effective. The interdisciplinary university wide poster session helped student broaden their horizons in research careers. The synergy of the REU program with other concurrently running high school summer programs on UAB campus will also be discussed.
Atomistic simulations of carbon diffusion and segregation in liquid silicon
NASA Astrophysics Data System (ADS)
Luo, Jinping; Alateeqi, Abdullah; Liu, Lijun; Sinno, Talid
2017-12-01
The diffusivity of carbon atoms in liquid silicon and their equilibrium distribution between the silicon melt and crystal phases are key, but unfortunately not precisely known parameters for the global models of silicon solidification processes. In this study, we apply a suite of molecular simulation tools, driven by multiple empirical potential models, to compute diffusion and segregation coefficients of carbon at the silicon melting temperature. We generally find good consistency across the potential model predictions, although some exceptions are identified and discussed. We also find good agreement with the range of available experimental measurements of segregation coefficients. However, the carbon diffusion coefficients we compute are significantly lower than the values typically assumed in continuum models of impurity distribution. Overall, we show that currently available empirical potential models may be useful, at least semi-quantitatively, for studying carbon (and possibly other impurity) transport in silicon solidification, especially if a multi-model approach is taken.
Shock waves in aviation security and safety
NASA Astrophysics Data System (ADS)
Settles, G. S.; Keane, B. T.; Anderson, B. W.; Gatto, J. A.
Accident investigations such as of Pan Am 103 and TWA 800 reveal the key role of shock-wave propagation in destroying the aircraft when an on-board explosion occurs. This paper surveys shock wave propagation inside an aircraft fuselage, caused either by a terrorist device or by accident, and provides some new experimental results. While aircraft-hardening research has been under way for more than a decade, no such experiments to date have used the crucial tool of high-speed optical imaging to visualize shock motion. Here, Penn State's Full-Scale Schlieren flow visualization facility yields the first shock-motion images in aviation security scenarios: 1) Explosions beneath full-size aircraft seats occupied by mannequins, 2) Explosions inside partially-filled luggage containers, and 3) Luggage-container explosions resulting in hull-holing. Both single-frame images and drum-camera movies are obtained. The implications of these results are discussed, though the overall topic must still be considered in its infancy.
Tong, Xuming; Chen, Jinghang; Miao, Hongyu; Li, Tingting; Zhang, Le
2015-01-01
Agent-based models (ABM) and differential equations (DE) are two commonly used methods for immune system simulation. However, it is difficult for ABM to estimate key parameters of the model by incorporating experimental data, whereas the differential equation model is incapable of describing the complicated immune system in detail. To overcome these problems, we developed an integrated ABM regression model (IABMR). It can combine the advantages of ABM and DE by employing ABM to mimic the multi-scale immune system with various phenotypes and types of cells as well as using the input and output of ABM to build up the Loess regression for key parameter estimation. Next, we employed the greedy algorithm to estimate the key parameters of the ABM with respect to the same experimental data set and used ABM to describe a 3D immune system similar to previous studies that employed the DE model. These results indicate that IABMR not only has the potential to simulate the immune system at various scales, phenotypes and cell types, but can also accurately infer the key parameters like DE model. Therefore, this study innovatively developed a complex system development mechanism that could simulate the complicated immune system in detail like ABM and validate the reliability and efficiency of model like DE by fitting the experimental data. PMID:26535589
This procedure is designed to support the collection of potentially responsive information using automated E-Discovery tools that rely on keywords, key phrases, index queries, or other technological assistance to retrieve Electronically Stored Information
Model-based high-throughput design of ion exchange protein chromatography.
Khalaf, Rushd; Heymann, Julia; LeSaout, Xavier; Monard, Florence; Costioli, Matteo; Morbidelli, Massimo
2016-08-12
This work describes the development of a model-based high-throughput design (MHD) tool for the operating space determination of a chromatographic cation-exchange protein purification process. Based on a previously developed thermodynamic mechanistic model, the MHD tool generates a large amount of system knowledge and thereby permits minimizing the required experimental workload. In particular, each new experiment is designed to generate information needed to help refine and improve the model. Unnecessary experiments that do not increase system knowledge are avoided. Instead of aspiring to a perfectly parameterized model, the goal of this design tool is to use early model parameter estimates to find interesting experimental spaces, and to refine the model parameter estimates with each new experiment until a satisfactory set of process parameters is found. The MHD tool is split into four sections: (1) prediction, high throughput experimentation using experiments in (2) diluted conditions and (3) robotic automated liquid handling workstations (robotic workstation), and (4) operating space determination and validation. (1) Protein and resin information, in conjunction with the thermodynamic model, is used to predict protein resin capacity. (2) The predicted model parameters are refined based on gradient experiments in diluted conditions. (3) Experiments on the robotic workstation are used to further refine the model parameters. (4) The refined model is used to determine operating parameter space that allows for satisfactory purification of the protein of interest on the HPLC scale. Each section of the MHD tool is used to define the adequate experimental procedures for the next section, thus avoiding any unnecessary experimental work. We used the MHD tool to design a polishing step for two proteins, a monoclonal antibody and a fusion protein, on two chromatographic resins, in order to demonstrate it has the ability to strongly accelerate the early phases of process development. Copyright © 2016 Elsevier B.V. All rights reserved.
Lodewyck, Jérôme; Debuisschert, Thierry; García-Patrón, Raúl; Tualle-Brouri, Rosa; Cerf, Nicolas J; Grangier, Philippe
2007-01-19
An intercept-resend attack on a continuous-variable quantum-key-distribution protocol is investigated experimentally. By varying the interception fraction, one can implement a family of attacks where the eavesdropper totally controls the channel parameters. In general, such attacks add excess noise in the channel, and may also result in non-Gaussian output distributions. We implement and characterize the measurements needed to detect these attacks, and evaluate experimentally the information rates available to the legitimate users and the eavesdropper. The results are consistent with the optimality of Gaussian attacks resulting from the security proofs.
Safety with Hand and Portable Power Tools. Module SH-14. Safety and Health.
ERIC Educational Resources Information Center
Center for Occupational Research and Development, Inc., Waco, TX.
This student module on safety with hand and portable power tools is one of 50 modules concerned with job safety and health. This module discusses the proper use and maintenance of tools, including the need for protective equipment for the worker. Following the introduction, 16 objectives (each keyed to a page in the text) the student is expected…
ERIC Educational Resources Information Center
Hope, Kempe Ronald, Sr.
2013-01-01
The purpose of this article is to provide an assessment and analysis of public sector performance contracting as a performance management tool in Kenya. It aims to demonstrate that performance contracting remains a viable and important tool for improving public sector performance as a key element of the on-going public sector transformation…
A prototype system for perinatal knowledge engineering using an artificial intelligence tool.
Sokol, R J; Chik, L
1988-01-01
Though several perinatal expert systems are extant, the use of artificial intelligence has, as yet, had minimal impact in medical computing. In this evaluation of the potential of AI techniques in the development of a computer based "Perinatal Consultant," a "top down" approach to the development of a perinatal knowledge base was taken, using as a source for such a knowledge base a 30-page manuscript of a chapter concerning high risk pregnancy. The UNIX utility "style" was used to parse sentences and obtain key words and phrases, both as part of a natural language interface and to identify key perinatal concepts. Compared with the "gold standard" of sentences containing key facts as chosen by the experts, a semiautomated method using a nonmedical speller to identify key words and phrases in context functioned with a sensitivity of 79%, i.e., approximately 8 in 10 key sentences were detected as the basis for PROLOG, rules and facts for the knowledge base. These encouraging results suggest that functional perinatal expert systems may well be expedited by using programming utilities in conjunction with AI tools and published literature.
Will, Thorsten; Helms, Volkhard
2017-04-04
Differential analysis of cellular conditions is a key approach towards understanding the consequences and driving causes behind biological processes such as developmental transitions or diseases. The progress of whole-genome expression profiling enabled to conveniently capture the state of a cell's transcriptome and to detect the characteristic features that distinguish cells in specific conditions. In contrast, mapping the physical protein interactome for many samples is experimentally infeasible at the moment. For the understanding of the whole system, however, it is equally important how the interactions of proteins are rewired between cellular states. To overcome this deficiency, we recently showed how condition-specific protein interaction networks that even consider alternative splicing can be inferred from transcript expression data. Here, we present the differential network analysis tool PPICompare that was specifically designed for isoform-sensitive protein interaction networks. Besides detecting significant rewiring events between the interactomes of grouped samples, PPICompare infers which alterations to the transcriptome caused each rewiring event and what is the minimal set of alterations necessary to explain all between-group changes. When applied to the development of blood cells, we verified that a reasonable amount of rewiring events were reported by the tool and found that differential gene expression was the major determinant of cellular adjustments to the interactome. Alternative splicing events were consistently necessary in each developmental step to explain all significant alterations and were especially important for rewiring in the context of transcriptional control. Applying PPICompare enabled us to investigate the dynamics of the human protein interactome during developmental transitions. A platform-independent implementation of the tool PPICompare is available at https://sourceforge.net/projects/ppicompare/ .
Ruiz, Daniel; Poveda, Germán; Vélez, Iván D; Quiñones, Martha L; Rúa, Guillermo L; Velásquez, Luz E; Zuluaga, Juan S
2006-01-01
Background Malaria has recently re-emerged as a public health burden in Colombia. Although the problem seems to be climate-driven, there remain significant gaps of knowledge in the understanding of the complexity of malaria transmission, which have motivated attempts to develop a comprehensive model. Methods The mathematical tool was applied to represent Plasmodium falciparum malaria transmission in two endemic-areas. Entomological exogenous variables were estimated through field campaigns and laboratory experiments. Availability of breeding places was included towards representing fluctuations in vector densities. Diverse scenarios, sensitivity analyses and instabilities cases were considered during experimentation-validation process. Results Correlation coefficients and mean square errors between observed and modelled incidences reached 0.897–0.668 (P > 0.95) and 0.0002–0.0005, respectively. Temperature became the most relevant climatic parameter driving the final incidence. Accordingly, malaria outbreaks are possible during the favourable epochs following the onset of El Niño warm events. Sporogonic and gonotrophic cycles showed to be the entomological key-variables controlling the transmission potential of mosquitoes' population. Simulation results also showed that seasonality of vector density becomes an important factor towards understanding disease transmission. Conclusion The model constitutes a promising tool to deepen the understanding of the multiple interactions related to malaria transmission conducive to outbreaks. In the foreseeable future it could be implemented as a tool to diagnose possible dynamical patterns of malaria incidence under several scenarios, as well as a decision-making tool for the early detection and control of outbreaks. The model will be also able to be merged with forecasts of El Niño events to provide a National Malaria Early Warning System. PMID:16882349
Ruiz, Daniel; Poveda, Germán; Vélez, Iván D; Quiñones, Martha L; Rúa, Guillermo L; Velásquez, Luz E; Zuluaga, Juan S
2006-08-01
Malaria has recently re-emerged as a public health burden in Colombia. Although the problem seems to be climate-driven, there remain significant gaps of knowledge in the understanding of the complexity of malaria transmission, which have motivated attempts to develop a comprehensive model. The mathematical tool was applied to represent Plasmodium falciparum malaria transmission in two endemic-areas. Entomological exogenous variables were estimated through field campaigns and laboratory experiments. Availability of breeding places was included towards representing fluctuations in vector densities. Diverse scenarios, sensitivity analyses and instabilities cases were considered during experimentation-validation process. Correlation coefficients and mean square errors between observed and modelled incidences reached 0.897-0.668 (P > 0.95) and 0.0002-0.0005, respectively. Temperature became the most relevant climatic parameter driving the final incidence. Accordingly, malaria outbreaks are possible during the favourable epochs following the onset of El Niño warm events. Sporogonic and gonotrophic cycles showed to be the entomological key-variables controlling the transmission potential of mosquitoes' population. Simulation results also showed that seasonality of vector density becomes an important factor towards understanding disease transmission. The model constitutes a promising tool to deepen the understanding of the multiple interactions related to malaria transmission conducive to outbreaks. In the foreseeable future it could be implemented as a tool to diagnose possible dynamical patterns of malaria incidence under several scenarios, as well as a decision-making tool for the early detection and control of outbreaks. The model will be also able to be merged with forecasts of El Niño events to provide a National Malaria Early Warning System.
Quantum exhaustive key search with simplified-DES as a case study.
Almazrooie, Mishal; Samsudin, Azman; Abdullah, Rosni; Mutter, Kussay N
2016-01-01
To evaluate the security of a symmetric cryptosystem against any quantum attack, the symmetric algorithm must be first implemented on a quantum platform. In this study, a quantum implementation of a classical block cipher is presented. A quantum circuit for a classical block cipher of a polynomial size of quantum gates is proposed. The entire work has been tested on a quantum mechanics simulator called libquantum. First, the functionality of the proposed quantum cipher is verified and the experimental results are compared with those of the original classical version. Then, quantum attacks are conducted by using Grover's algorithm to recover the secret key. The proposed quantum cipher is used as a black box for the quantum search. The quantum oracle is then queried over the produced ciphertext to mark the quantum state, which consists of plaintext and key qubits. The experimental results show that for a key of n-bit size and key space of N such that [Formula: see text], the key can be recovered in [Formula: see text] computational steps.
Improved understanding of the relationship between hydraulic properties and streaming potentials
NASA Astrophysics Data System (ADS)
Cassiani, G.; Brovelli, A.
2009-12-01
Streaming potential (SP) measurements have been satisfactorily used in a number of recent studies as a non-invasive tool to monitor fluid movement in both the vadose and the saturated zone. SPs are generated from the coupling between two independent physical processes oc-curring at the pore-level, namely water flow and excess of ions at the negatively charged solid matrix-water interface. The intensity of the measured potentials depends on physical proper-ties of the medium, including the internal micro-geometry of the system, the charge density of the interface and the composition of the pore fluid, which affects its ionic strength, pH and redox potential. The goal of this work is to investigate whether a relationship between the intensity of the SPs and the saturated hydraulic conductivity can be identified. Both properties are - at least to some extent - dependent on the pore-size distribution and connectivity of the pores, and there-fore some degree of correlation is expected. We used a pore-scale numerical model previously developed to simulate both the bulk hydraulic conductivity and the intensity of the SPs gener-ated in a three-dimensional pore-network. The chemical-physical properties of both the inter-face (Zeta-potential) and of the aqueous phase are computed using an analytical, physically based model that has shown good agreement with experimental data. Modelling results were satisfactorily compared with experimental data, showing that the model, although simplified retains the key properties and mechanisms that control SP generation. A sensitivity analysis with respect to the key geometrical and chemical parameters was conducted to evaluate how the correlation between the two studied variables changes and to ascertain whether the bulk hydraulic conductivity can be estimated from SP measurements alone.
NASA Technical Reports Server (NTRS)
Fatig, Michael
1993-01-01
Flight operations and the preparation for it has become increasingly complex as mission complexities increase. Further, the mission model dictates that a significant increase in flight operations activities is upon us. Finally, there is a need for process improvement and economy in the operations arena. It is therefore time that we recognize flight operations as a complex process requiring a defined, structured, and life cycle approach vitally linked to space segment, ground segment, and science operations processes. With this recognition, an FOT Tool Kit consisting of six major components designed to provide tools to guide flight operations activities throughout the mission life cycle was developed. The major components of the FOT Tool Kit and the concepts behind the flight operations life cycle process as developed at NASA's GSFC for GSFC-based missions are addressed. The Tool Kit is therefore intended to increase productivity, quality, cost, and schedule performance of the flight operations tasks through the use of documented, structured methodologies; knowledge of past lessons learned and upcoming new technology; and through reuse and sharing of key products and special application programs made possible through the development of standardized key products and special program directories.
Zhai, Jing-Xuan; Cao, Tian-Jie; An, Ji-Yong; Bian, Yong-Tao
2017-11-07
It is a challenging task for fundamental research whether proteins can interact with their partners. Protein self-interaction (SIP) is a special case of PPIs, which plays a key role in the regulation of cellular functions. Due to the limitations of experimental self-interaction identification, it is very important to develop an effective biological tool for predicting SIPs based on protein sequences. In the study, we developed a novel computational method called RVM-AB that combines the Relevance Vector Machine (RVM) model and Average Blocks (AB) for detecting SIPs from protein sequences. Firstly, Average Blocks (AB) feature extraction method is employed to represent protein sequences on a Position Specific Scoring Matrix (PSSM). Secondly, Principal Component Analysis (PCA) method is used to reduce the dimension of AB vector for reducing the influence of noise. Then, by employing the Relevance Vector Machine (RVM) algorithm, the performance of RVM-AB is assessed and compared with the state-of-the-art support vector machine (SVM) classifier and other exiting methods on yeast and human datasets respectively. Using the fivefold test experiment, RVM-AB model achieved very high accuracies of 93.01% and 97.72% on yeast and human datasets respectively, which are significantly better than the method based on SVM classifier and other previous methods. The experimental results proved that the RVM-AB prediction model is efficient and robust. It can be an automatic decision support tool for detecting SIPs. For facilitating extensive studies for future proteomics research, the RVMAB server is freely available for academic use at http://219.219.62.123:8888/SIP_AB. Copyright © 2017 Elsevier Ltd. All rights reserved.
The KUPNetViz: a biological network viewer for multiple -omics datasets in kidney diseases.
Moulos, Panagiotis; Klein, Julie; Jupp, Simon; Stevens, Robert; Bascands, Jean-Loup; Schanstra, Joost P
2013-07-24
Constant technological advances have allowed scientists in biology to migrate from conventional single-omics to multi-omics experimental approaches, challenging bioinformatics to bridge this multi-tiered information. Ongoing research in renal biology is no exception. The results of large-scale and/or high throughput experiments, presenting a wealth of information on kidney disease are scattered across the web. To tackle this problem, we recently presented the KUPKB, a multi-omics data repository for renal diseases. In this article, we describe KUPNetViz, a biological graph exploration tool allowing the exploration of KUPKB data through the visualization of biomolecule interactions. KUPNetViz enables the integration of multi-layered experimental data over different species, renal locations and renal diseases to protein-protein interaction networks and allows association with biological functions, biochemical pathways and other functional elements such as miRNAs. KUPNetViz focuses on the simplicity of its usage and the clarity of resulting networks by reducing and/or automating advanced functionalities present in other biological network visualization packages. In addition, it allows the extrapolation of biomolecule interactions across different species, leading to the formulations of new plausible hypotheses, adequate experiment design and to the suggestion of novel biological mechanisms. We demonstrate the value of KUPNetViz by two usage examples: the integration of calreticulin as a key player in a larger interaction network in renal graft rejection and the novel observation of the strong association of interleukin-6 with polycystic kidney disease. The KUPNetViz is an interactive and flexible biological network visualization and exploration tool. It provides renal biologists with biological network snapshots of the complex integrated data of the KUPKB allowing the formulation of new hypotheses in a user friendly manner.
The KUPNetViz: a biological network viewer for multiple -omics datasets in kidney diseases
2013-01-01
Background Constant technological advances have allowed scientists in biology to migrate from conventional single-omics to multi-omics experimental approaches, challenging bioinformatics to bridge this multi-tiered information. Ongoing research in renal biology is no exception. The results of large-scale and/or high throughput experiments, presenting a wealth of information on kidney disease are scattered across the web. To tackle this problem, we recently presented the KUPKB, a multi-omics data repository for renal diseases. Results In this article, we describe KUPNetViz, a biological graph exploration tool allowing the exploration of KUPKB data through the visualization of biomolecule interactions. KUPNetViz enables the integration of multi-layered experimental data over different species, renal locations and renal diseases to protein-protein interaction networks and allows association with biological functions, biochemical pathways and other functional elements such as miRNAs. KUPNetViz focuses on the simplicity of its usage and the clarity of resulting networks by reducing and/or automating advanced functionalities present in other biological network visualization packages. In addition, it allows the extrapolation of biomolecule interactions across different species, leading to the formulations of new plausible hypotheses, adequate experiment design and to the suggestion of novel biological mechanisms. We demonstrate the value of KUPNetViz by two usage examples: the integration of calreticulin as a key player in a larger interaction network in renal graft rejection and the novel observation of the strong association of interleukin-6 with polycystic kidney disease. Conclusions The KUPNetViz is an interactive and flexible biological network visualization and exploration tool. It provides renal biologists with biological network snapshots of the complex integrated data of the KUPKB allowing the formulation of new hypotheses in a user friendly manner. PMID:23883183
NASA Astrophysics Data System (ADS)
Smith, Arthur R.
2012-02-01
Future technological advances at the frontier of `elec'tronics will increasingly rely on the use of the spin property of the electron at ever smaller length scales. As a result, it is critical to make substantial efforts towards understanding and ultimately controlling spin and magnetism at the nanoscale. In SPIRE, the goal is to achieve these important scientific advancements through a unique combination of experimental and theoretical techniques, as well as complementary expertise and coherent efforts across three continents. The key experimental tool of choice is spin-polarized scanning tunneling microscopy -- the premier method for accessing the spin structure of surfaces and nanostructures with resolution down to the atomic scale. At the same time, atom and molecule deposition and manipulation schemes are added in order to both atomically engineer, and precisely investigate, novel nanoscale spin structures. These efforts are being applied to an array of physical systems, including single magnetic atomic layers, self-assembled 2-D molecular arrays, single adatoms and molecules, and alloyed spintronic materials. Efforts are aimed at exploring complex spin structures and phenomena occurring in these systems. At the same time, the problems are approached, and in some cases guided, by the use of leading theoretical tools, including analytical approaches such as renormalization group theory, and computational approaches such as first principles density functional theory. The scientific goals of the project are achieved by a collaborative effort with the international partners, engaging students at all levels who, through their research experiences both at home and abroad, gain international research outlooks as well as understandings of cultural differences, by working on intriguing problems of mutual interest. A novel scientific journalism internship program based at Ohio University furthers the project's broader impacts.
Computer simulation models as a tool to investigate the role of microRNAs in osteoarthritis
Smith, Graham R.
2017-01-01
The aim of this study was to show how computational models can be used to increase our understanding of the role of microRNAs in osteoarthritis (OA) using miR-140 as an example. Bioinformatics analysis and experimental results from the literature were used to create and calibrate models of gene regulatory networks in OA involving miR-140 along with key regulators such as NF-κB, SMAD3, and RUNX2. The individual models were created with the modelling standard, Systems Biology Markup Language, and integrated to examine the overall effect of miR-140 on cartilage homeostasis. Down-regulation of miR-140 may have either detrimental or protective effects for cartilage, indicating that the role of miR-140 is complex. Studies of individual networks in isolation may therefore lead to different conclusions. This indicated the need to combine the five chosen individual networks involving miR-140 into an integrated model. This model suggests that the overall effect of miR-140 is to change the response to an IL-1 stimulus from a prolonged increase in matrix degrading enzymes to a pulse-like response so that cartilage degradation is temporary. Our current model can easily be modified and extended as more experimental data become available about the role of miR-140 in OA. In addition, networks of other microRNAs that are important in OA could be incorporated. A fully integrated model could not only aid our understanding of the mechanisms of microRNAs in ageing cartilage but could also provide a useful tool to investigate the effect of potential interventions to prevent cartilage loss. PMID:29095952
Nilsson, Lisbeth; Durkin, Josephine
2017-10-01
To explore the knowledge necessary for adoption and implementation of the Assessment of Learning Powered mobility use (ALP) tool in different practice settings for both adults and children. To consult with a diverse population of professionals working with adults and children, in different countries and various settings; who were learning about or using the ALP tool, as part of exploring and implementing research findings. Classical grounded theory with a rigorous comparative analysis of data from informants together with reflections on our own rich experiences of powered mobility practice and comparisons with the literature. A core category learning tool use and a new theory of cognizing tool use, with its interdependent properties: motivation, confidence, permissiveness, attentiveness and co-construction has emerged which explains in greater depth what enables the application of the ALP tool. The scientific knowledge base on tool use learning and the new theory conveys the information necessary for practitioner's cognizing how to apply the learning approach of the ALP tool in order to enable tool use learning through powered mobility practice as a therapeutic intervention in its own right. This opens up the possibility for more children and adults to have access to learning through powered mobility practice. Implications for rehabilitation Tool use learning through powered mobility practice is a therapeutic intervention in its own right. Powered mobility practice can be used as a rehabilitation tool with individuals who may not need to become powered wheelchair users. Motivation, confidence, permissiveness, attentiveness and co-construction are key properties for enabling the application of the learning approach of the ALP tool. Labelling and the use of language, together with honing observational skills through viewing video footage, are key to developing successful learning partnerships.
Using the Git Software Tool on the Peregrine System | High-Performance
branch workflow. Create a local branch called "experimental" based on the current master... git branch experimental Use your branch (start working on that experimental branch....) git checkout experimental git pull origin experimental # work, work, work, commit.... Send local branch to the repo git push
Applications of Nuclear and Particle Physics Technology: Particles & Detection — A Brief Overview
NASA Astrophysics Data System (ADS)
Weisenberger, Andrew G.
A brief overview of the technology applications with significant societal benefit that have their origins in nuclear and particle physics research is presented. It is shown through representative examples that applications of nuclear physics can be classified into two basic areas: 1) applying the results of experimental nuclear physics and 2) applying the tools of experimental nuclear physics. Examples of the application of the tools of experimental nuclear and particle physics research are provided in the fields of accelerator and detector based technologies namely synchrotron light sources, nuclear medicine, ion implantation and radiation therapy.
ERIC Educational Resources Information Center
Cavus, Nadire; Ibrahim, Dogan
2007-01-01
The development of collaborative studies in learning has led to a renewed interest in the field of Web-based education. In this experimental study a highly interactive and collaborative virtual teaching environment has been created by supporting Moodle LMS with collaborative learning tool GREWPtool. The aim of this experimental study has been to…
ERIC Educational Resources Information Center
Kuhn, Jochen; Vogt, Patrik
2013-01-01
New media technology becomes more and more important for our daily life as well as for teaching physics. Within the scope of our N.E.T. research project we develop experiments using New Media Experimental Tools (N.E.T.) in physics education and study their influence on students learning abilities. We want to present the possibilities e.g. of…
NASA Astrophysics Data System (ADS)
Sampath Kumar, Bharath
The purpose of this study is to examine the role of partnering visualization tool such as simulation towards development of student's concrete conceptual understanding of chemical equilibrium. Students find chemistry concepts abstract, especially at the microscopic level. Chemical equilibrium is one such topic. While research studies have explored effectiveness of low tech instructional strategies such as analogies, jigsaw, cooperative learning, and using modeling blocks, fewer studies have explored the use of visualization tool such as simulations in the context of dynamic chemical equilibrium. Research studies have identified key reasons behind misconceptions such as lack of systematic understanding of foundational chemistry concepts, failure to recognize the system is dynamic, solving numerical problems on chemical equilibrium in an algorithmic fashion, erroneous application Le Chatelier's principle (LCP) etc. Kress et al. (2001) suggested that external representation in the form of visualization is more than a tool for learning, because it enables learners to make meanings or express their ideas which cannot be readily done so through a verbal representation alone. Mixed method study design was used towards data collection. The qualitative portion of the study is aimed towards understanding the change in student's mental model before and after the intervention. A quantitative instrument was developed based on common areas of misconceptions identified by research studies. A pilot study was conducted prior to the actual study to obtain feedback from students on the quantitative instrument and the simulation. Participants for the pilot study were sampled from a single general chemistry class. Following the pilot study, the research study was conducted with a total of 27 students (N=15 in experimental group and N=12 in control group). Prior to participating in the study, students have completed their midterm test on the topic of chemical equilibrium. Qualitative interviews pre and post revealed students' mental model or thought process towards chemical equilibrium. Simulations used in the study were developed using the SCRATCH software platform. In order to test the effect of visualization tool on students' conceptual understanding of chemical equilibrium, an ANCOVA analysis was conducted. Results from a one-factor ANCOVA showed posttest scores were significantly higher for the experimental group (Mpostadj. = 7.27 SDpost = 1.387) relative to the control group (Mpostadj. = 2.67, SDpost = 1.371) after adjusting for pretest scores, F (1,24) = 71.82, MSE = 1.497, p = 0.03, eta 2p = 0.75, d = 3.33. Cohen's d was converted to an attenuated effect size d* using the procedure outlined in Thompson (2006). The adjusted (for pretest scores) group mean difference estimate without measure error correction for the posttest scores and the pretest scores was 4.2 with a Cohen's d = 3.04. An alternate approach reported in Cho and Preacher (2015) was used to determine effect size. The adjusted (for pretest scores) group mean difference estimate with measurement error correction only for the posttest scores (but not with measurement error correction for the pretest scores) was 4.99 with a Cohen's d = 3.61. Finally, the adjusted (for pretest scores) group mean difference estimate with measurement error correction for both pretest and posttest scores was 4.23 with a Cohen's d = 3.07. From a quantitative perspective, these effect size indicate a strong relationship between the experimental intervention provided and students' conceptual understanding of chemical equilibrium concepts. That is, those students who received the experimental intervention had exceptionally higher. KEYWORDS: Chemical Equilibrium, Visualization, Alternate Conceptions, Ontological Shift. Simulations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Hong-Wei; Zhengzhou Information Science and Technology Institute, Zhengzhou, 450004; Wang, Shuang
2011-12-15
It is well known that the unconditional security of quantum-key distribution (QKD) can be guaranteed by quantum mechanics. However, practical QKD systems have some imperfections, which can be controlled by the eavesdropper to attack the secret key. With current experimental technology, a realistic beam splitter, made by fused biconical technology, has a wavelength-dependent property. Based on this fatal security loophole, we propose a wavelength-dependent attacking protocol, which can be applied to all practical QKD systems with passive state modulation. Moreover, we experimentally attack a practical polarization encoding QKD system to obtain all the secret key information at the cost ofmore » only increasing the quantum bit error rate from 1.3 to 1.4%.« less
Design of virtual simulation experiment based on key events
NASA Astrophysics Data System (ADS)
Zhong, Zheng; Zhou, Dongbo; Song, Lingxiu
2018-06-01
Considering complex content and lacking of guidance in virtual simulation experiments, the key event technology in VR narrative theory was introduced for virtual simulation experiment to enhance fidelity and vividness process. Based on the VR narrative technology, an event transition structure was designed to meet the need of experimental operation process, and an interactive event processing model was used to generate key events in interactive scene. The experiment of" margin value of bees foraging" based on Biologic morphology was taken as an example, many objects, behaviors and other contents were reorganized. The result shows that this method can enhance the user's experience and ensure experimental process complete and effectively.
NASA Astrophysics Data System (ADS)
Pratap, A.; Sahoo, P.; Patra, K.; Dyakonov, A. A.
2017-09-01
This study focuses on the improvement in grinding performance of BK-7 glass using polycrystalline diamond micro-tool. Micro-tools are modified using wire EDM and performance of modified tools is compared with that of as received tool. Tool wear of different types of tools are observed. To quantify the tool wear, a method based on weight loss of tool is introduced in this study. Modified tools significantly reduce tool wear in comparison to the normal tool. Grinding forces increase with machining time due to tool wear. However, modified tools produce lesser forces thus can improve life of the PCD micro-grinding tool.
ERIC Educational Resources Information Center
Ezquerra, Ángel; Manso, Javier; Burgos, Mª Esther; Hallabrin, Carla
2014-01-01
New curricular plans based on key competences create the need for new educational proposals that allow their development. This article describes a proposal to develop key competences through project-based learning. The project's objective is the creation of a digital video. The following study was carried out with students in their final two years…
Quantum dense key distribution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Degiovanni, I.P.; Ruo Berchera, I.; Castelletto, S.
2004-03-01
This paper proposes a protocol for quantum dense key distribution. This protocol embeds the benefits of a quantum dense coding and a quantum key distribution and is able to generate shared secret keys four times more efficiently than the Bennet-Brassard 1984 protocol. We hereinafter prove the security of this scheme against individual eavesdropping attacks, and we present preliminary experimental results, showing its feasibility.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lamers, M.D.
One of the key needs in the advancement of geothermal energy is availability of adequate subsurface measurements to aid the reservoir engineer in the development and operation of geothermal wells. Some current projects being sponsored by the U. S. Department of Energy's Division of Geothermal Energy pertaining to the development of improved well logging techniques, tools and components are described. An attempt is made to show how these projects contribute to improvement of geothermal logging technology in forming key elements of the overall program goals.
A new method to study ferroelectrics using the remanent Henkel plots
NASA Astrophysics Data System (ADS)
Vopson, Melvin M.
2018-05-01
Analysis of experimental curves constructed from dc demagnetization and isothermal remanent magnetization known as Henkel and delta M plots, have served for over 53 years as an important tool for characterization of interactions in ferromagnets. In this article we address the question whether the same experimental technique could be applied to the study of ferroelectric systems. The successful measurement of the equivalent dc depolarisation and isothermal remanent polarization curves and the construction of the Henkel and delta P plots for ferroelectrics is reported here. Full measurement protocol is provided together with experimental examples for two ferroelectric ceramic samples. This new measurement technique is an invaluable experimental tool that could be used to further advance our understanding of ferroelectric materials and their applications.
SPOT-A SENSOR PLACEMENT OPTIMIZATION TOOL FOR ...
journal article This paper presents SPOT, a Sensor Placement Optimization Tool. SPOT provides a toolkit that facilitates research in sensor placement optimization and enables the practical application of sensor placement solvers to real-world CWS design applications. This paper provides an overview of SPOT’s key features, and then illustrates how this tool can be flexibly applied to solve a variety of different types of sensor placement problems.
The Effect of a Computer-Based Cartooning Tool on Children's Cartoons and Written Stories
ERIC Educational Resources Information Center
Madden, M.; Chung, P. W. H.; Dawson, C. W.
2008-01-01
This paper reports a study assessing a new computer tool for cartoon storytelling, created by the authors for a target audience in the upper half of the English and Welsh Key Stage 2 (years 5 and 6, covering ages 9-11 years). The tool attempts to provide users with more opportunities for expressive visualisation than previous educational software;…
New Technologies and Communications Tools
ERIC Educational Resources Information Center
Bouse, Jim
2016-01-01
Relationships are key to the success of everything higher education hopes to accomplish, from recruiting the next class to retaining them, guiding them to graduation, creating successful alumni, and fostering satisfied donors. Creation of those relationships can be engaged and facilitated by the technology, communications tools, and ideas…