Sample records for essential experimental tools

  1. Quality Appraisal of Single-Subject Experimental Designs: An Overview and Comparison of Different Appraisal Tools

    ERIC Educational Resources Information Center

    Wendt, Oliver; Miller, Bridget

    2012-01-01

    Critical appraisal of the research literature is an essential step in informing and implementing evidence-based practice. Quality appraisal tools that assess the methodological quality of experimental studies provide a means to identify the most rigorous research suitable for evidence-based decision-making. In single-subject experimental research,…

  2. Designing tools for oil exploration using nuclear modeling

    NASA Astrophysics Data System (ADS)

    Mauborgne, Marie-Laure; Allioli, Françoise; Manclossi, Mauro; Nicoletti, Luisa; Stoller, Chris; Evans, Mike

    2017-09-01

    When designing nuclear tools for oil exploration, one of the first steps is typically nuclear modeling for concept evaluation and initial characterization. Having an accurate model, including the availability of accurate cross sections, is essential to reduce or avoid time consuming and costly design iterations. During tool response characterization, modeling is benchmarked with experimental data and then used to complement and to expand the database to make it more detailed and inclusive of more measurement environments which are difficult or impossible to reproduce in the laboratory. We present comparisons of our modeling results obtained using the ENDF/B-VI and ENDF/B-VII cross section data bases, focusing on the response to a few elements found in the tool, borehole and subsurface formation. For neutron-induced inelastic and capture gamma ray spectroscopy, major obstacles may be caused by missing or inaccurate cross sections for essential materials. We show examples of the benchmarking of modeling results against experimental data obtained during tool characterization and discuss observed discrepancies.

  3. TRIP-ID: A tool for a smart and interactive identification of Magic Formula tyre model parameters from experimental data acquired on track or test rig

    NASA Astrophysics Data System (ADS)

    Farroni, Flavio; Lamberti, Raffaele; Mancinelli, Nicolò; Timpone, Francesco

    2018-03-01

    Tyres play a key role in ground vehicles' dynamics because they are responsible for traction, braking and cornering. A proper tyre-road interaction model is essential for a useful and reliable vehicle dynamics model. In the last two decades Pacejka's Magic Formula (MF) has become a standard in simulation field. This paper presents a Tool, called TRIP-ID (Tyre Road Interaction Parameters IDentification), developed to characterize and to identify with a high grade of accuracy and reliability MF micro-parameters from experimental data deriving from telemetry or from test rig. The tool guides interactively the user through the identification process on the basis of strong diagnostic considerations about the experimental data made evident by the tool itself. A motorsport application of the tool is shown as a case study.

  4. Development and Validation of a Rubric for Diagnosing Students' Experimental Design Knowledge and Difficulties

    ERIC Educational Resources Information Center

    Dasgupta, Annwesa P.; Anderson, Trevor R.; Pelaez, Nancy

    2014-01-01

    It is essential to teach students about experimental design, as this facilitates their deeper understanding of how most biological knowledge was generated and gives them tools to perform their own investigations. Despite the importance of this area, surprisingly little is known about what students actually learn from designing biological…

  5. Heterogeneity of the gut microbiome in mice: guidelines for optimizing experimental design.

    PubMed

    Laukens, Debby; Brinkman, Brigitta M; Raes, Jeroen; De Vos, Martine; Vandenabeele, Peter

    2016-01-01

    Targeted manipulation of the gut flora is increasingly being recognized as a means to improve human health. Yet, the temporal dynamics and intra- and interindividual heterogeneity of the microbiome represent experimental limitations, especially in human cross-sectional studies. Therefore, rodent models represent an invaluable tool to study the host-microbiota interface. Progress in technical and computational tools to investigate the composition and function of the microbiome has opened a new era of research and we gradually begin to understand the parameters that influence variation of host-associated microbial communities. To isolate true effects from confounding factors, it is essential to include such parameters in model intervention studies. Also, explicit journal instructions to include essential information on animal experiments are mandatory. The purpose of this review is to summarize the factors that influence microbiota composition in mice and to provide guidelines to improve the reproducibility of animal experiments. © FEMS 2015.

  6. Heterogeneity of the gut microbiome in mice: guidelines for optimizing experimental design

    PubMed Central

    Laukens, Debby; Brinkman, Brigitta M.; Raes, Jeroen; De Vos, Martine; Vandenabeele, Peter

    2015-01-01

    Targeted manipulation of the gut flora is increasingly being recognized as a means to improve human health. Yet, the temporal dynamics and intra- and interindividual heterogeneity of the microbiome represent experimental limitations, especially in human cross-sectional studies. Therefore, rodent models represent an invaluable tool to study the host–microbiota interface. Progress in technical and computational tools to investigate the composition and function of the microbiome has opened a new era of research and we gradually begin to understand the parameters that influence variation of host-associated microbial communities. To isolate true effects from confounding factors, it is essential to include such parameters in model intervention studies. Also, explicit journal instructions to include essential information on animal experiments are mandatory. The purpose of this review is to summarize the factors that influence microbiota composition in mice and to provide guidelines to improve the reproducibility of animal experiments. PMID:26323480

  7. Enhancing performance of next generation FSO communication systems using soft computing-based predictions.

    PubMed

    Kazaura, Kamugisha; Omae, Kazunori; Suzuki, Toshiji; Matsumoto, Mitsuji; Mutafungwa, Edward; Korhonen, Timo O; Murakami, Tadaaki; Takahashi, Koichi; Matsumoto, Hideki; Wakamori, Kazuhiko; Arimoto, Yoshinori

    2006-06-12

    The deterioration and deformation of a free-space optical beam wave-front as it propagates through the atmosphere can reduce the link availability and may introduce burst errors thus degrading the performance of the system. We investigate the suitability of utilizing soft-computing (SC) based tools for improving performance of free-space optical (FSO) communications systems. The SC based tools are used for the prediction of key parameters of a FSO communications system. Measured data collected from an experimental FSO communication system is used as training and testing data for a proposed multi-layer neural network predictor (MNNP) used to predict future parameter values. The predicted parameters are essential for reducing transmission errors by improving the antenna's accuracy of tracking data beams. This is particularly essential for periods considered to be of strong atmospheric turbulence. The parameter values predicted using the proposed tool show acceptable conformity with original measurements.

  8. Evaluation and Analysis of F-16XL Wind Tunnel Data From Static and Dynamic Tests

    NASA Technical Reports Server (NTRS)

    Kim, Sungwan; Murphy, Patrick C.; Klein, Vladislav

    2004-01-01

    A series of wind tunnel tests were conducted in the NASA Langley Research Center as part of an ongoing effort to develop and test mathematical models for aircraft rigid-body aerodynamics in nonlinear unsteady flight regimes. Analysis of measurement accuracy, especially for nonlinear dynamic systems that may exhibit complicated behaviors, is an essential component of this ongoing effort. In this report, tools for harmonic analysis of dynamic data and assessing measurement accuracy are presented. A linear aerodynamic model is assumed that is appropriate for conventional forced-oscillation experiments, although more general models can be used with these tools. Application of the tools to experimental data is demonstrated and results indicate the levels of uncertainty in output measurements that can arise from experimental setup, calibration procedures, mechanical limitations, and input errors.

  9. Many Ways to Make A Line

    ERIC Educational Resources Information Center

    Ferrell, Holly

    2006-01-01

    This article describes a middle school introductory art lesson that encourages experimentation as an essential part of the creative process. In this lesson, students experiment with different types of media and tools to create an abstract piece that focuses on the most basic element of art--line. Students focus on line quality, focal points,…

  10. Comparison of Numerically Simulated and Experimentally Measured Performance of a Rotating Detonation Engine

    NASA Technical Reports Server (NTRS)

    Paxson, Daniel E.; Fotia, Matthew L.; Hoke, John; Schauer, Fred

    2015-01-01

    A quasi-two-dimensional, computational fluid dynamic (CFD) simulation of a rotating detonation engine (RDE) is described. The simulation operates in the detonation frame of reference and utilizes a relatively coarse grid such that only the essential primary flow field structure is captured. This construction and other simplifications yield rapidly converging, steady solutions. Viscous effects, and heat transfer effects are modeled using source terms. The effects of potential inlet flow reversals are modeled using boundary conditions. Results from the simulation are compared to measured data from an experimental RDE rig with a converging-diverging nozzle added. The comparison is favorable for the two operating points examined. The utility of the code as a performance optimization tool and a diagnostic tool are discussed.

  11. The use of continuous culture in systems biology investigations.

    PubMed

    Winder, Catherine L; Lanthaler, Karin

    2011-01-01

    When acquiring data for systems biology studies, it is essential to perform the experiments in controlled and reproducible conditions. Advances in the fields of proteomics and metabolomics allow the quantitative analysis of the components of the biological cell. It is essential to include a method in the experimental pipeline to culture the biological system in controlled and reproducible conditions to facilitate the acquisition of high-quality data. The employment of continuous culture methods for the growth of microorganisms is an ideal tool to achieve these objectives. This chapter will review the continuous culture approaches which may be applied in such studies, outline the experimental options which should be considered, and describe the approach applied in the production of steady-state cultures of Saccharomyces cerevisiae. Copyright © 2011 Elsevier Inc. All rights reserved.

  12. Counterbalancing for serial order carryover effects in experimental condition orders.

    PubMed

    Brooks, Joseph L

    2012-12-01

    Reactions of neural, psychological, and social systems are rarely, if ever, independent of previous inputs and states. The potential for serial order carryover effects from one condition to the next in a sequence of experimental trials makes counterbalancing of condition order an essential part of experimental design. Here, a method is proposed for generating counterbalanced sequences for repeated-measures designs including those with multiple observations of each condition on one participant and self-adjacencies of conditions. Condition ordering is reframed as a graph theory problem. Experimental conditions are represented as vertices in a graph and directed edges between them represent temporal relationships between conditions. A counterbalanced trial order results from traversing an Euler circuit through such a graph in which each edge is traversed exactly once. This method can be generalized to counterbalance for higher order serial order carryover effects as well as to create intentional serial order biases. Modern graph theory provides tools for finding other types of paths through such graph representations, providing a tool for generating experimental condition sequences with useful properties. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  13. Databases and Associated Tools for Glycomics and Glycoproteomics.

    PubMed

    Lisacek, Frederique; Mariethoz, Julien; Alocci, Davide; Rudd, Pauline M; Abrahams, Jodie L; Campbell, Matthew P; Packer, Nicolle H; Ståhle, Jonas; Widmalm, Göran; Mullen, Elaine; Adamczyk, Barbara; Rojas-Macias, Miguel A; Jin, Chunsheng; Karlsson, Niclas G

    2017-01-01

    The access to biodatabases for glycomics and glycoproteomics has proven to be essential for current glycobiological research. This chapter presents available databases that are devoted to different aspects of glycobioinformatics. This includes oligosaccharide sequence databases, experimental databases, 3D structure databases (of both glycans and glycorelated proteins) and association of glycans with tissue, disease, and proteins. Specific search protocols are also provided using tools associated with experimental databases for converting primary glycoanalytical data to glycan structural information. In particular, researchers using glycoanalysis methods by U/HPLC (GlycoBase), MS (GlycoWorkbench, UniCarb-DB, GlycoDigest), and NMR (CASPER) will benefit from this chapter. In addition we also include information on how to utilize glycan structural information to query databases that associate glycans with proteins (UniCarbKB) and with interactions with pathogens (SugarBind).

  14. EXACT2: the semantics of biomedical protocols

    PubMed Central

    2014-01-01

    Background The reliability and reproducibility of experimental procedures is a cornerstone of scientific practice. There is a pressing technological need for the better representation of biomedical protocols to enable other agents (human or machine) to better reproduce results. A framework that ensures that all information required for the replication of experimental protocols is essential to achieve reproducibility. Methods We have developed the ontology EXACT2 (EXperimental ACTions) that is designed to capture the full semantics of biomedical protocols required for their reproducibility. To construct EXACT2 we manually inspected hundreds of published and commercial biomedical protocols from several areas of biomedicine. After establishing a clear pattern for extracting the required information we utilized text-mining tools to translate the protocols into a machine amenable format. We have verified the utility of EXACT2 through the successful processing of previously 'unseen' (not used for the construction of EXACT2) protocols. Results The paper reports on a fundamentally new version EXACT2 that supports the semantically-defined representation of biomedical protocols. The ability of EXACT2 to capture the semantics of biomedical procedures was verified through a text mining use case. In this EXACT2 is used as a reference model for text mining tools to identify terms pertinent to experimental actions, and their properties, in biomedical protocols expressed in natural language. An EXACT2-based framework for the translation of biomedical protocols to a machine amenable format is proposed. Conclusions The EXACT2 ontology is sufficient to record, in a machine processable form, the essential information about biomedical protocols. EXACT2 defines explicit semantics of experimental actions, and can be used by various computer applications. It can serve as a reference model for for the translation of biomedical protocols in natural language into a semantically-defined format. PMID:25472549

  15. Laboratory preparation questionnaires as a tool for the implementation of the Just in Time Teaching in the Physics I laboratories: Research training

    NASA Astrophysics Data System (ADS)

    Miranda, David A.; Sanchez, Melba J.; Forero, Oscar M.

    2017-06-01

    The implementation of the JiTT (Just in Time Teaching) strategy is presented to increase the previous preparation of students enrolled in the subject Physics Laboratory I offered at the Industrial University of Santander (UIS), Colombia. In this study, a laboratory preparation questionnaire (CPL) was applied as a tool for the implementation of JiTT combined with elements of mediated learning. It was found that the CPL allows to improve the students’ experience regarding the preparation of the laboratory and the development of the experimental session. These questionnaires were implemented in an academic manager (Moodle) and a web application (lab.ciencias.uis.edu.co) was used to publish the contents essential for the preparation of the student before each practical session. The most significant result was that the students performed the experimental session with the basic knowledge to improve their learning experience.

  16. Quantum violation of an instrumental test

    NASA Astrophysics Data System (ADS)

    Chaves, Rafael; Carvacho, Gonzalo; Agresti, Iris; Di Giulio, Valerio; Aolita, Leandro; Giacomini, Sandro; Sciarrino, Fabio

    2018-03-01

    Inferring causal relations from experimental observations is of primal importance in science. Instrumental tests provide an essential tool for that aim, as they allow one to estimate causal dependencies even in the presence of unobserved common causes. In view of Bell's theorem, which implies that quantum mechanics is incompatible with our most basic notions of causality, it is of utmost importance to understand whether and how paradigmatic causal tools obtained in a classical setting can be carried over to the quantum realm. Here we show that quantum effects imply radically different predictions in the instrumental scenario. Among other results, we show that an instrumental test can be violated by entangled quantum states. Furthermore, we demonstrate such violation using a photonic set-up with active feed-forward of information, thus providing an experimental proof of this new form of non-classical behaviour. Our findings have fundamental implications in causal inference and may also lead to new applications of quantum technologies.

  17. A novel three-dimensional tool for teaching human neuroanatomy.

    PubMed

    Estevez, Maureen E; Lindgren, Kristen A; Bergethon, Peter R

    2010-01-01

    Three-dimensional (3D) visualization of neuroanatomy can be challenging for medical students. This knowledge is essential in order for students to correlate cross-sectional neuroanatomy and whole brain specimens within neuroscience curricula and to interpret clinical and radiological information as clinicians or researchers. This study implemented and evaluated a new tool for teaching 3D neuroanatomy to first-year medical students at Boston University School of Medicine. Students were randomized into experimental and control classrooms. All students were taught neuroanatomy according to traditional 2D methods. Then, during laboratory review, the experimental group constructed 3D color-coded physical models of the periventricular structures, while the control group re-examined 2D brain cross-sections. At the end of the course, 2D and 3D spatial relationships of the brain and preferred learning styles were assessed in both groups. The overall quiz scores for the experimental group were significantly higher than the control group (t(85) = 2.02, P < 0.05). However, when the questions were divided into those requiring either 2D or 3D visualization, only the scores for the 3D questions were significantly higher in the experimental group (F₁(,)₈₅ = 5.48, P = 0.02). When surveyed, 84% of students recommended repeating the 3D activity for future laboratories, and this preference was equally distributed across preferred learning styles (χ² = 0.14, n.s.). Our results suggest that our 3D physical modeling activity is an effective method for teaching spatial relationships of brain anatomy and will better prepare students for visualization of 3D neuroanatomy, a skill essential for higher education in neuroscience, neurology, and neurosurgery. Copyright © 2010 American Association of Anatomists.

  18. A Novel Three-Dimensional Tool for Teaching Human Neuroanatomy

    PubMed Central

    Estevez, Maureen E.; Lindgren, Kristen A.; Bergethon, Peter R.

    2011-01-01

    Three-dimensional (3-D) visualization of neuroanatomy can be challenging for medical students. This knowledge is essential in order for students to correlate cross-sectional neuroanatomy and whole brain specimens within neuroscience curricula and to interpret clinical and radiological information as clinicians or researchers. This study implemented and evaluated a new tool for teaching 3-D neuroanatomy to first-year medical students at Boston University School of Medicine. Students were randomized into experimental and control classrooms. All students were taught neuroanatomy according to traditional 2-D methods. Then, during laboratory review, the experimental group constructed 3-D color-coded physical models of the periventricular structures, while the control group re-examined 2-D brain cross-sections. At the end of the course, 2-D and 3-D spatial relationships of the brain and preferred learning styles were assessed in both groups. The overall quiz scores for the experimental group were significantly higher than the control group (t(85) = 2.02, P < 0.05). However, when the questions were divided into those requiring either 2-D or 3-D visualization, only the scores for the 3-D questions were significantly higher in the experimental group (F1,85 = 5.48, P = 0.02). When surveyed, 84% of students recommended repeating the 3-D activity for future laboratories, and this preference was equally distributed across preferred learning styles (χ2 = 0.14, n.s.). Our results suggest that our 3-D physical modeling activity is an effective method for teaching spatial relationships of brain anatomy and will better prepare students for visualization of 3-D neuroanatomy, a skill essential for higher education in neuroscience, neurology, and neurosurgery. PMID:20939033

  19. In-silico wear prediction for knee replacements--methodology and corroboration.

    PubMed

    Strickland, M A; Taylor, M

    2009-07-22

    The capability to predict in-vivo wear of knee replacements is a valuable pre-clinical analysis tool for implant designers. Traditionally, time-consuming experimental tests provided the principal means of investigating wear. Today, computational models offer an alternative. However, the validity of these models has not been demonstrated across a range of designs and test conditions, and several different formulas are in contention for estimating wear rates, limiting confidence in the predictive power of these in-silico models. This study collates and retrospectively simulates a wide range of experimental wear tests using fast rigid-body computational models with extant wear prediction algorithms, to assess the performance of current in-silico wear prediction tools. The number of tests corroborated gives a broader, more general assessment of the performance of these wear-prediction tools, and provides better estimates of the wear 'constants' used in computational models. High-speed rigid-body modelling allows a range of alternative algorithms to be evaluated. Whilst most cross-shear (CS)-based models perform comparably, the 'A/A+B' wear model appears to offer the best predictive power amongst existing wear algorithms. However, the range and variability of experimental data leaves considerable uncertainty in the results. More experimental data with reduced variability and more detailed reporting of studies will be necessary to corroborate these models with greater confidence. With simulation times reduced to only a few minutes, these models are ideally suited to large-volume 'design of experiment' or probabilistic studies (which are essential if pre-clinical assessment tools are to begin addressing the degree of variation observed clinically and in explanted components).

  20. Modeling NIF experimental designs with adaptive mesh refinement and Lagrangian hydrodynamics

    NASA Astrophysics Data System (ADS)

    Koniges, A. E.; Anderson, R. W.; Wang, P.; Gunney, B. T. N.; Becker, R.; Eder, D. C.; MacGowan, B. J.; Schneider, M. B.

    2006-06-01

    Incorporation of adaptive mesh refinement (AMR) into Lagrangian hydrodynamics algorithms allows for the creation of a highly powerful simulation tool effective for complex target designs with three-dimensional structure. We are developing an advanced modeling tool that includes AMR and traditional arbitrary Lagrangian-Eulerian (ALE) techniques. Our goal is the accurate prediction of vaporization, disintegration and fragmentation in National Ignition Facility (NIF) experimental target elements. Although our focus is on minimizing the generation of shrapnel in target designs and protecting the optics, the general techniques are applicable to modern advanced targets that include three-dimensional effects such as those associated with capsule fill tubes. Several essential computations in ordinary radiation hydrodynamics need to be redesigned in order to allow for AMR to work well with ALE, including algorithms associated with radiation transport. Additionally, for our goal of predicting fragmentation, we include elastic/plastic flow into our computations. We discuss the integration of these effects into a new ALE-AMR simulation code. Applications of this newly developed modeling tool as well as traditional ALE simulations in two and three dimensions are applied to NIF early-light target designs.

  1. Advanced Simulation and Computing Business Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rummel, E.

    To maintain a credible nuclear weapons program, the National Nuclear Security Administration’s (NNSA’s) Office of Defense Programs (DP) needs to make certain that the capabilities, tools, and expert staff are in place and are able to deliver validated assessments. This requires a complete and robust simulation environment backed by an experimental program to test ASC Program models. This ASC Business Plan document encapsulates a complex set of elements, each of which is essential to the success of the simulation component of the Nuclear Security Enterprise. The ASC Business Plan addresses the hiring, mentoring, and retaining of programmatic technical staff responsiblemore » for building the simulation tools of the nuclear security complex. The ASC Business Plan describes how the ASC Program engages with industry partners—partners upon whom the ASC Program relies on for today’s and tomorrow’s high performance architectures. Each piece in this chain is essential to assure policymakers, who must make decisions based on the results of simulations, that they are receiving all the actionable information they need.« less

  2. Battery Pack Thermal Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pesaran, Ahmad

    This presentation describes the thermal design of battery packs at the National Renewable Energy Laboratory. A battery thermal management system essential for xEVs for both normal operation during daily driving (achieving life and performance) and off-normal operation during abuse conditions (achieving safety). The battery thermal management system needs to be optimized with the right tools for the lowest cost. Experimental tools such as NREL's isothermal battery calorimeter, thermal imaging, and heat transfer setups are needed. Thermal models and computer-aided engineering tools are useful for robust designs. During abuse conditions, designs should prevent cell-to-cell propagation in a module/pack (i.e., keep themore » fire small and manageable). NREL's battery ISC device can be used for evaluating the robustness of a module/pack to cell-to-cell propagation.« less

  3. Evolution of the human hand: approaches to acquiring, analysing and interpreting the anatomical evidence

    PubMed Central

    MARZKE, MARY W.; MARZKE, R. F.

    2000-01-01

    The discovery of fossil hand bones from an early human ancestor at Olduvai Gorge in 1960, at the same level as primitive stone tools, generated a debate about the role of tools in the evolution of the human hand that has raged to the present day. Could the Olduvai hand have made the tools? Did the human hand evolve as an adaptation to tool making and tool use? The debate has been fueled by anatomical studies comparing living and fossil human and nonhuman primate hands, and by experimental observations. These have assessed the relative abilities of apes and humans to manufacture the Oldowan tools, but consensus has been hampered by disagreements about how to translate experimental data from living species into quantitative models for predicting the performance of fossil hands. Such models are now beginning to take shape as new techniques are applied to the capture, management and analysis of data on kinetic and kinematic variables ranging from hand joint structure, muscle mechanics, and the distribution and density of bone to joint movements and muscle recruitment during manipulative behaviour. The systematic comparative studies are highlighting a functional complex of features in the human hand facilitating a distinctive repertoire of grips that are apparently more effective for stone tool making than grips characterising various nonhuman primate species. The new techniques are identifying skeletal variables whose form may provide clues to the potential of fossil hominid hands for one-handed firm precision grips and fine precision manoeuvering movements, both of which are essential for habitual and effective tool making and tool use. PMID:10999274

  4. Avanti lipid tools: connecting lipids, technology, and cell biology.

    PubMed

    Sims, Kacee H; Tytler, Ewan M; Tipton, John; Hill, Kasey L; Burgess, Stephen W; Shaw, Walter A

    2014-08-01

    Lipid research is challenging owing to the complexity and diversity of the lipidome. Here we review a set of experimental tools developed for the seasoned lipid researcher, as well as, those who are new to the field of lipid research. Novel tools for probing protein-lipid interactions, applications for lipid binding antibodies, enhanced systems for the cellular delivery of lipids, improved visualization of lipid membranes using gold-labeled lipids, and advances in mass spectrometric analysis techniques will be discussed. Because lipid mediators are known to participate in a host of signal transduction and trafficking pathways within the cell, a comprehensive lipid toolbox that aids the science of lipidomics research is essential to better understand the molecular mechanisms of interactions between cellular components. This article is part of a Special Issue entitled Tools to study lipid functions. Copyright © 2014. Published by Elsevier B.V.

  5. On use of ZPR research reactors and associated instrumentation and measurement methods for reactor physics studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chauvin, J.P.; Blaise, P.; Lyoussi, A.

    2015-07-01

    The French atomic and alternative energies -CEA- is strongly involved in research and development programs concerning the use of nuclear energy as a clean and reliable source of energy and consequently is working on the present and future generation of reactors on various topics such as ageing plant management, optimization of the plutonium stockpile, waste management and innovative systems exploration. Core physics studies are an essential part of this comprehensive R and D effort. In particular, the Zero Power Reactor (ZPR) of CEA: EOLE, MINERVE and MASURCA play an important role in the validation of neutron (as well photon) physicsmore » calculation tools (codes and nuclear data). The experimental programs defined in the CEA's ZPR facilities aim at improving the calculation routes by reducing the uncertainties of the experimental databases. They also provide accurate data on innovative systems in terms of new materials (moderating and decoupling materials) and new concepts (ADS, ABWR, new MTR (e.g. JHR), GENIV) involving new fuels, absorbers and coolant materials. Conducting such interesting experimental R and D programs is based on determining and measuring main parameters of phenomena of interest to qualify calculation tools and nuclear data 'libraries'. Determining these parameters relies on the use of numerous and different experimental techniques using specific and appropriate instrumentation and detection tools. Main ZPR experimental programs at CEA, their objectives and challenges will be presented and discussed. Future development and perspectives regarding ZPR reactors and associated programs will be also presented. (authors)« less

  6. A Modelling Method of Bolt Joints Based on Basic Characteristic Parameters of Joint Surfaces

    NASA Astrophysics Data System (ADS)

    Yuansheng, Li; Guangpeng, Zhang; Zhen, Zhang; Ping, Wang

    2018-02-01

    Bolt joints are common in machine tools and have a direct impact on the overall performance of the tools. Therefore, the understanding of bolt joint characteristics is essential for improving machine design and assembly. Firstly, According to the experimental data obtained from the experiment, the stiffness curve formula was fitted. Secondly, a finite element model of unit bolt joints such as bolt flange joints, bolt head joints, and thread joints was constructed, and lastly the stiffness parameters of joint surfaces were implemented in the model by the secondary development of ABAQUS. The finite element model of the bolt joint established by this method can simulate the contact state very well.

  7. Interagency Transition Team Development and Facilitation. Essential Tools.

    ERIC Educational Resources Information Center

    Stodden, Robert A.; Brown, Steven E.; Galloway, L. M.; Mrazek, Susan; Noy, Liora

    2005-01-01

    The purpose of this Essential Tool is to assist state-level transition coordinators and others responsible for forming, conducting, and evaluating the performance of interagency transition teams that are focused upon the school and post-school needs of youth with disabilities. This Essential Tool is designed to guide the coordination efforts of…

  8. The experiment editor: supporting inquiry-based learning with virtual labs

    NASA Astrophysics Data System (ADS)

    Galan, D.; Heradio, R.; de la Torre, L.; Dormido, S.; Esquembre, F.

    2017-05-01

    Inquiry-based learning is a pedagogical approach where students are motivated to pose their own questions when facing problems or scenarios. In physics learning, students are turned into scientists who carry out experiments, collect and analyze data, formulate and evaluate hypotheses, and so on. Lab experimentation is essential for inquiry-based learning, yet there is a drawback with traditional hands-on labs in the high costs associated with equipment, space, and maintenance staff. Virtual laboratories are helpful to reduce these costs. This paper enriches the virtual lab ecosystem by providing an integrated environment to automate experimentation tasks. In particular, our environment supports: (i) scripting and running experiments on virtual labs, and (ii) collecting and analyzing data from the experiments. The current implementation of our environment supports virtual labs created with the authoring tool Easy Java/Javascript Simulations. Since there are public repositories with hundreds of freely available labs created with this tool, the potential applicability to our environment is considerable.

  9. In Vivo Hyperthermic Stress Model: An Easy Tool to Study the Effects of Oxidative Stress on Neuronal Tau Functionality in Mouse Brain.

    PubMed

    Chauderlier, Alban; Delattre, Lucie; Buée, Luc; Galas, Marie-Christine

    2017-01-01

    Oxidative damage is an early event in neurodegenerative disorders such as Alzheimer disease. To increase oxidative stress in AD-related mouse models is essential to study early mechanisms involved in the physiopathology of these diseases. In this chapter, we describe an experimental mouse model of transient and acute hyperthermic stress to induce in vivo an increase of oxidative stress in the brain of any kind of wild-type or transgenic mouse.

  10. Designing and Interpreting Limiting Dilution Assays: General Principles and Applications to the Latent Reservoir for Human Immunodeficiency Virus-1.

    PubMed

    Rosenbloom, Daniel I S; Elliott, Oliver; Hill, Alison L; Henrich, Timothy J; Siliciano, Janet M; Siliciano, Robert F

    2015-12-01

    Limiting dilution assays are widely used in infectious disease research. These assays are crucial for current human immunodeficiency virus (HIV)-1 cure research in particular. In this study, we offer new tools to help investigators design and analyze dilution assays based on their specific research needs. Limiting dilution assays are commonly used to measure the extent of infection, and in the context of HIV they represent an essential tool for studying latency and potential curative strategies. Yet standard assay designs may not discern whether an intervention reduces an already miniscule latent infection. This review addresses challenges arising in this setting and in the general use of dilution assays. We illustrate the major statistical method for estimating frequency of infectious units from assay results, and we offer an online tool for computing this estimate. We recommend a procedure for customizing assay design to achieve desired sensitivity and precision goals, subject to experimental constraints. We consider experiments in which no viral outgrowth is observed and explain how using alternatives to viral outgrowth may make measurement of HIV latency more efficient. Finally, we discuss how biological complications, such as probabilistic growth of small infections, alter interpretations of experimental results.

  11. A Genome-Scale Metabolic Reconstruction of Mycoplasma genitalium, iPS189

    PubMed Central

    Suthers, Patrick F.; Dasika, Madhukar S.; Kumar, Vinay Satish; Denisov, Gennady; Glass, John I.; Maranas, Costas D.

    2009-01-01

    With a genome size of ∼580 kb and approximately 480 protein coding regions, Mycoplasma genitalium is one of the smallest known self-replicating organisms and, additionally, has extremely fastidious nutrient requirements. The reduced genomic content of M. genitalium has led researchers to suggest that the molecular assembly contained in this organism may be a close approximation to the minimal set of genes required for bacterial growth. Here, we introduce a systematic approach for the construction and curation of a genome-scale in silico metabolic model for M. genitalium. Key challenges included estimation of biomass composition, handling of enzymes with broad specificities, and the lack of a defined medium. Computational tools were subsequently employed to identify and resolve connectivity gaps in the model as well as growth prediction inconsistencies with gene essentiality experimental data. The curated model, M. genitalium iPS189 (262 reactions, 274 metabolites), is 87% accurate in recapitulating in vivo gene essentiality results for M. genitalium. Approaches and tools described herein provide a roadmap for the automated construction of in silico metabolic models of other organisms. PMID:19214212

  12. Form and function in the Lower Palaeolithic: history, progress, and continued relevance.

    PubMed

    Key, Alastair; Stephen, Lycett

    2017-12-30

    Percussively flaked stone artefacts constitute a major source of evidence relating to hominin behavioural strategies and are, essentially, a product or byproduct of a past individual's decision to create a tool with respect to some broader goal. Moreover, it has long been noted that both differences and recurrent regularities exist within and between Palaeolithic stone artefact forms. Accordingly, archaeologists have frequently drawn links between form and functionality, with functional objectives and performance often being regarded consequential to a stone tool's morphological properties. Despite these factors, extensive reviews of the related concepts of form and function with respect to the Lower Palaeolithic remain surprisingly sparse. We attempt to redress this issue. First we stress the historical place of form-function concepts, and their role in establishing basic ideas that echo to this day. We then highlight methodological and conceptual progress in determining artefactual function in more recent years. Thereafter, we evaluate four specific issues that are of direct consequence for evaluating the ongoing relevance of form-function concepts, especially with respect to their relevance for understanding human evolution more generally. Our discussion highlights specifically how recent developments have been able to build on a long historical legacy, and demonstrate that direct, indirect, experimental, and evolutionary perspectives intersect in crucial ways, with each providing specific but essential insights for ongoing questions. We conclude by emphasising that our understanding of these issues and their interaction, has been, and will be, essential to accurately interpret the Lower Palaeolithic archaeological record, tool-form related behaviours of Lower Palaeolithic hominins, and their consequences for (and relationship to) wider questions of human evolution.

  13. Prediction Of Tensile And Shear Strength Of Friction Surfaced Tool Steel Deposit By Using Artificial Neural Networks

    NASA Astrophysics Data System (ADS)

    Manzoor Hussain, M.; Pitchi Raju, V.; Kandasamy, J.; Govardhan, D.

    2018-04-01

    Friction surface treatment is well-established solid technology and is used for deposition, abrasion and corrosion protection coatings on rigid materials. This novel process has wide range of industrial applications, particularly in the field of reclamation and repair of damaged and worn engineering components. In this paper, we present the prediction of tensile and shear strength of friction surface treated tool steel using ANN for simulated results of friction surface treatment. This experiment was carried out to obtain tool steel coatings of low carbon steel parts by changing contribution process parameters essentially friction pressure, rotational speed and welding speed. The simulation is performed by a 33-factor design that takes into account the maximum and least limits of the experimental work performed with the 23-factor design. Neural network structures, such as the Feed Forward Neural Network (FFNN), were used to predict tensile and shear strength of tool steel sediments caused by friction.

  14. A Scheduling Algorithm for Cloud Computing System Based on the Driver of Dynamic Essential Path.

    PubMed

    Xie, Zhiqiang; Shao, Xia; Xin, Yu

    2016-01-01

    To solve the problem of task scheduling in the cloud computing system, this paper proposes a scheduling algorithm for cloud computing based on the driver of dynamic essential path (DDEP). This algorithm applies a predecessor-task layer priority strategy to solve the problem of constraint relations among task nodes. The strategy assigns different priority values to every task node based on the scheduling order of task node as affected by the constraint relations among task nodes, and the task node list is generated by the different priority value. To address the scheduling order problem in which task nodes have the same priority value, the dynamic essential long path strategy is proposed. This strategy computes the dynamic essential path of the pre-scheduling task nodes based on the actual computation cost and communication cost of task node in the scheduling process. The task node that has the longest dynamic essential path is scheduled first as the completion time of task graph is indirectly influenced by the finishing time of task nodes in the longest dynamic essential path. Finally, we demonstrate the proposed algorithm via simulation experiments using Matlab tools. The experimental results indicate that the proposed algorithm can effectively reduce the task Makespan in most cases and meet a high quality performance objective.

  15. A Scheduling Algorithm for Cloud Computing System Based on the Driver of Dynamic Essential Path

    PubMed Central

    Xie, Zhiqiang; Shao, Xia; Xin, Yu

    2016-01-01

    To solve the problem of task scheduling in the cloud computing system, this paper proposes a scheduling algorithm for cloud computing based on the driver of dynamic essential path (DDEP). This algorithm applies a predecessor-task layer priority strategy to solve the problem of constraint relations among task nodes. The strategy assigns different priority values to every task node based on the scheduling order of task node as affected by the constraint relations among task nodes, and the task node list is generated by the different priority value. To address the scheduling order problem in which task nodes have the same priority value, the dynamic essential long path strategy is proposed. This strategy computes the dynamic essential path of the pre-scheduling task nodes based on the actual computation cost and communication cost of task node in the scheduling process. The task node that has the longest dynamic essential path is scheduled first as the completion time of task graph is indirectly influenced by the finishing time of task nodes in the longest dynamic essential path. Finally, we demonstrate the proposed algorithm via simulation experiments using Matlab tools. The experimental results indicate that the proposed algorithm can effectively reduce the task Makespan in most cases and meet a high quality performance objective. PMID:27490901

  16. Parameter Estimation of the Thermal Network Model of a Machine Tool Spindle by Self-made Bluetooth Temperature Sensor Module

    PubMed Central

    Lo, Yuan-Chieh; Hu, Yuh-Chung; Chang, Pei-Zen

    2018-01-01

    Thermal characteristic analysis is essential for machine tool spindles because sudden failures may occur due to unexpected thermal issue. This article presents a lumped-parameter Thermal Network Model (TNM) and its parameter estimation scheme, including hardware and software, in order to characterize both the steady-state and transient thermal behavior of machine tool spindles. For the hardware, the authors develop a Bluetooth Temperature Sensor Module (BTSM) which accompanying with three types of temperature-sensing probes (magnetic, screw, and probe). Its specification, through experimental test, achieves to the precision ±(0.1 + 0.0029|t|) °C, resolution 0.00489 °C, power consumption 7 mW, and size Ø40 mm × 27 mm. For the software, the heat transfer characteristics of the machine tool spindle correlative to rotating speed are derived based on the theory of heat transfer and empirical formula. The predictive TNM of spindles was developed by grey-box estimation and experimental results. Even under such complicated operating conditions as various speeds and different initial conditions, the experiments validate that the present modeling methodology provides a robust and reliable tool for the temperature prediction with normalized mean square error of 99.5% agreement, and the present approach is transferable to the other spindles with a similar structure. For realizing the edge computing in smart manufacturing, a reduced-order TNM is constructed by Model Order Reduction (MOR) technique and implemented into the real-time embedded system. PMID:29473877

  17. Parameter Estimation of the Thermal Network Model of a Machine Tool Spindle by Self-made Bluetooth Temperature Sensor Module.

    PubMed

    Lo, Yuan-Chieh; Hu, Yuh-Chung; Chang, Pei-Zen

    2018-02-23

    Thermal characteristic analysis is essential for machine tool spindles because sudden failures may occur due to unexpected thermal issue. This article presents a lumped-parameter Thermal Network Model (TNM) and its parameter estimation scheme, including hardware and software, in order to characterize both the steady-state and transient thermal behavior of machine tool spindles. For the hardware, the authors develop a Bluetooth Temperature Sensor Module (BTSM) which accompanying with three types of temperature-sensing probes (magnetic, screw, and probe). Its specification, through experimental test, achieves to the precision ±(0.1 + 0.0029|t|) °C, resolution 0.00489 °C, power consumption 7 mW, and size Ø40 mm × 27 mm. For the software, the heat transfer characteristics of the machine tool spindle correlative to rotating speed are derived based on the theory of heat transfer and empirical formula. The predictive TNM of spindles was developed by grey-box estimation and experimental results. Even under such complicated operating conditions as various speeds and different initial conditions, the experiments validate that the present modeling methodology provides a robust and reliable tool for the temperature prediction with normalized mean square error of 99.5% agreement, and the present approach is transferable to the other spindles with a similar structure. For realizing the edge computing in smart manufacturing, a reduced-order TNM is constructed by Model Order Reduction (MOR) technique and implemented into the real-time embedded system.

  18. Development and Validation of a Rubric for Diagnosing Students’ Experimental Design Knowledge and Difficulties

    PubMed Central

    Dasgupta, Annwesa P.; Anderson, Trevor R.

    2014-01-01

    It is essential to teach students about experimental design, as this facilitates their deeper understanding of how most biological knowledge was generated and gives them tools to perform their own investigations. Despite the importance of this area, surprisingly little is known about what students actually learn from designing biological experiments. In this paper, we describe a rubric for experimental design (RED) that can be used to measure knowledge of and diagnose difficulties with experimental design. The development and validation of the RED was informed by a literature review and empirical analysis of undergraduate biology students’ responses to three published assessments. Five areas of difficulty with experimental design were identified: the variable properties of an experimental subject; the manipulated variables; measurement of outcomes; accounting for variability; and the scope of inference appropriate for experimental findings. Our findings revealed that some difficulties, documented some 50 yr ago, still exist among our undergraduate students, while others remain poorly investigated. The RED shows great promise for diagnosing students’ experimental design knowledge in lecture settings, laboratory courses, research internships, and course-based undergraduate research experiences. It also shows potential for guiding the development and selection of assessment and instructional activities that foster experimental design. PMID:26086658

  19. Entanglement and Wigner Function Negativity of Multimode Non-Gaussian States

    NASA Astrophysics Data System (ADS)

    Walschaers, Mattia; Fabre, Claude; Parigi, Valentina; Treps, Nicolas

    2017-11-01

    Non-Gaussian operations are essential to exploit the quantum advantages in optical continuous variable quantum information protocols. We focus on mode-selective photon addition and subtraction as experimentally promising processes to create multimode non-Gaussian states. Our approach is based on correlation functions, as is common in quantum statistical mechanics and condensed matter physics, mixed with quantum optics tools. We formulate an analytical expression of the Wigner function after the subtraction or addition of a single photon, for arbitrarily many modes. It is used to demonstrate entanglement properties specific to non-Gaussian states and also leads to a practical and elegant condition for Wigner function negativity. Finally, we analyze the potential of photon addition and subtraction for an experimentally generated multimode Gaussian state.

  20. Entanglement and Wigner Function Negativity of Multimode Non-Gaussian States.

    PubMed

    Walschaers, Mattia; Fabre, Claude; Parigi, Valentina; Treps, Nicolas

    2017-11-03

    Non-Gaussian operations are essential to exploit the quantum advantages in optical continuous variable quantum information protocols. We focus on mode-selective photon addition and subtraction as experimentally promising processes to create multimode non-Gaussian states. Our approach is based on correlation functions, as is common in quantum statistical mechanics and condensed matter physics, mixed with quantum optics tools. We formulate an analytical expression of the Wigner function after the subtraction or addition of a single photon, for arbitrarily many modes. It is used to demonstrate entanglement properties specific to non-Gaussian states and also leads to a practical and elegant condition for Wigner function negativity. Finally, we analyze the potential of photon addition and subtraction for an experimentally generated multimode Gaussian state.

  1. Study on Platinum Coating Depth in Focused Ion Beam Diamond Cutting Tool Milling and Methods for Removing Platinum Layer.

    PubMed

    Choi, Woong Kirl; Baek, Seung Yub

    2015-09-22

    In recent years, nanomachining has attracted increasing attention in advanced manufacturing science and technologies as a value-added processes to control material structures, components, devices, and nanoscale systems. To make sub-micro patterns on these products, micro/nanoscale single-crystal diamond cutting tools are essential. Popular non-contact methods for the macro/micro processing of diamond composites are pulsed laser ablation (PLA) and electric discharge machining (EDM). However, for manufacturing nanoscale diamond tools, these machining methods are not appropriate. Despite diamond's extreme physical properties, diamond can be micro/nano machined relatively easily using a focused ion beam (FIB) technique. In the FIB milling process, the surface properties of the diamond cutting tool is affected by the amorphous damage layer caused by the FIB gallium ion collision and implantation and these influence the diamond cutting tool edge sharpness and increase the processing procedures. To protect the diamond substrate, a protection layer-platinum (Pt) coating is essential in diamond FIB milling. In this study, the depth of Pt coating layer which could decrease process-induced damage during FIB fabrication is investigated, along with methods for removing the Pt coating layer on diamond tools. The optimum Pt coating depth has been confirmed, which is very important for maintaining cutting tool edge sharpness and decreasing processing procedures. The ultra-precision grinding method and etching with aqua regia method have been investigated for removing the Pt coating layer. Experimental results show that when the diamond cutting tool width is bigger than 500 nm, ultra-precision grinding method is appropriate for removing Pt coating layer on diamond tool. However, the ultra-precision grinding method is not recommended for removing the Pt coating layer when the cutting tool width is smaller than 500 nm, because the possibility that the diamond cutting tool is damaged by the grinding process will be increased. Despite the etching method requiring more procedures to remove the Pt coating layer after FIB milling, it is a feasible method for diamond tools with under 500 nm width.

  2. Commodities Trading: An Essential Economic Tool.

    ERIC Educational Resources Information Center

    Welch, Mary A., Ed.

    1989-01-01

    This issue focuses on commodities trading as an essential economic tool. Activities include critical thinking about marketing decisions and discussion on how futures markets and options are used as important economic tools. Discussion questions and a special student project are included. (EH)

  3. Research on Hygiene Based on Fieldwork and Experimental Studies.

    PubMed

    Yajima, Ichiro

    2017-01-01

    Several experimental studies on hygiene have recently been performed and fieldwork studies are also important and essential tools. However, the implementation of experimental studies is insufficient compared with that of fieldwork studies on hygiene. Here, we show our well-balanced implementation of both fieldwork and experimental studies of toxic-element-mediated diseases including skin cancer and hearing loss. Since the pollution of drinking well water by toxic elements induces various diseases including skin cancer, we performed both fieldwork and experimental studies to determine the levels of toxic elements and the mechanisms behind the development of toxic-element-related diseases and to develop a novel remediation system. Our fieldwork studies in several countries including Bangladesh, Vietnam and Malaysia demonstrated that drinking well water was polluted with high concentrations of several toxic elements including arsenic, barium, iron and manganese. Our experimental studies using the data from our fieldwork studies demonstrated that these toxic elements caused skin cancer and hearing loss. Further experimental studies resulted in the development of a novel remediation system that adsorbs toxic elements from polluted drinking water. A well-balanced implementation of both fieldwork and experimental studies is important for the prediction, prevention and therapy of toxic-element-mediated diseases.

  4. Modeling Viral Spread

    PubMed Central

    Graw, Frederik; Perelson, Alan S.

    2016-01-01

    The way in which a viral infection spreads within a host is a complex process that is not well understood. Different viruses, such as human immunodeficiency virus type 1 and hepatitis C virus, have evolved different strategies, including direct cell-to-cell transmission and cell-free transmission, to spread within a host. To what extent these two modes of transmission are exploited in vivo is still unknown. Mathematical modeling has been an essential tool to get a better systematic and quantitative understanding of viral processes that are difficult to discern through strictly experimental approaches. In this review, we discuss recent attempts that combine experimental data and mathematical modeling in order to determine and quantify viral transmission modes. We also discuss the current challenges for a systems-level understanding of viral spread, and we highlight the promises and challenges that novel experimental techniques and data will bring to the field. PMID:27618637

  5. Selective laser melting in heat exchanger development - experimental investigation of heat transfer and pressure drop characteristics of wavy fins

    NASA Astrophysics Data System (ADS)

    Kuehndel, J.; Kerler, B.; Karcher, C.

    2018-04-01

    To improve performance of heat exchangers for vehicle applications, it is necessary to increase the air side heat transfer. Selective laser melting gives rise to be applied for fin development due to: i) independency of conventional tooling ii) a fast way to conduct essential experimental studies iii) high dimensional accuracy iv) degrees of freedom in design. Therefore, heat exchanger elements with wavy fins were examined in an experimental study. Experiments were conducted for air side Reynolds number range of 1400-7400, varying wavy amplitude and wave length of the fins at a constant water flow rate of 9.0 m3/h. Heat transfer and pressure drop characteristics were evaluated with Nusselt Number Nu and Darcy friction factor ψ as functions of Reynolds number. Heat transfer and pressure drop correlations were derived from measurement data obtained by regression analysis.

  6. Validation of an automated mite counter for Dermanyssus gallinae in experimental laying hen cages.

    PubMed

    Mul, Monique F; van Riel, Johan W; Meerburg, Bastiaan G; Dicke, Marcel; George, David R; Groot Koerkamp, Peter W G

    2015-08-01

    For integrated pest management (IPM) programs to be maximally effective, monitoring of the growth and decline of the pest populations is essential. Here, we present the validation results of a new automated monitoring device for the poultry red mite (Dermanyssus gallinae), a serious pest in laying hen facilities world-wide. This monitoring device (called an "automated mite counter") was validated in experimental laying hen cages with live birds and a growing population of D. gallinae. This validation study resulted in 17 data points of 'number of mites counted' by the automated mite counter and the 'number of mites present' in the experimental laying hen cages. The study demonstrated that the automated mite counter was able to track the D. gallinae population effectively. A wider evaluation showed that this automated mite counter can become a useful tool in IPM of D. gallinae in laying hen facilities.

  7. How to Monitor the Breathing of Laboratory Rodents: A Review of the Current Methods.

    PubMed

    Grimaud, Julien; Murthy, Venkatesh N

    2018-05-23

    Accurately measuring respiration in laboratory rodents is essential for many fields of research, including olfactory neuroscience, social behavior, learning and memory, and respiratory physiology. However, choosing the right technique to monitor respiration can be tricky, given the many criteria to take into account: reliability, precision, and invasiveness, to name a few. This review aims to assist experimenters in choosing the technique that will best fit their needs, by surveying the available tools, discussing their strengths and weaknesses, and offering suggestions for future improvements.

  8. Future directions in technology development - Increased use of space as a facility

    NASA Technical Reports Server (NTRS)

    Ambrus, Judith H.; Harris, Leonard A.; Levine, Jack; Tyson, Richard W.

    1988-01-01

    As human activities in space continue to grow in size and scope, the role of in-space technology experiments, as a necessary tool for essential technological development, will also grow. NASA has recognized the increasing importance of such experiments, and has instituted programs to plan, organize, and coordinate future in-space technology experiment activities within the overall space community. This paper discusses the history of in-space technology experiments, and expected future trends. It also describes NASA activities in this growing area of experimentation, and provides several examples of such experiments.

  9. Invited review article: the electrostatic plasma lens.

    PubMed

    Goncharov, Alexey

    2013-02-01

    The fundamental principles, experimental results, and potential applications of the electrostatic plasma lens for focusing and manipulating high-current, energetic, heavy ion beams are reviewed. First described almost 50 years ago, this optical beam device provides space charge neutralization of the ion beam within the lens volume, and thus provides an effective and unique tool for focusing high current beams where a high degree of neutralization is essential to prevent beam blow-up. Short and long lenses have been explored, and a lens in which the magnetic field is provided by rare-earth permanent magnets has been demonstrated. Applications include the use of this kind of optical tool for laboratory ion beam manipulation, high dose ion implantation, heavy ion accelerator injection, in heavy ion fusion, and other high technology.

  10. Introduction, comparison, and validation of Meta‐Essentials: A free and simple tool for meta‐analysis

    PubMed Central

    van Rhee, Henk; Hak, Tony

    2017-01-01

    We present a new tool for meta‐analysis, Meta‐Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta‐analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta‐Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta‐analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp‐Hartung adjustment of the DerSimonian‐Laird estimator. However, more advanced meta‐analysis methods such as meta‐analytical structural equation modelling and meta‐regression with multiple covariates are not available. In summary, Meta‐Essentials may prove a valuable resource for meta‐analysts, including researchers, teachers, and students. PMID:28801932

  11. The magic triangle goes MAD: experimental phasing with a bromine derivative

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beck, Tobias, E-mail: tbeck@shelx.uni-ac.gwdg.de; Gruene, Tim; Sheldrick, George M.

    2010-04-01

    5-Amino-2, 4, 6-tribromoisophthalic acid is used as a phasing tool for protein structure determination by MAD phasing. It is the second representative of a novel class of compounds for heavy-atom derivatization that combine heavy atoms with amino and carboxyl groups for binding to proteins. Experimental phasing is an essential technique for the solution of macromolecular structures. Since many heavy-atom ion soaks suffer from nonspecific binding, a novel class of compounds has been developed that combines heavy atoms with functional groups for binding to proteins. The phasing tool 5-amino-2, 4, 6-tribromoisophthalic acid (B3C) contains three functional groups (two carboxylate groups andmore » one amino group) that interact with proteins via hydrogen bonds. Three Br atoms suitable for anomalous dispersion phasing are arranged in an equilateral triangle and are thus readily identified in the heavy-atom substructure. B3C was incorporated into proteinase K and a multiwavelength anomalous dispersion (MAD) experiment at the Br K edge was successfully carried out. Radiation damage to the bromine–carbon bond was investigated. A comparison with the phasing tool I3C that contains three I atoms for single-wavelength anomalous dispersion (SAD) phasing was also carried out.« less

  12. OVERVIEW OF NEUTRON MEASUREMENTS IN JET FUSION DEVICE.

    PubMed

    Batistoni, P; Villari, R; Obryk, B; Packer, L W; Stamatelatos, I E; Popovichev, S; Colangeli, A; Colling, B; Fonnesu, N; Loreti, S; Klix, A; Klosowski, M; Malik, K; Naish, J; Pillon, M; Vasilopoulou, T; De Felice, P; Pimpinella, M; Quintieri, L

    2017-10-05

    The design and operation of ITER experimental fusion reactor requires the development of neutron measurement techniques and numerical tools to derive the fusion power and the radiation field in the device and in the surrounding areas. Nuclear analyses provide essential input to the conceptual design, optimisation, engineering and safety case in ITER and power plant studies. The required radiation transport calculations are extremely challenging because of the large physical extent of the reactor plant, the complexity of the geometry, and the combination of deep penetration and streaming paths. This article reports the experimental activities which are carried-out at JET to validate the neutronics measurements methods and numerical tools used in ITER and power plant design. A new deuterium-tritium campaign is proposed in 2019 at JET: the unique 14 MeV neutron yields produced will be exploited as much as possible to validate measurement techniques, codes, procedures and data currently used in ITER design thus reducing the related uncertainties and the associated risks in the machine operation. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  13. Analysis of tablet compaction. I. Characterization of mechanical behavior of powder and powder/tooling friction.

    PubMed

    Cunningham, J C; Sinka, I C; Zavaliangos, A

    2004-08-01

    In this first of two articles on the modeling of tablet compaction, the experimental inputs related to the constitutive model of the powder and the powder/tooling friction are determined. The continuum-based analysis of tableting makes use of an elasto-plastic model, which incorporates the elements of yield, plastic flow potential, and hardening, to describe the mechanical behavior of microcrystalline cellulose over the range of densities experienced during tableting. Specifically, a modified Drucker-Prager/cap plasticity model, which includes material parameters such as cohesion, internal friction, and hydrostatic yield pressure that evolve with the internal state variable relative density, was applied. Linear elasticity is assumed with the elastic parameters, Young's modulus, and Poisson's ratio dependent on the relative density. The calibration techniques were developed based on a series of simple mechanical tests including diametrical compression, simple compression, and die compaction using an instrumented die. The friction behavior is measured using an instrumented die and the experimental data are analyzed using the method of differential slices. The constitutive model and frictional properties are essential experimental inputs to the finite element-based model described in the companion article. Copyright 2004 Wiley-Liss, Inc. and the American Pharmacists Association J Pharm Sci 93:2022-2039, 2004

  14. Surgical tool detection and tracking in retinal microsurgery

    NASA Astrophysics Data System (ADS)

    Alsheakhali, Mohamed; Yigitsoy, Mehmet; Eslami, Abouzar; Navab, Nassir

    2015-03-01

    Visual tracking of surgical instruments is an essential part of eye surgery, and plays an important role for the surgeons as well as it is a key component of robotics assistance during the operation time. The difficulty of detecting and tracking medical instruments in-vivo images comes from its deformable shape, changes in brightness, and the presence of the instrument shadow. This paper introduces a new approach to detect the tip of surgical tool and its width regardless of its head shape and the presence of the shadows or vessels. The approach relies on integrating structural information about the strong edges from the RGB color model, and the tool location-based information from L*a*b color model. The probabilistic Hough transform was applied to get the strongest straight lines in the RGB-images, and based on information from the L* and a*, one of these candidates lines is selected as the edge of the tool shaft. Based on that line, the tool slope, the tool centerline and the tool tip could be detected. The tracking is performed by keeping track of the last detected tool tip and the tool slope, and filtering the Hough lines within a box around the last detected tool tip based on the slope differences. Experimental results demonstrate the high accuracy achieved in term of detecting the tool tip position, the tool joint point position, and the tool centerline. The approach also meets the real time requirements.

  15. Experimental Raman adiabatic transfer of optical states in rubidium

    NASA Astrophysics Data System (ADS)

    Appel, Jürgen; Figueroa, Eden; Vewinger, Frank; Marzlin, Karl-Peter; Lvovsky, Alexander

    2007-06-01

    An essential element of a quantum optical communication network is a tool for transferring and/or distributing quantum information between optical modes (possibly of different frequencies) in a loss- and decoherence-free fashion. We present a theory [1] and an experimental demonstration [2] of a protocol for routing and frequency conversion of optical quantum information via electromagnetically-induced transparency in an atomic system with multiple excited levels. Transfer of optical states between different signal modes is implemented by adiabatically changing the control fields. The proof-of-principle experiment is performed using the hyperfine levels of the rubidium D1 line. [1] F. Vewinger, J. Appel, E. Figueroa, A. I. Lvovsky, quant-ph/0611181 [2] J. Appel, K.-P. Marzlin, A. I. Lvovsky, Phys. Rev. A 73, 013804 (2006)

  16. How to Train a Cell - Cutting-Edge Molecular Tools

    NASA Astrophysics Data System (ADS)

    Czapiński, Jakub; Kiełbus, Michał; Kałafut, Joanna; Kos, Michał; Stepulak, Andrzej; Rivero-Müller, Adolfo

    2017-03-01

    In biological systems, the formation of molecular complexes is the currency for all cellular processes. Traditionally, functional experimentation was targeted to single molecular players in order to understand its effects in a cell or animal phenotype. In the last few years, we have been experiencing rapid progress in the development of ground-breaking molecular biology tools that affect the metabolic, structural, morphological, and (epi)genetic instructions of cells by chemical, optical (optogenetic) and mechanical inputs. Such precise dissection of cellular processes is not only essential for a better understanding of biological systems, but will also allow us to better diagnose and fix common dysfunctions. Here, we present several of these emerging and innovative techniques by providing the reader with elegant examples on how these tools have been implemented in cells, and, in some cases, organisms, to unravel molecular processes in minute detail. We also discuss their advantages and disadvantages with particular focus on their translation to multicellular organisms for in vivo spatiotemporal regulation. We envision that further developments of these tools will not only help solve the processes of life, but will give rise to novel clinical and industrial applications.

  17. Experimental characterization of a quantum many-body system via higher-order correlations.

    PubMed

    Schweigler, Thomas; Kasper, Valentin; Erne, Sebastian; Mazets, Igor; Rauer, Bernhard; Cataldini, Federica; Langen, Tim; Gasenzer, Thomas; Berges, Jürgen; Schmiedmayer, Jörg

    2017-05-17

    Quantum systems can be characterized by their correlations. Higher-order (larger than second order) correlations, and the ways in which they can be decomposed into correlations of lower order, provide important information about the system, its structure, its interactions and its complexity. The measurement of such correlation functions is therefore an essential tool for reading, verifying and characterizing quantum simulations. Although higher-order correlation functions are frequently used in theoretical calculations, so far mainly correlations up to second order have been studied experimentally. Here we study a pair of tunnel-coupled one-dimensional atomic superfluids and characterize the corresponding quantum many-body problem by measuring correlation functions. We extract phase correlation functions up to tenth order from interference patterns and analyse whether, and under what conditions, these functions factorize into correlations of lower order. This analysis characterizes the essential features of our system, the relevant quasiparticles, their interactions and topologically distinct vacua. From our data we conclude that in thermal equilibrium our system can be seen as a quantum simulator of the sine-Gordon model, relevant for diverse disciplines ranging from particle physics to condensed matter. The measurement and evaluation of higher-order correlation functions can easily be generalized to other systems and to study correlations of any other observable such as density, spin and magnetization. It therefore represents a general method for analysing quantum many-body systems from experimental data.

  18. Introduction, comparison, and validation of Meta-Essentials: A free and simple tool for meta-analysis.

    PubMed

    Suurmond, Robert; van Rhee, Henk; Hak, Tony

    2017-12-01

    We present a new tool for meta-analysis, Meta-Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta-analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta-Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta-analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp-Hartung adjustment of the DerSimonian-Laird estimator. However, more advanced meta-analysis methods such as meta-analytical structural equation modelling and meta-regression with multiple covariates are not available. In summary, Meta-Essentials may prove a valuable resource for meta-analysts, including researchers, teachers, and students. © 2017 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.

  19. Requirements for clinical information modelling tools.

    PubMed

    Moreno-Conde, Alberto; Jódar-Sánchez, Francisco; Kalra, Dipak

    2015-07-01

    This study proposes consensus requirements for clinical information modelling tools that can support modelling tasks in medium/large scale institutions. Rather than identify which functionalities are currently available in existing tools, the study has focused on functionalities that should be covered in order to provide guidance about how to evolve the existing tools. After identifying a set of 56 requirements for clinical information modelling tools based on a literature review and interviews with experts, a classical Delphi study methodology was applied to conduct a two round survey in order to classify them as essential or recommended. Essential requirements are those that must be met by any tool that claims to be suitable for clinical information modelling, and if we one day have a certified tools list, any tool that does not meet essential criteria would be excluded. Recommended requirements are those more advanced requirements that may be met by tools offering a superior product or only needed in certain modelling situations. According to the answers provided by 57 experts from 14 different countries, we found a high level of agreement to enable the study to identify 20 essential and 21 recommended requirements for these tools. It is expected that this list of identified requirements will guide developers on the inclusion of new basic and advanced functionalities that have strong support by end users. This list could also guide regulators in order to identify requirements that could be demanded of tools adopted within their institutions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  20. Numerical analysis of thermal drilling technique on titanium sheet metal

    NASA Astrophysics Data System (ADS)

    Kumar, R.; Hynes, N. Rajesh Jesudoss

    2018-05-01

    Thermal drilling is a technique used in drilling of sheet metal for various applications. It involves rotating conical tool with high speed in order to drill the sheet metal and formed a hole with bush below the surface of sheet metal. This article investigates the finite element analysis of thermal drilling on Ti6Al4Valloy sheet metal. This analysis was carried out by means of DEFORM-3D simulation software to simulate the performance characteristics of thermal drilling technique. Due to the contribution of high temperature deformation in this technique, the output performances which are difficult to measure by the experimental approach, can be successfully achieved by finite element method. Therefore, the modeling and simulation of thermal drilling is an essential tool to predict the strain rate, stress distribution and temperature of the workpiece.

  1. VeriClick: an efficient tool for table format verification

    NASA Astrophysics Data System (ADS)

    Nagy, George; Tamhankar, Mangesh

    2012-01-01

    The essential layout attributes of a visual table can be defined by the location of four critical grid cells. Although these critical cells can often be located by automated analysis, some means of human interaction is necessary for correcting residual errors. VeriClick is a macro-enabled spreadsheet interface that provides ground-truthing, confirmation, correction, and verification functions for CSV tables. All user actions are logged. Experimental results of seven subjects on one hundred tables suggest that VeriClick can provide a ten- to twenty-fold speedup over performing the same functions with standard spreadsheet editing commands.

  2. Protein-protein interaction predictions using text mining methods.

    PubMed

    Papanikolaou, Nikolas; Pavlopoulos, Georgios A; Theodosiou, Theodosios; Iliopoulos, Ioannis

    2015-03-01

    It is beyond any doubt that proteins and their interactions play an essential role in most complex biological processes. The understanding of their function individually, but also in the form of protein complexes is of a great importance. Nowadays, despite the plethora of various high-throughput experimental approaches for detecting protein-protein interactions, many computational methods aiming to predict new interactions have appeared and gained interest. In this review, we focus on text-mining based computational methodologies, aiming to extract information for proteins and their interactions from public repositories such as literature and various biological databases. We discuss their strengths, their weaknesses and how they complement existing experimental techniques by simultaneously commenting on the biological databases which hold such information and the benchmark datasets that can be used for evaluating new tools. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. An automated benchmarking platform for MHC class II binding prediction methods.

    PubMed

    Andreatta, Massimo; Trolle, Thomas; Yan, Zhen; Greenbaum, Jason A; Peters, Bjoern; Nielsen, Morten

    2018-05-01

    Computational methods for the prediction of peptide-MHC binding have become an integral and essential component for candidate selection in experimental T cell epitope discovery studies. The sheer amount of published prediction methods-and often discordant reports on their performance-poses a considerable quandary to the experimentalist who needs to choose the best tool for their research. With the goal to provide an unbiased, transparent evaluation of the state-of-the-art in the field, we created an automated platform to benchmark peptide-MHC class II binding prediction tools. The platform evaluates the absolute and relative predictive performance of all participating tools on data newly entered into the Immune Epitope Database (IEDB) before they are made public, thereby providing a frequent, unbiased assessment of available prediction tools. The benchmark runs on a weekly basis, is fully automated, and displays up-to-date results on a publicly accessible website. The initial benchmark described here included six commonly used prediction servers, but other tools are encouraged to join with a simple sign-up procedure. Performance evaluation on 59 data sets composed of over 10 000 binding affinity measurements suggested that NetMHCIIpan is currently the most accurate tool, followed by NN-align and the IEDB consensus method. Weekly reports on the participating methods can be found online at: http://tools.iedb.org/auto_bench/mhcii/weekly/. mniel@bioinformatics.dtu.dk. Supplementary data are available at Bioinformatics online.

  4. Experimental metagenomics and ribosomal profiling of the human skin microbiome.

    PubMed

    Ferretti, Pamela; Farina, Stefania; Cristofolini, Mario; Girolomoni, Giampiero; Tett, Adrian; Segata, Nicola

    2017-03-01

    The skin is the largest organ in the human body, and it is populated by a large diversity of microbes, most of which are co-evolved with the host and live in symbiotic harmony. There is increasing evidence that the skin microbiome plays a crucial role in the defense against pathogens, immune system training and homoeostasis, and microbiome perturbations have been associated with pathological skin conditions. Studying the skin resident microbial community is thus essential to better understand the microbiome-host crosstalk and to associate its specific configurations with cutaneous diseases. Several community profiling approaches have proved successful in unravelling the composition of the skin microbiome and overcome the limitations of cultivation-based assays, but these tools remain largely inaccessible to the clinical and medical dermatology communities. The study of the skin microbiome is also characterized by specific technical challenges, such as the low amount of microbial biomass and the extensive human DNA contamination. Here, we review the available community profiling approaches to study the skin microbiome, specifically focusing on the practical experimental and analytical tools necessary to generate and analyse skin microbiome data. We describe all the steps from the initial samples collection to the final data interpretation, with the goal of enabling clinicians and researchers who are not familiar with the microbiome field to perform skin profiling experiments. © 2016 The Authors. Experimental Dermatology Published by John Wiley & Sons Ltd.

  5. ITK: enabling reproducible research and open science

    PubMed Central

    McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis

    2014-01-01

    Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46. PMID:24600387

  6. ITK: enabling reproducible research and open science.

    PubMed

    McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis

    2014-01-01

    Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46.

  7. Two states or not two states: Single-molecule folding studies of protein L

    NASA Astrophysics Data System (ADS)

    Aviram, Haim Yuval; Pirchi, Menahem; Barak, Yoav; Riven, Inbal; Haran, Gilad

    2018-03-01

    Experimental tools of increasing sophistication have been employed in recent years to study protein folding and misfolding. Folding is considered a complex process, and one way to address it is by studying small proteins, which seemingly possess a simple energy landscape with essentially only two stable states, either folded or unfolded. The B1-IgG binding domain of protein L (PL) is considered a model two-state folder, based on measurements using a wide range of experimental techniques. We applied single-molecule fluorescence resonance energy transfer (FRET) spectroscopy in conjunction with a hidden Markov model analysis to fully characterize the energy landscape of PL and to extract the kinetic properties of individual molecules of the protein. Surprisingly, our studies revealed the existence of a third state, hidden under the two-state behavior of PL due to its small population, ˜7%. We propose that this minority intermediate involves partial unfolding of the two C-terminal β strands of PL. Our work demonstrates that single-molecule FRET spectroscopy can be a powerful tool for a comprehensive description of the folding dynamics of proteins, capable of detecting and characterizing relatively rare metastable states that are difficult to observe in ensemble studies.

  8. Understanding THz spectra of aqueous solutions: glycine in light and heavy water.

    PubMed

    Sun, Jian; Niehues, Gudrun; Forbert, Harald; Decka, Dominique; Schwaab, Gerhard; Marx, Dominik; Havenith, Martina

    2014-04-02

    THz spectroscopy of aqueous solutions has been established as of recently to be a valuable and complementary experimental tool to provide direct insights into the solute-solvent coupling due to hydrogen-bond dynamics involving interfacial water. Despite much experimental progress, understanding THz spectra in terms of molecular motions, akin to mid-infrared spectra, still remains elusive. Here, using the osmoprotectant glycine as a showcase, we demonstrate how this can be achieved by combining THz absorption spectroscopy and ab initio molecular dynamics. The experimental THz spectrum is characterized by broad yet clearly discernible peaks. Based on substantial extensions of available mode-specific decomposition schemes, the experimental spectrum can be reproduced by theory and assigned on an essentially quantitative level. This joint effort reveals an unexpectedly clear picture of the individual contributions of molecular motion to the THz absorption spectrum in terms of distinct modes stemming from intramolecular vibrations, rigid-body-like hindered rotational and translational motion, and specific couplings to interfacial water molecules. The assignment is confirmed by the peak shifts observed in the THz spectrum of deuterated glycine in heavy water, which allow us to separate the distinct modes experimentally.

  9. More than Solfège and Hand Signs: Philosophy, Tools, and Lesson Planning in the Authentic Kodály Classroom

    ERIC Educational Resources Information Center

    Bowyer, James

    2015-01-01

    Four components of the Kodály concept are delineated here: philosophy, objectives, essential tools, and lesson planning process. After outlining the tenets of the Kodály philosophy and objectives, the article presents the Kodály concept's essential tools, including singing, movable "do" solfège, rhythm syllables, hand signs, singing on…

  10. What's in Your Back Pack? Three Essential Items for Survival in the Tough and Changing World of Campus Construction

    ERIC Educational Resources Information Center

    del Monte, Rick

    2009-01-01

    As students know, the tools in their backpacks can influence success. If they are off to math class, a good calculator is essential. When on their way to English class, a laptop is fundamental. Building facility executives too have tools in their backpacks to assure the successful creation of educational buildings. Only their tools are…

  11. Predicting Essential Genes and Proteins Based on Machine Learning and Network Topological Features: A Comprehensive Review

    PubMed Central

    Zhang, Xue; Acencio, Marcio Luis; Lemke, Ney

    2016-01-01

    Essential proteins/genes are indispensable to the survival or reproduction of an organism, and the deletion of such essential proteins will result in lethality or infertility. The identification of essential genes is very important not only for understanding the minimal requirements for survival of an organism, but also for finding human disease genes and new drug targets. Experimental methods for identifying essential genes are costly, time-consuming, and laborious. With the accumulation of sequenced genomes data and high-throughput experimental data, many computational methods for identifying essential proteins are proposed, which are useful complements to experimental methods. In this review, we show the state-of-the-art methods for identifying essential genes and proteins based on machine learning and network topological features, point out the progress and limitations of current methods, and discuss the challenges and directions for further research. PMID:27014079

  12. Computational Fluid Dynamics (CFD): Future role and requirements as viewed by an applied aerodynamicist. [computer systems design

    NASA Technical Reports Server (NTRS)

    Yoshihara, H.

    1978-01-01

    The problem of designing the wing-fuselage configuration of an advanced transonic commercial airliner and the optimization of a supercruiser fighter are sketched, pointing out the essential fluid mechanical phenomena that play an important role. Such problems suggest that for a numerical method to be useful, it must be able to treat highly three dimensional turbulent separations, flows with jet engine exhausts, and complex vehicle configurations. Weaknesses of the two principal tools of the aerodynamicist, the wind tunnel and the computer, suggest a complementing combined use of these tools, which is illustrated by the case of the transonic wing-fuselage design. The anticipated difficulties in developing an adequate turbulent transport model suggest that such an approach may have to suffice for an extended period. On a longer term, experimentation of turbulent transport in meaningful cases must be intensified to provide a data base for both modeling and theory validation purposes.

  13. SeqCompress: an algorithm for biological sequence compression.

    PubMed

    Sardaraz, Muhammad; Tahir, Muhammad; Ikram, Ataul Aziz; Bajwa, Hassan

    2014-10-01

    The growth of Next Generation Sequencing technologies presents significant research challenges, specifically to design bioinformatics tools that handle massive amount of data efficiently. Biological sequence data storage cost has become a noticeable proportion of total cost in the generation and analysis. Particularly increase in DNA sequencing rate is significantly outstripping the rate of increase in disk storage capacity, which may go beyond the limit of storage capacity. It is essential to develop algorithms that handle large data sets via better memory management. This article presents a DNA sequence compression algorithm SeqCompress that copes with the space complexity of biological sequences. The algorithm is based on lossless data compression and uses statistical model as well as arithmetic coding to compress DNA sequences. The proposed algorithm is compared with recent specialized compression tools for biological sequences. Experimental results show that proposed algorithm has better compression gain as compared to other existing algorithms. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Implementation of quantum and classical discrete fractional Fourier transforms.

    PubMed

    Weimann, Steffen; Perez-Leija, Armando; Lebugle, Maxime; Keil, Robert; Tichy, Malte; Gräfe, Markus; Heilmann, René; Nolte, Stefan; Moya-Cessa, Hector; Weihs, Gregor; Christodoulides, Demetrios N; Szameit, Alexander

    2016-03-23

    Fourier transforms, integer and fractional, are ubiquitous mathematical tools in basic and applied science. Certainly, since the ordinary Fourier transform is merely a particular case of a continuous set of fractional Fourier domains, every property and application of the ordinary Fourier transform becomes a special case of the fractional Fourier transform. Despite the great practical importance of the discrete Fourier transform, implementation of fractional orders of the corresponding discrete operation has been elusive. Here we report classical and quantum optical realizations of the discrete fractional Fourier transform. In the context of classical optics, we implement discrete fractional Fourier transforms of exemplary wave functions and experimentally demonstrate the shift theorem. Moreover, we apply this approach in the quantum realm to Fourier transform separable and path-entangled biphoton wave functions. The proposed approach is versatile and could find applications in various fields where Fourier transforms are essential tools.

  15. Implementation of quantum and classical discrete fractional Fourier transforms

    PubMed Central

    Weimann, Steffen; Perez-Leija, Armando; Lebugle, Maxime; Keil, Robert; Tichy, Malte; Gräfe, Markus; Heilmann, René; Nolte, Stefan; Moya-Cessa, Hector; Weihs, Gregor; Christodoulides, Demetrios N.; Szameit, Alexander

    2016-01-01

    Fourier transforms, integer and fractional, are ubiquitous mathematical tools in basic and applied science. Certainly, since the ordinary Fourier transform is merely a particular case of a continuous set of fractional Fourier domains, every property and application of the ordinary Fourier transform becomes a special case of the fractional Fourier transform. Despite the great practical importance of the discrete Fourier transform, implementation of fractional orders of the corresponding discrete operation has been elusive. Here we report classical and quantum optical realizations of the discrete fractional Fourier transform. In the context of classical optics, we implement discrete fractional Fourier transforms of exemplary wave functions and experimentally demonstrate the shift theorem. Moreover, we apply this approach in the quantum realm to Fourier transform separable and path-entangled biphoton wave functions. The proposed approach is versatile and could find applications in various fields where Fourier transforms are essential tools. PMID:27006089

  16. SigWin-detector: a Grid-enabled workflow for discovering enriched windows of genomic features related to DNA sequences.

    PubMed

    Inda, Márcia A; van Batenburg, Marinus F; Roos, Marco; Belloum, Adam S Z; Vasunin, Dmitry; Wibisono, Adianto; van Kampen, Antoine H C; Breit, Timo M

    2008-08-08

    Chromosome location is often used as a scaffold to organize genomic information in both the living cell and molecular biological research. Thus, ever-increasing amounts of data about genomic features are stored in public databases and can be readily visualized by genome browsers. To perform in silico experimentation conveniently with this genomics data, biologists need tools to process and compare datasets routinely and explore the obtained results interactively. The complexity of such experimentation requires these tools to be based on an e-Science approach, hence generic, modular, and reusable. A virtual laboratory environment with workflows, workflow management systems, and Grid computation are therefore essential. Here we apply an e-Science approach to develop SigWin-detector, a workflow-based tool that can detect significantly enriched windows of (genomic) features in a (DNA) sequence in a fast and reproducible way. For proof-of-principle, we utilize a biological use case to detect regions of increased and decreased gene expression (RIDGEs and anti-RIDGEs) in human transcriptome maps. We improved the original method for RIDGE detection by replacing the costly step of estimation by random sampling with a faster analytical formula for computing the distribution of the null hypothesis being tested and by developing a new algorithm for computing moving medians. SigWin-detector was developed using the WS-VLAM workflow management system and consists of several reusable modules that are linked together in a basic workflow. The configuration of this basic workflow can be adapted to satisfy the requirements of the specific in silico experiment. As we show with the results from analyses in the biological use case on RIDGEs, SigWin-detector is an efficient and reusable Grid-based tool for discovering windows enriched for features of a particular type in any sequence of values. Thus, SigWin-detector provides the proof-of-principle for the modular e-Science based concept of integrative bioinformatics experimentation.

  17. Metazen – metadata capture for metagenomes

    PubMed Central

    2014-01-01

    Background As the impact and prevalence of large-scale metagenomic surveys grow, so does the acute need for more complete and standards compliant metadata. Metadata (data describing data) provides an essential complement to experimental data, helping to answer questions about its source, mode of collection, and reliability. Metadata collection and interpretation have become vital to the genomics and metagenomics communities, but considerable challenges remain, including exchange, curation, and distribution. Currently, tools are available for capturing basic field metadata during sampling, and for storing, updating and viewing it. Unfortunately, these tools are not specifically designed for metagenomic surveys; in particular, they lack the appropriate metadata collection templates, a centralized storage repository, and a unique ID linking system that can be used to easily port complete and compatible metagenomic metadata into widely used assembly and sequence analysis tools. Results Metazen was developed as a comprehensive framework designed to enable metadata capture for metagenomic sequencing projects. Specifically, Metazen provides a rapid, easy-to-use portal to encourage early deposition of project and sample metadata. Conclusions Metazen is an interactive tool that aids users in recording their metadata in a complete and valid format. A defined set of mandatory fields captures vital information, while the option to add fields provides flexibility. PMID:25780508

  18. Metazen - metadata capture for metagenomes.

    PubMed

    Bischof, Jared; Harrison, Travis; Paczian, Tobias; Glass, Elizabeth; Wilke, Andreas; Meyer, Folker

    2014-01-01

    As the impact and prevalence of large-scale metagenomic surveys grow, so does the acute need for more complete and standards compliant metadata. Metadata (data describing data) provides an essential complement to experimental data, helping to answer questions about its source, mode of collection, and reliability. Metadata collection and interpretation have become vital to the genomics and metagenomics communities, but considerable challenges remain, including exchange, curation, and distribution. Currently, tools are available for capturing basic field metadata during sampling, and for storing, updating and viewing it. Unfortunately, these tools are not specifically designed for metagenomic surveys; in particular, they lack the appropriate metadata collection templates, a centralized storage repository, and a unique ID linking system that can be used to easily port complete and compatible metagenomic metadata into widely used assembly and sequence analysis tools. Metazen was developed as a comprehensive framework designed to enable metadata capture for metagenomic sequencing projects. Specifically, Metazen provides a rapid, easy-to-use portal to encourage early deposition of project and sample metadata. Metazen is an interactive tool that aids users in recording their metadata in a complete and valid format. A defined set of mandatory fields captures vital information, while the option to add fields provides flexibility.

  19. How to Train a Cell–Cutting-Edge Molecular Tools

    PubMed Central

    Czapiński, Jakub; Kiełbus, Michał; Kałafut, Joanna; Kos, Michał; Stepulak, Andrzej; Rivero-Müller, Adolfo

    2017-01-01

    In biological systems, the formation of molecular complexes is the currency for all cellular processes. Traditionally, functional experimentation was targeted to single molecular players in order to understand its effects in a cell or animal phenotype. In the last few years, we have been experiencing rapid progress in the development of ground-breaking molecular biology tools that affect the metabolic, structural, morphological, and (epi)genetic instructions of cells by chemical, optical (optogenetic) and mechanical inputs. Such precise dissection of cellular processes is not only essential for a better understanding of biological systems, but will also allow us to better diagnose and fix common dysfunctions. Here, we present several of these emerging and innovative techniques by providing the reader with elegant examples on how these tools have been implemented in cells, and, in some cases, organisms, to unravel molecular processes in minute detail. We also discuss their advantages and disadvantages with particular focus on their translation to multicellular organisms for in vivo spatiotemporal regulation. We envision that further developments of these tools will not only help solve the processes of life, but will give rise to novel clinical and industrial applications. PMID:28344971

  20. Development and Usefulness of a District Health Systems Tool for Performance Improvement in Essential Public Health Functions in Botswana and Mozambique.

    PubMed

    Bishai, David; Sherry, Melissa; Pereira, Claudia C; Chicumbe, Sergio; Mbofana, Francisco; Boore, Amy; Smith, Monica; Nhambi, Leonel; Borse, Nagesh N

    2016-01-01

    This study describes the development of a self-audit tool for public health and the associated methodology for implementing a district health system self-audit tool that can provide quantitative data on how district governments perceive their performance of the essential public health functions. Development began with a consensus-building process to engage Ministry of Health and provincial health officers in Mozambique and Botswana. We then worked with lists of relevant public health functions as determined by these stakeholders to adapt a self-audit tool describing essential public health functions to each country's health system. We then piloted the tool across districts in both countries and conducted interviews with district health personnel to determine health workers' perception of the usefulness of the approach. Country stakeholders were able to develop consensus around 11 essential public health functions that were relevant in each country. Pilots of the self-audit tool enabled the tool to be effectively shortened. Pilots also disclosed a tendency to upcode during self-audits that was checked by group deliberation. Convening sessions at the district enabled better attendance and representative deliberation. Instant feedback from the audit was a feature that 100% of pilot respondents found most useful. The development of metrics that provide feedback on public health performance can be used as an aid in the self-assessment of health system performance at the district level. Measurements of practice can open the door to future applications for practice improvement and research into the determinants and consequences of better public health practice. The current tool can be assessed for its usefulness to district health managers in improving their public health practice. The tool can also be used by the Ministry of Health or external donors in the African region for monitoring the district-level performance of the essential public health functions.

  1. Development and Usefulness of a District Health Systems Tool for Performance Improvement in Essential Public Health Functions in Botswana and Mozambique

    PubMed Central

    Bishai, David; Sherry, Melissa; Pereira, Claudia C.; Chicumbe, Sergio; Mbofana, Francisco; Boore, Amy; Smith, Monica; Nhambi, Leonel; Borse, Nagesh N.

    2018-01-01

    Introduction This study describes the development of a self-audit tool for public health and the associated methodology for implementing a district health system self-audit tool that can provide quantitative data on how district governments perceive their own performance of the essential public health functions. Methods Development began with a consensus building process to engage Ministry of Health and provincial health officers in Mozambique and Botswana. We then worked with lists of relevant public health functions as determined by these stakeholders to adapt a self-audit tool describing essential public health functions to each country’s health system. We then piloted the tool across districts in both countries and conducted interviews with district health personnel to determine health workers’ perception of the usefulness of the approach. Results Country stakeholders were able to develop consensus around eleven essential public health functions that were relevant in each country. Pilots of the self-audit tool enabled the tool to be effectively shortened. Pilots also disclosed a tendency to up code during self-audits that was checked by group deliberation. Convening sessions at the district enabled better attendance and representative deliberation. Instant feedback from the audit was a feature that 100% of pilot respondents found most useful. Conclusions The development of metrics that provide feedback on public health performance can be used as an aid in the self-assessment of health system performance at the district level. Measurements of practice can open the door to future applications for practice improvement and research into the determinants and consequences of better public health practice. The current tool can be assessed for its usefulness to district health managers in improving their public health practice. The tool can also be used by ministry of health or external donors in the African region for monitoring the district level performance of the essential public health functions. PMID:27682727

  2. Designing and Evaluating an Interactive Multimedia Web-Based Simulation for Developing Nurses’ Competencies in Acute Nursing Care: Randomized Controlled Trial

    PubMed Central

    Wong, Lai Fun; Chan, Sally Wai-Chi; Ho, Jasmine Tze Yin; Mordiffi, Siti Zubaidah; Ang, Sophia Bee Leng; Goh, Poh Sun; Ang, Emily Neo Kim

    2015-01-01

    Background Web-based learning is becoming an increasingly important instructional tool in nursing education. Multimedia advancements offer the potential for creating authentic nursing activities for developing nursing competency in clinical practice. Objective This study aims to describe the design, development, and evaluation of an interactive multimedia Web-based simulation for developing nurses’ competencies in acute nursing care. Methods Authentic nursing activities were developed in a Web-based simulation using a variety of instructional strategies including animation video, multimedia instructional material, virtual patients, and online quizzes. A randomized controlled study was conducted on 67 registered nurses who were recruited from the general ward units of an acute care tertiary hospital. Following a baseline evaluation of all participants’ clinical performance in a simulated clinical setting, the experimental group received 3 hours of Web-based simulation and completed a survey to evaluate their perceptions of the program. All participants were re-tested for their clinical performances using a validated tool. Results The clinical performance posttest scores of the experimental group improved significantly (P<.001) from the pretest scores after the Web-based simulation. In addition, compared to the control group, the experimental group had significantly higher clinical performance posttest scores (P<.001) after controlling the pretest scores. The participants from the experimental group were satisfied with their learning experience and gave positive ratings for the quality of the Web-based simulation. Themes emerging from the comments about the most valuable aspects of the Web-based simulation include relevance to practice, instructional strategies, and fostering problem solving. Conclusions Engaging in authentic nursing activities using interactive multimedia Web-based simulation can enhance nurses’ competencies in acute care. Web-based simulations provide a promising educational tool in institutions where large groups of nurses need to be trained in acute nursing care and accessibility to repetitive training is essential for achieving long-term retention of clinical competency. PMID:25583029

  3. Designing and evaluating an interactive multimedia Web-based simulation for developing nurses' competencies in acute nursing care: randomized controlled trial.

    PubMed

    Liaw, Sok Ying; Wong, Lai Fun; Chan, Sally Wai-Chi; Ho, Jasmine Tze Yin; Mordiffi, Siti Zubaidah; Ang, Sophia Bee Leng; Goh, Poh Sun; Ang, Emily Neo Kim

    2015-01-12

    Web-based learning is becoming an increasingly important instructional tool in nursing education. Multimedia advancements offer the potential for creating authentic nursing activities for developing nursing competency in clinical practice. This study aims to describe the design, development, and evaluation of an interactive multimedia Web-based simulation for developing nurses' competencies in acute nursing care. Authentic nursing activities were developed in a Web-based simulation using a variety of instructional strategies including animation video, multimedia instructional material, virtual patients, and online quizzes. A randomized controlled study was conducted on 67 registered nurses who were recruited from the general ward units of an acute care tertiary hospital. Following a baseline evaluation of all participants' clinical performance in a simulated clinical setting, the experimental group received 3 hours of Web-based simulation and completed a survey to evaluate their perceptions of the program. All participants were re-tested for their clinical performances using a validated tool. The clinical performance posttest scores of the experimental group improved significantly (P<.001) from the pretest scores after the Web-based simulation. In addition, compared to the control group, the experimental group had significantly higher clinical performance posttest scores (P<.001) after controlling the pretest scores. The participants from the experimental group were satisfied with their learning experience and gave positive ratings for the quality of the Web-based simulation. Themes emerging from the comments about the most valuable aspects of the Web-based simulation include relevance to practice, instructional strategies, and fostering problem solving. Engaging in authentic nursing activities using interactive multimedia Web-based simulation can enhance nurses' competencies in acute care. Web-based simulations provide a promising educational tool in institutions where large groups of nurses need to be trained in acute nursing care and accessibility to repetitive training is essential for achieving long-term retention of clinical competency.

  4. Experimental validation of ultrasonic NDE simulation software

    NASA Astrophysics Data System (ADS)

    Dib, Gerges; Larche, Michael; Diaz, Aaron A.; Crawford, Susan L.; Prowant, Matthew S.; Anderson, Michael T.

    2016-02-01

    Computer modeling and simulation is becoming an essential tool for transducer design and insight into ultrasonic nondestructive evaluation (UT-NDE). As the popularity of simulation tools for UT-NDE increases, it becomes important to assess their reliability to model acoustic responses from defects in operating components and provide information that is consistent with in-field inspection data. This includes information about the detectability of different defect types for a given UT probe. Recently, a cooperative program between the Electrical Power Research Institute and the U.S. Nuclear Regulatory Commission was established to validate numerical modeling software commonly used for simulating UT-NDE of nuclear power plant components. In the first phase of this cooperative, extensive experimental UT measurements were conducted on machined notches with varying depth, length, and orientation in stainless steel plates. Then, the notches were modeled in CIVA, a semi-analytical NDE simulation platform developed by the French Commissariat a l'Energie Atomique, and their responses compared with the experimental measurements. Discrepancies between experimental and simulation results are due to either improper inputs to the simulation model, or to incorrect approximations and assumptions in the numerical models. To address the former, a variation study was conducted on the different parameters that are required as inputs for the model, specifically the specimen and transducer properties. Then, the ability of simulations to give accurate predictions regarding the detectability of the different defects was demonstrated. This includes the results in terms of the variations in defect amplitude indications, and the ratios between tip diffracted and specular signal amplitudes.

  5. Learning of embodied interaction dynamics with recurrent neural networks: some exploratory experiments.

    PubMed

    Oubbati, Mohamed; Kord, Bahram; Koprinkova-Hristova, Petia; Palm, Günther

    2014-04-01

    The new tendency of artificial intelligence suggests that intelligence must be seen as a result of the interaction between brains, bodies and environments. This view implies that designing sophisticated behaviour requires a primary focus on how agents are functionally coupled to their environments. Under this perspective, we present early results with the application of reservoir computing as an efficient tool to understand how behaviour emerges from interaction. Specifically, we present reservoir computing models, that are inspired by imitation learning designs, to extract the essential components of behaviour that results from agent-environment interaction dynamics. Experimental results using a mobile robot are reported to validate the learning architectures.

  6. Learning of embodied interaction dynamics with recurrent neural networks: some exploratory experiments

    NASA Astrophysics Data System (ADS)

    Oubbati, Mohamed; Kord, Bahram; Koprinkova-Hristova, Petia; Palm, Günther

    2014-04-01

    The new tendency of artificial intelligence suggests that intelligence must be seen as a result of the interaction between brains, bodies and environments. This view implies that designing sophisticated behaviour requires a primary focus on how agents are functionally coupled to their environments. Under this perspective, we present early results with the application of reservoir computing as an efficient tool to understand how behaviour emerges from interaction. Specifically, we present reservoir computing models, that are inspired by imitation learning designs, to extract the essential components of behaviour that results from agent-environment interaction dynamics. Experimental results using a mobile robot are reported to validate the learning architectures.

  7. The WEIZMASS spectral library for high-confidence metabolite identification

    NASA Astrophysics Data System (ADS)

    Shahaf, Nir; Rogachev, Ilana; Heinig, Uwe; Meir, Sagit; Malitsky, Sergey; Battat, Maor; Wyner, Hilary; Zheng, Shuning; Wehrens, Ron; Aharoni, Asaph

    2016-08-01

    Annotation of metabolites is an essential, yet problematic, aspect of mass spectrometry (MS)-based metabolomics assays. The current repertoire of definitive annotations of metabolite spectra in public MS databases is limited and suffers from lack of chemical and taxonomic diversity. Furthermore, the heterogeneity of the data prevents the development of universally applicable metabolite annotation tools. Here we present a combined experimental and computational platform to advance this key issue in metabolomics. WEIZMASS is a unique reference metabolite spectral library developed from high-resolution MS data acquired from a structurally diverse set of 3,540 plant metabolites. We also present MatchWeiz, a multi-module strategy using a probabilistic approach to match library and experimental data. This strategy allows efficient and high-confidence identification of dozens of metabolites in model and exotic plants, including metabolites not previously reported in plants or found in few plant species to date.

  8. Quantitative Subsurface Atomic Structure Fingerprint for 2D Materials and Heterostructures by First-Principles-Calibrated Contact-Resonance Atomic Force Microscopy.

    PubMed

    Tu, Qing; Lange, Björn; Parlak, Zehra; Lopes, Joao Marcelo J; Blum, Volker; Zauscher, Stefan

    2016-07-26

    Interfaces and subsurface layers are critical for the performance of devices made of 2D materials and heterostructures. Facile, nondestructive, and quantitative ways to characterize the structure of atomically thin, layered materials are thus essential to ensure control of the resultant properties. Here, we show that contact-resonance atomic force microscopy-which is exquisitely sensitive to stiffness changes that arise from even a single atomic layer of a van der Waals-adhered material-is a powerful experimental tool to address this challenge. A combined density functional theory and continuum modeling approach is introduced that yields sub-surface-sensitive, nanomechanical fingerprints associated with specific, well-defined structure models of individual surface domains. Where such models are known, this information can be correlated with experimentally obtained contact-resonance frequency maps to reveal the (sub)surface structure of different domains on the sample.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romanov, A.; Edstrom, D.; Emanov, F. A.

    Precise beam based measurement and correction of magnetic optics is essential for the successful operation of accelerators. The LOCO algorithm is a proven and reliable tool, which in some situations can be improved by using a broader class of experimental data. The standard data sets for LOCO include the closed orbit responses to dipole corrector variation, dispersion, and betatron tunes. This paper discusses the benefits from augmenting the data with four additional classes of experimental data: the beam shape measured with beam profile monitors; responses of closed orbit bumps to focusing field variations; betatron tune responses to focusing field variations;more » BPM-to-BPM betatron phase advances and beta functions in BPMs from turn-by-turn coordinates of kicked beam. All of the described features were implemented in the Sixdsimulation software that was used to correct the optics of the VEPP-2000 collider, the VEPP-5 injector booster ring, and the FAST linac.« less

  10. DOT2: Macromolecular Docking With Improved Biophysical Models

    PubMed Central

    Roberts, Victoria A.; Thompson, Elaine E.; Pique, Michael E.; Perez, Martin S.; Eyck, Lynn Ten

    2015-01-01

    Computational docking is a useful tool for predicting macromolecular complexes, which are often difficult to determine experimentally. Here we present the DOT2 software suite, an updated version of the DOT intermolecular docking program. DOT2 provides straightforward, automated construction of improved biophysical models based on molecular coordinates, offering checkpoints that guide the user to include critical features. DOT has been updated to run more quickly, allow flexibility in grid size and spacing, and generate a complete list of favorable candidate configu-rations. Output can be filtered by experimental data and rescored by the sum of electrostatic and atomic desolvation energies. We show that this rescoring method improves the ranking of correct complexes for a wide range of macromolecular interactions, and demonstrate that biologically relevant models are essential for biologically relevant results. The flexibility and versatility of DOT2 accommodate realistic models of complex biological systems, improving the likelihood of a successful docking outcome. PMID:23695987

  11. Experimental statistical signature of many-body quantum interference

    NASA Astrophysics Data System (ADS)

    Giordani, Taira; Flamini, Fulvio; Pompili, Matteo; Viggianiello, Niko; Spagnolo, Nicolò; Crespi, Andrea; Osellame, Roberto; Wiebe, Nathan; Walschaers, Mattia; Buchleitner, Andreas; Sciarrino, Fabio

    2018-03-01

    Multi-particle interference is an essential ingredient for fundamental quantum mechanics phenomena and for quantum information processing to provide a computational advantage, as recently emphasized by boson sampling experiments. Hence, developing a reliable and efficient technique to witness its presence is pivotal in achieving the practical implementation of quantum technologies. Here, we experimentally identify genuine many-body quantum interference via a recent efficient protocol, which exploits statistical signatures at the output of a multimode quantum device. We successfully apply the test to validate three-photon experiments in an integrated photonic circuit, providing an extensive analysis on the resources required to perform it. Moreover, drawing upon established techniques of machine learning, we show how such tools help to identify the—a priori unknown—optimal features to witness these signatures. Our results provide evidence on the efficacy and feasibility of the method, paving the way for its adoption in large-scale implementations.

  12. X-ray crystallography, an essential tool for the determination of thermodynamic relationships between crystalline polymorphs.

    PubMed

    Céolin, R; Rietveld, I-B

    2016-01-01

    After a short review of the controversies surrounding the discovery of crystalline polymorphism in relation to our present day understanding, the methods of how to solve the stability hierarchy of different polymorphs will be briefly discussed. They involve either theoretical calculations, or, more commonly, experimental methods based on classical thermodynamics. The experimental approach is mainly carried out using heat-exchange data associated to the transition of one form into another. It will be demonstrated that work-related data associated to the phase transition should be taken into account and the role of X-ray crystallography therein will be discussed. X-ray crystallography has become increasingly precise and can nowadays provide specific volumes and their differences as a function of temperature, and also as a function of pressure, humidity, and time. Copyright © 2015 Académie Nationale de Pharmacie. Published by Elsevier Masson SAS. All rights reserved.

  13. Experimental realization of a subwavelength optical potential based on atomic dark state

    NASA Astrophysics Data System (ADS)

    Wang, Yang; Subhankar, Sarthak; Rolston, Steven; Porto, James

    2017-04-01

    As a well-established tool optical lattice (OL) provides the unique opportunity to exploit the rich manybody physics. However, ``traditional'' OL, either via laser beam interference or direct projection with spatial light modulator, has a length scale around the wavelength (0.1 10 λ) that is set by diffraction, a fundamental limit from the wave nature of the light. Recent theoretical proposals suggest an alternative route, where the geometric potential, stemming from light-atom interaction, can be engineered to generate a much finer potential landscape which is essentially limited by the wave nature of the slow moving cold atoms. We report on the progress towards an experimental realization of these ideas using degenerate fermionic ytterbium atoms. Such subwavelength optical potential could open the gate to study physics beyond currently available parameter regimes, such as enhanced super-exchange coupling, magnetic dipolar coupling, and tunnel junction in atomtronics.

  14. Mycobacterium tuberculosis Metabolism

    PubMed Central

    Warner, Digby F.

    2015-01-01

    Metabolism underpins the physiology and pathogenesis of Mycobacterium tuberculosis. However, although experimental mycobacteriology has provided key insights into the metabolic pathways that are essential for survival and pathogenesis, determining the metabolic status of bacilli during different stages of infection and in different cellular compartments remains challenging. Recent advances—in particular, the development of systems biology tools such as metabolomics—have enabled key insights into the biochemical state of M. tuberculosis in experimental models of infection. In addition, their use to elucidate mechanisms of action of new and existing antituberculosis drugs is critical for the development of improved interventions to counter tuberculosis. This review provides a broad summary of mycobacterial metabolism, highlighting the adaptation of M. tuberculosis as specialist human pathogen, and discusses recent insights into the strategies used by the host and infecting bacillus to influence the outcomes of the host–pathogen interaction through modulation of metabolic functions. PMID:25502746

  15. Investigating Mechanisms of Chronic Kidney Disease in Mouse Models

    PubMed Central

    Eddy, Allison A.; Okamura, Daryl M.; Yamaguchi, Ikuyo; López-Guisa, Jesús M.

    2011-01-01

    Animal models of chronic kidney disease (CKD) are important experimental tools that are used to investigate novel mechanistic pathways and to validate potential new therapeutic interventions prior to pre-clinical testing in humans. Over the past several years, mouse CKD models have been extensively used for these purposes. Despite significant limitations, the model of unilateral ureteral obstruction (UUO) has essentially become the high throughput in vivo model, as it recapitulates the fundamental pathogenetic mechanisms that typify all forms of CKD in a relatively short time span. In addition, several alternative mouse models are available that can be used to validate new mechanistic paradigms and/or novel therapies. Several models are reviewed – both genetic and experimentally induced – that provide investigators with an opportunity to include renal functional study end-points together with quantitative measures of fibrosis severity, something that is not possible with the UUO model. PMID:21695449

  16. Implementing Liberia's poverty reduction strategy: An assessment of emergency and essential surgical care.

    PubMed

    Sherman, Lawrence; Clement, Peter T; Cherian, Meena N; Ndayimirije, Nestor; Noel, Luc; Dahn, Bernice; Gwenigale, Walter T; Kushner, Adam L

    2011-01-01

    To document infrastructure, personnel, procedures performed, and supplies and equipment available at all county hospitals in Liberia using the World Health Organization Tool for Situational Analysis of Emergency and Essential Surgical Care. Survey of county hospitals using the World Health Organization Tool for Situational Analysis of Emergency and Essential Surgical Care. Sixteen county hospitals in Liberia. Infrastructure, personnel, procedures performed, and supplies and equipment available. Uniformly, gross deficiencies in infrastructure, personnel, and supplies and equipment were identified. The World Health Organization Tool for Situational Analysis of Emergency and Essential Surgical Care was useful in identifying baseline emergency and surgical conditions for evidenced-based planning. To achieve the Poverty Reduction Strategy and delivery of the Basic Package of Health and Social Welfare Services, additional resources and manpower are needed to improve surgical and anesthetic care.

  17. Design and development of solid carbide step drill K34 for machining of CFRP and GFRP composite laminates

    NASA Astrophysics Data System (ADS)

    Rangaswamy, T.; Nagaraja, R.

    2018-04-01

    The Study focused on design and development of solid carbide step drill K34 to drill holes on composite materials such as Carbon Fiber Reinforced Plastic (CFRP) and Glass Fiber Reinforced Plastic (GFRP). The step drill K34 replaces step wise drilling of diameter 6.5mm and 9 mm holes that reduces the setup time, cutting speed, feed rate cost, delamination and increase the production rate. Several researchers have analyzed the effect of drilling process on various fiber reinforced plastic composites by carrying out using conventional tools and machinery. However, this process operation can lead to different kind of damages such as delamination, fiber pullout, and local cracks. To avoid the problems encountered at the time of drilling, suitable tool material and geometry is essential. This paper deals with the design and development of K34 Carbide step drill used to drill holes on CFRP and GFRP laminates. An Experimental study carried out to investigate the tool geometry, feed rate and cutting speed that avoids delamination and fiber breakage.

  18. Effectiveness of Ginger Essential Oil on Postoperative Nausea and Vomiting in Abdominal Surgery Patients.

    PubMed

    Lee, Yu Ri; Shin, Hye Sook

    2017-03-01

    The purpose of this study was to examine the effectiveness of aromatherapy with ginger essential oil on nausea and vomiting in abdominal surgery patients. This was a quasi-experimental study with a nonequivalent control group and repeated measures. The experimental group (n = 30) received ginger essential oil inhalation. The placebo control group (n = 30) received normal saline inhalation. The level of postoperative nausea and vomiting was measured using a Korean version of the Index of Nausea, Vomiting, and Retching (INVR) at baseline and at 6, 12, and 24 h after aromatherapy administration. The data were collected from July 23 to August 22, 2012. Nausea and vomiting scores were significantly lower in the experimental group with ginger essential oil inhalation than those in the placebo control group with normal saline. In the experimental group, the nausea and vomiting scores decreased considerably in the first 6 h after inhaled aromatherapy with ginger essential oil. Findings indicate that ginger essential oil inhalation has implications for alleviating postoperative nausea and vomiting in abdominal surgery patients.

  19. Use of ePortfolio Presentations in a Baccalaureate Nursing Program

    ERIC Educational Resources Information Center

    Feather, Rebecca; Ricci, Margaret

    2014-01-01

    Portfolios are an essential tool for demonstrating professional accomplishments and documenting professional growth in a variety of professions. Because of the competitive job market for new graduate nurses in health care, the development and use of an ePortfolio can be an essential tool for the application and interview process. The purpose of…

  20. [Current macro-diagnostic trends of forensic medicine in the Czech Republic].

    PubMed

    Frišhons, Jan; Kučerová, Štěpánka; Jurda, Mikoláš; Sokol, Miloš; Vojtíšek, Tomáš; Hejna, Petr

    2017-01-01

    Over the last few years, advanced diagnostic methods have penetrated in the realm of forensic medicine in addition to standard autopsy techniques supported by traditional X-ray examination and macro-diagnostic laboratory tests. Despite the progress of imaging methods, the conventional autopsy has remained basic and essential diagnostic tool in forensic medicine. Postmortem computed tomography and magnetic resonance imaging are far the most progressive modern radio diagnostic methods setting the current trend of virtual autopsies all over the world. Up to now, only two institutes of forensic medicine have available postmortem computed tomography for routine diagnostic purposes in the Czech Republic. Postmortem magnetic resonance is currently unattainable for routine diagnostic use and was employed only for experimental purposes. Photogrammetry is digital method focused primarily on body surface imaging. Recently, the most fruitful results have been yielded from the interdisciplinary cooperation between forensic medicine and forensic anthropology with the implementation of body scanning techniques and 3D printing. Non-invasive and mini-invasive investigative methods such as postmortem sonography and postmortem endoscopy was unsystematically tested for diagnostic performance with good outcomes despite of limitations of these methods in postmortem application. Other futuristic methods, such as the use of a drone to inspect the crime scene are still experimental tools. The authors of the article present a basic overview of the both routinely and experimentally used investigative methods and current macro-diagnostic trends of the forensic medicine in the Czech Republic.

  1. Models of protein–ligand crystal structures: trust, but verify

    PubMed Central

    Deller, Marc C.

    2015-01-01

    X-ray crystallography provides the most accurate models of protein–ligand structures. These models serve as the foundation of many computational methods including structure prediction, molecular modelling, and structure-based drug design. The success of these computational methods ultimately depends on the quality of the underlying protein–ligand models. X-ray crystallography offers the unparalleled advantage of a clear mathematical formalism relating the experimental data to the protein–ligand model. In the case of X-ray crystallography, the primary experimental evidence is the electron density of the molecules forming the crystal. The first step in the generation of an accurate and precise crystallographic model is the interpretation of the electron density of the crystal, typically carried out by construction of an atomic model. The atomic model must then be validated for fit to the experimental electron density and also for agreement with prior expectations of stereochemistry. Stringent validation of protein–ligand models has become possible as a result of the mandatory deposition of primary diffraction data, and many computational tools are now available to aid in the validation process. Validation of protein–ligand complexes has revealed some instances of overenthusiastic interpretation of ligand density. Fundamental concepts and metrics of protein–ligand quality validation are discussed and we highlight software tools to assist in this process. It is essential that end users select high quality protein–ligand models for their computational and biological studies, and we provide an overview of how this can be achieved. PMID:25665575

  2. Models of protein-ligand crystal structures: trust, but verify.

    PubMed

    Deller, Marc C; Rupp, Bernhard

    2015-09-01

    X-ray crystallography provides the most accurate models of protein-ligand structures. These models serve as the foundation of many computational methods including structure prediction, molecular modelling, and structure-based drug design. The success of these computational methods ultimately depends on the quality of the underlying protein-ligand models. X-ray crystallography offers the unparalleled advantage of a clear mathematical formalism relating the experimental data to the protein-ligand model. In the case of X-ray crystallography, the primary experimental evidence is the electron density of the molecules forming the crystal. The first step in the generation of an accurate and precise crystallographic model is the interpretation of the electron density of the crystal, typically carried out by construction of an atomic model. The atomic model must then be validated for fit to the experimental electron density and also for agreement with prior expectations of stereochemistry. Stringent validation of protein-ligand models has become possible as a result of the mandatory deposition of primary diffraction data, and many computational tools are now available to aid in the validation process. Validation of protein-ligand complexes has revealed some instances of overenthusiastic interpretation of ligand density. Fundamental concepts and metrics of protein-ligand quality validation are discussed and we highlight software tools to assist in this process. It is essential that end users select high quality protein-ligand models for their computational and biological studies, and we provide an overview of how this can be achieved.

  3. Computational tool for optimizing the essential oils utilization in inhibiting the bacterial growth

    PubMed Central

    El-Attar, Noha E; Awad, Wael A

    2017-01-01

    Day after day, the importance of relying on nature in many fields such as food, medical, pharmaceutical industries, and others is increasing. Essential oils (EOs) are considered as one of the most significant natural products for use as antimicrobials, antioxidants, antitumorals, and anti-inflammatories. Optimizing the usage of EOs is a big challenge faced by the scientific researchers because of the complexity of chemical composition of every EO, in addition to the difficulties to determine the best in inhibiting the bacterial activity. The goal of this article is to present a new computational tool based on two methodologies: reduction by using rough sets and optimization with particle swarm optimization. The developed tool dubbed as Essential Oil Reduction and Optimization Tool is applied on 24 types of EOs that have been tested toward 17 different species of bacteria. PMID:28919787

  4. Accurate thermoelastic tensor and acoustic velocities of NaCl

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marcondes, Michel L., E-mail: michel@if.usp.br; Chemical Engineering and Material Science, University of Minnesota, Minneapolis, 55455; Shukla, Gaurav, E-mail: shukla@physics.umn.edu

    Despite the importance of thermoelastic properties of minerals in geology and geophysics, their measurement at high pressures and temperatures are still challenging. Thus, ab initio calculations are an essential tool for predicting these properties at extreme conditions. Owing to the approximate description of the exchange-correlation energy, approximations used in calculations of vibrational effects, and numerical/methodological approximations, these methods produce systematic deviations. Hybrid schemes combining experimental data and theoretical results have emerged as a way to reconcile available information and offer more reliable predictions at experimentally inaccessible thermodynamics conditions. Here we introduce a method to improve the calculated thermoelastic tensor bymore » using highly accurate thermal equation of state (EoS). The corrective scheme is general, applicable to crystalline solids with any symmetry, and can produce accurate results at conditions where experimental data may not exist. We apply it to rock-salt-type NaCl, a material whose structural properties have been challenging to describe accurately by standard ab initio methods and whose acoustic/seismic properties are important for the gas and oil industry.« less

  5. VISIONET: intuitive visualisation of overlapping transcription factor networks, with applications in cardiogenic gene discovery.

    PubMed

    Nim, Hieu T; Furtado, Milena B; Costa, Mauro W; Rosenthal, Nadia A; Kitano, Hiroaki; Boyd, Sarah E

    2015-05-01

    Existing de novo software platforms have largely overlooked a valuable resource, the expertise of the intended biologist users. Typical data representations such as long gene lists, or highly dense and overlapping transcription factor networks often hinder biologists from relating these results to their expertise. VISIONET, a streamlined visualisation tool built from experimental needs, enables biologists to transform large and dense overlapping transcription factor networks into sparse human-readable graphs via numerically filtering. The VISIONET interface allows users without a computing background to interactively explore and filter their data, and empowers them to apply their specialist knowledge on far more complex and substantial data sets than is currently possible. Applying VISIONET to the Tbx20-Gata4 transcription factor network led to the discovery and validation of Aldh1a2, an essential developmental gene associated with various important cardiac disorders, as a healthy adult cardiac fibroblast gene co-regulated by cardiogenic transcription factors Gata4 and Tbx20. We demonstrate with experimental validations the utility of VISIONET for expertise-driven gene discovery that opens new experimental directions that would not otherwise have been identified.

  6. Monte carlo simulations of the n_TOF lead spallation target with the Geant4 toolkit: A benchmark study

    NASA Astrophysics Data System (ADS)

    Lerendegui-Marco, J.; Cortés-Giraldo, M. A.; Guerrero, C.; Quesada, J. M.; Meo, S. Lo; Massimi, C.; Barbagallo, M.; Colonna, N.; Mancussi, D.; Mingrone, F.; Sabaté-Gilarte, M.; Vannini, G.; Vlachoudis, V.; Aberle, O.; Andrzejewski, J.; Audouin, L.; Bacak, M.; Balibrea, J.; Bečvář, F.; Berthoumieux, E.; Billowes, J.; Bosnar, D.; Brown, A.; Caamaño, M.; Calviño, F.; Calviani, M.; Cano-Ott, D.; Cardella, R.; Casanovas, A.; Cerutti, F.; Chen, Y. H.; Chiaveri, E.; Cortés, G.; Cosentino, L.; Damone, L. A.; Diakaki, M.; Domingo-Pardo, C.; Dressler, R.; Dupont, E.; Durán, I.; Fernández-Domínguez, B.; Ferrari, A.; Ferreira, P.; Finocchiaro, P.; Göbel, K.; Gómez-Hornillos, M. B.; García, A. R.; Gawlik, A.; Gilardoni, S.; Glodariu, T.; Gonçalves, I. F.; González, E.; Griesmayer, E.; Gunsing, F.; Harada, H.; Heinitz, S.; Heyse, J.; Jenkins, D. G.; Jericha, E.; Käppeler, F.; Kadi, Y.; Kalamara, A.; Kavrigin, P.; Kimura, A.; Kivel, N.; Kokkoris, M.; Krtička, M.; Kurtulgil, D.; Leal-Cidoncha, E.; Lederer, C.; Leeb, H.; Lonsdale, S. J.; Macina, D.; Marganiec, J.; Martínez, T.; Masi, A.; Mastinu, P.; Mastromarco, M.; Maugeri, E. A.; Mazzone, A.; Mendoza, E.; Mengoni, A.; Milazzo, P. M.; Musumarra, A.; Negret, A.; Nolte, R.; Oprea, A.; Patronis, N.; Pavlik, A.; Perkowski, J.; Porras, I.; Praena, J.; Radeck, D.; Rauscher, T.; Reifarth, R.; Rout, P. C.; Rubbia, C.; Ryan, J. A.; Saxena, A.; Schillebeeckx, P.; Schumann, D.; Smith, A. G.; Sosnin, N. V.; Stamatopoulos, A.; Tagliente, G.; Tain, J. L.; Tarifeño-Saldivia, A.; Tassan-Got, L.; Valenta, S.; Variale, V.; Vaz, P.; Ventura, A.; Vlastou, R.; Wallner, A.; Warren, S.; Woods, P. J.; Wright, T.; Žugec, P.

    2017-09-01

    Monte Carlo (MC) simulations are an essential tool to determine fundamental features of a neutron beam, such as the neutron flux or the γ-ray background, that sometimes can not be measured or at least not in every position or energy range. Until recently, the most widely used MC codes in this field had been MCNPX and FLUKA. However, the Geant4 toolkit has also become a competitive code for the transport of neutrons after the development of the native Geant4 format for neutron data libraries, G4NDL. In this context, we present the Geant4 simulations of the neutron spallation target of the n_TOF facility at CERN, done with version 10.1.1 of the toolkit. The first goal was the validation of the intra-nuclear cascade models implemented in the code using, as benchmark, the characteristics of the neutron beam measured at the first experimental area (EAR1), especially the neutron flux and energy distribution, and the time distribution of neutrons of equal kinetic energy, the so-called Resolution Function. The second goal was the development of a Monte Carlo tool aimed to provide useful calculations for both the analysis and planning of the upcoming measurements at the new experimental area (EAR2) of the facility.

  7. Gene essentiality, conservation index and co-evolution of genes in cyanobacteria.

    PubMed

    Tiruveedula, Gopi Siva Sai; Wangikar, Pramod P

    2017-01-01

    Cyanobacteria, a group of photosynthetic prokaryotes, dominate the earth with ~ 1015 g wet biomass. Despite diversity in habitats and an ancient origin, cyanobacterial phylum has retained a significant core genome. Cyanobacteria are being explored for direct conversion of solar energy and carbon dioxide into biofuels. For this, efficient cyanobacterial strains will need to be designed via metabolic engineering. This will require identification of target knockouts to channelize the flow of carbon toward the product of interest while minimizing deletions of essential genes. We propose "Gene Conservation Index" (GCI) as a quick measure to predict gene essentiality in cyanobacteria. GCI is based on phylogenetic profile of a gene constructed with a reduced dataset of cyanobacterial genomes. GCI is the percentage of organism clusters in which the query gene is present in the reduced dataset. Of the 750 genes deemed to be essential in the experimental study on S. elongatus PCC 7942, we found 494 to be conserved across the phylum which largely comprise of the essential metabolic pathways. On the contrary, the conserved but non-essential genes broadly comprise of genes required under stress conditions. Exceptions to this rule include genes such as the glycogen synthesis and degradation enzymes, deoxyribose-phosphate aldolase (DERA), glucose-6-phosphate 1-dehydrogenase (zwf) and fructose-1,6-bisphosphatase class1, which are conserved but non-essential. While the essential genes are to be avoided during gene knockout studies as potentially lethal deletions, the non-essential but conserved set of genes could be interesting targets for metabolic engineering. Further, we identify clusters of co-evolving genes (CCG), which provide insights that may be useful in annotation. Principal component analysis (PCA) plots of the CCGs are demonstrated as data visualization tools that are complementary to the conventional heatmaps. Our dataset consists of phylogenetic profiles for 23,643 non-redundant cyanobacterial genes. We believe that the data and the analysis presented here will be a great resource to the scientific community interested in cyanobacteria.

  8. Digital Technology Snapshot of the Literacy and Essential Skills Field 2013. Summary Report

    ERIC Educational Resources Information Center

    Trottier, Vicki

    2013-01-01

    From January to March 2013, "Canadian Literacy and Learning Network" (CLLN) conducted a snapshot to provide information about how digital technology tools are being used in the Literacy and Essential Skills (L/ES) field. The snapshot focused primarily on digital tools and activities that meet the organizational needs of provincial and…

  9. The QSPR-THESAURUS: the online platform of the CADASTER project.

    PubMed

    Brandmaier, Stefan; Peijnenburg, Willie; Durjava, Mojca K; Kolar, Boris; Gramatica, Paola; Papa, Ester; Bhhatarai, Barun; Kovarich, Simona; Cassani, Stefano; Roy, Partha Pratim; Rahmberg, Magnus; Öberg, Tomas; Jeliazkova, Nina; Golsteijn, Laura; Comber, Mike; Charochkina, Larisa; Novotarskyi, Sergii; Sushko, Iurii; Abdelaziz, Ahmed; D'Onofrio, Elisa; Kunwar, Prakash; Ruggiu, Fiorella; Tetko, Igor V

    2014-03-01

    The aim of the CADASTER project (CAse Studies on the Development and Application of in Silico Techniques for Environmental Hazard and Risk Assessment) was to exemplify REACH-related hazard assessments for four classes of chemical compound, namely, polybrominated diphenylethers, per and polyfluorinated compounds, (benzo)triazoles, and musks and fragrances. The QSPR-THESAURUS website (http: / /qspr-thesaurus.eu) was established as the project's online platform to upload, store, apply, and also create, models within the project. We overview the main features of the website, such as model upload, experimental design and hazard assessment to support risk assessment, and integration with other web tools, all of which are essential parts of the QSPR-THESAURUS. 2014 FRAME.

  10. A simple method for processing data with least square method

    NASA Astrophysics Data System (ADS)

    Wang, Chunyan; Qi, Liqun; Chen, Yongxiang; Pang, Guangning

    2017-08-01

    The least square method is widely used in data processing and error estimation. The mathematical method has become an essential technique for parameter estimation, data processing, regression analysis and experimental data fitting, and has become a criterion tool for statistical inference. In measurement data analysis, the distribution of complex rules is usually based on the least square principle, i.e., the use of matrix to solve the final estimate and to improve its accuracy. In this paper, a new method is presented for the solution of the method which is based on algebraic computation and is relatively straightforward and easy to understand. The practicability of this method is described by a concrete example.

  11. Available clinical markers of treatment outcome integrated in mathematical models to guide therapy in HIV infection.

    PubMed

    Vergu, Elisabeta; Mallet, Alain; Golmard, Jean-Louis

    2004-02-01

    Because treatment failure in many HIV-infected persons may be due to multiple causes, including resistance to antiretroviral agents, it is important to better tailor drug therapy to individual patients. This improvement requires the prediction of treatment outcome from baseline immunological or virological factors, and from results of resistance tests. Here, we review briefly the available clinical factors that have an impact on therapy outcome, and discuss the role of a predictive modelling approach integrating these factors proposed in a previous work. Mathematical and statistical models could become essential tools to address questions that are difficult to study clinically and experimentally, thereby guiding decisions in the choice of individualized drug regimens.

  12. Engineering Approaches to Illuminating Brain Structure and Dynamics

    PubMed Central

    Deisseroth, Karl; Schnitzer, Mark J.

    2017-01-01

    Historical milestones in neuroscience have come in diverse forms, ranging from the resolution of specific biological mysteries via creative experimentation to broad technological advances allowing neuroscientists to ask new kinds of questions. The continuous development of tools is driven with a special necessity by the complexity, fragility, and inaccessibility of intact nervous systems, such that inventive technique development and application drawing upon engineering and the applied sciences has long been essential to neuroscience. Here we highlight recent technological directions in neuroscience spurred by progress in optical, electrical, mechanical, chemical, and biological engineering. These research areas are poised for rapid growth and will likely be central to the practice of neuroscience well into the future. PMID:24183010

  13. Metazen – metadata capture for metagenomes

    DOE PAGES

    Bischof, Jared; Harrison, Travis; Paczian, Tobias; ...

    2014-12-08

    Background: As the impact and prevalence of large-scale metagenomic surveys grow, so does the acute need for more complete and standards compliant metadata. Metadata (data describing data) provides an essential complement to experimental data, helping to answer questions about its source, mode of collection, and reliability. Metadata collection and interpretation have become vital to the genomics and metagenomics communities, but considerable challenges remain, including exchange, curation, and distribution. Currently, tools are available for capturing basic field metadata during sampling, and for storing, updating and viewing it. These tools are not specifically designed for metagenomic surveys; in particular, they lack themore » appropriate metadata collection templates, a centralized storage repository, and a unique ID linking system that can be used to easily port complete and compatible metagenomic metadata into widely used assembly and sequence analysis tools. Results: Metazen was developed as a comprehensive framework designed to enable metadata capture for metagenomic sequencing projects. Specifically, Metazen provides a rapid, easy-to-use portal to encourage early deposition of project and sample metadata. Conclusion: Metazen is an interactive tool that aids users in recording their metadata in a complete and valid format. A defined set of mandatory fields captures vital information, while the option to add fields provides flexibility.« less

  14. Metazen – metadata capture for metagenomes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bischof, Jared; Harrison, Travis; Paczian, Tobias

    Background: As the impact and prevalence of large-scale metagenomic surveys grow, so does the acute need for more complete and standards compliant metadata. Metadata (data describing data) provides an essential complement to experimental data, helping to answer questions about its source, mode of collection, and reliability. Metadata collection and interpretation have become vital to the genomics and metagenomics communities, but considerable challenges remain, including exchange, curation, and distribution. Currently, tools are available for capturing basic field metadata during sampling, and for storing, updating and viewing it. These tools are not specifically designed for metagenomic surveys; in particular, they lack themore » appropriate metadata collection templates, a centralized storage repository, and a unique ID linking system that can be used to easily port complete and compatible metagenomic metadata into widely used assembly and sequence analysis tools. Results: Metazen was developed as a comprehensive framework designed to enable metadata capture for metagenomic sequencing projects. Specifically, Metazen provides a rapid, easy-to-use portal to encourage early deposition of project and sample metadata. Conclusion: Metazen is an interactive tool that aids users in recording their metadata in a complete and valid format. A defined set of mandatory fields captures vital information, while the option to add fields provides flexibility.« less

  15. Development and experimental qualification of a calculation scheme for the evaluation of gamma heating in experimental reactors. Application to MARIA and Jules Horowitz (JHR) MTR Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tarchalski, M.; Pytel, K.; Wroblewska, M.

    2015-07-01

    Precise computational determination of nuclear heating which consists predominantly of gamma heating (more than 80 %) is one of the challenges in material testing reactor exploitation. Due to sophisticated construction and conditions of experimental programs planned in JHR it became essential to use most accurate and precise gamma heating model. Before the JHR starts to operate, gamma heating evaluation methods need to be developed and qualified in other experimental reactor facilities. This is done inter alia using OSIRIS, MINERVE or EOLE research reactors in France. Furthermore, MARIA - Polish material testing reactor - has been chosen to contribute to themore » qualification of gamma heating calculation schemes/tools. This reactor has some characteristics close to those of JHR (beryllium usage, fuel element geometry). To evaluate gamma heating in JHR and MARIA reactors, both simulation tools and experimental program have been developed and performed. For gamma heating simulation, new calculation scheme and gamma heating model of MARIA have been carried out using TRIPOLI4 and APOLLO2 codes. Calculation outcome has been verified by comparison to experimental measurements in MARIA reactor. To have more precise calculation results, model of MARIA in TRIPOLI4 has been made using the whole geometry of the core. This has been done for the first time in the history of MARIA reactor and was complex due to cut cone shape of all its elements. Material composition of burnt fuel elements has been implemented from APOLLO2 calculations. An experiment for nuclear heating measurements and calculation verification has been done in September 2014. This involved neutron, photon and nuclear heating measurements at selected locations in MARIA reactor using in particular Rh SPND, Ag SPND, Ionization Chamber (all three from CEA), KAROLINA calorimeter (NCBJ) and Gamma Thermometer (CEA/SCK CEN). Measurements were done in forty points using four channels. Maximal nuclear heating evaluated from measurements is of the order of 2.5 W/g at half of the possible MARIA power - 15 MW. The approach and the detailed program for experimental verification of calculations will be presented. The following points will be discussed: - Development of a gamma heating model of MARIA reactor with TRIPOLI 4 (coupled neutron-photon mode) and APOLLO2 model taking into account the key parameters like: configuration of the core, experimental loading, control rod location, reactor power, fuel depletion); - Design of specific measurement tools for MARIA experiments including for instance a new single-cell calorimeter called KAROLINA calorimeter; - MARIA experimental program description and a preliminary analysis of results; - Comparison of calculations for JHR and MARIA cores with experimental verification analysis, calculation behavior and n-γ 'environments'. (authors)« less

  16. Analysis methodology and development of a statistical tool for biodistribution data from internal contamination with actinides.

    PubMed

    Lamart, Stephanie; Griffiths, Nina M; Tchitchek, Nicolas; Angulo, Jaime F; Van der Meeren, Anne

    2017-03-01

    The aim of this work was to develop a computational tool that integrates several statistical analysis features for biodistribution data from internal contamination experiments. These data represent actinide levels in biological compartments as a function of time and are derived from activity measurements in tissues and excreta. These experiments aim at assessing the influence of different contamination conditions (e.g. intake route or radioelement) on the biological behavior of the contaminant. The ever increasing number of datasets and diversity of experimental conditions make the handling and analysis of biodistribution data difficult. This work sought to facilitate the statistical analysis of a large number of datasets and the comparison of results from diverse experimental conditions. Functional modules were developed using the open-source programming language R to facilitate specific operations: descriptive statistics, visual comparison, curve fitting, and implementation of biokinetic models. In addition, the structure of the datasets was harmonized using the same table format. Analysis outputs can be written in text files and updated data can be written in the consistent table format. Hence, a data repository is built progressively, which is essential for the optimal use of animal data. Graphical representations can be automatically generated and saved as image files. The resulting computational tool was applied using data derived from wound contamination experiments conducted under different conditions. In facilitating biodistribution data handling and statistical analyses, this computational tool ensures faster analyses and a better reproducibility compared with the use of multiple office software applications. Furthermore, re-analysis of archival data and comparison of data from different sources is made much easier. Hence this tool will help to understand better the influence of contamination characteristics on actinide biokinetics. Our approach can aid the optimization of treatment protocols and therefore contribute to the improvement of the medical response after internal contamination with actinides.

  17. Upper-limb tremor suppression with a 7DOF exoskeleton power-assist robot.

    PubMed

    Kiguchi, Kazuo; Hayashi, Yoshiaki

    2013-01-01

    A tremor which is one of the involuntary motions is somewhat rhythmic motion that may occur in various body parts. Although there are several kinds of the tremor, an essential tremor is the most common tremor disorder of the arm. The essential tremor is a disorder of unknown cause, and it is common in the elderly. The essential tremor interferes with a patient's daily living activity, because it may occur during a voluntary motion. If a patient of an essential tremor uses an EMG-based controlled power-assist robot, the robot might misunderstand the user's motion intention because of the effect of the essential tremor. In that case, upper-limb power-assist robots must carry out tremor suppression as well as power-assist, since a person performs various precise tasks with certain tools by the upper-limb in daily living. Therefore, it is important to suppress the tremor at the hand and grasped tool. However, in the case of the tremor suppression control method which suppressed the vibrations of the hand and the tip of the tool, vibration of other part such as elbow might occur. In this paper, the tremor suppression control method for upper-limb power-assist robot is proposed. In the proposed method, the vibration of the elbow is suppressed in addition to the hand and the tip of the tool. The validity of the proposed method was verified by the experiments.

  18. Market Intelligence Guide

    DTIC Science & Technology

    2012-01-05

    learn about the latest designs , trends in fashion, and scientific breakthroughs in chair ergonomics . Using this tradeshow, the Furnishings Commodity...these tools is essential to designing the optimal contract that reaps the most value from the exchange. Therefore, this market intelligence guide is...portfolio matrix) that are transferrable to the not-for-profit sector are absent. Each of these tools is essential to designing the optimal contract that

  19. Numerical Simulation of Ground Coupling of Low Yield Nuclear Detonation

    DTIC Science & Technology

    2010-06-01

    Without nuclear testing, advanced simulation and experimental facilities, such as the National Ignition Facility ( NIF ), are essential to assuring...in planning future experimental work at NIF . 15. NUMBER OF PAGES 93 14. SUBJECT TERMS National Ignition Facility, GEODYN, Ground Coupling...simulation and experimental facilities, such as the National Ignition Facility ( NIF ), are essential to assuring safety, reliability, and effectiveness

  20. Attenuation of cryocooler induced vibration using multimodal tuned dynamic absorbers

    NASA Astrophysics Data System (ADS)

    Veprik, Alexander; Babitsky, Vladimir; Tuito, Avi

    2017-05-01

    Modern infrared imagers often rely on split Stirling linear cryocoolers comprising compressor and expander, the relative position of which is governed by the optical design and packaging constraints. A force couple generated by imbalanced reciprocation of moving components inside both compressor and expander result in cryocooler induced vibration comprising angular and translational tonal components manifesting itself in the form of line of sight jitter and dynamic defocusing. Since linear cryocooler is usually driven at a fixed and precisely adjustable frequency, a tuned dynamic absorber is a well suited tool for vibration control. It is traditionally made in the form of lightweight single degree of freedom undamped mechanical resonator, the frequency of which is essentially matched with the driving frequency or vice versa. Unfortunately, the performance of such a traditional approach is limited in terms of simultaneous attenuating translational and angular components of cooler induced vibration. The authors are enhancing the traditional concept and consider multimodal tuned dynamic absorber made in the form of weakly damped mechanical resonator, where the frequencies of useful dynamic modes are essentially matched with the driving frequency. Dynamic analysis and experimental testing show that the dynamic reactions (forces and moments) produced by such a device may simultaneously attenuate both translational and angular components of cryocoolerinduced vibration. The authors are considering different embodiments and their suitability for different packaging concepts. The outcomes of theoretical predictions are supported by full scale experimentation.

  1. Nucleation and microstructure development in Cr-Mo-V tool steel during gas atomization

    NASA Astrophysics Data System (ADS)

    Behúlová, M.; Grgač, P.; Čička, R.

    2017-11-01

    Nucleation studies of undercooled metallic melts are of essential interest for the understanding of phase selection, growth kinetics and microstructure development during their rapid non-equilibrium solidification. The paper deals with the modelling of nucleation processes and microstructure development in the hypoeutectic tool steel Ch12MF4 with the chemical composition of 2.37% C, 12.06 % Cr, 1.2% Mo, 4.0% V and balance Fe [wt. %] in the process of nitrogen gas atomization. Based on the classical theory of homogeneous nucleation, the nucleation temperature of molten rapidly cooled spherical particles from this alloy with diameter from 40 μm to 600 μm in the gas atomization process is calculated using various estimations of parameters influencing the nucleation process - the Gibbs free energy difference between solid and liquid phases and the solid/liquid interfacial energy. Results of numerical calculations are compared with experimentally measured nucleation temperatures during levitation experiments and microstructures developed in rapidly solidified powder particles from the investigated alloy.

  2. How medicine has become a science?

    PubMed

    Zieliński, Andrzej

    2014-01-01

    The historical review of medical activities draws attention how late in its very long history therapies of proven effectiveness were introduced. Author attributes it to the late development of methods which would be capable to determine the causal relations which would scientifically justified identification the causes and risk factors of diseases as well as checking the effectiveness of preventive and therapeutic procedures. Among the fundamental tools for scientific knowledge of the causes and mechanisms of diseases, the author indicates: achievements of basic science and the development of epidemiological methods used to study causal relationships. In the author's opinion the results of basic research are an essential source of variables among which, with an increased likelihood could be found the causes and risk factors of studied conditions, including diseases. The author also stresses the role of medical technology, which is the primary source of potential medicines, other therapeutic procedures and diagnostic methods whose effectiveness is tested in experimental epidemiological studies. Medical technologies create also tools for the development of basic sciences.

  3. TeamWATCH: Visualizing development activities using a 3-D city metaphor to improve conflict detection and team awareness

    PubMed Central

    Ye, Xin

    2018-01-01

    The awareness of others’ activities has been widely recognized as essential in facilitating coordination in a team among Computer-Supported Cooperative Work communities. Several field studies of software developers in large software companies such as Microsoft have shown that coworker and artifact awareness are the most common information needs for software developers; however, they are also two of the seven most frequently unsatisfied information needs. To address this problem, we built a workspace awareness tool named TeamWATCH to visualize developer activities using a 3-D city metaphor. In this paper, we discuss the importance of awareness in software development, review existing workspace awareness tools, present the design and implementation of TeamWATCH, and evaluate how it could help detect and resolve conflicts earlier and better maintain group awareness via a controlled experiment. The experimental results showed that the subjects using TeamWATCH performed significantly better with respect to early conflict detection and resolution. PMID:29558519

  4. Computational modelling of genome-scale metabolic networks and its application to CHO cell cultures.

    PubMed

    Rejc, Živa; Magdevska, Lidija; Tršelič, Tilen; Osolin, Timotej; Vodopivec, Rok; Mraz, Jakob; Pavliha, Eva; Zimic, Nikolaj; Cvitanović, Tanja; Rozman, Damjana; Moškon, Miha; Mraz, Miha

    2017-09-01

    Genome-scale metabolic models (GEMs) have become increasingly important in recent years. Currently, GEMs are the most accurate in silico representation of the genotype-phenotype link. They allow us to study complex networks from the systems perspective. Their application may drastically reduce the amount of experimental and clinical work, improve diagnostic tools and increase our understanding of complex biological phenomena. GEMs have also demonstrated high potential for the optimisation of bio-based production of recombinant proteins. Herein, we review the basic concepts, methods, resources and software tools used for the reconstruction and application of GEMs. We overview the evolution of the modelling efforts devoted to the metabolism of Chinese Hamster Ovary (CHO) cells. We present a case study on CHO cell metabolism under different amino acid depletions. This leads us to the identification of the most influential as well as essential amino acids in selected CHO cell lines. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Exploring the potential of a structural alphabet-based tool for mining multiple target conformations and target flexibility insight

    PubMed Central

    Chéron, Jean-Baptiste; Triki, Dhoha; Senac, Caroline; Flatters, Delphine; Camproux, Anne-Claude

    2017-01-01

    Protein flexibility is often implied in binding with different partners and is essential for protein function. The growing number of macromolecular structures in the Protein Data Bank entries and their redundancy has become a major source of structural knowledge of the protein universe. The analysis of structural variability through available redundant structures of a target, called multiple target conformations (MTC), obtained using experimental or modeling methods and under different biological conditions or different sources is one way to explore protein flexibility. This analysis is essential to improve the understanding of various mechanisms associated with protein target function and flexibility. In this study, we explored structural variability of three biological targets by analyzing different MTC sets associated with these targets. To facilitate the study of these MTC sets, we have developed an efficient tool, SA-conf, dedicated to capturing and linking the amino acid and local structure variability and analyzing the target structural variability space. The advantage of SA-conf is that it could be applied to divers sets composed of MTCs available in the PDB obtained using NMR and crystallography or homology models. This tool could also be applied to analyze MTC sets obtained by dynamics approaches. Our results showed that SA-conf tool is effective to quantify the structural variability of a MTC set and to localize the structural variable positions and regions of the target. By selecting adapted MTC subsets and comparing their variability detected by SA-conf, we highlighted different sources of target flexibility such as induced by binding partner, by mutation and intrinsic flexibility. Our results support the interest to mine available structures associated with a target using to offer valuable insight into target flexibility and interaction mechanisms. The SA-conf executable script, with a set of pre-compiled binaries are available at http://www.mti.univ-paris-diderot.fr/recherche/plateformes/logiciels. PMID:28817602

  6. Exploring the potential of a structural alphabet-based tool for mining multiple target conformations and target flexibility insight.

    PubMed

    Regad, Leslie; Chéron, Jean-Baptiste; Triki, Dhoha; Senac, Caroline; Flatters, Delphine; Camproux, Anne-Claude

    2017-01-01

    Protein flexibility is often implied in binding with different partners and is essential for protein function. The growing number of macromolecular structures in the Protein Data Bank entries and their redundancy has become a major source of structural knowledge of the protein universe. The analysis of structural variability through available redundant structures of a target, called multiple target conformations (MTC), obtained using experimental or modeling methods and under different biological conditions or different sources is one way to explore protein flexibility. This analysis is essential to improve the understanding of various mechanisms associated with protein target function and flexibility. In this study, we explored structural variability of three biological targets by analyzing different MTC sets associated with these targets. To facilitate the study of these MTC sets, we have developed an efficient tool, SA-conf, dedicated to capturing and linking the amino acid and local structure variability and analyzing the target structural variability space. The advantage of SA-conf is that it could be applied to divers sets composed of MTCs available in the PDB obtained using NMR and crystallography or homology models. This tool could also be applied to analyze MTC sets obtained by dynamics approaches. Our results showed that SA-conf tool is effective to quantify the structural variability of a MTC set and to localize the structural variable positions and regions of the target. By selecting adapted MTC subsets and comparing their variability detected by SA-conf, we highlighted different sources of target flexibility such as induced by binding partner, by mutation and intrinsic flexibility. Our results support the interest to mine available structures associated with a target using to offer valuable insight into target flexibility and interaction mechanisms. The SA-conf executable script, with a set of pre-compiled binaries are available at http://www.mti.univ-paris-diderot.fr/recherche/plateformes/logiciels.

  7. Development and Validation of a Rubric for Diagnosing Students' Experimental Design Knowledge and Difficulties.

    PubMed

    Dasgupta, Annwesa P; Anderson, Trevor R; Pelaez, Nancy

    2014-01-01

    It is essential to teach students about experimental design, as this facilitates their deeper understanding of how most biological knowledge was generated and gives them tools to perform their own investigations. Despite the importance of this area, surprisingly little is known about what students actually learn from designing biological experiments. In this paper, we describe a rubric for experimental design (RED) that can be used to measure knowledge of and diagnose difficulties with experimental design. The development and validation of the RED was informed by a literature review and empirical analysis of undergraduate biology students' responses to three published assessments. Five areas of difficulty with experimental design were identified: the variable properties of an experimental subject; the manipulated variables; measurement of outcomes; accounting for variability; and the scope of inference appropriate for experimental findings. Our findings revealed that some difficulties, documented some 50 yr ago, still exist among our undergraduate students, while others remain poorly investigated. The RED shows great promise for diagnosing students' experimental design knowledge in lecture settings, laboratory courses, research internships, and course-based undergraduate research experiences. It also shows potential for guiding the development and selection of assessment and instructional activities that foster experimental design. © 2014 A. P. Dasgupta et al. CBE—Life Sciences Education © 2014 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  8. A three-dimensional inverse finite element analysis of the heel pad.

    PubMed

    Chokhandre, Snehal; Halloran, Jason P; van den Bogert, Antonie J; Erdemir, Ahmet

    2012-03-01

    Quantification of plantar tissue behavior of the heel pad is essential in developing computational models for predictive analysis of preventive treatment options such as footwear for patients with diabetes. Simulation based studies in the past have generally adopted heel pad properties from the literature, in return using heel-specific geometry with material properties of a different heel. In exceptional cases, patient-specific material characterization was performed with simplified two-dimensional models, without further evaluation of a heel-specific response under different loading conditions. The aim of this study was to conduct an inverse finite element analysis of the heel in order to calculate heel-specific material properties in situ. Multidimensional experimental data available from a previous cadaver study by Erdemir et al. ("An Elaborate Data Set Characterizing the Mechanical Response of the Foot," ASME J. Biomech. Eng., 131(9), pp. 094502) was used for model development, optimization, and evaluation of material properties. A specimen-specific three-dimensional finite element representation was developed. Heel pad material properties were determined using inverse finite element analysis by fitting the model behavior to the experimental data. Compression dominant loading, applied using a spherical indenter, was used for optimization of the material properties. The optimized material properties were evaluated through simulations representative of a combined loading scenario (compression and anterior-posterior shear) with a spherical indenter and also of a compression dominant loading applied using an elevated platform. Optimized heel pad material coefficients were 0.001084 MPa (μ), 9.780 (α) (with an effective Poisson's ratio (ν) of 0.475), for a first-order nearly incompressible Ogden material model. The model predicted structural response of the heel pad was in good agreement for both the optimization (<1.05% maximum tool force, 0.9% maximum tool displacement) and validation cases (6.5% maximum tool force, 15% maximum tool displacement). The inverse analysis successfully predicted the material properties for the given specimen-specific heel pad using the experimental data for the specimen. The modeling framework and results can be used for accurate predictions of the three-dimensional interaction of the heel pad with its surroundings.

  9. A Comparison of Predictive Thermo and Water Solvation Property Prediction Tools and Experimental Data for Selected Traditional Chemical Warfare Agents and Simulants II: COSMO RS and COSMOTherm

    DTIC Science & Technology

    2017-04-01

    A COMPARISON OF PREDICTIVE THERMO AND WATER SOLVATION PROPERTY PREDICTION TOOLS AND EXPERIMENTAL DATA FOR...4. TITLE AND SUBTITLE A Comparison of Predictive Thermo and Water Solvation Property Prediction Tools and Experimental Data for Selected...1  2.  EXPERIMENTAL PROCEDURE

  10. Probing the toxicity of nanoparticles: a unified in silico machine learning model based on perturbation theory.

    PubMed

    Concu, Riccardo; Kleandrova, Valeria V; Speck-Planche, Alejandro; Cordeiro, M Natália D S

    2017-09-01

    Nanoparticles (NPs) are part of our daily life, having a wide range of applications in engineering, physics, chemistry, and biomedicine. However, there are serious concerns regarding the harmful effects that NPs can cause to the different biological systems and their ecosystems. Toxicity testing is an essential step for assessing the potential risks of the NPs, but the experimental assays are often very expensive and usually too slow to flag the number of NPs that may cause adverse effects. In silico models centered on quantitative structure-activity/toxicity relationships (QSAR/QSTR) are alternative tools that have become valuable supports to risk assessment, rationalizing the search for safer NPs. In this work, we develop a unified QSTR-perturbation model based on artificial neural networks, aimed at simultaneously predicting general toxicity profiles of NPs under diverse experimental conditions. The model is derived from 54,371 NP-NP pair cases generated by applying the perturbation theory to a set of 260 unique NPs, and showed an accuracy higher than 97% in both training and validation sets. Physicochemical interpretation of the different descriptors in the model are additionally provided. The QSTR-perturbation model is then employed to predict the toxic effects of several NPs not included in the original dataset. The theoretical results obtained for this independent set are strongly consistent with the experimental evidence found in the literature, suggesting that the present QSTR-perturbation model can be viewed as a promising and reliable computational tool for probing the toxicity of NPs.

  11. Quantum probabilities from quantum entanglement: experimentally unpacking the Born rule

    DOE PAGES

    Harris, Jérémie; Bouchard, Frédéric; Santamato, Enrico; ...

    2016-05-11

    The Born rule, a foundational axiom used to deduce probabilities of events from wavefunctions, is indispensable in the everyday practice of quantum physics. It is also key in the quest to reconcile the ostensibly inconsistent laws of the quantum and classical realms, as it confers physical significance to reduced density matrices, the essential tools of decoherence theory. Following Bohr's Copenhagen interpretation, textbooks postulate the Born rule outright. But, recent attempts to derive it from other quantum principles have been successful, holding promise for simplifying and clarifying the quantum foundational bedrock. Moreover, a major family of derivations is based on envariance,more » a recently discovered symmetry of entangled quantum states. Here, we identify and experimentally test three premises central to these envariance-based derivations, thus demonstrating, in the microworld, the symmetries from which the Born rule is derived. Furthermore, we demonstrate envariance in a purely local quantum system, showing its independence from relativistic causality.« less

  12. Using the Textpresso Site-Specific Recombinases Web server to identify Cre expressing mouse strains and floxed alleles.

    PubMed

    Condie, Brian G; Urbanski, William M

    2014-01-01

    Effective tools for searching the biomedical literature are essential for identifying reagents or mouse strains as well as for effective experimental design and informed interpretation of experimental results. We have built the Textpresso Site Specific Recombinases (Textpresso SSR) Web server to enable researchers who use mice to perform in-depth searches of a rapidly growing and complex part of the mouse literature. Our Textpresso Web server provides an interface for searching the full text of most of the peer-reviewed publications that report the characterization or use of mouse strains that express Cre or Flp recombinase. The database also contains most of the publications that describe the characterization or analysis of strains carrying conditional alleles or transgenes that can be inactivated or activated by site-specific recombinases such as Cre or Flp. Textpresso SSR complements the existing online databases that catalog Cre and Flp expression patterns by providing a unique online interface for the in-depth text mining of the site specific recombinase literature.

  13. Functional complexity and ecosystem stability: an experimental approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Voris, P.; O'Neill, R.V.; Shugart, H.H.

    1978-01-01

    The complexity-stability hypothesis was experimentally tested using intact terrestrial microcosms. Functional complexity was defined as the number and significance of component interactions (i.e., population interactions, physical-chemical reactions, biological turnover rates) influenced by nonlinearities, feedbacks, and time delays. It was postulated that functional complexity could be nondestructively measured through analysis of a signal generated from the system. Power spectral analysis of hourly CO/sub 2/ efflux, from eleven old-field microcosms, was analyzed for the number of low frequency peaks and used to rank the functional complexity of each system. Ranking of ecosystem stability was based on the capacity of the system tomore » retain essential nutrients and was measured by net loss of Ca after the system was stressed. Rank correlation supported the hypothesis that increasing ecosystem functional complexity leads to increasing ecosystem stability. The results indicated that complex functional dynamics can serve to stabilize the system. The results also demonstrated that microcosms are useful tools for system-level investigations.« less

  14. A new computational strategy for identifying essential proteins based on network topological properties and biological information.

    PubMed

    Qin, Chao; Sun, Yongqi; Dong, Yadong

    2017-01-01

    Essential proteins are the proteins that are indispensable to the survival and development of an organism. Deleting a single essential protein will cause lethality or infertility. Identifying and analysing essential proteins are key to understanding the molecular mechanisms of living cells. There are two types of methods for predicting essential proteins: experimental methods, which require considerable time and resources, and computational methods, which overcome the shortcomings of experimental methods. However, the prediction accuracy of computational methods for essential proteins requires further improvement. In this paper, we propose a new computational strategy named CoTB for identifying essential proteins based on a combination of topological properties, subcellular localization information and orthologous protein information. First, we introduce several topological properties of the protein-protein interaction (PPI) network. Second, we propose new methods for measuring orthologous information and subcellular localization and a new computational strategy that uses a random forest prediction model to obtain a probability score for the proteins being essential. Finally, we conduct experiments on four different Saccharomyces cerevisiae datasets. The experimental results demonstrate that our strategy for identifying essential proteins outperforms traditional computational methods and the most recently developed method, SON. In particular, our strategy improves the prediction accuracy to 89, 78, 79, and 85 percent on the YDIP, YMIPS, YMBD and YHQ datasets at the top 100 level, respectively.

  15. [Quality of the pharmacotherapeutic recommendations for the integrated care procedures in Andalusia].

    PubMed

    Corte, Rosa María Muñoz; Estepa, Raúl García; Ramos, Bernardo Santos; Paloma, Francisco Javier Bautista

    2009-01-01

    To evaluate the quality of the pharmacotherapeutic recommendations included in the Integrated Care Procedures (PAIs regarding its initials in Spanish) of the Andalusian Ministry of Health, published up to March 2008, through the design and validation of a tool. The assessment tool was designed based on similar instruments, specifically the AGREE. Other criteria included were taken from various literature sources or were devised by ourselves. The tool was validated prior to being used. After applying it to all the PAIs, we examined the degree of compliance with these pharmacotherapeutic criteria, both as a whole and by PAIs subgroups. The developed tool is a questionnaire of 20 items, divided into 4 sections. The first section consists of the essential criteria, and the rest make reference to more specific, non essential criteria: definition of the level of evidence, thoroughness of information and definition of indicators. It was found that 4 of the 60 PAIs do not contain any type of therapeutic recommendation. No PAI fulfils all the items listed in the tool, however, 70 % of them fulfil the essential quality criteria established. There is a great variability in the content of pharmacotherapeutic recommendations for each PAI. Once the validity of the tool has been proved, it could be used to assess the quality of the therapeutic recommendations in clinical practice guidelines.

  16. An Automated, Experimenter-Free Method for the Standardised, Operant Cognitive Testing of Rats

    PubMed Central

    Rivalan, Marion; Munawar, Humaira; Fuchs, Anna; Winter, York

    2017-01-01

    Animal models of human pathology are essential for biomedical research. However, a recurring issue in the use of animal models is the poor reproducibility of behavioural and physiological findings within and between laboratories. The most critical factor influencing this issue remains the experimenter themselves. One solution is the use of procedures devoid of human intervention. We present a novel approach to experimenter-free testing cognitive abilities in rats, by combining undisturbed group housing with automated, standardized and individual operant testing. This experimenter-free system consisted of an automated-operant system (Bussey-Saksida rat touch screen) connected to a home cage containing group living rats via an automated animal sorter (PhenoSys). The automated animal sorter, which is based on radio-frequency identification (RFID) technology, functioned as a mechanical replacement of the experimenter. Rats learnt to regularly and individually enter the operant chamber and remained there for the duration of the experimental session only. Self-motivated rats acquired the complex touch screen task of trial-unique non-matching to location (TUNL) in half the time reported for animals that were manually placed into the operant chamber. Rat performance was similar between the two groups within our laboratory, and comparable to previously published results obtained elsewhere. This reproducibility, both within and between laboratories, confirms the validity of this approach. In addition, automation reduced daily experimental time by 80%, eliminated animal handling, and reduced equipment cost. This automated, experimenter-free setup is a promising tool of great potential for testing a large variety of functions with full automation in future studies. PMID:28060883

  17. Software Aids for radiologists: Part 1, Useful Photoshop skills.

    PubMed

    Gross, Joel A; Thapa, Mahesh M

    2012-12-01

    The purpose of this review is to describe the use of several essential techniques and tools in Adobe Photoshop image-editing software. The techniques shown expand on those previously described in the radiologic literature. Radiologists, especially those with minimal experience with image-editing software, can quickly apply a few essential Photoshop tools to minimize the frustration that can result from attempting to navigate a complex user interface.

  18. Molecular Treatment of Nano-Kaolinite Generations.

    PubMed

    Táborosi, Attila; Szilagyi, Robert K; Zsirka, Balázs; Fónagy, Orsolya; Horváth, Erzsébet; Kristóf, János

    2018-06-18

    A procedure is developed for defining a compositionally and structurally realistic, atomic-scale description of exfoliated clay nanoparticles from the kaolinite family of phylloaluminosilicates. By use of coordination chemical principles, chemical environments within a nanoparticle can be separated into inner, outer, and peripheral spheres. The edges of the molecular models of nanoparticles were protonated in a validated manner to achieve charge neutrality. Structural optimizations using semiempirical methods (NDDO Hamiltonians and DFTB formalism) and ab initio density functionals with a saturated basis set revealed previously overlooked molecular origins of morphological changes as a result of exfoliation. While the use of semiempirical methods is desirable for the treatment of nanoparticles composed of tens of thousands of atoms, the structural accuracy is rather modest in comparison to DFT methods. We report a comparative survey of our infrared data for untreated crystalline and various exfoliated states of kaolinite and halloysite. Given the limited availability of experimental techniques for providing direct structural information about nano-kaolinite, the vibrational spectra can be considered as an essential tool for validating structural models. The comparison of experimental and calculated stretching and bending frequencies further justified the use of the preferred level of theory. Overall, an optimal molecular model of the defect-free, ideal nano-kaolinite can be composed with respect to stationary structure and curvature of the potential energy surface using the PW91/SVP level of theory with empirical dispersion correction (PW91+D) and polarizable continuum solvation model (PCM) without the need for a scaled quantum chemical force field. This validated theoretical approach is essential in order to follow the formation of exfoliated clays and their surface reactivity that is experimentally unattainable.

  19. Metabonomics approaches and the potential application in foodsafety evaluation.

    PubMed

    Kuang, Hua; Li, Zhe; Peng, Chifang; Liu, Liqiang; Xu, Liguang; Zhu, Yingyue; Wang, Libing; Xu, Chuanlai

    2012-01-01

    It is essential that the novel biomarkers discovered by means of advanced detection tools based on metabonomics could be used for long-term monitoring in food safety. By summarizing the common biomarkers discovery flowsheet based on metabonomics, this review evaluates the possible application of metabonomics in new biomarker discovery, especially in relation to food safety issues. Metabonomics have the advantages of decreasing detection limits and constant monitoring. Although metabonomics is still in the developmental stage, we believe that, based on its properties, such as noninvasiveness, sensitivity, and persistence, together with rigorous experimental designs, new and novel technologies, as well as increasingly accurate chemometrics and a relational database, metabonomics can demonstrate extensive application in food safety in the postgenome period.

  20. Template method for fabricating interdigitate p-n heterojunction for organic solar cell

    PubMed Central

    2012-01-01

    Anodic aluminum oxide (AAO) templates are used to fabricate arrays of poly(3-hexylthiophene) (P3HT) pillars. This technique makes it possible to control the dimensions of the pillars, namely their diameters, intervals, and heights, on a tens-of-nanometer scale. These features are essential for enhancing carrier processes such as carrier generation, exciton diffusion, and carrier dissociation and transport. An interdigitated p-n junction between P3HT pillars and fullerene (C60) exhibits a photovoltaic effect. Although the device properties are still preliminary, the experimental results indicate that an AAO template is an effective tool with which to develop organic solar cells because highly regulated nanostructures can be produced on large areas exceeding 100 mm2. PMID:22908897

  1. ProtVista: visualization of protein sequence annotations.

    PubMed

    Watkins, Xavier; Garcia, Leyla J; Pundir, Sangya; Martin, Maria J

    2017-07-01

    ProtVista is a comprehensive visualization tool for the graphical representation of protein sequence features in the UniProt Knowledgebase, experimental proteomics and variation public datasets. The complexity and relationships in this wealth of data pose a challenge in interpretation. Integrative visualization approaches such as provided by ProtVista are thus essential for researchers to understand the data and, for instance, discover patterns affecting function and disease associations. ProtVista is a JavaScript component released as an open source project under the Apache 2 License. Documentation and source code are available at http://ebi-uniprot.github.io/ProtVista/ . martin@ebi.ac.uk. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  2. Engineering approaches to illuminating brain structure and dynamics.

    PubMed

    Deisseroth, Karl; Schnitzer, Mark J

    2013-10-30

    Historical milestones in neuroscience have come in diverse forms, ranging from the resolution of specific biological mysteries via creative experimentation to broad technological advances allowing neuroscientists to ask new kinds of questions. The continuous development of tools is driven with a special necessity by the complexity, fragility, and inaccessibility of intact nervous systems, such that inventive technique development and application drawing upon engineering and the applied sciences has long been essential to neuroscience. Here we highlight recent technological directions in neuroscience spurred by progress in optical, electrical, mechanical, chemical, and biological engineering. These research areas are poised for rapid growth and will likely be central to the practice of neuroscience well into the future. Copyright © 2013 Elsevier Inc. All rights reserved.

  3. KNOW ESSENTIALS: a tool for informed decisions in the absence of formal HTA systems.

    PubMed

    Mathew, Joseph L

    2011-04-01

    Most developing countries and resource-limited settings lack robust health technology assessment (HTA) systems. Because the development of locally relevant HTA is not immediately viable, and the extrapolation of external HTA is inappropriate, a new model for evaluating health technologies is required. The aim of this study was to describe the development and application of KNOW ESSENTIALS, a tool facilitating evidence-based decisions on health technologies by stakeholders in settings lacking formal HTA systems. Current HTA methodology was examined through literature search. Additional issues relevant to resource-limited settings, but not adequately addressed in current methodology, were identified through further literature search, appraisal of contextually relevant issues, discussion with healthcare professionals familiar with the local context, and personal experience. A set of thirteen elements important for evidence-based decisions was identified, selected and combined into a tool with the mnemonic KNOW ESSENTIALS. Detailed definitions for each element, coding for the elements, and a system to evaluate a given health technology using the tool were developed. Developing countries and resource-limited settings face several challenges to informed decision making. Models that are relevant and applicable in high-income countries are unlikely in such settings. KNOW ESSENTIALS is an alternative that facilitates evidence-based decision making by stakeholders without formal expertise in HTA. The tool could be particularly useful, as an interim measure, in healthcare systems that are developing HTA capacity. It could also be useful anywhere when rapid evidence-based decisions on health technologies are required.

  4. Results of an Experimental Exploration of Advanced Automated Geospatial Tools: Agility in Complex Planning

    DTIC Science & Technology

    2009-06-01

    AUTOMATED GEOSPATIAL TOOLS : AGILITY IN COMPLEX PLANNING Primary Topic: Track 5 – Experimentation and Analysis Walter A. Powell [STUDENT] - GMU...TITLE AND SUBTITLE Results of an Experimental Exploration of Advanced Automated Geospatial Tools : Agility in Complex Planning 5a. CONTRACT NUMBER...Std Z39-18 Abstract Typically, the development of tools and systems for the military is requirement driven; systems are developed to meet

  5. Investigating water transport through the xylem network in vascular plants.

    PubMed

    Kim, Hae Koo; Park, Joonghyuk; Hwang, Ildoo

    2014-04-01

    Our understanding of physical and physiological mechanisms depends on the development of advanced technologies and tools to prove or re-evaluate established theories, and test new hypotheses. Water flow in land plants is a fascinating phenomenon, a vital component of the water cycle, and essential for life on Earth. The cohesion-tension theory (CTT), formulated more than a century ago and based on the physical properties of water, laid the foundation for our understanding of water transport in vascular plants. Numerous experimental tools have since been developed to evaluate various aspects of the CTT, such as the existence of negative hydrostatic pressure. This review focuses on the evolution of the experimental methods used to study water transport in plants, and summarizes the different ways to investigate the diversity of the xylem network structure and sap flow dynamics in various species. As water transport is documented at different scales, from the level of single conduits to entire plants, it is critical that new results be subjected to systematic cross-validation and that findings based on different organs be integrated at the whole-plant level. We also discuss the functional trade-offs between optimizing hydraulic efficiency and maintaining the safety of the entire transport system. Furthermore, we evaluate future directions in sap flow research and highlight the importance of integrating the combined effects of various levels of hydraulic regulation.

  6. Global Sensitivity Analysis as Good Modelling Practices tool for the identification of the most influential process parameters of the primary drying step during freeze-drying.

    PubMed

    Van Bockstal, Pieter-Jan; Mortier, Séverine Thérèse F C; Corver, Jos; Nopens, Ingmar; Gernaey, Krist V; De Beer, Thomas

    2018-02-01

    Pharmaceutical batch freeze-drying is commonly used to improve the stability of biological therapeutics. The primary drying step is regulated by the dynamic settings of the adaptable process variables, shelf temperature T s and chamber pressure P c . Mechanistic modelling of the primary drying step leads to the optimal dynamic combination of these adaptable process variables in function of time. According to Good Modelling Practices, a Global Sensitivity Analysis (GSA) is essential for appropriate model building. In this study, both a regression-based and variance-based GSA were conducted on a validated mechanistic primary drying model to estimate the impact of several model input parameters on two output variables, the product temperature at the sublimation front T i and the sublimation rate ṁ sub . T s was identified as most influential parameter on both T i and ṁ sub , followed by P c and the dried product mass transfer resistance α Rp for T i and ṁ sub , respectively. The GSA findings were experimentally validated for ṁ sub via a Design of Experiments (DoE) approach. The results indicated that GSA is a very useful tool for the evaluation of the impact of different process variables on the model outcome, leading to essential process knowledge, without the need for time-consuming experiments (e.g., DoE). Copyright © 2017 Elsevier B.V. All rights reserved.

  7. X-ray absorption spectroscopy: EXAFS (Extended X-ray Absorption Fine Structure) and XANES (X-ray Absorption Near Edge Structure)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alp, E.E.; Mini, S.M.; Ramanathan, M.

    1990-04-01

    The x-ray absorption spectroscopy (XAS) had been an essential tool to gather spectroscopic information about atomic energy level structure in the early decades of this century. It has also played an important role in the discovery and systematization of rare-earth elements. The discovery of synchrotron radiation in 1952, and later the availability of broadly tunable synchrotron based x-ray sources have revitalized this technique since the 1970's. The correct interpretation of the oscillatory structure in the x-ray absorption cross-section above the absorption edge by Sayers et. al. has transformed XAS from a spectroscopic tool to a structural technique. EXAFS (Extended X-raymore » Absorption Fine Structure) yields information about the interatomic distances, near neighbor coordination numbers, and lattice dynamics. An excellent description of the principles and data analysis techniques of EXAFS is given by Teo. XANES (X-ray Absorption Near Edge Structure), on the other hand, gives information about the valence state, energy bandwidth and bond angles. Today, there are about 50 experimental stations in various synchrotrons around the world dedicated to collecting x-ray absorption data from the bulk and surfaces of solids and liquids. In this chapter, we will give the basic principles of XAS, explain the information content of essentially two different aspects of the absorption process leading to EXAFS and XANES, and discuss the source and samples limitations.« less

  8. A Guerilla Guide to Common Problems in ‘Neurostatistics’: Essential Statistical Topics in Neuroscience

    PubMed Central

    Smith, Paul F.

    2017-01-01

    Effective inferential statistical analysis is essential for high quality studies in neuroscience. However, recently, neuroscience has been criticised for the poor use of experimental design and statistical analysis. Many of the statistical issues confronting neuroscience are similar to other areas of biology; however, there are some that occur more regularly in neuroscience studies. This review attempts to provide a succinct overview of some of the major issues that arise commonly in the analyses of neuroscience data. These include: the non-normal distribution of the data; inequality of variance between groups; extensive correlation in data for repeated measurements across time or space; excessive multiple testing; inadequate statistical power due to small sample sizes; pseudo-replication; and an over-emphasis on binary conclusions about statistical significance as opposed to effect sizes. Statistical analysis should be viewed as just another neuroscience tool, which is critical to the final outcome of the study. Therefore, it needs to be done well and it is a good idea to be proactive and seek help early, preferably before the study even begins. PMID:29371855

  9. A Guerilla Guide to Common Problems in 'Neurostatistics': Essential Statistical Topics in Neuroscience.

    PubMed

    Smith, Paul F

    2017-01-01

    Effective inferential statistical analysis is essential for high quality studies in neuroscience. However, recently, neuroscience has been criticised for the poor use of experimental design and statistical analysis. Many of the statistical issues confronting neuroscience are similar to other areas of biology; however, there are some that occur more regularly in neuroscience studies. This review attempts to provide a succinct overview of some of the major issues that arise commonly in the analyses of neuroscience data. These include: the non-normal distribution of the data; inequality of variance between groups; extensive correlation in data for repeated measurements across time or space; excessive multiple testing; inadequate statistical power due to small sample sizes; pseudo-replication; and an over-emphasis on binary conclusions about statistical significance as opposed to effect sizes. Statistical analysis should be viewed as just another neuroscience tool, which is critical to the final outcome of the study. Therefore, it needs to be done well and it is a good idea to be proactive and seek help early, preferably before the study even begins.

  10. 2013 R&D 100 Award: ‘Miniapps’ Bolster High Performance Computing

    ScienceCinema

    Belak, Jim; Richards, David

    2018-06-12

    Two Livermore computer scientists served on a Sandia National Laboratories-led team that developed Mantevo Suite 1.0, the first integrated suite of small software programs, also called "miniapps," to be made available to the high performance computing (HPC) community. These miniapps facilitate the development of new HPC systems and the applications that run on them. Miniapps (miniature applications) serve as stripped down surrogates for complex, full-scale applications that can require a great deal of time and effort to port to a new HPC system because they often consist of hundreds of thousands of lines of code. The miniapps are a prototype that contains some or all of the essentials of the real application but with many fewer lines of code, making the miniapp more versatile for experimentation. This allows researchers to more rapidly explore options and optimize system design, greatly improving the chances the full-scale application will perform successfully. These miniapps have become essential tools for exploring complex design spaces because they can reliably predict the performance of full applications.

  11. Gene Unprediction with Spurio: A tool to identify spurious protein sequences.

    PubMed

    Höps, Wolfram; Jeffryes, Matt; Bateman, Alex

    2018-01-01

    We now have access to the sequences of tens of millions of proteins. These protein sequences are essential for modern molecular biology and computational biology. The vast majority of protein sequences are derived from gene prediction tools and have no experimental supporting evidence for their translation.  Despite the increasing accuracy of gene prediction tools there likely exists a large number of spurious protein predictions in the sequence databases.  We have developed the Spurio tool to help identify spurious protein predictions in prokaryotes.  Spurio searches the query protein sequence against a prokaryotic nucleotide database using tblastn and identifies homologous sequences. The tblastn matches are used to score the query sequence's likelihood of being a spurious protein prediction using a Gaussian process model. The most informative feature is the appearance of stop codons within the presumed translation of homologous DNA sequences. Benchmarking shows that the Spurio tool is able to distinguish spurious from true proteins. However, transposon proteins are prone to be predicted as spurious because of the frequency of degraded homologs found in the DNA sequence databases. Our initial experiments suggest that less than 1% of the proteins in the UniProtKB sequence database are likely to be spurious and that Spurio is able to identify over 60 times more spurious proteins than the AntiFam resource. The Spurio software and source code is available under an MIT license at the following URL: https://bitbucket.org/bateman-group/spurio.

  12. GPS-CCD: A Novel Computational Program for the Prediction of Calpain Cleavage Sites

    PubMed Central

    Gao, Xinjiao; Ma, Qian; Ren, Jian; Xue, Yu

    2011-01-01

    As one of the most essential post-translational modifications (PTMs) of proteins, proteolysis, especially calpain-mediated cleavage, plays an important role in many biological processes, including cell death/apoptosis, cytoskeletal remodeling, and the cell cycle. Experimental identification of calpain targets with bona fide cleavage sites is fundamental for dissecting the molecular mechanisms and biological roles of calpain cleavage. In contrast to time-consuming and labor-intensive experimental approaches, computational prediction of calpain cleavage sites might more cheaply and readily provide useful information for further experimental investigation. In this work, we constructed a novel software package of GPS-CCD (Calpain Cleavage Detector) for the prediction of calpain cleavage sites, with an accuracy of 89.98%, sensitivity of 60.87% and specificity of 90.07%. With this software, we annotated potential calpain cleavage sites for hundreds of calpain substrates, for which the exact cleavage sites had not been previously determined. In this regard, GPS-CCD 1.0 is considered to be a useful tool for experimentalists. The online service and local packages of GPS-CCD 1.0 were implemented in JAVA and are freely available at: http://ccd.biocuckoo.org/. PMID:21533053

  13. Assessment Tools' Indicators for Sustainability in Universities: An Analytical Overview

    ERIC Educational Resources Information Center

    Alghamdi, Naif; den Heijer, Alexandra; de Jonge, Hans

    2017-01-01

    Purpose: The purpose of this paper is to analyse 12 assessment tools of sustainability in universities and develop the structure and the contents of these tools to be more intelligible. The configuration of the tools reviewed highlight indicators that clearly communicate only the essential information. This paper explores how the theoretical…

  14. Accurate in silico prediction of species-specific methylation sites based on information gain feature optimization.

    PubMed

    Wen, Ping-Ping; Shi, Shao-Ping; Xu, Hao-Dong; Wang, Li-Na; Qiu, Jian-Ding

    2016-10-15

    As one of the most important reversible types of post-translational modification, protein methylation catalyzed by methyltransferases carries many pivotal biological functions as well as many essential biological processes. Identification of methylation sites is prerequisite for decoding methylation regulatory networks in living cells and understanding their physiological roles. Experimental methods are limitations of labor-intensive and time-consuming. While in silicon approaches are cost-effective and high-throughput manner to predict potential methylation sites, but those previous predictors only have a mixed model and their prediction performances are not fully satisfactory now. Recently, with increasing availability of quantitative methylation datasets in diverse species (especially in eukaryotes), there is a growing need to develop a species-specific predictor. Here, we designed a tool named PSSMe based on information gain (IG) feature optimization method for species-specific methylation site prediction. The IG method was adopted to analyze the importance and contribution of each feature, then select the valuable dimension feature vectors to reconstitute a new orderly feature, which was applied to build the finally prediction model. Finally, our method improves prediction performance of accuracy about 15% comparing with single features. Furthermore, our species-specific model significantly improves the predictive performance compare with other general methylation prediction tools. Hence, our prediction results serve as useful resources to elucidate the mechanism of arginine or lysine methylation and facilitate hypothesis-driven experimental design and validation. The tool online service is implemented by C# language and freely available at http://bioinfo.ncu.edu.cn/PSSMe.aspx CONTACT: jdqiu@ncu.edu.cnSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. Providing haptic feedback in robot-assisted minimally invasive surgery: a direct optical force-sensing solution for haptic rendering of deformable bodies.

    PubMed

    Ehrampoosh, Shervin; Dave, Mohit; Kia, Michael A; Rablau, Corneliu; Zadeh, Mehrdad H

    2013-01-01

    This paper presents an enhanced haptic-enabled master-slave teleoperation system which can be used to provide force feedback to surgeons in minimally invasive surgery (MIS). One of the research goals was to develop a combined-control architecture framework that included both direct force reflection (DFR) and position-error-based (PEB) control strategies. To achieve this goal, it was essential to measure accurately the direct contact forces between deformable bodies and a robotic tool tip. To measure the forces at a surgical tool tip and enhance the performance of the teleoperation system, an optical force sensor was designed, prototyped, and added to a robot manipulator. The enhanced teleoperation architecture was formulated by developing mathematical models for the optical force sensor, the extended slave robot manipulator, and the combined-control strategy. Human factor studies were also conducted to (a) examine experimentally the performance of the enhanced teleoperation system with the optical force sensor, and (b) study human haptic perception during the identification of remote object deformability. The first experiment was carried out to discriminate deformability of objects when human subjects were in direct contact with deformable objects by means of a laparoscopic tool. The control parameters were then tuned based on the results of this experiment using a gain-scheduling method. The second experiment was conducted to study the effectiveness of the force feedback provided through the enhanced teleoperation system. The results show that the force feedback increased the ability of subjects to correctly identify materials of different deformable types. In addition, the virtual force feedback provided by the teleoperation system comes close to the real force feedback experienced in direct MIS. The experimental results provide design guidelines for choosing and validating the control architecture and the optical force sensor.

  16. Steps Towards Understanding Large-scale Deformation of Gas Hydrate-bearing Sediments

    NASA Astrophysics Data System (ADS)

    Gupta, S.; Deusner, C.; Haeckel, M.; Kossel, E.

    2016-12-01

    Marine sediments bearing gas hydrates are typically characterized by heterogeneity in the gas hydrate distribution and anisotropy in the sediment-gas hydrate fabric properties. Gas hydrates also contribute to the strength and stiffness of the marine sediment, and any disturbance in the thermodynamic stability of the gas hydrates is likely to affect the geomechanical stability of the sediment. Understanding mechanisms and triggers of large-strain deformation and failure of marine gas hydrate-bearing sediments is an area of extensive research, particularly in the context of marine slope-stability and industrial gas production. The ultimate objective is to predict severe deformation events such as regional-scale slope failure or excessive sand production by using numerical simulation tools. The development of such tools essentially requires a careful analysis of thermo-hydro-chemo-mechanical behavior of gas hydrate-bearing sediments at lab-scale, and its stepwise integration into reservoir-scale simulators through definition of effective variables, use of suitable constitutive relations, and application of scaling laws. One of the focus areas of our research is to understand the bulk coupled behavior of marine gas hydrate systems with contributions from micro-scale characteristics, transport-reaction dynamics, and structural heterogeneity through experimental flow-through studies using high-pressure triaxial test systems and advanced tomographical tools (CT, ERT, MRI). We combine these studies to develop mathematical model and numerical simulation tools which could be used to predict the coupled hydro-geomechanical behavior of marine gas hydrate reservoirs in a large-strain framework. Here we will present some of our recent results from closely co-ordinated experimental and numerical simulation studies with an objective to capture the large-deformation behavior relevant to different gas production scenarios. We will also report on a variety of mechanically relevant test scenarios focusing on effects of dynamic changes in gas hydrate saturation, highly uneven gas hydrate distributions, focused fluid migration and gas hydrate production through depressurization and CO2 injection.

  17. Electrochemical reverse engineering: A systems-level tool to probe the redox-based molecular communication of biology.

    PubMed

    Li, Jinyang; Liu, Yi; Kim, Eunkyoung; March, John C; Bentley, William E; Payne, Gregory F

    2017-04-01

    The intestine is the site of digestion and forms a critical interface between the host and the outside world. This interface is composed of host epithelium and a complex microbiota which is "connected" through an extensive web of chemical and biological interactions that determine the balance between health and disease for the host. This biology and the associated chemical dialogues occur within a context of a steep oxygen gradient that provides the driving force for a variety of reduction and oxidation (redox) reactions. While some redox couples (e.g., catecholics) can spontaneously exchange electrons, many others are kinetically "insulated" (e.g., biothiols) allowing the biology to set and control their redox states far from equilibrium. It is well known that within cells, such non-equilibrated redox couples are poised to transfer electrons to perform reactions essential to immune defense (e.g., transfer from NADH to O 2 for reactive oxygen species, ROS, generation) and protection from such oxidative stresses (e.g., glutathione-based reduction of ROS). More recently, it has been recognized that some of these redox-active species (e.g., H 2 O 2 ) cross membranes and diffuse into the extracellular environment including lumen to transmit redox information that is received by atomically-specific receptors (e.g., cysteine-based sulfur switches) that regulate biological functions. Thus, redox has emerged as an important modality in the chemical signaling that occurs in the intestine and there have been emerging efforts to develop the experimental tools needed to probe this modality. We suggest that electrochemistry provides a unique tool to experimentally probe redox interactions at a systems level. Importantly, electrochemistry offers the potential to enlist the extensive theories established in signal processing in an effort to "reverse engineer" the molecular communication occurring in this complex biological system. Here, we review our efforts to develop this electrochemical tool for in vitro redox-probing. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Which DSM validated tools for diagnosing depression are usable in primary care research? A systematic literature review.

    PubMed

    Nabbe, P; Le Reste, J Y; Guillou-Landreat, M; Munoz Perez, M A; Argyriadou, S; Claveria, A; Fernández San Martín, M I; Czachowski, S; Lingner, H; Lygidakis, C; Sowinska, A; Chiron, B; Derriennic, J; Le Prielec, A; Le Floch, B; Montier, T; Van Marwijk, H; Van Royen, P

    2017-01-01

    Depression occurs frequently in primary care. Its broad clinical variability makes it difficult to diagnose. This makes it essential that family practitioner (FP) researchers have validated tools to minimize bias in studies of everyday practice. Which tools validated against psychiatric examination, according to the major depression criteria of DSM-IV or 5, can be used for research purposes? An international FP team conducted a systematic review using the following databases: Pubmed, Cochrane and Embase, from 2000/01/01 to 2015/10/01. The three databases search identified 770 abstracts: 546 abstracts were analyzed after duplicates had been removed (224 duplicates); 50 of the validity studies were eligible and 4 studies were included. In 4 studies, the following tools were found: GDS-5, GDS-15, GDS-30, CESD-R, HADS, PSC-51 and HSCL-25. Sensitivity, Specificity, Positive Predictive Value, Negative Predictive Value were collected. The Youden index was calculated. Using efficiency data alone to compare these studies could be misleading. Additional reliability, reproducibility and ergonomic data will be essential for making comparisons. This study selected seven tools, usable in primary care research, for the diagnosis of depression. In order to define the best tools in terms of efficiency, reproducibility, reliability and ergonomics for research in primary care, and for care itself, further research will be essential. Copyright © 2016. Published by Elsevier Masson SAS.

  19. Essentials of Career Interest Assessment. Essentials of Psychological Assessment Series.

    ERIC Educational Resources Information Center

    Prince, Jeffrey P.; Heiser, Lisa J.

    This book is a quick reference source to guide the career professional through the essentials of using the most popular career interest tools. It summarizes important technical aspects of each inventory, and offers step-by-step guidance in the interpretation and use of the various inventories. The chapters are: (1) "Overview"; (2)…

  20. The effects of environment and ownership on children's innovation of tools and tool material selection.

    PubMed

    Sheridan, Kimberly M; Konopasky, Abigail W; Kirkwood, Sophie; Defeyter, Margaret A

    2016-03-19

    Research indicates that in experimental settings, young children of 3-7 years old are unlikely to devise a simple tool to solve a problem. This series of exploratory studies done in museums in the US and UK explores how environment and ownership of materials may improve children's ability and inclination for (i) tool material selection and (ii) innovation. The first study takes place in a children's museum, an environment where children can use tools and materials freely. We replicated a tool innovation task in this environment and found that while 3-4 year olds showed the predicted low levels of innovation rates, 4-7 year olds showed higher rates of innovation than the younger children and than reported in prior studies. The second study explores the effect of whether the experimental materials are owned by the experimenter or the child on tool selection and innovation. Results showed that 5-6 year olds and 6-7 year olds were more likely to select tool material they owned compared to tool material owned by the experimenter, although ownership had no effect on tool innovation. We argue that learning environments supporting tool exploration and invention and conveying ownership over materials may encourage successful tool innovation at earlier ages. © 2016 The Author(s).

  1. Providing Guidance in Virtual Lab Experimentation: The Case of an Experiment Design Tool

    ERIC Educational Resources Information Center

    Efstathiou, Charalampos; Hovardas, Tasos; Xenofontos, Nikoletta A.; Zacharia, Zacharias C.; deJong, Ton; Anjewierden, Anjo; van Riesen, Siswa A. N.

    2018-01-01

    The present study employed a quasi-experimental design to assess a computer-based tool, which was intended to scaffold the task of designing experiments when using a virtual lab for the process of experimentation. In particular, we assessed the impact of this tool on primary school students' cognitive processes and inquiry skills before and after…

  2. Thermal-Diffusivity Measurements of Mexican Citrus Essential Oils Using Photoacoustic Methodology in the Transmission Configuration

    NASA Astrophysics Data System (ADS)

    Muñoz, G. A. López; González, R. F. López; López, J. A. Balderas; Martínez-Pérez, L.

    2011-05-01

    Photoacoustic methodology in the transmission configuration (PMTC) was used to study the thermophysical properties and their relation with the composition in Mexican citrus essential oils providing the viability of using photothermal techniques for quality control and for authentication of oils and their adulteration. Linear relations for the amplitude (on a semi-log scale) and phase, as functions of the sample's thickness, for the PMTC was obtained through a theoretical model fit to the experimental data for thermal-diffusivity measurements in Mexican orange, pink grapefruit, mandarin, lime type A, centrifuged essential oils, and Mexican distilled lime essential oil. Gas chromatography for distilled lime essential oil and centrifuged lime essential oil type A is reported to complement the study. Experimental results showed close thermal-diffusivity values between Mexican citrus essential oils obtained by centrifugation, but a significant difference of this physical property for distilled lime oil and the corresponding value obtained by centrifugation, which is due to their different chemical compositions involved with the extraction processes.

  3. Pediatric intensive care unit admission tool: a colorful approach.

    PubMed

    Biddle, Amy

    2007-12-01

    This article discusses the development, implementation, and utilization of our institution's Pediatric Intensive Care Unit (PICU) Color-Coded Admission Status Tool. Rather than the historical method of identifying a maximum number of staffed beds, a tool was developed to color code the PICU's admission status. Previous methods had been ineffective and led to confusion between the PICU leadership team and the administration. The tool includes the previously missing components of staffing and acuity, which are essential in determining admission capability. The PICU tool has three colored levels: green indicates open for admissions; yellow, admission alert resulting from available beds or because staffing is not equal to the projected patient numbers or required acuity; and red, admissions on hold because only one trauma or arrest bed is available or staffing is not equal to the projected acuity. Yellow and red designations require specific actions and the medical director's approval. The tool has been highly successful and significantly impacted nursing with the inclusion of the essential component of nurse staffing necessary in determining bed availability.

  4. Anti-inflammatory effects of compounds alpha-humulene and (-)-trans-caryophyllene isolated from the essential oil of Cordia verbenacea.

    PubMed

    Fernandes, Elizabeth S; Passos, Giselle F; Medeiros, Rodrigo; da Cunha, Fernanda M; Ferreira, Juliano; Campos, Maria M; Pianowski, Luiz F; Calixto, João B

    2007-08-27

    This study evaluated the anti-inflammatory properties of two sesquiterpenes isolated from Cordia verbenacea's essential oil, alpha-humulene and (-)-trans-caryophyllene. Our results revealed that oral treatment with both compounds displayed marked inhibitory effects in different inflammatory experimental models in mice and rats. alpha-humulene and (-)-trans-caryophyllene were effective in reducing platelet activating factor-, bradykinin- and ovoalbumin-induced mouse paw oedema, while only alpha-humulene was able to diminish the oedema formation caused by histamine injection. Also, both compounds had important inhibitory effects on the mouse and rat carrageenan-induced paw oedema. Systemic treatment with alpha-humulene largely prevented both tumor necrosis factor-alpha (TNFalpha) and interleukin-1beta (IL-1beta) generation in carrageenan-injected rats, whereas (-)-trans-caryophyllene diminished only TNFalpha release. Furthermore, both compounds reduced the production of prostaglandin E(2) (PGE(2)), as well as inducible nitric oxide synthase (iNOS) and cyclooxygenase (COX-2) expression, induced by the intraplantar injection of carrageenan in rats. The anti-inflammatory effects of alpha-humulene and (-)-trans-caryophyllene were comparable to those observed in dexamethasone-treated animals, used as positive control drug. All these findings indicate that alpha-humulene and (-)-trans-caryophyllene, derived from the essential oil of C. verbenacea, might represent important tools for the management and/or treatment of inflammatory diseases.

  5. Placing a Disrupted Degradation Motif at the C Terminus of Proteasome Substrates Attenuates Degradation without Impairing Ubiquitylation*

    PubMed Central

    Alfassy, Omri S.; Cohen, Itamar; Reiss, Yuval; Tirosh, Boaz; Ravid, Tommer

    2013-01-01

    Protein elimination by the ubiquitin-proteasome system requires the presence of a cis-acting degradation signal. Efforts to discern degradation signals of misfolded proteasome substrates thus far revealed a general mechanism whereby the exposure of cryptic hydrophobic motifs provides a degradation determinant. We have previously characterized such a determinant, employing the yeast kinetochore protein Ndc10 as a model substrate. Ndc10 is essentially a stable protein that is rapidly degraded upon exposure of a hydrophobic motif located at the C-terminal region. The degradation motif comprises two distinct and essential elements: DegA, encompassing two amphipathic helices, and DegB, a hydrophobic sequence within the loosely structured C-terminal tail of Ndc10. Here we show that the hydrophobic nature of DegB is irrelevant for the ubiquitylation of substrates containing the Ndc10 degradation motif, but is essential for proteasomal degradation. Mutant DegB, in which the hydrophobic sequence was disrupted, acted as a dominant degradation inhibitory element when expressed at the C-terminal regions of ubiquitin-dependent and -independent substrates of the 26S proteasome. This mutant stabilized substrates in both yeast and mammalian cells, indicative of a modular recognition moiety. The dominant function of the mutant DegB provides a powerful experimental tool for evaluating the physiological implications of stabilization of specific proteasome substrates in intact cells and for studying the associated pathological effects. PMID:23519465

  6. Polyubiquitin-sensor proteins reveal localization and linkage-type dependence of cellular ubiquitin signaling

    PubMed Central

    Sims, Joshua J.; Scavone, Francesco; Cooper, Eric M.; Kane, Lesley A.; Youle, Richard J.; Boeke, Jef D.; Cohen, Robert E.

    2012-01-01

    Polyubiquitin (polyUb) chain topology is thought to direct modified substrates to specific fates, but this function-topology relationship is poorly understood, as are the dynamics and subcellular locations of specific polyUb signals. Experimental access to these questions has been limited because linkage-specific inhibitors and in vivo sensors have been unavailable. Here we present a general strategy to track linkage-specific polyUb signals in yeast and mammalian cells, and to probe their functions. We designed several high-affinity lysine-63-polyUb-binding proteins and demonstrate their specificity both in vitro and in cells. We apply these tools as competitive inhibitors to dissect the polyUb-linkage dependence of NF-κB activation in several cell types, inferring the essential role of lysine-63-polyUb for signaling via the IL-1β and TNF-related weak inducer of apoptosis (TWEAK) but not TNF-α receptors. We anticipate live-cell imaging, proteomic, and biochemical applications for these tools, and extension of the design strategy to other polymeric ubiquitin-like protein modifications. PMID:22306808

  7. Experimental Evaluation of the Tools of the Mind Pre-K Curriculum. Technical Report. Working Paper

    ERIC Educational Resources Information Center

    Farran, Dale C.; Wilson, Sandra J.; Meador, Deanna; Norvell, Jennifer; Nesbitt, Kimberly

    2015-01-01

    The experimental evaluation of the "Tools of the Mind Pre-K Curriculum" described in this report was designed to examine the effectiveness of the "Tools of the Mind" ("Tools") curriculum for enhancing children's self-regulation skills and their academic preparation for kindergarten when compared to the usual…

  8. Impact of guided reciprocal peer questioning on nursing students' self-esteem and learning.

    PubMed

    Lakdizaji, Sima; Abdollahzadeh, Farahnaz; Hassankhanih, Hadi; Kalantari, Manizhe

    2013-07-01

    Self-esteem is essential for clinical judgments. Nursing students in clinical environments should make a bridge between theoretical education and clinical function. This study was aimed to survey the effect of guided questioning in peer groups on nursing students' self-esteem and clinical learning. In this quasi-experimental study, all nursing students in semester 4 (60) were selected. The autumn semester students (n = 28) were chosen as the control group, and the spring semester students (n = 32) as the experimental group. The experimental group underwent the course of cardiac medical surgical training by the Guided Reciprocal Peer Questioning. The control group was trained by lecture. After confirmation of the validity and reliability of tools including Rosenberg Self-esteem Scale and the researcher-made questionnaire, data were collected and analyzed by SPSS version 17.0. There was no significant difference concerning demographic and educational characteristics between the two groups. Mean score differences of self-esteem and learning were not significant before teaching, while they were significantly promoted after teaching in the experimental (P < 0.001) and control (P < 0.05) groups. Promotion in the experimental group was more considerable than in the control group. As revealed by the results, inquiry method, due to its more positive impact on self-esteem and students' learning, can be applied alone or in combination with the other methods. Conducting this study for other students and for theoretical courses is suggested.

  9. Nanoscale Analysis of a Hierarchical Hybrid Solar Cell in 3D.

    PubMed

    Divitini, Giorgio; Stenzel, Ole; Ghadirzadeh, Ali; Guarnera, Simone; Russo, Valeria; Casari, Carlo S; Bassi, Andrea Li; Petrozza, Annamaria; Di Fonzo, Fabio; Schmidt, Volker; Ducati, Caterina

    2014-05-01

    A quantitative method for the characterization of nanoscale 3D morphology is applied to the investigation of a hybrid solar cell based on a novel hierarchical nanostructured photoanode. A cross section of the solar cell device is prepared by focused ion beam milling in a micropillar geometry, which allows a detailed 3D reconstruction of the titania photoanode by electron tomography. It is found that the hierarchical titania nanostructure facilitates polymer infiltration, thus favoring intermixing of the two semiconducting phases, essential for charge separation. The 3D nanoparticle network is analyzed with tools from stochastic geometry to extract information related to the charge transport in the hierarchical solar cell. In particular, the experimental dataset allows direct visualization of the percolation pathways that contribute to the photocurrent.

  10. Visualization of terahertz surface waves propagation on metal foils

    PubMed Central

    Wang, Xinke; Wang, Sen; Sun, Wenfeng; Feng, Shengfei; Han, Peng; Yan, Haitao; Ye, Jiasheng; Zhang, Yan

    2016-01-01

    Exploitation of surface plasmonic devices (SPDs) in the terahertz (THz) band is always beneficial for broadening the application potential of THz technologies. To clarify features of SPDs, a practical characterization means is essential for accurately observing the complex field distribution of a THz surface wave (TSW). Here, a THz digital holographic imaging system is employed to coherently exhibit temporal variations and spectral properties of TSWs activated by a rectangular or semicircular slit structure on metal foils. Advantages of the imaging system are comprehensively elucidated, including the exclusive measurement of TSWs and fall-off of the time consumption. Numerical simulations of experimental procedures further verify the imaging measurement accuracy. It can be anticipated that this imaging system will provide a versatile tool for analyzing the performance and principle of SPDs. PMID:26729652

  11. Docking studies on NSAID/COX-2 isozyme complexes using Contact Statistics analysis

    NASA Astrophysics Data System (ADS)

    Ermondi, Giuseppe; Caron, Giulia; Lawrence, Raelene; Longo, Dario

    2004-11-01

    The selective inhibition of COX-2 isozymes should lead to a new generation of NSAIDs with significantly reduced side effects; e.g. celecoxib (Celebrex®) and rofecoxib (Vioxx®). To obtain inhibitors with higher selectivity it has become essential to gain additional insight into the details of the interactions between COX isozymes and NSAIDs. Although X-ray structures of COX-2 complexed with a small number of ligands are available, experimental data are missing for two well-known selective COX-2 inhibitors (rofecoxib and nimesulide) and docking results reported are controversial. We use a combination of a traditional docking procedure with a new computational tool (Contact Statistics analysis) that identifies the best orientation among a number of solutions to shed some light on this topic.

  12. The DIII-D Plasma Control System as a Scientific Research Tool

    NASA Astrophysics Data System (ADS)

    Hyatt, A. W.; Ferron, J. R.; Humphreys, D. A.; Leuer, J. A.; Walker, M. L.; Welander, A. S.

    2006-10-01

    The digital plasma control system (PCS) is an essential element of the DIII-D tokamak as a scientific research instrument, providing experimenters with real-time measurement and control of the plasma equilibrium, heating, current drive, transport, stability, and plasma-wall interactions. A wide range of sensors and actuators allow feedback control not only of global quantities such as discharge shape, plasma energy, and toroidal rotation, but also of non-axisymmetric magnetic fields and features of the internal profiles of temperature and current density. These diverse capabilities of the PCS improve the effectiveness of tokamak operation and enable unique physics experiments. We will present an overview of the PCS and the systems it controls and interacts with, and show examples of various plasma parameters controlled by the PCS and its actuators.

  13. Nanoscale Analysis of a Hierarchical Hybrid Solar Cell in 3D

    PubMed Central

    Divitini, Giorgio; Stenzel, Ole; Ghadirzadeh, Ali; Guarnera, Simone; Russo, Valeria; Casari, Carlo S; Bassi, Andrea Li; Petrozza, Annamaria; Di Fonzo, Fabio; Schmidt, Volker; Ducati, Caterina

    2014-01-01

    A quantitative method for the characterization of nanoscale 3D morphology is applied to the investigation of a hybrid solar cell based on a novel hierarchical nanostructured photoanode. A cross section of the solar cell device is prepared by focused ion beam milling in a micropillar geometry, which allows a detailed 3D reconstruction of the titania photoanode by electron tomography. It is found that the hierarchical titania nanostructure facilitates polymer infiltration, thus favoring intermixing of the two semiconducting phases, essential for charge separation. The 3D nanoparticle network is analyzed with tools from stochastic geometry to extract information related to the charge transport in the hierarchical solar cell. In particular, the experimental dataset allows direct visualization of the percolation pathways that contribute to the photocurrent. PMID:25834481

  14. The bright future of single-molecule fluorescence imaging

    PubMed Central

    Juette, Manuel F.; Terry, Daniel S.; Wasserman, Michael R.; Zhou, Zhou; Altman, Roger B.; Zheng, Qinsi; Blanchard, Scott C.

    2014-01-01

    Single-molecule Förster resonance energy transfer (smFRET) is an essential and maturing tool to probe biomolecular interactions and conformational dynamics in vitro and, increasingly, in living cells. Multi-color smFRET enables the correlation of multiple such events and the precise dissection of their order and timing. However, the requirements for good spectral separation, high time resolution, and extended observation times place extraordinary demands on the fluorescent labels used in such experiments. Together with advanced experimental designs and data analysis, the development of long-lasting, non-fluctuating fluorophores is therefore proving key to progress in the field. Recently developed strategies for obtaining ultra-stable organic fluorophores spanning the visible spectrum are underway that will enable multi-color smFRET studies to deliver on their promise of previously unachievable biological insights. PMID:24956235

  15. The impact of CmapTools utilization towards students' conceptual change on optics topic

    NASA Astrophysics Data System (ADS)

    Rofiuddin, Muhammad Rifqi; Feranie, Selly

    2017-05-01

    Science teachers need to help students identify their prior ideas and modify them based on scientific knowledge. This process is called as conceptual change. One of essential tools to analyze students' conceptual change is by using concept map. Concept Maps are graphical representations of knowledge that are comprised of concepts and the relationships between them. Constructing concept map is implemented by adapting the role of technology to support learning process, as it is suitable with Educational Ministry Regulation No.68 year 2013. Institute for Human and Machine Cognition (IHMC) has developed CmapTools, a client-server software for easily construct and visualize concept maps. This research aims to investigate secondary students' conceptual change after experiencing five-stage conceptual teaching model by utilizing CmapTools in learning Optics. Weak experimental method through one group pretest-posttest design is implemented in this study to collect preliminary and post concept map as qualitative data. Sample was taken purposively of 8th grade students (n= 22) at one of private schools Bandung, West Java. Conceptual change based on comparison of preliminary and post concept map construction is assessed based on rubric of concept map scoring and structure. Results shows significance conceptual change differences at 50.92 % that is elaborated into concept map element such as prepositions and hierarchical level in high category, cross links in medium category and specific examples in low category. All of the results are supported with the students' positive response towards CmapTools utilization that indicates improvement of motivation, interest, and behavior aspect towards Physics lesson.

  16. Experimental Evaluation of the Tools of the Mind Pre-K Curriculum. Fidelity of Implementation Technical Report. Working Paper

    ERIC Educational Resources Information Center

    Meador, Deanna; Nesbitt, Kimberly; Farran, Dale

    2015-01-01

    The "Experimental Evaluation of the Tools of the Mind Pre-K Curriculum" study was designed to compare the effectiveness of the "Tools of the Mind" ("Tools") curriculum to the curricula the school system is currently using in enhancing children's self-regulation skills and their academic preparation for kindergarten.…

  17. Complementary roles for toxicologic pathology and mathematics in toxicogenomics, with special reference to data interpretation and oscillatory dynamics.

    PubMed

    Morgan, Kevin T; Pino, Michael; Crosby, Lynn M; Wang, Min; Elston, Timothy C; Jayyosi, Zaid; Bonnefoi, Marc; Boorman, Gary

    2004-01-01

    Toxicogenomics is an emerging multidisciplinary science that will profoundly impact the practice of toxicology. New generations of biologists, using evolving toxicogenomics tools, will generate massive data sets in need of interpretation. Mathematical tools are necessary to cluster and otherwise find meaningful structure in such data. The linking of this structure to gene functions and disease processes, and finally the generation of useful data interpretation remains a significant challenge. The training and background of pathologists make them ideally suited to contribute to the field of toxicogenomics, from experimental design to data interpretation. Toxicologic pathology, a discipline based on pattern recognition, requires familiarity with the dynamics of disease processes and interactions between organs, tissues, and cell populations. Optimal involvement of toxicologic pathologists in toxicogenomics requires that they communicate effectively with the many other scientists critical for the effective application of this complex discipline to societal problems. As noted by Petricoin III et al (Nature Genetics 32, 474-479, 2002), cooperation among regulators, sponsors and experts will be essential for realizing the potential of microarrays for public health. Following a brief introduction to the role of mathematics in toxicogenomics, "data interpretation" from the perspective of a pathologist is briefly discussed. Based on oscillatory behavior in the liver, the importance of an understanding of mathematics is addressed, and an approach to learning mathematics "later in life" is provided. An understanding of pathology by mathematicians involved in toxicogenomics is equally critical, as both mathematics and pathology are essential for transforming toxicogenomics data sets into useful knowledge.

  18. Evaluation of the co-registration capabilities of a MRI/PET compatible bed in an Experimental autoimmune encephalomyelitis (EAE) model

    NASA Astrophysics Data System (ADS)

    Esposito, Giovanna; D'angeli, Luca; Bartoli, Antonietta; Chaabane, Linda; Terreno, Enzo

    2013-02-01

    Positron Emission Tomography (PET) with 18F-FDG is a promising tool for the detection and evaluation of active inflammation in animal models of neuroinflammation. MRI is a complementary imaging technique with high resolution and contrast suitable to obtain the anatomical data required to analyze PET data. To combine PET and MRI modalities, we developed a support bed system compatible for both scanners that allowed to perform imaging exams without animal repositioning. With this approach, MRI and PET data were acquired in mice with Experimental autoimmune encephalomyelitis (EAE). In this model, it was possible to measure a variation of 18F-FDG uptake proportional to the degree of disease severity which is mainly related to Central Nervous System (CNS) inflammation. Against the low resolved PET images, the co-registered MRI/PET images allowed to distinguish the different brain structures and to obtain a more accurate tracer evaluation. This is essential in particular for brain regions whose size is of the order of the spatial resolution of PET.

  19. Integral nuclear data validation using experimental spent nuclear fuel compositions

    DOE PAGES

    Gauld, Ian C.; Williams, Mark L.; Michel-Sendis, Franco; ...

    2017-07-19

    Measurements of the isotopic contents of spent nuclear fuel provide experimental data that are a prerequisite for validating computer codes and nuclear data for many spent fuel applications. Under the auspices of the Organisation for Economic Co-operation and Development (OECD) Nuclear Energy Agency (NEA) and guidance of the Expert Group on Assay Data of Spent Nuclear Fuel of the NEA Working Party on Nuclear Criticality Safety, a new database of expanded spent fuel isotopic compositions has been compiled. The database, Spent Fuel Compositions (SFCOMPO) 2.0, includes measured data for more than 750 fuel samples acquired from 44 different reactors andmore » representing eight different reactor technologies. Measurements for more than 90 isotopes are included. This new database provides data essential for establishing the reliability of code systems for inventory predictions, but it also has broader potential application to nuclear data evaluation. Furthermore, the database, together with adjoint based sensitivity and uncertainty tools for transmutation systems developed to quantify the importance of nuclear data on nuclide concentrations, are described.« less

  20. Reconstruction of Tissue-Specific Metabolic Networks Using CORDA

    PubMed Central

    Schultz, André; Qutub, Amina A.

    2016-01-01

    Human metabolism involves thousands of reactions and metabolites. To interpret this complexity, computational modeling becomes an essential experimental tool. One of the most popular techniques to study human metabolism as a whole is genome scale modeling. A key challenge to applying genome scale modeling is identifying critical metabolic reactions across diverse human tissues. Here we introduce a novel algorithm called Cost Optimization Reaction Dependency Assessment (CORDA) to build genome scale models in a tissue-specific manner. CORDA performs more efficiently computationally, shows better agreement to experimental data, and displays better model functionality and capacity when compared to previous algorithms. CORDA also returns reaction associations that can greatly assist in any manual curation to be performed following the automated reconstruction process. Using CORDA, we developed a library of 76 healthy and 20 cancer tissue-specific reconstructions. These reconstructions identified which metabolic pathways are shared across diverse human tissues. Moreover, we identified changes in reactions and pathways that are differentially included and present different capacity profiles in cancer compared to healthy tissues, including up-regulation of folate metabolism, the down-regulation of thiamine metabolism, and tight regulation of oxidative phosphorylation. PMID:26942765

  1. Advances in single-cell experimental design made possible by automated imaging platforms with feedback through segmentation.

    PubMed

    Crick, Alex J; Cammarota, Eugenia; Moulang, Katie; Kotar, Jurij; Cicuta, Pietro

    2015-01-01

    Live optical microscopy has become an essential tool for studying the dynamical behaviors and variability of single cells, and cell-cell interactions. However, experiments and data analysis in this area are often extremely labor intensive, and it has often not been achievable or practical to perform properly standardized experiments on a statistically viable scale. We have addressed this challenge by developing automated live imaging platforms, to help standardize experiments, increasing throughput, and unlocking previously impossible ones. Our real-time cell tracking programs communicate in feedback with microscope and camera control software, and they are highly customizable, flexible, and efficient. As examples of our current research which utilize these automated platforms, we describe two quite different applications: egress-invasion interactions of malaria parasites and red blood cells, and imaging of immune cells which possess high motility and internal dynamics. The automated imaging platforms are able to track a large number of motile cells simultaneously, over hours or even days at a time, greatly increasing data throughput and opening up new experimental possibilities. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Integral nuclear data validation using experimental spent nuclear fuel compositions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gauld, Ian C.; Williams, Mark L.; Michel-Sendis, Franco

    Measurements of the isotopic contents of spent nuclear fuel provide experimental data that are a prerequisite for validating computer codes and nuclear data for many spent fuel applications. Under the auspices of the Organisation for Economic Co-operation and Development (OECD) Nuclear Energy Agency (NEA) and guidance of the Expert Group on Assay Data of Spent Nuclear Fuel of the NEA Working Party on Nuclear Criticality Safety, a new database of expanded spent fuel isotopic compositions has been compiled. The database, Spent Fuel Compositions (SFCOMPO) 2.0, includes measured data for more than 750 fuel samples acquired from 44 different reactors andmore » representing eight different reactor technologies. Measurements for more than 90 isotopes are included. This new database provides data essential for establishing the reliability of code systems for inventory predictions, but it also has broader potential application to nuclear data evaluation. Furthermore, the database, together with adjoint based sensitivity and uncertainty tools for transmutation systems developed to quantify the importance of nuclear data on nuclide concentrations, are described.« less

  3. Development and utilization of complementary communication channels for treatment decision making and survivorship issues among cancer patients: The CIS Research Consortium Experience.

    PubMed

    Fleisher, Linda; Wen, Kuang Yi; Miller, Suzanne M; Diefenbach, Michael; Stanton, Annette L; Ropka, Mary; Morra, Marion; Raich, Peter C

    2015-11-01

    Cancer patients and survivors are assuming active roles in decision-making and digital patient support tools are widely used to facilitate patient engagement. As part of Cancer Information Service Research Consortium's randomized controlled trials focused on the efficacy of eHealth interventions to promote informed treatment decision-making for newly diagnosed prostate and breast cancer patients, and post-treatment breast cancer, we conducted a rigorous process evaluation to examine the actual use of and perceived benefits of two complementary communication channels -- print and eHealth interventions. The three Virtual Cancer Information Service (V-CIS) interventions were developed through a rigorous developmental process, guided by self-regulatory theory, informed decision-making frameworks, and health communications best practices. Control arm participants received NCI print materials; experimental arm participants received the additional V-CIS patient support tool. Actual usage data from the web-based V-CIS was also obtained and reported. Print materials were highly used by all groups. About 60% of the experimental group reported using the V-CIS. Those who did use the V-CIS rated it highly on improvements in knowledge, patient-provider communication and decision-making. The findings show that how patients actually use eHealth interventions either singularly or within the context of other communication channels is complex. Integrating rigorous best practices and theoretical foundations is essential and multiple communication approaches should be considered to support patient preferences.

  4. 20 CFR 416.1220 - Property essential to self-support; general.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... and supplies, motor vehicles, and tools, etc.) used in a trade or business (as defined in § 404.1066... activities. Liquid resources other than those used as part of a trade or business are not property essential...

  5. How does social essentialism affect the development of inter-group relations?

    PubMed

    Rhodes, Marjorie; Leslie, Sarah-Jane; Saunders, Katya; Dunham, Yarrow; Cimpian, Andrei

    2018-01-01

    Psychological essentialism is a pervasive conceptual bias to view categories as reflecting something deep, stable, and informative about their members. Scholars from diverse disciplines have long theorized that psychological essentialism has negative ramifications for inter-group relations, yet little previous empirical work has experimentally tested the social implications of essentialist beliefs. Three studies (N = 127, ages 4.5-6) found that experimentally inducing essentialist beliefs about a novel social category led children to share fewer resources with category members, but did not lead to the out-group dislike that defines social prejudice. These findings indicate that essentialism negatively influences some key components of inter-group relations, but does not lead directly to the development of prejudice. © 2017 John Wiley & Sons Ltd.

  6. The influence of essential oils on human attention. I: alertness.

    PubMed

    Ilmberger, J; Heuberger, E; Mahrhofer, C; Dessovic, H; Kowarik, D; Buchbauer, G

    2001-03-01

    Scientific research on the effects of essential oils on human behavior lags behind the promises made by popular aromatherapy. Nearly all aspects of human behavior are closely linked to processes of attention, the basic level being that of alertness, which ranges from sleep to wakefulness. In our study we measured the influence of essential oils and components of essential oils [peppermint, jasmine, ylang-ylang, 1,8-cineole (in two different dosages) and menthol] on this core attentional function, which can be experimentally defined as speed of information processing. Substances were administered by inhalation; levels of alertness were assessed by measuring motor and reaction times in a reaction time paradigm. The performances of the six experimental groups receiving substances (n = 20 in four groups, n = 30 in two groups) were compared with those of corresponding control groups receiving water. Between-group analysis, i.e. comparisons between experimental groups and their respective control groups, mainly did not reach statistical significance. However, within-group analysis showed complex correlations between subjective evaluations of substances and objective performance, indicating that effects of essentials oils or their components on basic forms of attentional behavior are mainly psychological.

  7. OGEE v2: an update of the online gene essentiality database with special focus on differentially essential genes in human cancer cell lines.

    PubMed

    Chen, Wei-Hua; Lu, Guanting; Chen, Xiao; Zhao, Xing-Ming; Bork, Peer

    2017-01-04

    OGEE is an Online GEne Essentiality database. To enhance our understanding of the essentiality of genes, in OGEE we collected experimentally tested essential and non-essential genes, as well as associated gene properties known to contribute to gene essentiality. We focus on large-scale experiments, and complement our data with text-mining results. We organized tested genes into data sets according to their sources, and tagged those with variable essentiality statuses across data sets as conditionally essential genes, intending to highlight the complex interplay between gene functions and environments/experimental perturbations. Developments since the last public release include increased numbers of species and gene essentiality data sets, inclusion of non-coding essential sequences and genes with intermediate essentiality statuses. In addition, we included 16 essentiality data sets from cancer cell lines, corresponding to 9 human cancers; with OGEE, users can easily explore the shared and differentially essential genes within and between cancer types. These genes, especially those derived from cell lines that are similar to tumor samples, could reveal the oncogenic drivers, paralogous gene expression pattern and chromosomal structure of the corresponding cancer types, and can be further screened to identify targets for cancer therapy and/or new drug development. OGEE is freely available at http://ogee.medgenius.info. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  8. OLS Client and OLS Dialog: Open Source Tools to Annotate Public Omics Datasets.

    PubMed

    Perez-Riverol, Yasset; Ternent, Tobias; Koch, Maximilian; Barsnes, Harald; Vrousgou, Olga; Jupp, Simon; Vizcaíno, Juan Antonio

    2017-10-01

    The availability of user-friendly software to annotate biological datasets and experimental details is becoming essential in data management practices, both in local storage systems and in public databases. The Ontology Lookup Service (OLS, http://www.ebi.ac.uk/ols) is a popular centralized service to query, browse and navigate biomedical ontologies and controlled vocabularies. Recently, the OLS framework has been completely redeveloped (version 3.0), including enhancements in the data model, like the added support for Web Ontology Language based ontologies, among many other improvements. However, the new OLS is not backwards compatible and new software tools are needed to enable access to this widely used framework now that the previous version is no longer available. We here present the OLS Client as a free, open-source Java library to retrieve information from the new version of the OLS. It enables rapid tool creation by providing a robust, pluggable programming interface and common data model to programmatically access the OLS. The library has already been integrated and is routinely used by several bioinformatics resources and related data annotation tools. Secondly, we also introduce an updated version of the OLS Dialog (version 2.0), a Java graphical user interface that can be easily plugged into Java desktop applications to access the OLS. The software and related documentation are freely available at https://github.com/PRIDE-Utilities/ols-client and https://github.com/PRIDE-Toolsuite/ols-dialog. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. [Experimental model of tooth decay as an educational tool for school-age children].

    PubMed

    de Araújo Silva, Thiago Fernando; Feitosa, José Leonilson; Medeiros Dantas, Rodrigo Maristony; Dantas de Medeiros, Fabianna da Conceição; Cavalcanti Lima, Isabela Pinheiro; Guerra Seabra, Eduardo José

    2016-04-01

    Objective This work consisted of the construction of an educational in vitro model of dental caries that started with an adaptation of Miller's classic experiment. Methods In a sterilized and sealed glass jar, a sample paste of human saliva was collected and a substrate of manufactured sugar (sucrose) was added. In addition, a human tooth with healthy dental crown extracted in dental treatment but otherwise healthy was added. Research phase I had the negative control sample test (tooth + saliva without added) and the others were opened after 1, 2, 3 and 4 months of incubation. This phase was essential for the next experimental time development. In phase II, two saliva donors with poor levels of oral health habits were recruited. The incubation time (defined by phase I) was 2 and 3 months for each donor. Results This research data gives the possibility of building educational materials about the etiology of tooth decay and its clinical evolution. It also makes possible the production of an explanatory sheet about how to reproduce this experimental model to be used by school children in secondary education. Conclusions Doing this kind of work together at school can help reduce inequities in oral health, especially since there is an approximation toward the discourses, facilitating the process of information dissemination.

  10. A numerical study of mixing and combustion in hypervelocity flows through a scramjet combustor model

    NASA Technical Reports Server (NTRS)

    Krishamurthy, Ramesh

    1993-01-01

    Interest in high speed, air-breathing propulsion systems such as scramjets has revived in recent years fueled to a large extent by the National Aerospace Plane (NASP) program. These vehicles are expected to fly trans-atmospheric and as a consequence, the Mach number level within the engine/combustor would be rather high (M greater than 5). Ground based testing of such scramjet engines requires a facility that can not only achieve the right Mach number, but also have the proper pressures and temperatures to simulate the combustion processes. At present, only pulse type facilities can provide such high enthalpy flows. The newest of these is the free-piston shock tunnel, T5 located at GALCIT. Recently, a generic combustor model was tested in T5, and the experimental data from that study is analyzed in the present report. The available experimental data from T5 are essentially the static pressures on the injection wall and the one opposite to it. Thus, a principal aim of the present study was to validate the available experimental data by using a proven CFD tool and then investigate the performance characteristics of the combustor model, such as, the mixing efficiency and combustion efficiency. For this purpose, in this study, the code GASP has been used.

  11. A numerical study of mixing and combustion in hypervelocity flows through a scramjet combustor model

    NASA Astrophysics Data System (ADS)

    Krishamurthy, Ramesh

    1993-12-01

    Interest in high speed, air-breathing propulsion systems such as scramjets has revived in recent years fueled to a large extent by the National Aerospace Plane (NASP) program. These vehicles are expected to fly trans-atmospheric and as a consequence, the Mach number level within the engine/combustor would be rather high (M greater than 5). Ground based testing of such scramjet engines requires a facility that can not only achieve the right Mach number, but also have the proper pressures and temperatures to simulate the combustion processes. At present, only pulse type facilities can provide such high enthalpy flows. The newest of these is the free-piston shock tunnel, T5 located at GALCIT. Recently, a generic combustor model was tested in T5, and the experimental data from that study is analyzed in the present report. The available experimental data from T5 are essentially the static pressures on the injection wall and the one opposite to it. Thus, a principal aim of the present study was to validate the available experimental data by using a proven CFD tool and then investigate the performance characteristics of the combustor model, such as, the mixing efficiency and combustion efficiency. For this purpose, in this study, the code GASP has been used.

  12. Using high-performance ¹H NMR (HP-qNMR®) for the certification of organic reference materials under accreditation guidelines--describing the overall process with focus on homogeneity and stability assessment.

    PubMed

    Weber, Michael; Hellriegel, Christine; Rueck, Alexander; Wuethrich, Juerg; Jenks, Peter

    2014-05-01

    Quantitative NMR spectroscopy (qNMR) is gaining interest across both analytical and industrial research applications and has become an essential tool for the content assignment and quantitative determination of impurities. The key benefits of using qNMR as measurement method for the purity determination of organic molecules are discussed, with emphasis on the ability to establish traceability to "The International System of Units" (SI). The work describes a routine certification procedure from the point of view of a commercial producer of certified reference materials (CRM) under ISO/IEC 17025 and ISO Guide 34 accreditation, that resulted in a set of essential references for (1)H qNMR measurements, and the relevant application data for these substances are given. The overall process includes specific selection criteria, pre-tests, experimental conditions, homogeneity and stability studies. The advantages of an accelerated stability study over the classical stability-test design are shown with respect to shelf-life determination and shipping conditions. Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.

  13. Comparison of simulations with PHITS and HIBRAC with experimental data in the context of particle therapy monitoring

    PubMed Central

    Rohling, Heide; Sihver, Lembit; Priegnitz, Marlen; Enghardt, Wolfgang; Fiedler, Fine

    2014-01-01

    Therapeutic irradiation with protons and ions is advantageous over radiotherapy with photons due to its favorable dose deposition. Additionally, ion beams provide a higher relative biological effectiveness than photons. For this reason, an improved treatment of deep-seated tumors is achieved and normal tissue is spared. However, small deviations from the treatment plan can have a large impact on the dose distribution. Therefore, a monitoring is required to assure the quality of the treatment. Particle therapy positron emission tomography (PT-PET) is the only clinically proven method which provides a non-invasive monitoring of dose delivery. It makes use of the β+-activity produced by nuclear fragmentation during irradiation. In order to evaluate these PT-PET measurements, simulations of the β+-activity are necessary. Therefore, it is essential to know the yields of the β+-emitting nuclides at every position of the beam path as exact as possible. We evaluated the three-dimensional Monte-Carlo simulation tool PHITS (version 2.30) [ 1] and the 1D deterministic simulation tool HIBRAC [ 2] with respect to the production of β+-emitting nuclides. The yields of the most important β+-emitting nuclides for carbon, lithium, helium and proton beams have been calculated. The results were then compared with experimental data obtained at GSI Helmholtzzentrum für Schwerionenforschung Darmstadt, Germany. GEANT4 simulations provide an additional benchmark [ 3]. For PHITS, the impact of different nuclear reaction models, total cross-section models and evaporation models on the β+-emitter production has been studied. In general, PHITS underestimates the yields of positron-emitters and cannot compete with GEANT4 so far. The β+-emitters calculated with an extended HIBRAC code were in good agreement with the experimental data for carbon and proton beams and comparable to the GEANT4 results, see [ 4] and Fig. 1. Considering the simulation results and its speed compared with three-dimensional Monte-Carlo tools, HIBRAC is a good candidate for the implementation in clinical routine PT-PET. Fig 1.Depth-dependent yields of the production of 11C and 15O during proton irradiation of a PMMA target with 140 MeV [ 4].

  14. ASSET. Assessment Simplification System for Elementary Teachers.

    ERIC Educational Resources Information Center

    Kentucky State Dept. of Education, Frankfort.

    This document is designed to show the connections between assessment tools available for primary and intermediate grades in the Kentucky public schools. Sections of the document outline the essential assessment tools and give information about how they support and mirror each other. These tools can be used to bridge the knowledge of primary and…

  15. A Standards-Based Grading and Reporting Tool for Faculty: Design and Implications

    ERIC Educational Resources Information Center

    Sadik, Alaa M.

    2011-01-01

    The use of standard-based assessment, grading and reporting tools is essential to ensure that assessment meets acceptable levels of quality and standardization. This study reports the design, development and evaluation of a standards-based assessment tool for the instructors at Sultan Qaboos University, Sultanate of Oman. The Rapid Applications…

  16. Making Culturally Responsive Mathematics Teaching Explicit: A Lesson Analysis Tool

    ERIC Educational Resources Information Center

    Aguirre, Julia M.; Zavala, Maria del Rosario

    2013-01-01

    In the United States, there is a need for pedagogical tools that help teachers develop essential pedagogical content knowledge and practices to meet the mathematical education needs of a growing culturally and linguistically diverse student population. In this article, we introduce an innovative lesson analysis tool that focuses on integrating…

  17. Anti-inflammatory activity of leaf essential oil from Cinnamomum longepaniculatum (Gamble) N. Chao.

    PubMed

    Du, Yong-Hua; Feng, Rui-Zhang; Li, Qun; Wei, Qin; Yin, Zhong-Qiong; Zhou, Li-Jun; Tao, Cui; Jia, Ren-Yong

    2014-01-01

    The anti-inflammatory activity of the essential oil from C. longepaniculatum was evaluated by three experimental models including the dimethyl benzene-induced ear edema in mice, the carrageenan-induced paw edema in rat and the acetic acid-induced vascular permeability in mice. The influence of the essential oil on histological changes and prostaglandin E2 (PGE2), histamine and 5-hydroxytryptamine (5-HT) production associated with carrageenan-induced rat paw edema was also investigated. The essential oil (0.5, 0.25, 0.13 ml/kg b.w.) showed significantly inhibition of inflammation along with a dose-dependent manner in the three experimental models. The anti-inflammatory activity of essential oil was occurred both in early and late phase and peaked at 4 h after carrageenan injection. The essential oil resulted in a dose dependent reduction of the paw thickness, connective tissue injury and the infiltration of inflammatory cell. The essential oil also significantly reduced the production of PGE2, histamine and 5-HT in the exudates of edema paw induced by carrageenan. Both the essential oil and indomethacin resulted relative lower percentage inhibition of histamine and 5-HT than that of PGE2 at 4 h after carrageenan injection.

  18. The Basics in Pottery: Clay and Tools.

    ERIC Educational Resources Information Center

    Larson, Joan

    1985-01-01

    Art teachers at the middle school or junior high school level usually find themselves in a program teaching ceramics. The most essential tools needed for a ceramics class are discussed. Different kinds of clay are also discussed. (RM)

  19. A cross-sectional survey of essential surgical capacity in Somalia

    PubMed Central

    Elkheir, Natalie; Sharma, Akshay; Cherian, Meena; Saleh, Omar Abdelrahman; Everard, Marthe; Popal, Ghulam Rabani; Ibrahim, Abdi Awad

    2014-01-01

    Objective To assess life-saving and disability-preventing surgical services (including emergency, trauma, obstetrics, anaesthesia) of health facilities in Somalia and to assist in the planning of strategies for strengthening surgical care systems. Design Cross-sectional survey. Setting Health facilities in all 3 administrative zones of Somalia; northwest Somalia (NWS), known as Somaliland; northeast Somalia (NES), known as Puntland; and south/central Somalia (SCS). Participants 14 health facilities. Measures The WHO Tool for Situational Analysis to Assess Emergency and Essential Surgical Care was employed to capture a health facility's capacity to deliver surgical and anaesthesia services by investigating four categories of data: infrastructure, human resources, interventions available and equipment. Results The 14 facilities surveyed in Somalia represent 10 of the 18 districts throughout the country. The facilities serve an average patient population of 331 250 people, and 12 of the 14 identify as hospitals. While major surgical procedures were provided at many facilities (caesarean section, laparotomy, appendicectomy, etc), only 22% had fully available oxygen access, 50% fully available electricity and less than 30% had any management guidelines for emergency and surgical care. Furthermore, only 36% were able to provide general anaesthesia inhalation due to lack of skills, supplies and equipment. Basic supplies for airway management and the prevention of infection transmission were severely lacking in most facilities. Conclusions According to the results of the WHO Tool for Situational Analysis to Assess Emergency and Essential Surgical Care survey, there exist significant gaps in the capacity of emergency and essential surgical services in Somalia including inadequacies in essential equipment, service provision and infrastructure. The information provided by the WHO tool can serve as a basis for evidence-based decisions on country-level policy regarding the allocation of resources and provision of emergency and essential surgical services. PMID:24812189

  20. A cross-sectional survey of essential surgical capacity in Somalia.

    PubMed

    Elkheir, Natalie; Sharma, Akshay; Cherian, Meena; Saleh, Omar Abdelrahman; Everard, Marthe; Popal, Ghulam Rabani; Ibrahim, Abdi Awad

    2014-05-07

    To assess life-saving and disability-preventing surgical services (including emergency, trauma, obstetrics, anaesthesia) of health facilities in Somalia and to assist in the planning of strategies for strengthening surgical care systems. Cross-sectional survey. Health facilities in all 3 administrative zones of Somalia; northwest Somalia (NWS), known as Somaliland; northeast Somalia (NES), known as Puntland; and south/central Somalia (SCS). 14 health facilities. The WHO Tool for Situational Analysis to Assess Emergency and Essential Surgical Care was employed to capture a health facility's capacity to deliver surgical and anaesthesia services by investigating four categories of data: infrastructure, human resources, interventions available and equipment. The 14 facilities surveyed in Somalia represent 10 of the 18 districts throughout the country. The facilities serve an average patient population of 331 250 people, and 12 of the 14 identify as hospitals. While major surgical procedures were provided at many facilities (caesarean section, laparotomy, appendicectomy, etc), only 22% had fully available oxygen access, 50% fully available electricity and less than 30% had any management guidelines for emergency and surgical care. Furthermore, only 36% were able to provide general anaesthesia inhalation due to lack of skills, supplies and equipment. Basic supplies for airway management and the prevention of infection transmission were severely lacking in most facilities. According to the results of the WHO Tool for Situational Analysis to Assess Emergency and Essential Surgical Care survey, there exist significant gaps in the capacity of emergency and essential surgical services in Somalia including inadequacies in essential equipment, service provision and infrastructure. The information provided by the WHO tool can serve as a basis for evidence-based decisions on country-level policy regarding the allocation of resources and provision of emergency and essential surgical services.

  1. The DIMA web resource--exploring the protein domain network.

    PubMed

    Pagel, Philipp; Oesterheld, Matthias; Stümpflen, Volker; Frishman, Dmitrij

    2006-04-15

    Conserved domains represent essential building blocks of most known proteins. Owing to their role as modular components carrying out specific functions they form a network based both on functional relations and direct physical interactions. We have previously shown that domain interaction networks provide substantially novel information with respect to networks built on full-length protein chains. In this work we present a comprehensive web resource for exploring the Domain Interaction MAp (DIMA), interactively. The tool aims at integration of multiple data sources and prediction techniques, two of which have been implemented so far: domain phylogenetic profiling and experimentally demonstrated domain contacts from known three-dimensional structures. A powerful yet simple user interface enables the user to compute, visualize, navigate and download domain networks based on specific search criteria. http://mips.gsf.de/genre/proj/dima

  2. Multiphysics and multiscale modelling, data-model fusion and integration of organ physiology in the clinic: ventricular cardiac mechanics.

    PubMed

    Chabiniok, Radomir; Wang, Vicky Y; Hadjicharalambous, Myrianthi; Asner, Liya; Lee, Jack; Sermesant, Maxime; Kuhl, Ellen; Young, Alistair A; Moireau, Philippe; Nash, Martyn P; Chapelle, Dominique; Nordsletten, David A

    2016-04-06

    With heart and cardiovascular diseases continually challenging healthcare systems worldwide, translating basic research on cardiac (patho)physiology into clinical care is essential. Exacerbating this already extensive challenge is the complexity of the heart, relying on its hierarchical structure and function to maintain cardiovascular flow. Computational modelling has been proposed and actively pursued as a tool for accelerating research and translation. Allowing exploration of the relationships between physics, multiscale mechanisms and function, computational modelling provides a platform for improving our understanding of the heart. Further integration of experimental and clinical data through data assimilation and parameter estimation techniques is bringing computational models closer to use in routine clinical practice. This article reviews developments in computational cardiac modelling and how their integration with medical imaging data is providing new pathways for translational cardiac modelling.

  3. A dual-heterodyne laser interferometer for simultaneous measurement of linear and angular displacements.

    PubMed

    Yan, Hao; Duan, Hui-Zong; Li, Lin-Tao; Liang, Yu-Rong; Luo, Jun; Yeh, Hsien-Chi

    2015-12-01

    Picometer laser interferometry is an essential tool for ultra-precision measurements in frontier scientific research and advanced manufacturing. In this paper, we present a dual-heterodyne laser interferometer for simultaneously measuring linear and angular displacements with resolutions of picometer and nanoradian, respectively. The phase measurement method is based on cross-correlation analysis and realized by a PXI-bus data acquisition system. By implementing a dual-heterodyne interferometer with a highly symmetric optical configuration, low frequency noises caused by the environmental fluctuations can be suppressed to very low levels via common-mode noise rejection. Experimental results for the dual-heterodyne interferometer configuration presented demonstrate that the noise levels of the linear and angular displacement measurements are approximately 1 pm/Hz(1/2) and 0.5 nrad/Hz(1/2) at 1 Hz.

  4. 2015 Stewardship Science Academic Programs Annual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stone, Terri; Mischo, Millicent

    The Stockpile Stewardship Academic Programs (SSAP) are essential to maintaining a pipeline of professionals to support the technical capabilities that reside at the National Nuclear Security Administration (NNSA) national laboratories, sites, and plants. Since 1992, the United States has observed the moratorium on nuclear testing while significantly decreasing the nuclear arsenal. To accomplish this without nuclear testing, NNSA and its laboratories developed a science-based Stockpile Stewardship Program to maintain and enhance the experimental and computational tools required to ensure the continued safety, security, and reliability of the stockpile. NNSA launched its academic program portfolio more than a decade ago tomore » engage students skilled in specific technical areas of relevance to stockpile stewardship. The success of this program is reflected by the large number of SSAP students choosing to begin their careers at NNSA national laboratories.« less

  5. Acute Myocardial Ischemia: Cellular Mechanisms Underlying ST Segment Elevation

    PubMed Central

    Di Diego, José M.; Antzelevitch, Charles

    2014-01-01

    The electrocardiogram (ECG) is an essential tool for the diagnosis of acute myocardial ischemia in the emergency department, as well as for that of an evolving acute myocardial infarction (AMI). Changes in the surface ECG in leads whose positive poles face the ischemic region are known to be related to injury currents flowing across the boundaries between the ischemic and the surrounding normal myocardium. Although experimental studies have also shown an endocardium to epicardium differential sensitivity to the effect of acute ischemia, the important contribution of this transmural heterogeneous response to the changes observed in the surface ECG are less appreciated by the clinical cardiologist. This review briefly discusses our current knowledge regarding the electrophysiology of the ischemic myocardium focusing primarily on the electrophysiologic changes underlying the ECG alterations observed at the onset of a transmural AMI. PMID:24742586

  6. Current and anticipated uses of thermalhydraulic and neutronic codes at PSI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aksan, S.N.; Zimmermann, M.A.; Yadigaroglu, G.

    1997-07-01

    The thermalhydraulic and/or neutronic codes in use at PSI mainly provide the capability to perform deterministic safety analysis for Swiss NPPs and also serve as analysis tools for experimental facilities for LWR and ALWR simulations. In relation to these applications, physical model development and improvements, and assessment of the codes are also essential components of the activities. In this paper, a brief overview is provided on the thermalhydraulic and/or neutronic codes used for safety analysis of LWRs, at PSI, and also of some experiences and applications with these codes. Based on these experiences, additional assessment needs are indicated, together withmore » some model improvement needs. The future needs that could be used to specify both the development of a new code and also improvement of available codes are summarized.« less

  7. The Metabolic Core and Catalytic Switches Are Fundamental Elements in the Self-Regulation of the Systemic Metabolic Structure of Cells

    PubMed Central

    De la Fuente, Ildefonso M.; Cortes, Jesus M.; Perez-Pinilla, Martin B.; Ruiz-Rodriguez, Vicente; Veguillas, Juan

    2011-01-01

    Background Experimental observations and numerical studies with dissipative metabolic networks have shown that cellular enzymatic activity self-organizes spontaneously leading to the emergence of a metabolic core formed by a set of enzymatic reactions which are always active under all environmental conditions, while the rest of catalytic processes are only intermittently active. The reactions of the metabolic core are essential for biomass formation and to assure optimal metabolic performance. The on-off catalytic reactions and the metabolic core are essential elements of a Systemic Metabolic Structure which seems to be a key feature common to all cellular organisms. Methodology/Principal Findings In order to investigate the functional importance of the metabolic core we have studied different catalytic patterns of a dissipative metabolic network under different external conditions. The emerging biochemical data have been analysed using information-based dynamic tools, such as Pearson's correlation and Transfer Entropy (which measures effective functionality). Our results show that a functional structure of effective connectivity emerges which is dynamical and characterized by significant variations of bio-molecular information flows. Conclusions/Significance We have quantified essential aspects of the metabolic core functionality. The always active enzymatic reactions form a hub –with a high degree of effective connectivity- exhibiting a wide range of functional information values being able to act either as a source or as a sink of bio-molecular causal interactions. Likewise, we have found that the metabolic core is an essential part of an emergent functional structure characterized by catalytic modules and metabolic switches which allow critical transitions in enzymatic activity. Both, the metabolic core and the catalytic switches in which also intermittently-active enzymes are involved seem to be fundamental elements in the self-regulation of the Systemic Metabolic Structure. PMID:22125607

  8. OPTIMIZING BMP PLACEMENT AT WATERSHED-SCALE USING SUSTAIN

    EPA Science Inventory

    Watershed and stormwater managers need modeling tools to evaluate alternative plans for environmental quality restoration and protection needs in urban and developing areas. A watershed-scale decision-support system, based on cost optimization, provides an essential tool to suppo...

  9. U.S. EPA's Watershed Management Research Activities

    EPA Science Inventory

    Watershed and stormwater managers need modeling tools to evaluate alternative plans for environmental quality restoration and protection needs in urban and developing areas. A watershed-scale decision-support system, based on cost optimization, provides an essential tool to suppo...

  10. Primer on Condition Curves for Water Mains

    EPA Science Inventory

    ABSTRACT The development of economical tools to prioritize pipe renewal based upon structural condition and remaining asset life is essential to effectively manage water infrastructure assets for both large and small diameter pipes. One tool that may facilitate asset management...

  11. "PowerUp"!: A Tool for Calculating Minimum Detectable Effect Sizes and Minimum Required Sample Sizes for Experimental and Quasi-Experimental Design Studies

    ERIC Educational Resources Information Center

    Dong, Nianbo; Maynard, Rebecca

    2013-01-01

    This paper and the accompanying tool are intended to complement existing supports for conducting power analysis tools by offering a tool based on the framework of Minimum Detectable Effect Sizes (MDES) formulae that can be used in determining sample size requirements and in estimating minimum detectable effect sizes for a range of individual- and…

  12. Development of TIF based figuring algorithm for deterministic pitch tool polishing

    NASA Astrophysics Data System (ADS)

    Yi, Hyun-Su; Kim, Sug-Whan; Yang, Ho-Soon; Lee, Yun-Woo

    2007-12-01

    Pitch is perhaps the oldest material used for optical polishing, leaving superior surface texture, and has been used widely in the optics shop floor. However, for its unpredictable controllability of removal characteristics, the pitch tool polishing has been rarely analysed quantitatively and many optics shops rely heavily on optician's "feel" even today. In order to bring a degree of process controllability to the pitch tool polishing, we added motorized tool motions to the conventional Draper type polishing machine and modelled the tool path in the absolute machine coordinate. We then produced a number of Tool Influence Function (TIF) both from an analytical model and a series of experimental polishing runs using the pitch tool. The theoretical TIFs agreed well with the experimental TIFs to the profile accuracy of 79 % in terms of its shape. The surface figuring algorithm was then developed in-house utilizing both theoretical and experimental TIFs. We are currently undertaking a series of trial figuring experiments to prove the performance of the polishing algorithm, and the early results indicate that the highly deterministic material removal control with the pitch tool can be achieved to a certain level of form error. The machine renovation, TIF theory and experimental confirmation, figuring simulation results are reported together with implications to deterministic polishing.

  13. Study of runaway electrons in TUMAN-3M tokamak plasmas

    NASA Astrophysics Data System (ADS)

    Shevelev, A.; Khilkevitch, E.; Tukachinsky, A.; Pandya, S.; Askinazi, L.; Belokurov, A.; Chugunov, I.; Doinikov, D.; Gin, D.; Iliasova, M.; Kiptily, V.; Kornev, V.; Lebedev, S.; Naidenov, V.; Plyusnin, V.; Polunovsky, I.; Zhubr, N.

    2018-07-01

    Studies of runaway electrons in present day tokamaks are essential to improve theoretical models and to support possible avoidance or suppression mechanisms in future large-scale plasma devices. Some of the phenomena associated with the runaway electrons take place at faster time scales, and thus it is essential to probe the runaway electrons to investigate underlying physics. The present article reports a few experimental observations of runaway electron associated events, at fast time scales, using a state-of-the-art multi-detector system developed at the Ioffe Institute and recently deployed on the TUMAN-3M tokamak. The system is based on the high-performance scintillation gamma-ray spectrometers for measurements of bremsstrahlung generated during the interaction of accelerated electrons with plasma and materials of the tokamak chamber. It includes a total three detectors configured in the spectroscopic mode having different lines of sight. Along with this hardware, dedicated algorithms were developed and validated that enables the separation of piled-up pulses, maximize the dynamic range of the detector and provides a counting rate as high as 107 counts per second. The inversion code, DeGaSum, has been used for the reconstruction of a runaway electron energy distribution function from the measured gamma-ray spectra. Using this tool, experimental analysis of the runaway electron beam generation and evolution of their energy distribution in the TUMAN-3M representative plasma discharges is performed. The effect on gamma-ray count rate during the magnetohydrodynamic activities and possible changes in the runaway electron energy distribution function during sawtooth oscillations is discussed in detail. Possible maximum limit of the runaway electron energy in TUMAN-3M is investigated and compared with the numerical analysis. In addition, the probability of the runaway electron generation throughout the plasma discharge is estimated analytically and compared with the experimental observation that suggests a balance between production and loss of the runaway electrons.

  14. Experimental study on internal cooling system in hard turning of HCWCI using CBN tools

    NASA Astrophysics Data System (ADS)

    Ravi, A. M.; Murigendrappa, S. M.

    2018-04-01

    In recent times, hard turning became most emerging technique in manufacturing processes, especially to cut high hard materials like high chrome white cast iron (HCWCI). Use of Cubic boron nitride (CBN), pCBN and Carbide tools are most appropriate to shear the metals but are uneconomical. Since hard turning carried out in dry condition, lowering the tool wear by minimizing tool temperature is the only solution. Study reveals, no effective cooling systems are available so for in order to enhance the tool life of the cutting tools and to improve machinability characteristics. The detrimental effect of cutting parameters on cutting temperature is generally controlled by proper selections. The objective of this paper is to develop a new cooling system to control tool tip temperature, thereby minimizing the cutting forces and the tool wear rates. The materials chosen for this work was HCWCI and cutting tools are CBN inserts. Intricate cavities were made on the periphery of the tool holder for easy flow of cold water. Taguchi techniques were adopted to carry out the experimentations. The experimental results confirm considerable reduction in the cutting forces and tool wear rates.

  15. Optimal experimental design for parameter estimation of a cell signaling model.

    PubMed

    Bandara, Samuel; Schlöder, Johannes P; Eils, Roland; Bock, Hans Georg; Meyer, Tobias

    2009-11-01

    Differential equation models that describe the dynamic changes of biochemical signaling states are important tools to understand cellular behavior. An essential task in building such representations is to infer the affinities, rate constants, and other parameters of a model from actual measurement data. However, intuitive measurement protocols often fail to generate data that restrict the range of possible parameter values. Here we utilized a numerical method to iteratively design optimal live-cell fluorescence microscopy experiments in order to reveal pharmacological and kinetic parameters of a phosphatidylinositol 3,4,5-trisphosphate (PIP(3)) second messenger signaling process that is deregulated in many tumors. The experimental approach included the activation of endogenous phosphoinositide 3-kinase (PI3K) by chemically induced recruitment of a regulatory peptide, reversible inhibition of PI3K using a kinase inhibitor, and monitoring of the PI3K-mediated production of PIP(3) lipids using the pleckstrin homology (PH) domain of Akt. We found that an intuitively planned and established experimental protocol did not yield data from which relevant parameters could be inferred. Starting from a set of poorly defined model parameters derived from the intuitively planned experiment, we calculated concentration-time profiles for both the inducing and the inhibitory compound that would minimize the predicted uncertainty of parameter estimates. Two cycles of optimization and experimentation were sufficient to narrowly confine the model parameters, with the mean variance of estimates dropping more than sixty-fold. Thus, optimal experimental design proved to be a powerful strategy to minimize the number of experiments needed to infer biological parameters from a cell signaling assay.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rutqvist, Jonny; Blanco Martin, Laura; Mukhopadhyay, Sumit

    In this report, we present FY2014 progress by Lawrence Berkeley National Laboratory (LBNL) related to modeling of coupled thermal-hydrological-mechanical-chemical (THMC) processes in salt and their effect on brine migration at high temperatures. LBNL’s work on the modeling of coupled THMC processes in salt was initiated in FY2012, focusing on exploring and demonstrating the capabilities of an existing LBNL modeling tool (TOUGH-FLAC) for simulating temperature-driven coupled flow and geomechanical processes in salt. This work includes development related to, and implementation of, essential capabilities, as well as testing the model against relevant information and published experimental data related to the fate andmore » transport of water. we provide more details on the FY2014 work, first presenting updated tools and improvements made to the TOUGH-FLAC simulator, and the use of this updated tool in a new model simulation of long-term THM behavior within a generic repository in a salt formation. This is followed by the description of current benchmarking and validations efforts, including the TSDE experiment. We then present the current status in the development of constitutive relationships and the dual-continuum model for brine migration. We conclude with an outlook for FY2015, which will be much focused on model validation against field experiments and on the use of the model for the design studies related to a proposed heater experiment.« less

  17. A new digitized reverse correction method for hypoid gears based on a one-dimensional probe

    NASA Astrophysics Data System (ADS)

    Li, Tianxing; Li, Jubo; Deng, Xiaozhong; Yang, Jianjun; Li, Genggeng; Ma, Wensuo

    2017-12-01

    In order to improve the tooth surface geometric accuracy and transmission quality of hypoid gears, a new digitized reverse correction method is proposed based on the measurement data from a one-dimensional probe. The minimization of tooth surface geometrical deviations is realized from the perspective of mathematical analysis and reverse engineering. Combining the analysis of complex tooth surface generation principles and the measurement mechanism of one-dimensional probes, the mathematical relationship between the theoretical designed tooth surface, the actual machined tooth surface and the deviation tooth surface is established, the mapping relation between machine-tool settings and tooth surface deviations is derived, and the essential connection between the accurate calculation of tooth surface deviations and the reverse correction method of machine-tool settings is revealed. Furthermore, a reverse correction model of machine-tool settings is built, a reverse correction strategy is planned, and the minimization of tooth surface deviations is achieved by means of the method of numerical iterative reverse solution. On this basis, a digitized reverse correction system for hypoid gears is developed by the organic combination of numerical control generation, accurate measurement, computer numerical processing, and digitized correction. Finally, the correctness and practicability of the digitized reverse correction method are proved through a reverse correction experiment. The experimental results show that the tooth surface geometric deviations meet the engineering requirements after two trial cuts and one correction.

  18. Turbomachinery

    NASA Technical Reports Server (NTRS)

    Simoneau, Robert J.; Strazisar, Anthony J.; Sockol, Peter M.; Reid, Lonnie; Adamczyk, John J.

    1987-01-01

    The discipline research in turbomachinery, which is directed toward building the tools needed to understand such a complex flow phenomenon, is based on the fact that flow in turbomachinery is fundamentally unsteady or time dependent. Success in building a reliable inventory of analytic and experimental tools will depend on how the time and time-averages are treated, as well as on who the space and space-averages are treated. The raw tools at disposal (both experimentally and computational) are truly powerful and their numbers are growing at a staggering pace. As a result of this power, a case can be made that a situation exists where information is outstripping understanding. The challenge is to develop a set of computational and experimental tools which genuinely increase understanding of the fluid flow and heat transfer in a turbomachine. Viewgraphs outline a philosophy based on working on a stairstep hierarchy of mathematical and experimental complexity to build a system of tools, which enable one to aggressively design the turbomachinery of the next century. Examples of the types of computational and experimental tools under current development at Lewis, with progress to date, are examined. The examples include work in both the time-resolved and time-averaged domains. Finally, an attempt is made to identify the proper place for Lewis in this continuum of research.

  19. Essential oil of Siparuna guianensis as an alternative tool for improved lepidopteran control and resistance management practices.

    PubMed

    Lourenço, Adriano M; Haddi, Khalid; Ribeiro, Bergman M; Corrêia, Roberto F T; Tomé, Hudson V V; Santos-Amaya, Oscar; Pereira, Eliseu J G; Guedes, Raul N C; Santos, Gil R; Oliveira, Eugênio E; Aguiar, Raimundo W S

    2018-05-08

    Although the cultivation of transgenic plants expressing toxins of Bacillus thuringiensis (Bt) represents a successful pest management strategy, the rapid evolution of resistance to Bt plants in several lepidopteran pests has threatened the sustainability of this practice. By exhibiting a favorable safety profile and allowing integration with pest management initiatives, plant essential oils have become relevant pest control alternatives. Here, we assessed the potential of essential oils extracted from a Neotropical plant, Siparuna guianensis Aublet, for improving the control and resistance management of key lepidopteran pests (i.e., Spodoptera frugiperda and Anticarsia gemmatalis). The essential oil exhibited high toxicity against both lepidopteran pest species (including an S. frugiperda strain resistant to Cry1A.105 and Cry2Ab Bt toxins). This high insecticidal activity was associated with necrotic and apoptotic effects revealed by in vitro assays with lepidopteran (but not human) cell lines. Furthermore, deficits in reproduction (e.g., egg-laying deterrence and decreased egg viability), larval development (e.g., feeding inhibition) and locomotion (e.g., individual and grouped larvae walking activities) were recorded for lepidopterans sublethally exposed to the essential oil. Thus, by similarly and efficiently controlling lepidopteran strains susceptible and resistant to Bt toxins, the S. guianensis essential oil represents a promising management tool against key lepidopteran pests.

  20. In Vitro Experimental Model for the Long-Term Analysis of Cellular Dynamics During Bronchial Tree Development from Lung Epithelial Cells

    PubMed Central

    Maruta, Naomichi; Marumoto, Moegi

    2017-01-01

    Lung branching morphogenesis has been studied for decades, but the underlying developmental mechanisms are still not fully understood. Cellular movements dynamically change during the branching process, but it is difficult to observe long-term cellular dynamics by in vivo or tissue culture experiments. Therefore, developing an in vitro experimental model of bronchial tree would provide an essential tool for developmental biology, pathology, and systems biology. In this study, we succeeded in reconstructing a bronchial tree in vitro by using primary human bronchial epithelial cells. A high concentration gradient of bronchial epithelial cells was required for branching initiation, whereas homogeneously distributed endothelial cells induced the formation of successive branches. Subsequently, the branches grew in size to the order of millimeter. The developed model contains only two types of cells and it facilitates the analysis of lung branching morphogenesis. By taking advantage of our experimental model, we carried out long-term time-lapse observations, which revealed self-assembly, collective migration with leader cells, rotational motion, and spiral motion of epithelial cells in each developmental event. Mathematical simulation was also carried out to analyze the self-assembly process and it revealed simple rules that govern cellular dynamics. Our experimental model has provided many new insights into lung development and it has the potential to accelerate the study of developmental mechanisms, pattern formation, left–right asymmetry, and disease pathogenesis of the human lung. PMID:28471293

  1. Effects of a work-based critical reflection program for novice nurses.

    PubMed

    Kim, Yeon Hee; Min, Ja; Kim, Soon Hee; Shin, Sujin

    2018-02-27

    Critical reflection is effective in improving students' communication abilities and confidence. The aim of this study was to evaluate the effectiveness of a work-based critical reflection program to enhance novice nurses' clinical critical-thinking abilities, communication competency, and job performance. The present study used a quasi-experimental design. From October 2014 to August 2015, we collected data from 44 novice nurses working in an advanced general hospital in S city in Korea. Nurses in the experimental group participated in a critical reflection program for six months. Outcome variables were clinical critical-thinking skills, communication abilities, and job performance. A non-parametric Mann-Whitney U-test and a Wilcoxon rank sum test were selected to evaluate differences in mean ranks and to assess the null hypothesis that the medians were equal across the groups. The results showed that the clinical critical-thinking skills of those in the experimental group improved significantly (p = 0.003). The differences in mean ranks of communication ability between two groups was significantly statistically different (p = 0.028). Job performance improved significantly in both the experimental group and the control group, so there was no statistical difference (p = 0.294). We therefore suggest that a critical reflection program be considered an essential tool for improving critical thinking and communication abilities among novice nurses who need to adapt to the clinical environment as quickly as possible. Further, we suggest conducting research into critical reflection programs among larger and more diverse samples.

  2. A Comparison of Parameter Study Creation and Job Submission Tools

    NASA Technical Reports Server (NTRS)

    DeVivo, Adrian; Yarrow, Maurice; McCann, Karen M.; Biegel, Bryan (Technical Monitor)

    2001-01-01

    We consider the differences between the available general purpose parameter study and job submission tools. These tools necessarily share many features, but frequently with differences in the way they are designed and implemented For this class of features, we will only briefly outline the essential differences. However we will focus on the unique features which distinguish the ILab parameter study and job submission tool from other packages, and which make the ILab tool easier and more suitable for use in our research and engineering environment.

  3. Randomized test of an implementation intention-based tool to reduce stress-induced eating.

    PubMed

    O'Connor, Daryl B; Armitage, Christopher J; Ferguson, Eamonn

    2015-06-01

    Stress may indirectly contribute to disease (e.g. cardiovascular disease, cancer) by producing deleterious changes to diet. The purpose of this study was to test the effectiveness of a stress management support (SMS) tool to reduce stress-related unhealthy snacking and to promote stress-related healthy snacking. Participants were randomized to complete a SMS tool with instruction to link stressful situations with healthy snack alternatives (experimental) or a SMS tool without a linking instruction (control). On-line daily reports of stressors and snacking were completed for 7 days. Daily stressors were associated with unhealthy snack consumption in the control condition but not in the experimental condition. Participants highly motivated towards healthy eating consumed a greater number of healthy snacks in the experimental condition on stressful days compared to participants in the experimental condition with low and mean levels of motivation. This tool is an effective, theory driven, intervention that helps to protect against stress-induced high-calorie snack consumption.

  4. [Blended-learning in psychosomatics and psychotherapy - Increasing the satisfaction and knowledge of students with a web-based e-learning tool].

    PubMed

    Ferber, Julia; Schneider, Gudrun; Havlik, Linda; Heuft, Gereon; Friederichs, Hendrik; Schrewe, Franz-Bernhard; Schulz-Steinel, Andrea; Burgmer, Markus

    2014-01-01

    To improve the synergy of established methods of teaching, the Department of Psychosomatics and Psychotherapy, University Hospital Münster, developed a web-based elearning tool using video clips of standardized patients. The effect of this blended-learning approach was evaluated. A multiple-choice test was performed by a naive (without the e-learning tool) and an experimental (with the tool) cohort of medical students to test the groups' expertise in psychosomatics. In addition, participants' satisfaction with the new tool was evaluated (numeric rating scale of 0-10). The experimental cohort was more satisfied with the curriculum and more interested in psychosomatics. Furthermore, the experimental cohort scored significantly better in the multiple-choice test. The new tool proved to be an important addition to the classical curriculum as a blended-learning approach which improves students' satisfaction and knowledge in psychosomatics.

  5. Energy Systems Integration News | Energy Systems Integration Facility |

    Science.gov Websites

    answer that question by examining the technical, infrastructure, economic, and policy barriers to greater intra-hour, inter-hour, seasonal, and inter-annual variability of solar resources-essential information powerful tool that provides essential information to policymakers, financiers, project developers, and

  6. Modeling synthetic lethality

    PubMed Central

    Le Meur, Nolwenn; Gentleman, Robert

    2008-01-01

    Background Synthetic lethality defines a genetic interaction where the combination of mutations in two or more genes leads to cell death. The implications of synthetic lethal screens have been discussed in the context of drug development as synthetic lethal pairs could be used to selectively kill cancer cells, but leave normal cells relatively unharmed. A challenge is to assess genome-wide experimental data and integrate the results to better understand the underlying biological processes. We propose statistical and computational tools that can be used to find relationships between synthetic lethality and cellular organizational units. Results In Saccharomyces cerevisiae, we identified multi-protein complexes and pairs of multi-protein complexes that share an unusually high number of synthetic genetic interactions. As previously predicted, we found that synthetic lethality can arise from subunits of an essential multi-protein complex or between pairs of multi-protein complexes. Finally, using multi-protein complexes allowed us to take into account the pleiotropic nature of the gene products. Conclusions Modeling synthetic lethality using current estimates of the yeast interactome is an efficient approach to disentangle some of the complex molecular interactions that drive a cell. Our model in conjunction with applied statistical methods and computational methods provides new tools to better characterize synthetic genetic interactions. PMID:18789146

  7. Know Your Enemy: Successful Bioinformatic Approaches to Predict Functional RNA Structures in Viral RNAs.

    PubMed

    Lim, Chun Shen; Brown, Chris M

    2017-01-01

    Structured RNA elements may control virus replication, transcription and translation, and their distinct features are being exploited by novel antiviral strategies. Viral RNA elements continue to be discovered using combinations of experimental and computational analyses. However, the wealth of sequence data, notably from deep viral RNA sequencing, viromes, and metagenomes, necessitates computational approaches being used as an essential discovery tool. In this review, we describe practical approaches being used to discover functional RNA elements in viral genomes. In addition to success stories in new and emerging viruses, these approaches have revealed some surprising new features of well-studied viruses e.g., human immunodeficiency virus, hepatitis C virus, influenza, and dengue viruses. Some notable discoveries were facilitated by new comparative analyses of diverse viral genome alignments. Importantly, comparative approaches for finding RNA elements embedded in coding and non-coding regions differ. With the exponential growth of computer power we have progressed from stem-loop prediction on single sequences to cutting edge 3D prediction, and from command line to user friendly web interfaces. Despite these advances, many powerful, user friendly prediction tools and resources are underutilized by the virology community.

  8. Analysis of Protein Phosphorylation and Its Functional Impact on Protein-Protein Interactions via Text Mining of the Scientific Literature.

    PubMed

    Wang, Qinghua; Ross, Karen E; Huang, Hongzhan; Ren, Jia; Li, Gang; Vijay-Shanker, K; Wu, Cathy H; Arighi, Cecilia N

    2017-01-01

    Post-translational modifications (PTMs) are one of the main contributors to the diversity of proteoforms in the proteomic landscape. In particular, protein phosphorylation represents an essential regulatory mechanism that plays a role in many biological processes. Protein kinases, the enzymes catalyzing this reaction, are key participants in metabolic and signaling pathways. Their activation or inactivation dictate downstream events: what substrates are modified and their subsequent impact (e.g., activation state, localization, protein-protein interactions (PPIs)). The biomedical literature continues to be the main source of evidence for experimental information about protein phosphorylation. Automatic methods to bring together phosphorylation events and phosphorylation-dependent PPIs can help to summarize the current knowledge and to expose hidden connections. In this chapter, we demonstrate two text mining tools, RLIMS-P and eFIP, for the retrieval and extraction of kinase-substrate-site data and phosphorylation-dependent PPIs from the literature. These tools offer several advantages over a literature search in PubMed as their results are specific for phosphorylation. RLIMS-P and eFIP results can be sorted, organized, and viewed in multiple ways to answer relevant biological questions, and the protein mentions are linked to UniProt identifiers.

  9. Know Your Enemy: Successful Bioinformatic Approaches to Predict Functional RNA Structures in Viral RNAs

    PubMed Central

    Lim, Chun Shen; Brown, Chris M.

    2018-01-01

    Structured RNA elements may control virus replication, transcription and translation, and their distinct features are being exploited by novel antiviral strategies. Viral RNA elements continue to be discovered using combinations of experimental and computational analyses. However, the wealth of sequence data, notably from deep viral RNA sequencing, viromes, and metagenomes, necessitates computational approaches being used as an essential discovery tool. In this review, we describe practical approaches being used to discover functional RNA elements in viral genomes. In addition to success stories in new and emerging viruses, these approaches have revealed some surprising new features of well-studied viruses e.g., human immunodeficiency virus, hepatitis C virus, influenza, and dengue viruses. Some notable discoveries were facilitated by new comparative analyses of diverse viral genome alignments. Importantly, comparative approaches for finding RNA elements embedded in coding and non-coding regions differ. With the exponential growth of computer power we have progressed from stem-loop prediction on single sequences to cutting edge 3D prediction, and from command line to user friendly web interfaces. Despite these advances, many powerful, user friendly prediction tools and resources are underutilized by the virology community. PMID:29354101

  10. Modelling the interactions between animal venom peptides and membrane proteins.

    PubMed

    Hung, Andrew; Kuyucak, Serdar; Schroeder, Christina I; Kaas, Quentin

    2017-12-01

    The active components of animal venoms are mostly peptide toxins, which typically target ion channels and receptors of both the central and peripheral nervous system, interfering with action potential conduction and/or synaptic transmission. The high degree of sequence conservation of their molecular targets makes a range of these toxins active at human receptors. The high selectivity and potency displayed by some of these toxins have prompted their use as pharmacological tools as well as drugs or drug leads. Molecular modelling has played an essential role in increasing our molecular-level understanding of the activity and specificity of animal toxins, as well as engineering them for biotechnological and pharmaceutical applications. This review focuses on the biological insights gained from computational and experimental studies of animal venom toxins interacting with membranes and ion channels. A host of recent X-ray crystallography and electron-microscopy structures of the toxin targets has contributed to a dramatic increase in the accuracy of the molecular models of toxin binding modes greatly advancing this exciting field of study. This article is part of the Special Issue entitled 'Venom-derived Peptides as Pharmacological Tools.' Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Mathematical modeling of physiological systems: an essential tool for discovery.

    PubMed

    Glynn, Patric; Unudurthi, Sathya D; Hund, Thomas J

    2014-08-28

    Mathematical models are invaluable tools for understanding the relationships between components of a complex system. In the biological context, mathematical models help us understand the complex web of interrelations between various components (DNA, proteins, enzymes, signaling molecules etc.) in a biological system, gain better understanding of the system as a whole, and in turn predict its behavior in an altered state (e.g. disease). Mathematical modeling has enhanced our understanding of multiple complex biological processes like enzyme kinetics, metabolic networks, signal transduction pathways, gene regulatory networks, and electrophysiology. With recent advances in high throughput data generation methods, computational techniques and mathematical modeling have become even more central to the study of biological systems. In this review, we provide a brief history and highlight some of the important applications of modeling in biological systems with an emphasis on the study of excitable cells. We conclude with a discussion about opportunities and challenges for mathematical modeling going forward. In a larger sense, the review is designed to help answer a simple but important question that theoreticians frequently face from interested but skeptical colleagues on the experimental side: "What is the value of a model?" Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Essential Features of Responsible Governance of Agricultural Biotechnology

    PubMed Central

    Hartley, Sarah; Wickson, Fern

    2016-01-01

    Agricultural biotechnology continues to generate considerable controversy. We argue that to address this controversy, serious changes to governance are needed. The new wave of genomic tools and products (e.g., CRISPR, gene drives, RNAi, synthetic biology, and genetically modified [GM] insects and fish), provide a particularly useful opportunity to reflect on and revise agricultural biotechnology governance. In response, we present five essential features to advance more socially responsible forms of governance. In presenting these, we hope to stimulate further debate and action towards improved forms of governance, particularly as these new genomic tools and products continue to emerge. PMID:27144921

  13. Essential Features of Responsible Governance of Agricultural Biotechnology.

    PubMed

    Hartley, Sarah; Gillund, Frøydis; van Hove, Lilian; Wickson, Fern

    2016-05-01

    Agricultural biotechnology continues to generate considerable controversy. We argue that to address this controversy, serious changes to governance are needed. The new wave of genomic tools and products (e.g., CRISPR, gene drives, RNAi, synthetic biology, and genetically modified [GM] insects and fish), provide a particularly useful opportunity to reflect on and revise agricultural biotechnology governance. In response, we present five essential features to advance more socially responsible forms of governance. In presenting these, we hope to stimulate further debate and action towards improved forms of governance, particularly as these new genomic tools and products continue to emerge.

  14. A Training Tool and Methodology to Allow Concurrent Multidisciplinary Experimental Projects in Engineering Education

    ERIC Educational Resources Information Center

    Maseda, F. J.; Martija, I.; Martija, I.

    2012-01-01

    This paper describes a novel Electrical Machine and Power Electronic Training Tool (EM&PE[subscript TT]), a methodology for using it, and associated experimental educational activities. The training tool is implemented by recreating a whole power electronics system, divided into modular blocks. This process is similar to that applied when…

  15. A dorsolateral prefrontal cortex semi-automatic segmenter

    NASA Astrophysics Data System (ADS)

    Al-Hakim, Ramsey; Fallon, James; Nain, Delphine; Melonakos, John; Tannenbaum, Allen

    2006-03-01

    Structural, functional, and clinical studies in schizophrenia have, for several decades, consistently implicated dysfunction of the prefrontal cortex in the etiology of the disease. Functional and structural imaging studies, combined with clinical, psychometric, and genetic analyses in schizophrenia have confirmed the key roles played by the prefrontal cortex and closely linked "prefrontal system" structures such as the striatum, amygdala, mediodorsal thalamus, substantia nigra-ventral tegmental area, and anterior cingulate cortices. The nodal structure of the prefrontal system circuit is the dorsal lateral prefrontal cortex (DLPFC), or Brodmann area 46, which also appears to be the most commonly studied and cited brain area with respect to schizophrenia. 1, 2, 3, 4 In 1986, Weinberger et. al. tied cerebral blood flow in the DLPFC to schizophrenia.1 In 2001, Perlstein et. al. demonstrated that DLPFC activation is essential for working memory tasks commonly deficient in schizophrenia. 2 More recently, groups have linked morphological changes due to gene deletion and increased DLPFC glutamate concentration to schizophrenia. 3, 4 Despite the experimental and clinical focus on the DLPFC in structural and functional imaging, the variability of the location of this area, differences in opinion on exactly what constitutes DLPFC, and inherent difficulties in segmenting this highly convoluted cortical region have contributed to a lack of widely used standards for manual or semi-automated segmentation programs. Given these implications, we developed a semi-automatic tool to segment the DLPFC from brain MRI scans in a reproducible way to conduct further morphological and statistical studies. The segmenter is based on expert neuroanatomist rules (Fallon-Kindermann rules), inspired by cytoarchitectonic data and reconstructions presented by Rajkowska and Goldman-Rakic. 5 It is semi-automated to provide essential user interactivity. We present our results and provide details on our DLPFC open-source tool.

  16. Who Are the Top Contributors in a MOOC? Relating Participants' Performance and Contributions

    ERIC Educational Resources Information Center

    Alario-Hoyos, C.; Muñoz-Merino, P. J.; Pérez-Sanagustín, M.; Delgado Kloos, C.; Parada Gelvez, H. A.

    2016-01-01

    The role of social tools in massive open online courses (MOOCs) is essential as they connect participants. Of all the participants in an MOOC, top contributors are the ones who more actively contribute via social tools. This article analyses and reports empirical data from five different social tools pertaining to an actual MOOC to characterize…

  17. Experimental validation of the RATE tool for inferring HLA restrictions of T cell epitopes.

    PubMed

    Paul, Sinu; Arlehamn, Cecilia S Lindestam; Schulten, Veronique; Westernberg, Luise; Sidney, John; Peters, Bjoern; Sette, Alessandro

    2017-06-21

    The RATE tool was recently developed to computationally infer the HLA restriction of given epitopes from immune response data of HLA typed subjects without additional cumbersome experimentation. Here, RATE was validated using experimentally defined restriction data from a set of 191 tuberculosis-derived epitopes and 63 healthy individuals with MTB infection from the Western Cape Region of South Africa. Using this experimental dataset, the parameters utilized by the RATE tool to infer restriction were optimized, which included relative frequency (RF) of the subjects responding to a given epitope and expressing a given allele as compared to the general test population and the associated p-value in a Fisher's exact test. We also examined the potential for further optimization based on the predicted binding affinity of epitopes to potential restricting HLA alleles, and the absolute number of individuals expressing a given allele and responding to the specific epitope. Different statistical measures, including Matthew's correlation coefficient, accuracy, sensitivity and specificity were used to evaluate performance of RATE as a function of these criteria. Based on our results we recommend selection of HLA restrictions with cutoffs of p-value < 0.01 and RF ≥ 1.3. The usefulness of the tool was demonstrated by inferring new HLA restrictions for epitope sets where restrictions could not be experimentally determined due to lack of necessary cell lines and for an additional data set related to recognition of pollen derived epitopes from allergic patients. Experimental data sets were used to validate RATE tool and the parameters used by the RATE tool to infer restriction were optimized. New HLA restrictions were identified using the optimized RATE tool.

  18. A computational approach for predicting off-target toxicity of antiviral ribonucleoside analogues to mitochondrial RNA polymerase.

    PubMed

    Freedman, Holly; Winter, Philip; Tuszynski, Jack; Tyrrell, D Lorne; Houghton, Michael

    2018-06-22

    In the development of antiviral drugs that target viral RNA-dependent RNA polymerases, off-target toxicity caused by the inhibition of the human mitochondrial RNA polymerase (POLRMT) is a major liability. Therefore, it is essential that all new ribonucleoside analogue drugs be accurately screened for POLRMT inhibition. A computational tool that can accurately predict NTP binding to POLRMT could assist in evaluating any potential toxicity and in designing possible salvaging strategies. Using the available crystal structure of POLRMT bound to an RNA transcript, here we created a model of POLRMT with an NTP molecule bound in the active site. Furthermore, we implemented a computational screening procedure that determines the relative binding free energy of an NTP analogue to POLRMT by free energy perturbation (FEP), i.e. a simulation in which the natural NTP molecule is slowly transformed into the analogue and back. In each direction, the transformation was performed over 40 ns of simulation on our IBM Blue Gene Q supercomputer. This procedure was validated across a panel of drugs for which experimental dissociation constants were available, showing that NTP relative binding free energies could be predicted to within 0.97 kcal/mol of the experimental values on average. These results demonstrate for the first time that free-energy simulation can be a useful tool for predicting binding affinities of NTP analogues to a polymerase. We expect that our model, together with similar models of viral polymerases, will be very useful in the screening and future design of NTP inhibitors of viral polymerases that have no mitochondrial toxicity. © 2018 Freedman et al.

  19. Tool Removes Coil-Spring Thread Inserts

    NASA Technical Reports Server (NTRS)

    Collins, Gerald J., Jr.; Swenson, Gary J.; Mcclellan, J. Scott

    1991-01-01

    Tool removes coil-spring thread inserts from threaded holes. Threads into hole, pries insert loose, grips insert, then pulls insert to thread it out of hole. Effects essentially reverse of insertion process to ease removal and avoid further damage to threaded inner surface of hole.

  20. Heart rate variability in normal and pathological sleep.

    PubMed

    Tobaldini, Eleonora; Nobili, Lino; Strada, Silvia; Casali, Karina R; Braghiroli, Alberto; Montano, Nicola

    2013-10-16

    Sleep is a physiological process involving different biological systems, from molecular to organ level; its integrity is essential for maintaining health and homeostasis in human beings. Although in the past sleep has been considered a state of quiet, experimental and clinical evidences suggest a noteworthy activation of different biological systems during sleep. A key role is played by the autonomic nervous system (ANS), whose modulation regulates cardiovascular functions during sleep onset and different sleep stages. Therefore, an interest on the evaluation of autonomic cardiovascular control in health and disease is growing by means of linear and non-linear heart rate variability (HRV) analyses. The application of classical tools for ANS analysis, such as HRV during physiological sleep, showed that the rapid eye movement (REM) stage is characterized by a likely sympathetic predominance associated with a vagal withdrawal, while the opposite trend is observed during non-REM sleep. More recently, the use of non-linear tools, such as entropy-derived indices, have provided new insight on the cardiac autonomic regulation, revealing for instance changes in the cardiovascular complexity during REM sleep, supporting the hypothesis of a reduced capability of the cardiovascular system to deal with stress challenges. Interestingly, different HRV tools have been applied to characterize autonomic cardiac control in different pathological conditions, from neurological sleep disorders to sleep disordered breathing (SDB). In summary, linear and non-linear analysis of HRV are reliable approaches to assess changes of autonomic cardiac modulation during sleep both in health and diseases. The use of these tools could provide important information of clinical and prognostic relevance.

  1. FOCUS: Essential Elements of Quality for State-Funded Preschool Programs

    ERIC Educational Resources Information Center

    New Mexico Public Education Department, 2016

    2016-01-01

    The "FOCUS: Essential Elements of Quality, New Mexico's Tiered Quality Rating and Improvement System (TQRIS)," provides early childhood program personnel with the criteria, tools, and resources they need to improve the quality of their program. These quality improvements focus on children's growth, development, and learning--so that each…

  2. Minimum Essential Requirements and Standards in Medical Education.

    ERIC Educational Resources Information Center

    Wojtczak, Andrzej; Schwarz, M. Roy

    2000-01-01

    Reviews the definition of standards in general, and proposes a definition of standards and global minimum essential requirements for use in medical education. Aims to serve as a tool for the improvement of quality and international comparisons of basic medical programs. Explains the IIME (Institute for International Medical Education) project…

  3. Essentials for the Teacher's Toolbox

    ERIC Educational Resources Information Center

    Uhler, Jennifer

    2012-01-01

    Every profession has a set of essential tools for carrying out its work. Airplane mechanics cannot repair engines without sophisticated diagnostics, wrenches, and pliers. Surgeons cannot operate without scalpels and clamps. In contrast, teaching has often been perceived as a profession requiring only students, chalk, and a blackboard in order for…

  4. Distillation time as tool for improved antimalarial activity and differential oil composition of cumin seed oil

    USDA-ARS?s Scientific Manuscript database

    A steam distillation extraction kinetics experiment was conducted to estimate essential oil yield, composition, antimalarial, and antioxidant capacity of cumin (Cuminum cyminum L.) seed (fruits). Furthermore, regression models were developed to predict essential oil yield and composition for a given...

  5. Help Seeking: Agentic Learners Initiating Feedback

    ERIC Educational Resources Information Center

    Fletcher, Anna Katarina

    2018-01-01

    Effective feedback is an essential tool for making learning explicit and an essential feature of classroom practice that promotes learner autonomy. Yet, it remains a pressing challenge for teachers to scaffold the active involvement of students as critical, reflective and autonomous learners who use feedback constructively. This paper seeks to…

  6. Experimental evaluation of tool run-out in micro milling

    NASA Astrophysics Data System (ADS)

    Attanasio, Aldo; Ceretti, Elisabetta

    2018-05-01

    This paper deals with micro milling cutting process focusing the attention on tool run-out measurement. In fact, among the effects of the scale reduction from macro to micro (i.e., size effects) tool run-out plays an important role. This research is aimed at developing an easy and reliable method to measure tool run-out in micro milling based on experimental tests and an analytical model. From an Industry 4.0 perspective this measuring strategy can be integrated into an adaptive system for controlling cutting forces, with the objective of improving the production quality, the process stability, reducing at the same time the tool wear and the machining costs. The proposed procedure estimates the tool run-out parameters from the tool diameter, the channel width, and the phase angle between the cutting edges. The cutting edge phase measurement is based on the force signal analysis. The developed procedure has been tested on data coming from micro milling experimental tests performed on a Ti6Al4V sample. The results showed that the developed procedure can be successfully used for tool run-out estimation.

  7. Video analysis of projectile motion using tablet computers as experimental tools

    NASA Astrophysics Data System (ADS)

    Klein, P.; Gröber, S.; Kuhn, J.; Müller, A.

    2014-01-01

    Tablet computers were used as experimental tools to record and analyse the motion of a ball thrown vertically from a moving skateboard. Special applications plotted the measurement data component by component, allowing a simple determination of initial conditions and g in order to explore the underlying laws of motion. This experiment can easily be performed by students themselves, providing more autonomy in their problem-solving processes than traditional learning approaches. We believe that this autonomy and the authenticity of the experimental tool both foster their motivation.

  8. Experimental and analytical tools for evaluation of Stirling engine rod seal behavior

    NASA Technical Reports Server (NTRS)

    Krauter, A. I.; Cheng, H. S.

    1979-01-01

    The first year of a two year experimental and analytical program is reported. The program is directed at the elastohydrodynamic behavior of sliding elastomeric rod seals for the Stirling engine. During the year, experimental and analytical tools were developed for evaluating seal leakage, seal friction, and the fluid film thickness at the seal/cylinder interface.

  9. IMG-ABC: A Knowledge Base To Fuel Discovery of Biosynthetic Gene Clusters and Novel Secondary Metabolites.

    PubMed

    Hadjithomas, Michalis; Chen, I-Min Amy; Chu, Ken; Ratner, Anna; Palaniappan, Krishna; Szeto, Ernest; Huang, Jinghua; Reddy, T B K; Cimermančič, Peter; Fischbach, Michael A; Ivanova, Natalia N; Markowitz, Victor M; Kyrpides, Nikos C; Pati, Amrita

    2015-07-14

    In the discovery of secondary metabolites, analysis of sequence data is a promising exploration path that remains largely underutilized due to the lack of computational platforms that enable such a systematic approach on a large scale. In this work, we present IMG-ABC (https://img.jgi.doe.gov/abc), an atlas of biosynthetic gene clusters within the Integrated Microbial Genomes (IMG) system, which is aimed at harnessing the power of "big" genomic data for discovering small molecules. IMG-ABC relies on IMG's comprehensive integrated structural and functional genomic data for the analysis of biosynthetic gene clusters (BCs) and associated secondary metabolites (SMs). SMs and BCs serve as the two main classes of objects in IMG-ABC, each with a rich collection of attributes. A unique feature of IMG-ABC is the incorporation of both experimentally validated and computationally predicted BCs in genomes as well as metagenomes, thus identifying BCs in uncultured populations and rare taxa. We demonstrate the strength of IMG-ABC's focused integrated analysis tools in enabling the exploration of microbial secondary metabolism on a global scale, through the discovery of phenazine-producing clusters for the first time in Alphaproteobacteria. IMG-ABC strives to fill the long-existent void of resources for computational exploration of the secondary metabolism universe; its underlying scalable framework enables traversal of uncovered phylogenetic and chemical structure space, serving as a doorway to a new era in the discovery of novel molecules. IMG-ABC is the largest publicly available database of predicted and experimental biosynthetic gene clusters and the secondary metabolites they produce. The system also includes powerful search and analysis tools that are integrated with IMG's extensive genomic/metagenomic data and analysis tool kits. As new research on biosynthetic gene clusters and secondary metabolites is published and more genomes are sequenced, IMG-ABC will continue to expand, with the goal of becoming an essential component of any bioinformatic exploration of the secondary metabolism world. Copyright © 2015 Hadjithomas et al.

  10. Treatment of experimental pythiosis with essential oils of Origanum vulgare and Mentha piperita singly, in association and in combination with immunotherapy.

    PubMed

    Fonseca, Anelise O S; Pereira, Daniela I B; Botton, Sônia A; Pötter, Luciana; Sallis, Elisa S V; Júnior, Sérgio F V; Filho, Fernando S M; Zambrano, Cristina Gomes; Maroneze, Beatriz P; Valente, Julia S S; Baptista, Cristiane T; Braga, Caroline Q; Ben, Vanessa Dal; Meireles, Mario C A

    2015-08-05

    This study investigated the in vivo antimicrobial activity of the essential oils of Origanum vulgare and Mentha piperita both singly, associated and in combination with immunotherapy to treat experimental pythiosis. The disease was reproduced in 18 rabbits divided into six groups (n=3): group 1, control; group 2, treated with essential oil of Mentha piperita; group 3, treated with essential oil of Origanum vulgare; group 4, treated with commercial immunotherapic; group 5, treated with a association of oils of M. piperita and O. vulgare and group 6, treated with a combination of both oils plus immunotherapy. Essential oils were added in a topical cream base formula, and lesions were treated daily for 45 days. The animals in groups 4 and 6 received a dose of immunotherapeutic agent every 14 days. The results revealed that the evolution of lesions in groups 5 and 6 did not differ from one another but differed from the other groups. The lesions of group 5 increased 3.16 times every measurement, while those of group 6 increased 1.83 times, indicating that the smallest growth of the lesions occurred when the combination of therapies were used. A rabbit from group 5 showed clinical cure at day 20 of treatment. This research is the pioneer in the treatment of experimental pythiosis using essential oils from medicinal plants and a combination of therapies. This study demonstrated that the use of essential oils can be a viable alternative treatment to cutaneous pythiosis, particularly when used in association or combination with immunotherapy. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Utilizing Technology to Enhance Learning Environments: The Net Gen Student

    ERIC Educational Resources Information Center

    Muhammad, Amanda J.; Mitova, Mariana A.; Wooldridge, Deborah G.

    2016-01-01

    It is essential for instructors to understand the importance of classroom technology so they can prepare to use it to personalize students' learning. Strategies for choosing effective electronic tools are presented, followed by specific suggestions for designing enhanced personalized learning using electronic tools.

  12. New Texts, New Tools: An Argument for Media Literacy.

    ERIC Educational Resources Information Center

    McBrien, J. Lynn

    1999-01-01

    Adults cannot adequately prevent their children from observing media messages. Students are actually safer if they are educated about analyzing and assessing unsavory messages for themselves. Appropriate media-literacy pedagogy involves five essential elements: background, tools, deconstruction of media techniques, product evaluation, and original…

  13. FPI: FM Success through Analytics

    ERIC Educational Resources Information Center

    Hickling, Duane

    2013-01-01

    The APPA Facilities Performance Indicators (FPI) is perhaps one of the most powerful analytical tools that institutional facilities professionals have at their disposal. It is a diagnostic facilities performance management tool that addresses the essential questions that facilities executives must answer to effectively perform their roles. It…

  14. SEA: a super-enhancer archive.

    PubMed

    Wei, Yanjun; Zhang, Shumei; Shang, Shipeng; Zhang, Bin; Li, Song; Wang, Xinyu; Wang, Fang; Su, Jianzhong; Wu, Qiong; Liu, Hongbo; Zhang, Yan

    2016-01-04

    Super-enhancers are large clusters of transcriptional enhancers regarded as having essential roles in driving the expression of genes that control cell identity during development and tumorigenesis. The construction of a genome-wide super-enhancer database is urgently needed to better understand super-enhancer-directed gene expression regulation for a given biology process. Here, we present a specifically designed web-accessible database, Super-Enhancer Archive (SEA, http://sea.edbc.org). SEA focuses on integrating super-enhancers in multiple species and annotating their potential roles in the regulation of cell identity gene expression. The current release of SEA incorporates 83 996 super-enhancers computationally or experimentally identified in 134 cell types/tissues/diseases, including human (75 439, three of which were experimentally identified), mouse (5879, five of which were experimentally identified), Drosophila melanogaster (1774) and Caenorhabditis elegans (904). To facilitate data extraction, SEA supports multiple search options, including species, genome location, gene name, cell type/tissue and super-enhancer name. The response provides detailed (epi)genetic information, incorporating cell type specificity, nearby genes, transcriptional factor binding sites, CRISPR/Cas9 target sites, evolutionary conservation, SNPs, H3K27ac, DNA methylation, gene expression and TF ChIP-seq data. Moreover, analytical tools and a genome browser were developed for users to explore super-enhancers and their roles in defining cell identity and disease processes in depth. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  15. Characterization of Unsteady Flow Structures Near Leading-Edge Slat. Part 1; PIV Measurements

    NASA Technical Reports Server (NTRS)

    Jenkins, Luther N.; Khorrami, Mehdi R.; Choudhari, Meelan

    2004-01-01

    A comprehensive computational and experimental study has been performed at the NASA Langley Research Center as part of the Quiet Aircraft Technology (QAT) Program to investigate the unsteady flow near a leading-edge slat of a two-dimensional, high-lift system. This paper focuses on the experimental effort conducted in the NASA Langley Basic Aerodynamics Research Tunnel (BART) where Particle Image Velocimetry (PIV) data was acquired in the slat cove and at the slat trailing edge of a three-element, high-lift model at 4, 6, and 8 degrees angle of attack and a freestream Mach Number of 0.17. Instantaneous velocities obtained from PIV images are used to obtain mean and fluctuating components of velocity and vorticity. The data show the recirculation in the cove, reattachment of the shear layer on the slat lower surface, and discrete vortical structures within the shear layer emanating from the slat cusp and slat trailing edge. Detailed measurements are used to examine the shear layer formation at the slat cusp, vortex shedding at the slat trailing edge, and convection of vortical structures through the slat gap. Selected results are discussed and compared with unsteady, Reynolds-Averaged Navier-Stokes (URANS) computations for the same configuration in a companion paper by Khorrami, Choudhari, and Jenkins (2004). The experimental dataset provides essential flow-field information for the validation of near-field inputs to noise prediction tools.

  16. Genetic drift and selection in many-allele range expansions.

    PubMed

    Weinstein, Bryan T; Lavrentovich, Maxim O; Möbius, Wolfram; Murray, Andrew W; Nelson, David R

    2017-12-01

    We experimentally and numerically investigate the evolutionary dynamics of four competing strains of E. coli with differing expansion velocities in radially expanding colonies. We compare experimental measurements of the average fraction, correlation functions between strains, and the relative rates of genetic domain wall annihilations and coalescences to simulations modeling the population as a one-dimensional ring of annihilating and coalescing random walkers with deterministic biases due to selection. The simulations reveal that the evolutionary dynamics can be collapsed onto master curves governed by three essential parameters: (1) an expansion length beyond which selection dominates over genetic drift; (2) a characteristic angular correlation describing the size of genetic domains; and (3) a dimensionless constant quantifying the interplay between a colony's curvature at the frontier and its selection length scale. We measure these parameters with a new technique that precisely measures small selective differences between spatially competing strains and show that our simulations accurately predict the dynamics without additional fitting. Our results suggest that the random walk model can act as a useful predictive tool for describing the evolutionary dynamics of range expansions composed of an arbitrary number of genotypes with different fitnesses.

  17. Control of the Pore Texture in Nanoporous Silicon via Chemical Dissolution.

    PubMed

    Secret, Emilie; Wu, Chia-Chen; Chaix, Arnaud; Galarneau, Anne; Gonzalez, Philippe; Cot, Didier; Sailor, Michael J; Jestin, Jacques; Zanotti, Jean-Marc; Cunin, Frédérique; Coasne, Benoit

    2015-07-28

    The surface and textural properties of porous silicon (pSi) control many of its physical properties essential to its performance in key applications such as optoelectronics, energy storage, luminescence, sensing, and drug delivery. Here, we combine experimental and theoretical tools to demonstrate that the surface roughness at the nanometer scale of pSi can be tuned in a controlled fashion using partial thermal oxidation followed by removal of the resulting silicon oxide layer with hydrofluoric acid (HF) solution. Such a process is shown to smooth the pSi surface by means of nitrogen adsorption, electron microscopy, and small-angle X-ray and neutron scattering. Statistical mechanics Monte Carlo simulations, which are consistent with the experimental data, support the interpretation that the pore surface is initially rough and that the oxidation/oxide removal procedure diminishes the surface roughness while increasing the pore diameter. As a specific example considered in this work, the initial roughness ξ ∼ 3.2 nm of pSi pores having a diameter of 7.6 nm can be decreased to 1.0 nm following the simple procedure above. This study allows envisioning the design of pSi samples with optimal surface properties toward a specific process.

  18. Flexible structure control experiments using a real-time workstation for computer-aided control engineering

    NASA Technical Reports Server (NTRS)

    Stieber, Michael E.

    1989-01-01

    A Real-Time Workstation for Computer-Aided Control Engineering has been developed jointly by the Communications Research Centre (CRC) and Ruhr-Universitaet Bochum (RUB), West Germany. The system is presently used for the development and experimental verification of control techniques for large space systems with significant structural flexibility. The Real-Time Workstation essentially is an implementation of RUB's extensive Computer-Aided Control Engineering package KEDDC on an INTEL micro-computer running under the RMS real-time operating system. The portable system supports system identification, analysis, control design and simulation, as well as the immediate implementation and test of control systems. The Real-Time Workstation is currently being used by CRC to study control/structure interaction on a ground-based structure called DAISY, whose design was inspired by a reflector antenna. DAISY emulates the dynamics of a large flexible spacecraft with the following characteristics: rigid body modes, many clustered vibration modes with low frequencies and extremely low damping. The Real-Time Workstation was found to be a very powerful tool for experimental studies, supporting control design and simulation, and conducting and evaluating tests withn one integrated environment.

  19. A simple analytical method for determining the atmospheric dispersion of upward-directed high velocity releases

    NASA Astrophysics Data System (ADS)

    Palazzi, E.

    The evaluation of atmospheric dispersion of a cloud, arising from a sudden release of flammable or toxic materials, is an essential tool for properly designing flares, vents and other safety devices and to quantify the potential risk related to the existing ones or arising from the various kinds of accidents which can occur in chemical plants. Among the methods developed to treat the important case of upward-directed jets, Hoehne's procedure for determining the behaviour and extent of flammability zone is extensively utilized, particularly concerning petrochemical plants. In a previous study, a substantial simplification of the aforesaid procedure was achieved, by correlating the experimental data with an empirical formula, allowing to obtain a mathematical description of the boundaries of the flammable cloud. Following a theoretical approach, a most general model is developed in the present work, applicable to the various kinds of design problems and/or risk evaluation regarding upward-directed releases from high velocity sources. It is also demonstrated that the model gives conservative results, if applied outside the range of the Hoehne's experimental conditions. Moreover, with simple modifications, the same approach could be easily applied to deal with the atmospheric dispersion of anyhow directed releases.

  20. Quasi-experimental study designs series-paper 11: supporting the production and use of health systems research syntheses that draw on quasi-experimental study designs.

    PubMed

    Lavis, John N; Bärnighausen, Till; El-Jardali, Fadi

    2017-09-01

    To describe the infrastructure available to support the production of policy-relevant health systems research syntheses, particularly those incorporating quasi-experimental evidence, and the tools available to support the use of these syntheses. Literature review. The general challenges associated with the available infrastructure include their sporadic nature or limited coverage of issues and countries, whereas the specific ones related to policy-relevant syntheses of quasi-experimental evidence include the lack of mechanism to register synthesis titles and scoping review protocols, the limited number of groups preparing user-friendly summaries, and the difficulty of finding quasi-experimental studies for inclusion in rapid syntheses and research syntheses more generally. Although some new tools have emerged in recent years, such as guidance workbooks and citizen briefs and panels, challenges related to using available tools to support the use of policy-relevant syntheses of quasi-experimental evidence arise from such studies potentially being harder for policymakers and stakeholders to commission and understand. Policymakers, stakeholders, and researchers need to expand the coverage and institutionalize the use of the available infrastructure and tools to support the use of health system research syntheses containing quasi-experimental evidence. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Electron Production and Collective Field Generation in Intense Particle Beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Molvik, A W; Vay, J; Cohen, R

    Electron cloud effects (ECEs) are increasingly recognized as important, but incompletely understood, dynamical phenomena, which can severely limit the performance of present electron colliders, the next generation of high-intensity rings, such as PEP-II upgrade, LHC, and the SNS, the SIS 100/200, or future high-intensity heavy ion accelerators such as envisioned in Heavy Ion Inertial Fusion (HIF). Deleterious effects include ion-electron instabilities, emittance growth, particle loss, increase in vacuum pressure, added heat load at the vacuum chamber walls, and interference with certain beam diagnostics. Extrapolation of present experience to significantly higher beam intensities is uncertain given the present level of understanding.more » With coordinated LDRD projects at LLNL and LBNL, we undertook a comprehensive R&D program including experiments, theory and simulations to better understand the phenomena, establish the essential parameters, and develop mitigating mechanisms. This LDRD project laid the essential groundwork for such a program. We developed insights into the essential processes, modeled the relevant physics, and implemented these models in computational production tools that can be used for self-consistent study of the effect on ion beams. We validated the models and tools through comparison with experimental data, including data from new diagnostics that we developed as part of this work and validated on the High-Current Experiment (HCX) at LBNL. We applied these models to High-Energy Physics (HEP) and other advanced accelerators. This project was highly successful, as evidenced by the two paragraphs above, and six paragraphs following that are taken from our 2003 proposal with minor editing that mostly consisted of changing the tense. Further benchmarks of outstanding performance are: we had 13 publications with 8 of them in refereed journals, our work was recognized by the accelerator and plasma physics communities by 8 invited papers and we have 5 additional invitations for invited papers at upcoming conferences, we attracted collaborators who had SBIR funding, we are collaborating with scientists at CERN and GSI Darmstadt on gas desorption physics for submission to Physical Review Letters, and another PRL on absolute measurements of electron cloud density and Phys. Rev. ST-AB on electron emission physics are also being readied for submission.« less

  2. The Influence of Roughness on Gear Surface Fatigue

    NASA Technical Reports Server (NTRS)

    Krantz, Timothy

    2005-01-01

    Gear working surfaces are subjected to repeated rolling and sliding contacts, and often designs require loads sufficient to cause eventual fatigue of the surface. This research provides experimental data and analytical tools to further the understanding of the causal relationship of gear surface roughness to surface fatigue. The research included evaluations and developments of statistical tools for gear fatigue data, experimental evaluation of the surface fatigue lives of superfinished gears with a near-mirror quality, and evaluations of the experiments by analytical methods and surface inspections. Alternative statistical methods were evaluated using Monte Carlo studies leading to a final recommendation to describe gear fatigue data using a Weibull distribution, maximum likelihood estimates of shape and scale parameters, and a presumed zero-valued location parameter. A new method was developed for comparing two datasets by extending the current methods of likelihood-ratio based statistics. The surface fatigue lives of superfinished gears were evaluated by carefully controlled experiments, and it is shown conclusively that superfinishing of gears can provide for significantly greater lives relative to ground gears. The measured life improvement was approximately a factor of five. To assist with application of this finding to products, the experimental condition was evaluated. The fatigue life results were expressed in terms of specific film thickness and shown to be consistent with bearing data. Elastohydrodynamic and stress analyses were completed to relate the stress condition to fatigue. Smooth-surface models do not adequately explain the improved fatigue lives. Based on analyses using a rough surface model, it is concluded that the improved fatigue lives of superfinished gears is due to a reduced rate of near-surface micropitting fatigue processes, not due to any reduced rate of spalling (sub-surface) fatigue processes. To complete the evaluations, surface inspection were completed. The surface topographies of the ground gears changed substantially due to running, but the topographies of the superfinished gears were essentially unchanged with running.

  3. Institutional Dashboards: Navigational Tool for Colleges and Universities. Professional File. Number 123, Winter 2012

    ERIC Educational Resources Information Center

    Terkla, Dawn Geronimo; Sharkness, Jessica; Cohen, Margaret; Roscoe, Heather S.; Wiseman, Marjorie

    2012-01-01

    In an age in which information and data are more readily available than ever, it is critical for higher education institutions to develop tools that can communicate essential information to those who make decisions in an easy-to-understand format. One of the tools available for this purpose is a dashboard, a one- to two-page document that presents…

  4. Isothermal titration calorimetry for measuring macromolecule-ligand affinity.

    PubMed

    Duff, Michael R; Grubbs, Jordan; Howell, Elizabeth E

    2011-09-07

    Isothermal titration calorimetry (ITC) is a useful tool for understanding the complete thermodynamic picture of a binding reaction. In biological sciences, macromolecular interactions are essential in understanding the machinery of the cell. Experimental conditions, such as buffer and temperature, can be tailored to the particular binding system being studied. However, careful planning is needed since certain ligand and macromolecule concentration ranges are necessary to obtain useful data. Concentrations of the macromolecule and ligand need to be accurately determined for reliable results. Care also needs to be taken when preparing the samples as impurities can significantly affect the experiment. When ITC experiments, along with controls, are performed properly, useful binding information, such as the stoichiometry, affinity and enthalpy, are obtained. By running additional experiments under different buffer or temperature conditions, more detailed information can be obtained about the system. A protocol for the basic setup of an ITC experiment is given.

  5. Isothermal Titration Calorimetry for Measuring Macromolecule-Ligand Affinity

    PubMed Central

    Duff,, Michael R.; Grubbs, Jordan; Howell, Elizabeth E.

    2011-01-01

    Isothermal titration calorimetry (ITC) is a useful tool for understanding the complete thermodynamic picture of a binding reaction. In biological sciences, macromolecular interactions are essential in understanding the machinery of the cell. Experimental conditions, such as buffer and temperature, can be tailored to the particular binding system being studied. However, careful planning is needed since certain ligand and macromolecule concentration ranges are necessary to obtain useful data. Concentrations of the macromolecule and ligand need to be accurately determined for reliable results. Care also needs to be taken when preparing the samples as impurities can significantly affect the experiment. When ITC experiments, along with controls, are performed properly, useful binding information, such as the stoichiometry, affinity and enthalpy, are obtained. By running additional experiments under different buffer or temperature conditions, more detailed information can be obtained about the system. A protocol for the basic setup of an ITC experiment is given. PMID:21931288

  6. Digital Literacy Development of Students Involved in an ICT Educational Project

    NASA Astrophysics Data System (ADS)

    Quintana, Maria Graciela Badilla; Pujol, Meritxell Cortada

    The impact of the Information and Communication Technologies (ICT) has become the core of a change that involves most of the society fields, consequently the technological and informational literacy are essential requirements in education. The research is a quasi-experimental and ex-post-facto study in schools from Spain. The aim was to describe and analyze the involvement showed by 219 students who participated in a development of ICT's Project named Ponte dos Brozos. The research objective was to respond if the students who usually worked with ICT, had better knowledge and management with computing tools, and if they are better prepared in researching and selecting information. Results showed that students who have a higher contact with ICTs know about the technology and how to use it, also better knowledge and control of the computer and operative systems, a high information management level trough the Internet, although their literacy in information is devoid.

  7. Flow measurement around a model ship with propeller and rudder

    NASA Astrophysics Data System (ADS)

    van, S. H.; Kim, W. J.; Yoon, H. S.; Lee, Y. Y.; Park, I. R.

    2006-04-01

    For the design of hull forms with better resistance and propulsive performance, it is essential to understand flow characteristics, such as wave and wake development, around a ship. Experimental data detailing the local flow characteristics are invaluable for the validation of the physical and numerical modeling of computational fluid dynamics (CFD) codes, which are recently gaining attention as efficient tools for hull form evaluation. This paper describes velocity and wave profiles measured in the towing tank for the KRISO 138,000 m3 LNG carrier model with propeller and rudder. The effects of propeller and rudder on the wake and wave profiles in the stern region are clearly identified. The results contained in this paper can provide an opportunity to explore integrated flow phenomena around a model ship in the self-propelled condition, and can be added to the International Towing Tank Conference benchmark data for CFD validation as the previous KCS and KVLCC cases.

  8. DOCKTITE-a highly versatile step-by-step workflow for covalent docking and virtual screening in the molecular operating environment.

    PubMed

    Scholz, Christoph; Knorr, Sabine; Hamacher, Kay; Schmidt, Boris

    2015-02-23

    The formation of a covalent bond with the target is essential for a number of successful drugs, yet tools for covalent docking without significant restrictions regarding warhead or receptor classes are rare and limited in use. In this work we present DOCKTITE, a highly versatile workflow for covalent docking in the Molecular Operating Environment (MOE) combining automated warhead screening, nucleophilic side chain attachment, pharmacophore-based docking, and a novel consensus scoring approach. The comprehensive validation study includes pose predictions of 35 protein/ligand complexes which resulted in a mean RMSD of 1.74 Å and a prediction rate of 71.4% with an RMSD below 2 Å, a virtual screening with an area under the curve (AUC) for the receiver operating characteristics (ROC) of 0.81, and a significant correlation between predicted and experimental binding affinities (ρ = 0.806, R(2) = 0.649, p < 0.005).

  9. Long-term synchronized electrophysiological and behavioral wireless monitoring of freely moving animals

    PubMed Central

    Grand, Laszlo; Ftomov, Sergiu; Timofeev, Igor

    2012-01-01

    Parallel electrophysiological recording and behavioral monitoring of freely moving animals is essential for a better understanding of the neural mechanisms underlying behavior. In this paper we describe a novel wireless recording technique, which is capable of synchronously recording in vivo multichannel electrophysiological (LFP, MUA, EOG, EMG) and activity data (accelerometer, video) from freely moving cats. The method is based on the integration of commercially available components into a simple monitoring system and is complete with accelerometers and the needed signal processing tools. LFP activities of freely moving group-housed cats were recorded from multiple intracortical areas and from the hippocampus. EMG, EOG, accelerometer and video were simultaneously acquired with LFP activities 24-h a day for 3 months. These recordings confirm the possibility of using our wireless method for 24-h long-term monitoring of neurophysiological and behavioral data of freely moving experimental animals such as cats, ferrets, rabbits and other large animals. PMID:23099345

  10. Buckling of Aluminium Sheet Components

    NASA Astrophysics Data System (ADS)

    Hegadekatte, Vishwanath; Shi, Yihai; Nardini, Dubravko

    Wrinkling is one of the major defects in sheet metal forming processes. It may become a serious obstacle to implementing the forming process and assembling the parts, and may also play a significant role in the wear of the tool. Wrinkling is essentially a local buckling phenomenon that results from compressive stresses (compressive instability) e.g., in the hoop direction for axi-symmetric systems such as beverage cans. Modern beverage can is a highly engineered product with a complex geometry. Therefore in order to understand wrinkling in such a complex system, we have started by studying wrinkling with the Yoshida buckling test. Further, we have studied the buckling of ideal and dented beverage cans under axial loading by laboratory testing. We have modelled the laboratory tests and also the imperfection sensitivity of the two systems using finite element method and the predictions are in qualitative agreement with experimental data.

  11. A decade of insights into grassland ecosystem responses to global environmental change

    USGS Publications Warehouse

    Borer, Elizabeth T.; Grace, James B.; Harpole, W. Stanley; MacDougall, Andrew S.; Seabloom, Eric W.

    2017-01-01

    Earth’s biodiversity and carbon uptake by plants, or primary productivity, are intricately interlinked, underlie many essential ecosystem processes, and depend on the interplay among environmental factors, many of which are being changed by human activities. While ecological theory generalizes across taxa and environments, most empirical tests of factors controlling diversity and productivity have been observational, single-site experiments, or meta-analyses, limiting our understanding of variation among site-level responses and tests of general mechanisms. A synthesis of results from ten years of a globally distributed, coordinated experiment, the Nutrient Network (NutNet), demonstrates that species diversity promotes ecosystem productivity and stability, and that nutrient supply and herbivory control diversity via changes in composition, including invasions of non-native species and extinction of native species. Distributed experimental networks are a powerful tool for tests and integration of multiple theories and for generating multivariate predictions about the effects of global changes on future ecosystems.

  12. FRODOCK 2.0: fast protein-protein docking server.

    PubMed

    Ramírez-Aportela, Erney; López-Blanco, José Ramón; Chacón, Pablo

    2016-08-01

    The prediction of protein-protein complexes from the structures of unbound components is a challenging and powerful strategy to decipher the mechanism of many essential biological processes. We present a user-friendly protein-protein docking server based on an improved version of FRODOCK that includes a complementary knowledge-based potential. The web interface provides a very effective tool to explore and select protein-protein models and interactively screen them against experimental distance constraints. The competitive success rates and efficiency achieved allow the retrieval of reliable potential protein-protein binding conformations that can be further refined with more computationally demanding strategies. The server is free and open to all users with no login requirement at http://frodock.chaconlab.org pablo@chaconlab.org Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  13. A DIY Ultrasonic Signal Generator for Sound Experiments

    NASA Astrophysics Data System (ADS)

    Riad, Ihab F.

    2018-02-01

    Many physics departments around the world have electronic and mechanical workshops attached to them that can help build experimental setups and instruments for research and the training of undergraduate students. The workshops are usually run by experienced technicians and equipped with expensive lathing, computer numerical control (CNC) machines, electric measuring instruments, and several other essential tools. However, in developing countries such as Sudan, the lack of qualified technicians and adequately equipped workshops hampers efforts by these departments to supplement their laboratories with the equipment they need. The only other option is to buy the needed equipment from specialized manufacturers. The latter option is not feasible for the departments in developing countries where funding for education and research is scarce and very limited and as equipment from these manufacturers is typically too expensive. These departments struggle significantly in equipping undergraduate teaching laboratories, and here we propose one way to address this.

  14. Matching multiple rigid domain decompositions of proteins

    PubMed Central

    Flynn, Emily; Streinu, Ileana

    2017-01-01

    We describe efficient methods for consistently coloring and visualizing collections of rigid cluster decompositions obtained from variations of a protein structure, and lay the foundation for more complex setups that may involve different computational and experimental methods. The focus here is on three biological applications: the conceptually simpler problems of visualizing results of dilution and mutation analyses, and the more complex task of matching decompositions of multiple NMR models of the same protein. Implemented into the KINARI web server application, the improved visualization techniques give useful information about protein folding cores, help examining the effect of mutations on protein flexibility and function, and provide insights into the structural motions of PDB proteins solved with solution NMR. These tools have been developed with the goal of improving and validating rigidity analysis as a credible coarse-grained model capturing essential information about a protein’s slow motions near the native state. PMID:28141528

  15. A wireless soil moisture sensor powered by solar energy.

    PubMed

    Jiang, Mingliang; Lv, Mouchao; Deng, Zhong; Zhai, Guoliang

    2017-01-01

    In a variety of agricultural activities, such as irrigation scheduling and nutrient management, soil water content is regarded as an essential parameter. Either power supply or long-distance cable is hardly available within field scale. For the necessity of monitoring soil water dynamics at field scale, this study presents a wireless soil moisture sensor based on the impedance transform of the frequency domain. The sensor system is powered by solar energy, and the data can be instantly transmitted by wireless communication. The sensor electrodes are embedded into the bottom of a supporting rod so that the sensor can measure soil water contents at different depths. An optimal design with time executing sequence is considered to reduce the energy consumption. The experimental results showed that the sensor is a promising tool for monitoring moisture in large-scale farmland using solar power and wireless communication.

  16. Accurate evaluation of sensitivity for calibration between a LiDAR and a panoramic camera used for remote sensing

    NASA Astrophysics Data System (ADS)

    García-Moreno, Angel-Iván; González-Barbosa, José-Joel; Ramírez-Pedraza, Alfonso; Hurtado-Ramos, Juan B.; Ornelas-Rodriguez, Francisco-Javier

    2016-04-01

    Computer-based reconstruction models can be used to approximate urban environments. These models are usually based on several mathematical approximations and the usage of different sensors, which implies dependency on many variables. The sensitivity analysis presented in this paper is used to weigh the relative importance of each uncertainty contributor into the calibration of a panoramic camera-LiDAR system. Both sensors are used for three-dimensional urban reconstruction. Simulated and experimental tests were conducted. For the simulated tests we analyze and compare the calibration parameters using the Monte Carlo and Latin hypercube sampling techniques. Sensitivity analysis for each variable involved into the calibration was computed by the Sobol method, which is based on the analysis of the variance breakdown, and the Fourier amplitude sensitivity test method, which is based on Fourier's analysis. Sensitivity analysis is an essential tool in simulation modeling and for performing error propagation assessments.

  17. [Research progress of probe design software of oligonucleotide microarrays].

    PubMed

    Chen, Xi; Wu, Zaoquan; Liu, Zhengchun

    2014-02-01

    DNA microarray has become an essential medical genetic diagnostic tool for its high-throughput, miniaturization and automation. The design and selection of oligonucleotide probes are critical for preparing gene chips with high quality. Several sets of probe design software have been developed and are available to perform this work now. Every set of the software aims to different target sequences and shows different advantages and limitations. In this article, the research and development of these sets of software are reviewed in line with three main criteria, including specificity, sensitivity and melting temperature (Tm). In addition, based on the experimental results from literatures, these sets of software are classified according to their applications. This review will be helpful for users to choose an appropriate probe-design software. It will also reduce the costs of microarrays, improve the application efficiency of microarrays, and promote both the research and development (R&D) and commercialization of high-performance probe design software.

  18. Finite element simulations of the head-brain responses to the top impacts of a construction helmet: Effects of the neck and body mass.

    PubMed

    Wu, John Z; Pan, Christopher S; Wimer, Bryan M; Rosen, Charles L

    2017-01-01

    Traumatic brain injuries are among the most common severely disabling injuries in the United States. Construction helmets are considered essential personal protective equipment for reducing traumatic brain injury risks at work sites. In this study, we proposed a practical finite element modeling approach that would be suitable for engineers to optimize construction helmet design. The finite element model includes all essential anatomical structures of a human head (i.e. skin, scalp, skull, cerebrospinal fluid, brain, medulla, spinal cord, cervical vertebrae, and discs) and all major engineering components of a construction helmet (i.e. shell and suspension system). The head finite element model has been calibrated using the experimental data in the literature. It is technically difficult to precisely account for the effects of the neck and body mass on the dynamic responses, because the finite element model does not include the entire human body. An approximation approach has been developed to account for the effects of the neck and body mass on the dynamic responses of the head-brain. Using the proposed model, we have calculated the responses of the head-brain during a top impact when wearing a construction helmet. The proposed modeling approach would provide a tool to improve the helmet design on a biomechanical basis.

  19. Contra-Rotating Open Rotor Tone Noise Prediction

    NASA Technical Reports Server (NTRS)

    Envia, Edmane

    2014-01-01

    Reliable prediction of contra-rotating open rotor (CROR) noise is an essential element of any strategy for the development of low-noise open rotor propulsion systems that can meet both the community noise regulations and the cabin noise limits. Since CROR noise spectra typically exhibits a preponderance of tones, significant efforts have been directed towards predicting their tone spectra. To that end, there has been an ongoing effort at NASA to assess various in-house open rotor tone noise prediction tools using a benchmark CROR blade set for which significant aerodynamic and acoustic data had been acquired in wind tunnel tests. In the work presented here, the focus is on the near-field noise of the benchmark open rotor blade set at the cruise condition. Using an analytical CROR tone noise model with input from high-fidelity aerodynamic simulations, detailed tone noise spectral predictions have been generated and compared with the experimental data. Comparisons indicate that the theoretical predictions are in good agreement with the data, especially for the dominant CROR tones and their overall sound pressure level. The results also indicate that, whereas individual rotor tones are well predicted by the linear sources (i.e., thickness and loading), for the interaction tones it is essential that the quadrupole sources be included in the analysis.

  20. Contra-Rotating Open Rotor Tone Noise Prediction

    NASA Technical Reports Server (NTRS)

    Envia, Edmane

    2014-01-01

    Reliable prediction of contra-rotating open rotor (CROR) noise is an essential element of any strategy for the development of low-noise open rotor propulsion systems that can meet both the community noise regulations and cabin noise limits. Since CROR noise spectra exhibit a preponderance of tones, significant efforts have been directed towards predicting their tone content. To that end, there has been an ongoing effort at NASA to assess various in-house open rotor tone noise prediction tools using a benchmark CROR blade set for which significant aerodynamic and acoustic data have been acquired in wind tunnel tests. In the work presented here, the focus is on the nearfield noise of the benchmark open rotor blade set at the cruise condition. Using an analytical CROR tone noise model with input from high-fidelity aerodynamic simulations, tone noise spectra have been predicted and compared with the experimental data. Comparisons indicate that the theoretical predictions are in good agreement with the data, especially for the dominant tones and for the overall sound pressure level of tones. The results also indicate that, whereas the individual rotor tones are well predicted by the combination of the thickness and loading sources, for the interaction tones it is essential that the quadrupole source is also included in the analysis.

  1. Creating a Minnesota Statewide SNAP-Ed Program Evaluation

    ERIC Educational Resources Information Center

    Gold, Abby; Barno, Trina Adler; Sherman, Shelley; Lovett, Kathleen; Hurtado, G. Ali

    2013-01-01

    Systematic evaluation is an essential tool for understanding program effectiveness. This article describes the pilot test of a statewide evaluation tool for the Supplemental Nutrition Assistance Program-Education (SNAP-Ed). A computer algorithm helped Community Nutrition Educators (CNEs) build surveys specific to their varied educational settings…

  2. A Resource Guide Identifying Technology Tools for Schools. Appendix

    ERIC Educational Resources Information Center

    Fox, Christine; Jones, Rachel

    2009-01-01

    SETDA and NASTID's "Technology Tools for Schools Resource Guide" provides definitions of key technology components and relevant examples, where appropriate as a glossary for educators. The guide also presents essential implementation and infrastructure considerations that decision makers should think about when implementing technology in schools.…

  3. Professional Development through Organizational Assessment: Using APPA's Facilities Management Evaluation Program

    ERIC Educational Resources Information Center

    Medlin, E. Lander; Judd, R. Holly

    2013-01-01

    APPA's Facilities Management Evaluation Program (FMEP) provides an integrated system to optimize organizational performance. The criteria for evaluation not only provide a tool for organizational continuous improvement, they serve as a compelling leadership development tool essential for today's facilities management professional. The senior…

  4. A Comparison of Systematic Screening Tools for Emotional and Behavioral Disorders

    ERIC Educational Resources Information Center

    Lane, Kathleen Lynne; Little, M. Annette; Casey, Amy M.; Lambert, Warren; Wehby, Joseph; Weisenbach, Jessica L.; Phillips, Andrea

    2009-01-01

    Early identification of students who might develop emotional and behavioral disorders (EBD) is essential in preventing negative outcomes. Systematic screening tools are available for identifying elementary-age students with EBD, including the "Systematic Screening for Behavior Disorders" (SSBD) and the "Student Risk Screening…

  5. Establishing Time for Professional Learning

    ERIC Educational Resources Information Center

    Journal of Staff Development, 2013

    2013-01-01

    Time for collaborative learning is an essential resource for educators working to implement college- and career-ready standards. The pages in this article include tools from the workbook "Establishing Time for Professional Learning." The tools support a complete process to help educators effectively find and use time. The following…

  6. Bridging the Educational Research-Teaching Practice Gap: Foundations for Assessing and Developing Biochemistry Students' Visual Literacy

    ERIC Educational Resources Information Center

    Schonborn, Konrad J.; Anderson, Trevor R.

    2010-01-01

    External representations (ERs), such as diagrams, animations, and dynamic models are vital tools for communicating and constructing knowledge in biochemistry. To build a meaningful understanding of structure, function, and process, it is essential that students become visually literate by mastering key cognitive skills that are essential for…

  7. Towards the prediction of essential genes by integration of network topology, cellular localization and biological process information

    PubMed Central

    2009-01-01

    Background The identification of essential genes is important for the understanding of the minimal requirements for cellular life and for practical purposes, such as drug design. However, the experimental techniques for essential genes discovery are labor-intensive and time-consuming. Considering these experimental constraints, a computational approach capable of accurately predicting essential genes would be of great value. We therefore present here a machine learning-based computational approach relying on network topological features, cellular localization and biological process information for prediction of essential genes. Results We constructed a decision tree-based meta-classifier and trained it on datasets with individual and grouped attributes-network topological features, cellular compartments and biological processes-to generate various predictors of essential genes. We showed that the predictors with better performances are those generated by datasets with integrated attributes. Using the predictor with all attributes, i.e., network topological features, cellular compartments and biological processes, we obtained the best predictor of essential genes that was then used to classify yeast genes with unknown essentiality status. Finally, we generated decision trees by training the J48 algorithm on datasets with all network topological features, cellular localization and biological process information to discover cellular rules for essentiality. We found that the number of protein physical interactions, the nuclear localization of proteins and the number of regulating transcription factors are the most important factors determining gene essentiality. Conclusion We were able to demonstrate that network topological features, cellular localization and biological process information are reliable predictors of essential genes. Moreover, by constructing decision trees based on these data, we could discover cellular rules governing essentiality. PMID:19758426

  8. Increasing use of high-speed digital imagery as a measurement tool on test and evaluation ranges

    NASA Astrophysics Data System (ADS)

    Haddleton, Graham P.

    2001-04-01

    In military research and development or testing there are various fast and dangerous events that need to be recorded and analysed. High-speed cameras allow the capture of movement too fast to be recognised by the human eye, and provide data that is essential for the analysis and evaluation of such events. High-speed photography is often the only type of instrumentation that can be used to record the parameters demanded by our customers. I will show examples where this applied cinematography is used not only to provide a visual record of events, but also as an essential measurement tool.

  9. A Web-Based Course Assessment Tool with Direct Mapping to Student Outcomes

    ERIC Educational Resources Information Center

    Ibrahim, Walid; Atif, Yacine; Shuaib, Khaled; Sampson, Demetrios

    2015-01-01

    The assessment of curriculum outcomes is an essential element for continuous academic improvement. However, the collection, aggregation and analysis of assessment data are notoriously complex and time-consuming processes. At the same time, only few developments of supporting electronic processes and tools for continuous academic program assessment…

  10. Physical Models of Schooling, the 'Ought' Question and Educational Change.

    ERIC Educational Resources Information Center

    Bauer, Norman J.

    This paper examines the methods used in designing school and classroom environments. The tools are labeled: (1) discipline-centered schooling; (2) empirical-naturalistic schooling; and (3) great works schooling. First, the outline endeavors to reveal the essential elements of the three tools that represent images, structures, or "maps" of…

  11. Collection Development in Public Health: A Guide to Selection Tools

    ERIC Educational Resources Information Center

    Wallis, Lisa C.

    2004-01-01

    Public health librarians face many challenges in collection development because the field is multidisciplinary, the collection's users have varied needs, and many of the essential resources are grey literature materials. Further, little has been published about public health selection tools. However, librarians responsible for these areas have a…

  12. Using the Internet As an Instructional Tool.

    ERIC Educational Resources Information Center

    Hudson River Center for Program Development, Glenmont, NY.

    This manual is designed to introduce adult educators to the Internet and examine ways that it can enhance instruction. An overview of the Internet covers its evolution. These three sections focus on the three areas of the Internet essential to instructional application: communication, information access, and search tools. The section on…

  13. Development and Classroom Implementation of an Environmental Data Creation and Sharing Tool

    ERIC Educational Resources Information Center

    Brogan, Daniel S.; McDonald, Walter M.; Lohani, Vinod K.; Dymond, Randel L.; Bradner, Aaron J.

    2016-01-01

    Education is essential for solving the complex water-related challenges facing society. The Learning Enhanced Watershed Assessment System (LEWAS) and the Online Watershed Learning System (OWLS) provide data creation and data sharing infrastructures, respectively, that combine to form an environmental learning tool. This system collects, integrates…

  14. The Personal Digital Library (PDL)-based e-learning: Using the PDL as an e-learning support tool

    NASA Astrophysics Data System (ADS)

    Deng, Xiaozhao; Ruan, Jianhai

    The paper describes a support tool for learners engaged in e-learning, the Personal Digital Library (PDL). The characteristics and functionality of the PDL are presented. Suggested steps for constructing and managing a PDL are outlined and discussed briefly. The authors believe that the PDL as a support tool of e-learning will be important and essential in the future.

  15. AN EIGHT WEEK SEMINAR IN AN INTRODUCTION TO NUMERICAL CONTROL ON TWO- AND THREE-AXIS MACHINE TOOLS FOR VOCATIONAL AND TECHNICAL MACHINE TOOL INSTRUCTORS. FINAL REPORT.

    ERIC Educational Resources Information Center

    BOLDT, MILTON; POKORNY, HARRY

    THIRTY-THREE MACHINE SHOP INSTRUCTORS FROM 17 STATES PARTICIPATED IN AN 8-WEEK SEMINAR TO DEVELOP THE SKILLS AND KNOWLEDGE ESSENTIAL FOR TEACHING THE OPERATION OF NUMERICALLY CONTROLLED MACHINE TOOLS. THE SEMINAR WAS GIVEN FROM JUNE 20 TO AUGUST 12, 1966, WITH COLLEGE CREDIT AVAILABLE THROUGH STOUT STATE UNIVERSITY. THE PARTICIPANTS COMPLETED AN…

  16. Web-based automation of green building rating index and life cycle cost analysis

    NASA Astrophysics Data System (ADS)

    Shahzaib Khan, Jam; Zakaria, Rozana; Aminuddin, Eeydzah; IzieAdiana Abidin, Nur; Sahamir, Shaza Rina; Ahmad, Rosli; Nafis Abas, Darul

    2018-04-01

    Sudden decline in financial markets and economic meltdown has slow down adaptation and lowered interest of investors towards green certified buildings due to their higher initial costs. Similarly, it is essential to fetch investor’s attention towards more development of green buildings through automated tools for the construction projects. Though, historical dearth is found on the automation of green building rating tools that brings up an essential gap to develop an automated analog computerized programming tool. This paper present a proposed research aim to develop an integrated web-based automated analog computerized programming that applies green building rating assessment tool, green technology and life cycle cost analysis. It also emphasizes to identify variables of MyCrest and LCC to be integrated and developed in a framework then transformed into automated analog computerized programming. A mix methodology of qualitative and quantitative survey and its development portray the planned to carry MyCrest-LCC integration to an automated level. In this study, the preliminary literature review enriches better understanding of Green Building Rating Tools (GBRT) integration to LCC. The outcome of this research is a pave way for future researchers to integrate other efficient tool and parameters that contributes towards green buildings and future agendas.

  17. MEDICI: Mining Essentiality Data to Identify Critical Interactions for Cancer Drug Target Discovery and Development | Office of Cancer Genomics

    Cancer.gov

    Protein-protein interactions (PPIs) mediate the transmission and regulation of oncogenic signals that are essential to cellular proliferation and survival, and thus represent potential targets for anti-cancer therapeutic discovery. Despite their significance, there is no method to experimentally disrupt and interrogate the essentiality of individual endogenous PPIs. The ability to computationally predict or infer PPI essentiality would help prioritize PPIs for drug discovery and help advance understanding of cancer biology.

  18. Kernel Ada Programming Support Environment (KAPSE) Interface Team: Public Report. Volume II.

    DTIC Science & Technology

    1982-10-28

    essential I parameters from our work so far in this area and, using trade-offs concerning these, construct the KIT’s recommended alternative. 1145...environment that are also in the development states. At this point in development it is essential for the KITEC to provide a forum and act as a focal...standardization in this area. Moreover, this is an area with considerable divergence in proposed approaches. Or the other hand, an essential tool from the point of

  19. [History of mechanical sutures in digestive system surgery].

    PubMed

    Picardi, Nicola

    2002-01-01

    The attempts to suture wounds with mechanical device are very old, and their history is lost in the night of times. But more recently--that means less than two century ago--already before the true initial beginning of the modern surgery, after the birth of anaesthesiology with the "ether day--16 october 1846" there have been many efforts to develop new methods to join the tissue of the gut avoiding the danger of peritoneal contamination. The primitive tools of these ancient stapler were founded on the principle to compress with mechanical devices the two sides of the tissue to join. Very early in the past century, well before the appearance of the antibiotics, in the heart of the old Europe were developed and perfectionated devices able to join the intestinal tissue with metallic stitches: the primitive staplers. But after the end of the second world war the development has become bursting, with the progress of the Sovietic Institute of experimental research on surgical tools of Moscow and then with the mighty initiatives of the industrial power in the USA. The more important progress in this field was founded on the standardization of tools designed to fix metallic stitches on the gut, but very recently there are new attempts to use the more old principle of compression-suture on new basis. The results of this development, essential for modern surgery, are the standardization of the surgical technique, the shortening of operative times, and an important support to the new mininvasive approach to digestive surgery.

  20. Detecting dark-matter waves with a network of precision-measurement tools

    NASA Astrophysics Data System (ADS)

    Derevianko, Andrei

    2018-04-01

    Virialized ultralight fields (VULFs) are viable cold dark-matter candidates and include scalar and pseudoscalar bosonic fields, such as axions and dilatons. Direct searches for VULFs rely on low-energy precision-measurement tools. While previous proposals have focused on detecting coherent oscillations of the VULF signals at the VULF Compton frequencies for individual devices, here I consider a network of such devices. Virialized ultralight fields are essentially dark-matter waves and as such they carry both temporal and spatial phase information. Thereby, the discovery reach can be improved by using networks of precision-measurement tools. To formalize this idea, I derive a spatiotemporal two-point correlation function for the ultralight dark-matter fields in the framework of the standard halo model. Due to VULFs being Gaussian random fields, the derived two-point correlation function fully determines N -point correlation functions. For a network of ND devices within the coherence length of the field, the sensitivity compared to a single device can be improved by a factor of √{ND}. Further, I derive a VULF dark-matter signal profile for an individual device. The resulting line shape is strongly asymmetric due to the parabolic dispersion relation for massive nonrelativistic bosons. I discuss the aliasing effect that extends the discovery reach to VULF frequencies higher than the experimental sampling rate. I present sensitivity estimates and develop a stochastic field signal-to-noise ratio statistic. Finally, I consider an application of the formalism developed to atomic clocks and their networks.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Nicholas R.; Carlsen, Brett W.; Dixon, Brent W.

    Dynamic fuel cycle simulation tools are intended to model holistic transient nuclear fuel cycle scenarios. As with all simulation tools, fuel cycle simulators require verification through unit tests, benchmark cases, and integral tests. Model validation is a vital aspect as well. Although compara-tive studies have been performed, there is no comprehensive unit test and benchmark library for fuel cycle simulator tools. The objective of this paper is to identify the must test functionalities of a fuel cycle simulator tool within the context of specific problems of interest to the Fuel Cycle Options Campaign within the U.S. Department of Energy smore » Office of Nuclear Energy. The approach in this paper identifies the features needed to cover the range of promising fuel cycle options identified in the DOE-NE Fuel Cycle Evaluation and Screening (E&S) and categorizes these features to facilitate prioritization. Features were categorized as essential functions, integrating features, and exemplary capabilities. One objective of this paper is to propose a library of unit tests applicable to each of the essential functions. Another underlying motivation for this paper is to encourage an international dialog on the functionalities and standard test methods for fuel cycle simulator tools.« less

  2. Computational approaches to metabolic engineering utilizing systems biology and synthetic biology.

    PubMed

    Fong, Stephen S

    2014-08-01

    Metabolic engineering modifies cellular function to address various biochemical applications. Underlying metabolic engineering efforts are a host of tools and knowledge that are integrated to enable successful outcomes. Concurrent development of computational and experimental tools has enabled different approaches to metabolic engineering. One approach is to leverage knowledge and computational tools to prospectively predict designs to achieve the desired outcome. An alternative approach is to utilize combinatorial experimental tools to empirically explore the range of cellular function and to screen for desired traits. This mini-review focuses on computational systems biology and synthetic biology tools that can be used in combination for prospective in silico strain design.

  3. RNA-seq Data: Challenges in and Recommendations for Experimental Design and Analysis.

    PubMed

    Williams, Alexander G; Thomas, Sean; Wyman, Stacia K; Holloway, Alisha K

    2014-10-01

    RNA-seq is widely used to determine differential expression of genes or transcripts as well as identify novel transcripts, identify allele-specific expression, and precisely measure translation of transcripts. Thoughtful experimental design and choice of analysis tools are critical to ensure high-quality data and interpretable results. Important considerations for experimental design include number of replicates, whether to collect paired-end or single-end reads, sequence length, and sequencing depth. Common analysis steps in all RNA-seq experiments include quality control, read alignment, assigning reads to genes or transcripts, and estimating gene or transcript abundance. Our aims are two-fold: to make recommendations for common components of experimental design and assess tool capabilities for each of these steps. We also test tools designed to detect differential expression, since this is the most widespread application of RNA-seq. We hope that these analyses will help guide those who are new to RNA-seq and will generate discussion about remaining needs for tool improvement and development. Copyright © 2014 John Wiley & Sons, Inc.

  4. Risk management measures for chemicals: the "COSHH essentials" approach.

    PubMed

    Garrod, A N I; Evans, P G; Davy, C W

    2007-12-01

    "COSHH essentials" was developed in Great Britain to help duty holders comply with the Control of Substances Hazardous to Health (COSHH) Regulations. It uses a similar approach to that described in the new European "REACH" Regulation (Registration, Evaluation, Authorisation and Restriction of Chemicals; EC No. 1907/2006 of the European Parliament), insofar as it identifies measures for managing the risk for specified exposure scenarios. It can therefore assist REACH duty holders with the identification and communication of appropriate risk-management measures. The technical basis for COSHH essentials is explained in the original papers published in the Annals of Occupational Hygiene. Its details will, therefore, not be described here; rather, its ability to provide a suitable means for communicating risk-management measures will be explored. COSHH essentials is a simple tool based on an empirical approach to risk assessment and risk management. The output is a "Control Guidance Sheet" that lists the "dos" and "don'ts" for control in a specific task scenario. The guidance in COSHH essentials recognises that exposure in the workplace will depend not just on mechanical controls, but also on a number of other factors, including administrative and behavioural controls, such as systems of work, supervision and training. In 2002, COSHH essentials was made freely available via the internet (http://www.coshh-essentials.org.uk/). This electronic delivery enabled links to be made between product series that share tasks, such as drum filling, and with ancillary guidance, such as setting up health surveillance for work with a respiratory sensitiser. COSHH essentials has proved to be a popular tool for communicating good control practice. It has attracted over 1 million visits to its site since its launch. It offers a common benchmark of good practice for chemical users, manufacturers, suppliers and importers, as well as regulators and health professionals.

  5. Effective Communication: An Essential Tool To Cope with the Challenge of Technological Change.

    ERIC Educational Resources Information Center

    Coing, Marga

    For a library to function effectively, it is essential that it fosters an open management style, which encourages communication of ideas and objectives both within the library itself and, by example, in other elements in the overall administration of which the library is a part. This paper describes the improvement in morale, efficiency, and…

  6. Experimental and numerical investigations on the temperature distribution in PVD AlTiN coated and uncoated Al2O3/TiCN mixed ceramic cutting tools in hard turning of AISI 52100 steel

    NASA Astrophysics Data System (ADS)

    Sateesh Kumar, Ch; Patel, Saroj Kumar; Das, Anshuman

    2018-03-01

    Temperature generation in cutting tools is one of the major causes of tool failure especially during hard machining where machining forces are quite high resulting in elevated temperatures. Thus, the present work investigates the temperature generation during hard machining of AISI 52100 steel (62 HRC hardness) with uncoated and PVD AlTiN coated Al2O3/TiCN mixed ceramic cutting tools. The experiments were performed on a heavy duty lathe machine with both coated and uncoated cutting tools under dry cutting environment. The temperature of the cutting zone was measured using an infrared thermometer and a finite element model has been adopted to predict the temperature distribution in cutting tools during machining for comparative assessment with the measured temperature. The experimental and numerical results revealed a significant reduction of cutting zone temperature during machining with PVD AlTiN coated cutting tools when compared to uncoated cutting tools during each experimental run. The main reason for decrease in temperature for AlTiN coated tools is the lower coefficient of friction offered by the coating material which allows the free flow of the chips on the rake surface when compared with uncoated cutting tools. Further, the superior wear behaviour of AlTiN coating resulted in reduction of cutting temperature.

  7. Introduction on Using the FastPCR Software and the Related Java Web Tools for PCR and Oligonucleotide Assembly and Analysis.

    PubMed

    Kalendar, Ruslan; Tselykh, Timofey V; Khassenov, Bekbolat; Ramanculov, Erlan M

    2017-01-01

    This chapter introduces the FastPCR software as an integrated tool environment for PCR primer and probe design, which predicts properties of oligonucleotides based on experimental studies of the PCR efficiency. The software provides comprehensive facilities for designing primers for most PCR applications and their combinations. These include the standard PCR as well as the multiplex, long-distance, inverse, real-time, group-specific, unique, overlap extension PCR for multi-fragments assembling cloning and loop-mediated isothermal amplification (LAMP). It also contains a built-in program to design oligonucleotide sets both for long sequence assembly by ligase chain reaction and for design of amplicons that tile across a region(s) of interest. The software calculates the melting temperature for the standard and degenerate oligonucleotides including locked nucleic acid (LNA) and other modifications. It also provides analyses for a set of primers with the prediction of oligonucleotide properties, dimer and G/C-quadruplex detection, linguistic complexity as well as a primer dilution and resuspension calculator. The program consists of various bioinformatical tools for analysis of sequences with the GC or AT skew, CG% and GA% content, and the purine-pyrimidine skew. It also analyzes the linguistic sequence complexity and performs generation of random DNA sequence as well as restriction endonucleases analysis. The program allows to find or create restriction enzyme recognition sites for coding sequences and supports the clustering of sequences. It performs efficient and complete detection of various repeat types with visual display. The FastPCR software allows the sequence file batch processing that is essential for automation. The program is available for download at http://primerdigital.com/fastpcr.html , and its online version is located at http://primerdigital.com/tools/pcr.html .

  8. A review of creative and expressive writing as a pedagogical tool in medical education.

    PubMed

    Cowen, Virginia S; Kaufman, Diane; Schoenherr, Lisa

    2016-03-01

    The act of writing offers an opportunity to foster self-expression and organisational abilities, along with observation and descriptive skills. These soft skills are relevant to clinical thinking and medical practice. Medical school curricula employ pedagogical approaches suitable for assessing medical and clinical knowledge, but teaching methods for soft skills in critical thinking, listening and verbal expression, which are important in patient communication and engagement, may be less formal. Creative and expressive writing that is incorporated into medical school courses or clerkships offers a vehicle for medical students to develop soft skills. The aim of this review was to explore creative and expressive writing as a pedagogical tool in medical schools in relation to outcomes of medical education. This project employed a scoping review approach to gather, evaluate and synthesise reports on the use of creative and expressive writing in US medical education. Ten databases were searched for scholarly articles reporting on creative or expressive writing during medical school. Limitation of the results to activities associated with US medical schools, produced 91 articles. A thematic analysis of the articles was conducted to identify how writing was incorporated into the curriculum. Enthusiasm for writing as a pedagogical tool was identified in 28 editorials and overviews. Quasi-experimental, mixed methods and qualitative studies, primarily writing activities, were aimed at helping students cognitively or emotionally process difficult challenges in medical education, develop a personal identity or reflect on interpersonal skills. The programmes and interventions using creative or expressive writing were largely associated with elective courses or clerkships, and not required courses. Writing was identified as a potentially relevant pedagogical tool, but not included as an essential component of medical school curricula. © 2016 John Wiley & Sons Ltd.

  9. Detecting false positive sequence homology: a machine learning approach.

    PubMed

    Fujimoto, M Stanley; Suvorov, Anton; Jensen, Nicholas O; Clement, Mark J; Bybee, Seth M

    2016-02-24

    Accurate detection of homologous relationships of biological sequences (DNA or amino acid) amongst organisms is an important and often difficult task that is essential to various evolutionary studies, ranging from building phylogenies to predicting functional gene annotations. There are many existing heuristic tools, most commonly based on bidirectional BLAST searches that are used to identify homologous genes and combine them into two fundamentally distinct classes: orthologs and paralogs. Due to only using heuristic filtering based on significance score cutoffs and having no cluster post-processing tools available, these methods can often produce multiple clusters constituting unrelated (non-homologous) sequences. Therefore sequencing data extracted from incomplete genome/transcriptome assemblies originated from low coverage sequencing or produced by de novo processes without a reference genome are susceptible to high false positive rates of homology detection. In this paper we develop biologically informative features that can be extracted from multiple sequence alignments of putative homologous genes (orthologs and paralogs) and further utilized in context of guided experimentation to verify false positive outcomes. We demonstrate that our machine learning method trained on both known homology clusters obtained from OrthoDB and randomly generated sequence alignments (non-homologs), successfully determines apparent false positives inferred by heuristic algorithms especially among proteomes recovered from low-coverage RNA-seq data. Almost ~42 % and ~25 % of predicted putative homologies by InParanoid and HaMStR respectively were classified as false positives on experimental data set. Our process increases the quality of output from other clustering algorithms by providing a novel post-processing method that is both fast and efficient at removing low quality clusters of putative homologous genes recovered by heuristic-based approaches.

  10. Development of Conceptual Design Support Tool Founded on Formalization of Conceptual Design Process for Regenerative Life Support Systems

    NASA Astrophysics Data System (ADS)

    Miyajima, Hiroyuki; Yuhara, Naohiro

    Regenerative Life Support Systems (RLSS), which maintain human lives by recycling substances essential for living, are comprised of humans, plants, and material circulation systems. The plants supply food to the humans or reproduce water and gases by photosynthesis, while the material circulation systems recycle physicochemically and circulate substances disposed by humans and plants. RLSS attracts attention since manned space activities have been shifted from previous short trips to long-term stay activities as such base as a space station, a lunar base, and a Mars base. The present typical space base is the International Space Station (ISS), a manned experimental base for prolonged stays, where RLSS recycles only water and air. In order to accommodate prolonged and extended manned activity in future space bases, developing RLSS that implements food production and regeneration of resources at once using plants is expected. The configuration of RLSS should be designed to suit its own duty, for which design requirements for RLSS with an unprecedented configuration may arise. Accordingly, it is necessary to establish a conceptual design method for generalized RLSS. It is difficult, however, to systematize the design process by analyzing previous design because there are only a few ground-experimental facilities, namely CEEF (Closed Ecology Experiment Facilities) of Japan, BIO-Plex (Bioregenerative Planetary Life Support Systems Test Complex) of the U.S., and BIOS3 of Russia. Thus a conceptual design method which doesn’t rely on previous design examples is required for generalized RLSS from the above reasons. This study formalizes a conceptual design process, and develops a conceptual design support tool for RLSS based on this design process.

  11. Simulations of 3D bioprinting: predicting bioprintability of nanofibrillar inks.

    PubMed

    Göhl, Johan; Markstedt, Kajsa; Mark, Andreas; Håkansson, Karl; Gatenholm, Paul; Edelvik, Fredrik

    2018-06-18

    3D bioprinting with cell containing bioinks show great promise in the biofabrication of patient specific tissue constructs. To fulfil the multiple requirements of a bioink, a wide range of materials and bioink composition are being developed and evaluated with regard to cell viability, mechanical performance and printability. It is essential that the printability and printing fidelity is not neglected since failure in printing the targeted architecture may be catastrophic for the survival of the cells and consequently the function of the printed tissue. However, experimental evaluation of bioinks printability is time-consuming and must be kept at a minimum, especially when 3D bioprinting with cells that are valuable and costly. This paper demonstrates how experimental evaluation could be complemented with computer based simulations to evaluate newly developed bioinks. Here, a computational fluid dynamics simulation tool was used to study the influence of different printing parameters and evaluate the predictability of the printing process. Based on data from oscillation frequency measurements of the evaluated bioinks, a full stress rheology model was used, where the viscoelastic behaviour of the material was captured. Simulation of the 3D bioprinting process is a powerful tool and will help in reducing the time and cost in the development and evaluation of bioinks. Moreover, it gives the opportunity to isolate parameters such as printing speed, nozzle height, flow rate and printing path to study their influence on the printing fidelity and the viscoelastic stresses within the bioink. The ability to study these features more extensively by simulating the printing process will result in a better understanding of what influences the viability of cells in 3D bioprinted tissue constructs.

  12. Effect of different liming levels on the biomass production and essential oil extraction yield of Cunila galioides Benth.

    PubMed

    Mossi, A J; Pauletti, G F; Rota, L; Echeverrigaray, S; Barros, I B I; Oliveira, J V; Paroul, N; Cansian, R L

    2012-11-01

    Poejo is an aromatic and medicinal plant native to highland areas of south Brazil, in acid soils with high Al3+ concentration. The main objective of the present work was to evaluate the effect of liming on the extraction yield of essential oil of three chemotypes of poejo (Cunila galioides Benth). For this purpose, the experiments were performed in a greenhouse, using 8-litre pots. The treatments were four dosages of limestone (0, 3.15, 12.5, and 25 g.L(-1)) and a completely random experimental design was used, with four replications and three chemotypes, set up in a 3 × 4 factorial arrangement. The parameters evaluated were dry weight of aerial parts, essential oil content and chemical composition of essential oil. Results showed that liming affects the biomass production, essential oil yield and chemical composition, with cross interaction verified between chemotype and limestone dosage. For the higher dosage lower biomass production, lower yield of essential oil as well as the lowest content of citral (citral chemotype) and limonene (menthene chemotype) was observed. In the ocimene chemotype, no liming influence was observed on the essential oil yield and on the content of major compounds. The dosage of 3.15 g.L(-1) can be considered the best limestone dosage for the production of poejo for the experimental conditions evaluated.

  13. Finding Your Voice: Talent Development Centers and the Academic Talent Search

    ERIC Educational Resources Information Center

    Rushneck, Amy S.

    2012-01-01

    Talent Development Centers are just one of many tools every family, teacher, and gifted advocate should have in their tool box. To understand the importance of Talent Development Centers, it is essential to also understand the Academic Talent Search Program. Talent Search participants who obtain scores comparable to college-bound high school…

  14. Innovative Assessment Tools for a Short, Fast-Paced, Summer Field Course

    ERIC Educational Resources Information Center

    Baustian, Melissa M.; Bentley, Samuel J.; Wandersee, James H.

    2008-01-01

    An experiential science program, such as a summer course at a field station, requires unique assessment tools. Traditional assessment via a pencil-and-paper exam cannot capture the essential skills and concepts learned at a summer field station. Therefore, the authors developed a pre- and postcourse image-based analysis to evaluate student…

  15. Sold! The Elementary Classroom Auction as Learning Tool of Communication and Economics

    ERIC Educational Resources Information Center

    Boyd, Josh; Boyd, Gina

    2014-01-01

    An auction, though an economic tool, is essentially a performance dependent on communication (Smith, 1989). The auctioneer dictates the pace, asks for bids, and acknowledges responses; the enterprise is controlled by a voice (Boyce, 2001). Bidders must listen and respond strategically to the communication of the people around them. An auction…

  16. Practice versus Politics in Danish Day-Care Centres: How to Bridge the Gap in Early Learning?

    ERIC Educational Resources Information Center

    Clasen, Line Engel; Jensen de López, Kristine

    2016-01-01

    It is essential that early educators in day-care services possess adequate pedagogical tools for supporting children's communicative development. Early literacy programmes (ELPs) are potential tools. However, studies investigating the effects of ELPs seldom address implementation processes or the programme users' perspectives. This study sheds…

  17. Small Wonders Close Encounters

    ERIC Educational Resources Information Center

    Kniseley, MacGregor; Capraro, Karen

    2013-01-01

    This article introduces students to the world of digital microscopy. Looking at small objects through a digital microscope is like traveling through a foreign country for the first time. The experience is new, engaging, and exciting. A handheld digital microscope is an essential tool in a 21st century teacher's toolkit and the perfect tool to…

  18. Effects of juniper essential oil on growth performance, some rumen protozoa, rumen fermentation and antioxidant blood enzyme parameters of growing Saanen kids.

    PubMed

    Yesilbag, D; Biricik, H; Cetin, I; Kara, C; Meral, Y; Cengiz, S S; Orman, A; Udum, D

    2017-10-01

    This study aimed to evaluate the effects of juniper essential oil on the growth performance, rumen fermentation parameters, rumen protozoa population, blood antioxidant enzyme parameters and faecal content in growing Saanen kids. Thirty-six male Saanen kids (36 ± 14 days of age) were used in the study. Each group consisted of 9 kids. The control group (G1) was fed with a diet that consisted of the above concentrated feed and oat hay, whereas the experimental groups consumed the same diet but with the concentrated feed uniformly sprayed with juniper essential oil 0.4 ml/kg (G2), 0.8 ml/kg (G3) or 2 ml/kg (G4). There were no differences (p > 0.05) in live weight, live weight gain or feed consumption between the control and experimental groups. There was a significant improvement (p < 0.05) in feed efficiency in the G3 group. There were no differences in the rumen pH, rumen volatile fatty acid (VFA) profile or faecal pH of the control and experimental groups. The rumen NH 3 N values were similar at the middle and end of the experiment, but at the start of the experiment, the rumen NH 3 N values differed between the control and experimental groups (p < 0.05). The faecal score value was significantly (p < 0.05) decreased in the experimental groups. The addition of juniper essential oil supplementation to the rations caused significant effects on the kids' antioxidant blood parameters. Although the superoxide dismutase (SOD) activity, total antioxidant capacity (TAC) and catalase values were significantly (p < 0.05) increased in the experimental groups (G2, G3 and G4), especially group G4, the blood glutathione peroxidase (GPX) value significantly decreased in the experimental groups. The results of this study suggest that supplementation of juniper oil is more effective on antioxidant parameters than on performance parameters and may be used as a natural antioxidant product. Journal of Animal Physiology and Animal Nutrition © 2016 Blackwell Verlag GmbH.

  19. A new heat transfer analysis in machining based on two steps of 3D finite element modelling and experimental validation

    NASA Astrophysics Data System (ADS)

    Haddag, B.; Kagnaya, T.; Nouari, M.; Cutard, T.

    2013-01-01

    Modelling machining operations allows estimating cutting parameters which are difficult to obtain experimentally and in particular, include quantities characterizing the tool-workpiece interface. Temperature is one of these quantities which has an impact on the tool wear, thus its estimation is important. This study deals with a new modelling strategy, based on two steps of calculation, for analysis of the heat transfer into the cutting tool. Unlike the classical methods, considering only the cutting tool with application of an approximate heat flux at the cutting face, estimated from experimental data (e.g. measured cutting force, cutting power), the proposed approach consists of two successive 3D Finite Element calculations and fully independent on the experimental measurements; only the definition of the behaviour of the tool-workpiece couple is necessary. The first one is a 3D thermomechanical modelling of the chip formation process, which allows estimating cutting forces, chip morphology and its flow direction. The second calculation is a 3D thermal modelling of the heat diffusion into the cutting tool, by using an adequate thermal loading (applied uniform or non-uniform heat flux). This loading is estimated using some quantities obtained from the first step calculation, such as contact pressure, sliding velocity distributions and contact area. Comparisons in one hand between experimental data and the first calculation and at the other hand between measured temperatures with embedded thermocouples and the second calculation show a good agreement in terms of cutting forces, chip morphology and cutting temperature.

  20. Experimental and theoretical description of the optical properties of Myrcia sylvatica essential oil.

    PubMed

    Silva Prado, Andriele da; Leal, Luciano Almeida; de Brito, Patrick Pascoal; de Almeida Fonseca, Antonio Luciano; Blawid, Stefan; Ceschin, Artemis Marti; Veras Mourão, Rosa Helena; da Silva Júnior, Antônio Quaresma; Antonio da Silva Filho, Demétrio; Ribeiro Junior, Luiz Antonio; Ferreira da Cunha, Wiliam

    2017-07-01

    We present an extensive study of the optical properties of Myrcia sylvatica essential oil with the goal of investigating the suitability of its material system for uses in organic photovoltaics. The methods of extraction, experimental analysis, and theoretical modeling are described in detail. The precise composition of the oil in our samples is determined via gas chromatography, mass spectrometry, and X-ray scattering techniques. The measurements indicate that, indeed, the material system of Myrcia sylvatica essential oil may be successfully employed for the design of organic photovoltaic devices. The optical absorption of the molecules that compose the oil are calculated using time-dependent density functional theory and used to explain the measured UV-Vis spectra of the oil. We show that it is sufficient to consider the α-bisabolol/cadalene pair, two of the main constituents of the oil, to obtain the main features of the UV-Vis spectra. This finding is of importance for future works that aim to use Myrcia sylvatica essential oil as a photovoltaic material.

  1. Comparisons for Effectiveness of Aromatherapy and Acupressure Massage on Quality of Life in Career Women: A Randomized Controlled Trial.

    PubMed

    Kao, Yu-Hsiu; Huang, Yi-Ching; Chung, Ue-Lin; Hsu, Wen-Ni; Tang, Yi-Ting; Liao, Yi-Hung

    2017-06-01

    This study was aimed to compare the effectiveness of aromatherapy and acupressure massage intervention strategies on the sleep quality and quality of life (QOL) in career women. The randomized controlled trial experimental design was used in the present study. One hundred and thirty-two career women (24-55 years) voluntarily participated in this study and they were randomly assigned to (1) placebo (distilled water), (2) lavender essential oil (Lavandula angustifolia), (3) blended essential oil (1:1:1 ratio of L. angustifolia, Salvia sclarea, and Origanum majorana), and (4) acupressure massage groups for a 4-week treatment. The Pittsburgh Sleep Quality Index and Short Form 36 Health Survey were used to evaluate the intervention effects at pre- and postintervention. After a 4-week treatment, all experimental groups (blended essential oil, lavender essential oil, and acupressure massage) showed significant improvements in sleep quality and QOL (p < 0.05). Significantly greater improvement in QOL was observed in the participants with blended essential oil treatment compared with those with lavender essential oil (p < 0.05), and a significantly greater improvement in sleep quality was observed in the acupressure massage and blended essential oil groups compared with the lavender essential oil group (p < 0.05). The blended essential oil exhibited greater dual benefits on improving both QOL and sleep quality compared with the interventions of lavender essential oil and acupressure massage in career women. These results suggest that aromatherapy and acupressure massage improve the sleep and QOL and may serve as the optimal means for career women to improve their sleep and QOL.

  2. Bioinformatic tools for inferring functional information from plant microarray data: tools for the first steps.

    PubMed

    Page, Grier P; Coulibaly, Issa

    2008-01-01

    Microarrays are a very powerful tool for quantifying the amount of RNA in samples; however, their ability to query essentially every gene in a genome, which can number in the tens of thousands, presents analytical and interpretative problems. As a result, a variety of software and web-based tools have been developed to help with these issues. This article highlights and reviews some of the tools for the first steps in the analysis of a microarray study. We have tried for a balance between free and commercial systems. We have organized the tools by topics including image processing tools (Section 2), power analysis tools (Section 3), image analysis tools (Section 4), database tools (Section 5), databases of functional information (Section 6), annotation tools (Section 7), statistical and data mining tools (Section 8), and dissemination tools (Section 9).

  3. Thermal model development and validation for rapid filling of high pressure hydrogen tanks

    DOE PAGES

    Johnson, Terry A.; Bozinoski, Radoslav; Ye, Jianjun; ...

    2015-06-30

    This paper describes the development of thermal models for the filling of high pressure hydrogen tanks with experimental validation. Two models are presented; the first uses a one-dimensional, transient, network flow analysis code developed at Sandia National Labs, and the second uses the commercially available CFD analysis tool Fluent. These models were developed to help assess the safety of Type IV high pressure hydrogen tanks during the filling process. The primary concern for these tanks is due to the increased susceptibility to fatigue failure of the liner caused by the fill process. Thus, a thorough understanding of temperature changes ofmore » the hydrogen gas and the heat transfer to the tank walls is essential. The effects of initial pressure, filling time, and fill procedure were investigated to quantify the temperature change and verify the accuracy of the models. In this paper we show that the predictions of mass averaged gas temperature for the one and three-dimensional models compare well with the experiment and both can be used to make predictions for final mass delivery. Furthermore, due to buoyancy and other three-dimensional effects, however, the maximum wall temperature cannot be predicted using one-dimensional tools alone which means that a three-dimensional analysis is required for a safety assessment of the system.« less

  4. Protein flexibility in the light of structural alphabets

    PubMed Central

    Craveur, Pierrick; Joseph, Agnel P.; Esque, Jeremy; Narwani, Tarun J.; Noël, Floriane; Shinada, Nicolas; Goguet, Matthieu; Leonard, Sylvain; Poulain, Pierre; Bertrand, Olivier; Faure, Guilhem; Rebehmed, Joseph; Ghozlane, Amine; Swapna, Lakshmipuram S.; Bhaskara, Ramachandra M.; Barnoud, Jonathan; Téletchéa, Stéphane; Jallu, Vincent; Cerny, Jiri; Schneider, Bohdan; Etchebest, Catherine; Srinivasan, Narayanaswamy; Gelly, Jean-Christophe; de Brevern, Alexandre G.

    2015-01-01

    Protein structures are valuable tools to understand protein function. Nonetheless, proteins are often considered as rigid macromolecules while their structures exhibit specific flexibility, which is essential to complete their functions. Analyses of protein structures and dynamics are often performed with a simplified three-state description, i.e., the classical secondary structures. More precise and complete description of protein backbone conformation can be obtained using libraries of small protein fragments that are able to approximate every part of protein structures. These libraries, called structural alphabets (SAs), have been widely used in structure analysis field, from definition of ligand binding sites to superimposition of protein structures. SAs are also well suited to analyze the dynamics of protein structures. Here, we review innovative approaches that investigate protein flexibility based on SAs description. Coupled to various sources of experimental data (e.g., B-factor) and computational methodology (e.g., Molecular Dynamic simulation), SAs turn out to be powerful tools to analyze protein dynamics, e.g., to examine allosteric mechanisms in large set of structures in complexes, to identify order/disorder transition. SAs were also shown to be quite efficient to predict protein flexibility from amino-acid sequence. Finally, in this review, we exemplify the interest of SAs for studying flexibility with different cases of proteins implicated in pathologies and diseases. PMID:26075209

  5. Guiding students to develop an understanding of scientific inquiry: a science skills approach to instruction and assessment.

    PubMed

    Stone, Elisa M

    2014-01-01

    New approaches for teaching and assessing scientific inquiry and practices are essential for guiding students to make the informed decisions required of an increasingly complex and global society. The Science Skills approach described here guides students to develop an understanding of the experimental skills required to perform a scientific investigation. An individual teacher's investigation of the strategies and tools she designed to promote scientific inquiry in her classroom is outlined. This teacher-driven action research in the high school biology classroom presents a simple study design that allowed for reciprocal testing of two simultaneous treatments, one that aimed to guide students to use vocabulary to identify and describe different scientific practices they were using in their investigations-for example, hypothesizing, data analysis, or use of controls-and another that focused on scientific collaboration. A knowledge integration (KI) rubric was designed to measure how students integrated their ideas about the skills and practices necessary for scientific inquiry. KI scores revealed that student understanding of scientific inquiry increased significantly after receiving instruction and using assessment tools aimed at promoting development of specific inquiry skills. General strategies for doing classroom-based action research in a straightforward and practical way are discussed, as are implications for teaching and evaluating introductory life sciences courses at the undergraduate level.

  6. Engineering specification and system design for CAD/CAM of custom shoes: UMC project effort

    NASA Technical Reports Server (NTRS)

    Bao, Han P.

    1990-01-01

    Further experimentations were made to improve the design and fabrication techniques of the integrated sole. The sole design is shown to be related to the foot position requirements and the actual shape of the foot including presence of neurotropic ulcers or other infections. Factors for consideration were: heel pitch, balance line, and rigidity conditions of the foot. Machining considerations were also part of the design problem. Among these considerations, widths of each contour, tool motion, tool feed rate, depths of cut, and slopes of cut at the boundary were the key elements. The essential fabrication techniques evolved around the idea of machining a mold then, using quick-firm latex material, casting the sole through the mold. Two main mold materials were experimented with: plaster and wood. Plaster was very easy to machine and shape but could barely support the pressure in the hydraulic press required by the casting process. Wood was found to be quite effective in terms of relative cost, strength, and surface smoothness except for the problem of cutting against the fibers which could generate ragged surfaces. The programming efforts to convert the original dBase programs into C programs so that they could be executed on the SUN Computer at North Carolina State University are discussed.

  7. Design Issues and Inference in Experimental L2 Research

    ERIC Educational Resources Information Center

    Hudson, Thom; Llosa, Lorena

    2015-01-01

    Explicit attention to research design issues is essential in experimental second language (L2) research. Too often, however, such careful attention is not paid. This article examines some of the issues surrounding experimental L2 research and its relationships to causal inferences. It discusses the place of research questions and hypotheses,…

  8. 20 CFR 416.250 - Experimental, pilot, and demonstration projects in the SSI program.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... you are placed in a control group which is not subject to the alternative requirements, limitations... 20 Employees' Benefits 2 2011-04-01 2011-04-01 false Experimental, pilot, and demonstration... Because of Essential Persons § 416.250 Experimental, pilot, and demonstration projects in the SSI program...

  9. 20 CFR 416.250 - Experimental, pilot, and demonstration projects in the SSI program.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... you are placed in a control group which is not subject to the alternative requirements, limitations... 20 Employees' Benefits 2 2013-04-01 2013-04-01 false Experimental, pilot, and demonstration... Because of Essential Persons § 416.250 Experimental, pilot, and demonstration projects in the SSI program...

  10. 20 CFR 416.250 - Experimental, pilot, and demonstration projects in the SSI program.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... you are placed in a control group which is not subject to the alternative requirements, limitations... 20 Employees' Benefits 2 2014-04-01 2014-04-01 false Experimental, pilot, and demonstration... Because of Essential Persons § 416.250 Experimental, pilot, and demonstration projects in the SSI program...

  11. 20 CFR 416.250 - Experimental, pilot, and demonstration projects in the SSI program.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... you are placed in a control group which is not subject to the alternative requirements, limitations... 20 Employees' Benefits 2 2012-04-01 2012-04-01 false Experimental, pilot, and demonstration... Because of Essential Persons § 416.250 Experimental, pilot, and demonstration projects in the SSI program...

  12. 20 CFR 416.250 - Experimental, pilot, and demonstration projects in the SSI program.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... you are placed in a control group which is not subject to the alternative requirements, limitations... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Experimental, pilot, and demonstration... Because of Essential Persons § 416.250 Experimental, pilot, and demonstration projects in the SSI program...

  13. Counterbalancing for Serial Order Carryover Effects in Experimental Condition Orders

    ERIC Educational Resources Information Center

    Brooks, Joseph L.

    2012-01-01

    Reactions of neural, psychological, and social systems are rarely, if ever, independent of previous inputs and states. The potential for serial order carryover effects from one condition to the next in a sequence of experimental trials makes counterbalancing of condition order an essential part of experimental design. Here, a method is proposed…

  14. An experimental study of flank wear in the end milling of AISI 316 stainless steel with coated carbide inserts

    NASA Astrophysics Data System (ADS)

    Odedeyi, P. B.; Abou-El-Hossein, K.; Liman, M.

    2017-05-01

    Stainless steel 316 is a difficult-to-machine iron-based alloys that contain minimum of about 12% of chromium commonly used in marine and aerospace industry. This paper presents an experimental study of the tool wear propagation variations in the end milling of stainless steel 316 with coated carbide inserts. The milling tests were conducted at three different cutting speeds while feed rate and depth of cut were at (0.02, 0.06 and 01) mm/rev and (1, 2 and 3) mm, respectively. The cutting tool used was TiAlN-PVD-multi-layered coated carbides. The effects of cutting speed, cutting tool coating top layer and workpiece material were investigated on the tool life. The results showed that cutting speed significantly affected the machined flank wears values. With increasing cutting speed, the flank wear values decreased. The experimental results showed that significant flank wear was the major and predominant failure mode affecting the tool life.

  15. SCOUT: A Fast Monte-Carlo Modeling Tool of Scintillation Camera Output

    PubMed Central

    Hunter, William C. J.; Barrett, Harrison H.; Lewellen, Thomas K.; Miyaoka, Robert S.; Muzi, John P.; Li, Xiaoli; McDougald, Wendy; MacDonald, Lawrence R.

    2011-01-01

    We have developed a Monte-Carlo photon-tracking and readout simulator called SCOUT to study the stochastic behavior of signals output from a simplified rectangular scintillation-camera design. SCOUT models the salient processes affecting signal generation, transport, and readout. Presently, we compare output signal statistics from SCOUT to experimental results for both a discrete and a monolithic camera. We also benchmark the speed of this simulation tool and compare it to existing simulation tools. We find this modeling tool to be relatively fast and predictive of experimental results. Depending on the modeled camera geometry, we found SCOUT to be 4 to 140 times faster than other modeling tools. PMID:22072297

  16. SCOUT: a fast Monte-Carlo modeling tool of scintillation camera output†

    PubMed Central

    Hunter, William C J; Barrett, Harrison H.; Muzi, John P.; McDougald, Wendy; MacDonald, Lawrence R.; Miyaoka, Robert S.; Lewellen, Thomas K.

    2013-01-01

    We have developed a Monte-Carlo photon-tracking and readout simulator called SCOUT to study the stochastic behavior of signals output from a simplified rectangular scintillation-camera design. SCOUT models the salient processes affecting signal generation, transport, and readout of a scintillation camera. Presently, we compare output signal statistics from SCOUT to experimental results for both a discrete and a monolithic camera. We also benchmark the speed of this simulation tool and compare it to existing simulation tools. We find this modeling tool to be relatively fast and predictive of experimental results. Depending on the modeled camera geometry, we found SCOUT to be 4 to 140 times faster than other modeling tools. PMID:23640136

  17. Class Room Seminar and Journal Club (CRSJC) as an Effective Teaching Learning Tool: Perception to Post Graduation Pharmacy Students

    ERIC Educational Resources Information Center

    Dahiya, Sunita; Dahiya, Rajiv

    2015-01-01

    Theory and practicals are two essential components of pharmacy course curriculum; but in addition to appearing and passing examination with good score grades, pharmacy post graduation (PG) pursuing students are essentially required to develop some professional skills which might not be attained solely by conventional class room programs. This…

  18. Experimental validation of predicted cancer genes using FRET

    NASA Astrophysics Data System (ADS)

    Guala, Dimitri; Bernhem, Kristoffer; Ait Blal, Hammou; Jans, Daniel; Lundberg, Emma; Brismar, Hjalmar; Sonnhammer, Erik L. L.

    2018-07-01

    Huge amounts of data are generated in genome wide experiments, designed to investigate diseases with complex genetic causes. Follow up of all potential leads produced by such experiments is currently cost prohibitive and time consuming. Gene prioritization tools alleviate these constraints by directing further experimental efforts towards the most promising candidate targets. Recently a gene prioritization tool called MaxLink was shown to outperform other widely used state-of-the-art prioritization tools in a large scale in silico benchmark. An experimental validation of predictions made by MaxLink has however been lacking. In this study we used Fluorescence Resonance Energy Transfer, an established experimental technique for detection of protein-protein interactions, to validate potential cancer genes predicted by MaxLink. Our results provide confidence in the use of MaxLink for selection of new targets in the battle with polygenic diseases.

  19. Experimental Investigation of Spectra of Dynamical Maps and their Relation to non-Markovianity

    NASA Astrophysics Data System (ADS)

    Yu, Shang; Wang, Yi-Tao; Ke, Zhi-Jin; Liu, Wei; Meng, Yu; Li, Zhi-Peng; Zhang, Wen-Hao; Chen, Geng; Tang, Jian-Shun; Li, Chuan-Feng; Guo, Guang-Can

    2018-02-01

    The spectral theorem of von Neumann has been widely applied in various areas, such as the characteristic spectral lines of atoms. It has been recently proposed that dynamical evolution also possesses spectral lines. As the most intrinsic property of evolution, the behavior of these spectra can, in principle, exhibit almost every feature of this evolution, among which the most attractive topic is non-Markovianity, i.e., the memory effects during evolution. Here, we develop a method to detect these spectra, and moreover, we experimentally examine the relation between the spectral behavior and non-Markovianity by engineering the environment to prepare dynamical maps with different non-Markovian properties and then detecting the dynamical behavior of the spectral values. These spectra will lead to a witness for essential non-Markovianity. We also experimentally verify another simplified witness method for essential non-Markovianity. Interestingly, in both cases, we observe the sudden transition from essential non-Markovianity to something else. Our work shows the role of the spectra of evolution in the studies of non-Makovianity and provides the alternative methods to characterize non-Markovian behavior.

  20. Identification of widespread adenosine nucleotide binding in Mycobacterium tuberculosis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ansong, Charles; Ortega, Corrie; Payne, Samuel H.

    The annotation of protein function is almost completely performed by in silico approaches. However, computational prediction of protein function is frequently incomplete and error prone. In Mycobacterium tuberculosis (Mtb), ~25% of all genes have no predicted function and are annotated as hypothetical proteins. This lack of functional information severely limits our understanding of Mtb pathogenicity. Current tools for experimental functional annotation are limited and often do not scale to entire protein families. Here, we report a generally applicable chemical biology platform to functionally annotate bacterial proteins by combining activity-based protein profiling (ABPP) and quantitative LC-MS-based proteomics. As an example ofmore » this approach for high-throughput protein functional validation and discovery, we experimentally annotate the families of ATP-binding proteins in Mtb. Our data experimentally validate prior in silico predictions of >250 ATPases and adenosine nucleotide-binding proteins, and reveal 73 hypothetical proteins as novel ATP-binding proteins. We identify adenosine cofactor interactions with many hypothetical proteins containing a diversity of unrelated sequences, providing a new and expanded view of adenosine nucleotide binding in Mtb. Furthermore, many of these hypothetical proteins are both unique to Mycobacteria and essential for infection, suggesting specialized functions in mycobacterial physiology and pathogenicity. Thus, we provide a generally applicable approach for high throughput protein function discovery and validation, and highlight several ways in which application of activity-based proteomics data can improve the quality of functional annotations to facilitate novel biological insights.« less

  1. Analysis of Ten Reverse Engineering Tools

    NASA Astrophysics Data System (ADS)

    Koskinen, Jussi; Lehmonen, Tero

    Reverse engineering tools can be used in satisfying the information needs of software maintainers. Especially in case of maintaining large-scale legacy systems tool support is essential. Reverse engineering tools provide various kinds of capabilities to provide the needed information to the tool user. In this paper we analyze the provided capabilities in terms of four aspects: provided data structures, visualization mechanisms, information request specification mechanisms, and navigation features. We provide a compact analysis of ten representative reverse engineering tools for supporting C, C++ or Java: Eclipse Java Development Tools, Wind River Workbench (for C and C++), Understand (for C++), Imagix 4D, Creole, Javadoc, Javasrc, Source Navigator, Doxygen, and HyperSoft. The results of the study supplement the earlier findings in this important area.

  2. Building ProteomeTools based on a complete synthetic human proteome

    PubMed Central

    Zolg, Daniel P.; Wilhelm, Mathias; Schnatbaum, Karsten; Zerweck, Johannes; Knaute, Tobias; Delanghe, Bernard; Bailey, Derek J.; Gessulat, Siegfried; Ehrlich, Hans-Christian; Weininger, Maximilian; Yu, Peng; Schlegl, Judith; Kramer, Karl; Schmidt, Tobias; Kusebauch, Ulrike; Deutsch, Eric W.; Aebersold, Ruedi; Moritz, Robert L.; Wenschuh, Holger; Moehring, Thomas; Aiche, Stephan; Huhmer, Andreas; Reimer, Ulf; Kuster, Bernhard

    2018-01-01

    The ProteomeTools project builds molecular and digital tools from the human proteome to facilitate biomedical and life science research. Here, we report the generation and multimodal LC-MS/MS analysis of >330,000 synthetic tryptic peptides representing essentially all canonical human gene products and exemplify the utility of this data. The resource will be extended to >1 million peptides and all data will be shared with the community via ProteomicsDB and proteomeXchange. PMID:28135259

  3. The in vitro Antimicrobial Activity and Chemometric Modelling of 59 Commercial Essential Oils against Pathogens of Dermatological Relevance.

    PubMed

    Orchard, Ané; Sandasi, Maxleene; Kamatou, Guy; Viljoen, Alvaro; van Vuuren, Sandy

    2017-01-01

    This study reports on the inhibitory concentration of 59 commercial essential oils recommended for dermatological conditions, and identifies putative compounds responsible for antimicrobial activity. Essential oils were investigated for antimicrobial activity using minimum inhibitory concentration assays. Ten essential oils were identified as having superior antimicrobial activity. The essential oil compositions were determined using gas chromatography coupled to mass spectrometry and the data analysed with the antimicrobial activity using multivariate tools. Orthogonal projections to latent structures models were created for seven of the pathogens. Eugenol was identified as the main biomarker responsible for antimicrobial activity in the majority of the essential oils. The essential oils mostly displayed noteworthy antimicrobial activity, with five oils displaying broad-spectrum activity against the 13 tested micro-organisms. The antimicrobial efficacies of the essential oils highlight their potential in treating dermatological infections and through chemometric modelling, bioactive volatiles have been identified. © 2017 Wiley-VHCA AG, Zurich, Switzerland.

  4. Experimental anti-GBM disease as a tool for studying spontaneous lupus nephritis.

    PubMed

    Fu, Yuyang; Du, Yong; Mohan, Chandra

    2007-08-01

    Lupus nephritis is an immune-mediated disease, where antibodies and T cells both play pathogenic roles. Since spontaneous lupus nephritis in mouse models takes 6-12 months to manifest, there is an urgent need for a mouse model that can be used to delineate the pathogenic processes that lead to immune nephritis, over a quicker time frame. We propose that the experimental anti-glomerular basement membrane (GBM) disease model might be a suitable tool for uncovering some of the molecular steps underlying lupus nephritis. This article reviews the current evidence that supports the use of the experimental anti-GBM nephritis model for studying spontaneous lupus nephritis. Importantly, out of about 25 different molecules that have been specifically examined in the experimental anti-GBM model and also spontaneous lupus nephritis, all influence both diseases concordantly, suggesting that the experimental model might be a useful tool for unraveling the molecular basis of spontaneous lupus nephritis. This has important clinical implications, both from the perspective of genetic susceptibility as well as clinical therapeutics.

  5. Video Analysis of Projectile Motion Using Tablet Computers as Experimental Tools

    ERIC Educational Resources Information Center

    Klein, P.; Gröber, S.; Kuhn, J.; Müller, A.

    2014-01-01

    Tablet computers were used as experimental tools to record and analyse the motion of a ball thrown vertically from a moving skateboard. Special applications plotted the measurement data component by component, allowing a simple determination of initial conditions and "g" in order to explore the underlying laws of motion. This experiment…

  6. ON DEVELOPING TOOLS AND METHODS FOR ENVIRONMENTALLY BENIGN PROCESSES

    EPA Science Inventory

    Two types of tools are generally needed for designing processes and products that are cleaner from environmental impact perspectives. The first kind is called process tools. Process tools are based on information obtained from experimental investigations in chemistry., material s...

  7. Digital Pedagogies for Teachers' CPD

    ERIC Educational Resources Information Center

    Montebello, Matthew

    2017-01-01

    The continuous professional development of educators is not only essential to highly maintain their expertise levels and ensure that their knowledge is up to scratch, but also to catch up and adopt new pedagogical tools, skills and techniques. The advent of the Web 2.0 brought about a plethora of digital tools that teachers have not only struggled…

  8. Toward a New Approach to the Evaluation of a Digital Curriculum Using Learning Analytics

    ERIC Educational Resources Information Center

    Rangel, Virginia Snodgrass; Bell, Elizabeth R.; Monroy, Carlos; Whitaker, J. Reid

    2015-01-01

    Understanding how an educational intervention is implemented is essential to evaluating its effectiveness. With the increased use of digital tools in classrooms, however, traditional methods of measuring implementation fall short. Fortunately, there is a way to learn about the interactions that users have with digital tools that are embedded into…

  9. Assessing Teamwork in Undergraduate Education: A Measurement Tool to Evaluate Individual Teamwork Skills

    ERIC Educational Resources Information Center

    Britton, Emily; Simper, Natalie; Leger, Andrew; Stephenson, Jenn

    2017-01-01

    Effective teamwork skills are essential for success in an increasingly team-based workplace. However, research suggests that there is often confusion concerning how teamwork is measured and assessed, making it difficult to develop these skills in undergraduate curricula. The goal of the present study was to develop a sustainable tool for assessing…

  10. BAC Libraries from Wheat Chromosome 7D – Efficient Tool for Positional Cloning of Aphid Resistance Genes

    USDA-ARS?s Scientific Manuscript database

    Positional cloning in bread wheat is a tedious task due to its huge genome size (~17 Gbp) and polyploid character. BAC libraries represent an essential tool for positional cloning. However, wheat BAC libraries comprise more than million clones, which make their screening very laborious. Here we pres...

  11. Technical Manual for the Conceptual Learning and Development Assessment Series II: Cutting Tool. Technical Report No. 435. Reprinted December 1977.

    ERIC Educational Resources Information Center

    DiLuzio, Geneva J.; And Others

    This document accompanies Conceptual Learning and Development Assessment Series II: Cutting Tool, a test constructed to chart the conceptual development of individuals. As a technical manual, it contains information on the rationale, development, standardization, and reliability of the test, as well as essential information and statistical data…

  12. Toxicity of β-citronellol, geraniol and linalool from Pelargonium roseum essential oil against the West Nile and filariasis vector Culex pipiens (Diptera: Culicidae).

    PubMed

    Tabari, Mohaddeseh Abouhosseini; Youssefi, Mohammad Reza; Esfandiari, Aryan; Benelli, Giovanni

    2017-10-01

    Insect vectors are responsible for spreading devastating parasites and pathogens. A large number of botanicals have been suggested for eco-friendly control programs against mosquito vectors, and some of them are aromatic plants. Pelargonium roseum, a species belonging to the Geraniaceae family, due to its pleasant rose-like odor may represent a suitable candidate as mosquito repellent and/or larvicide. In this research, we evaluated the toxicity of the essential oil from P. roseum and its major constituents against the West Nile and filariasis vector Culex pipiens. The chemical composition of P. roseum essential oil was analyzed by gas chromatography-mass spectroscopy. Major constituents were citronellol (35.9%), geraniol (18.5%), and linalool (5.72%). The bioactivity of P. roseum essential oil and its three major compounds on larvae and egg rafts of Cx. pipiens was evaluated. The essential oil had a significant toxic effect on larvae and egg rafts of Cx. pipiens, with 50% lethal concentration (LC 50 ) values of 5.49 and 0.45μg/mL, respectively. Major constituents, geraniol, citronellol and linalool resulted in LC 50 values of 6.86, 7.64 and 14.87μg/mL on larvae, and 0.8, 0.67 and 1.27μg/mL on egg rafts. Essential oil and two of its constituents, citronellol and geraniol showed moderate knock-down on Cx. pipiens adults. Overall, the present investigation revealed that the major components of P. roseum and specially the whole essential oil could be helpful in developing novel and safe mosquito control tools and also offer an environmentally safe and cheap tool for reducing Cx. pipiens mosquito populations. Copyright © 2017. Published by Elsevier Ltd.

  13. Salinity impact on yield, water use, mineral and essential oil content of fennel (Foeniculum vulgare Mill.)

    USDA-ARS?s Scientific Manuscript database

    The experimental study was carried out to determine the effects of salinity on water consumption, plant height, fresh and seed yields, biomass production, ion accumulation and essential oil content of fennel (Foeniculum vulgare Mill.) under greenhouse conditions. The experiment was conducted with a ...

  14. Essentials of multiangle data-processing methodology for smoke polluted atmospheres

    Treesearch

    Vladimir Kovalev; A. Petkov; Cyle Wold; Shawn Urbanski; WeiMin Hao

    2011-01-01

    Essentials for investigating smoke plume characteristics with scanning lidar are discussed. Particularly, we outline basic principles for determining dynamics, heights, and optical properties of smoke plumes and layers in wildfire-polluted atmospheres. Both simulated and experimental data obtained in vicinities of wildfires with a two-wavelength scanning lidar are...

  15. Essential amino acids interacting with flavonoids: A theoretical approach

    NASA Astrophysics Data System (ADS)

    Codorniu-Hernández, Edelsys; Mesa-Ibirico, Ariel; Hernández-Santiesteban, Richel; Montero-Cabrera, Luis A.; Martínez-Luzardo, Francisco; Santana-Romero, Jorge L.; Borrmann, Tobias; Stohrer, Wolf-D.

    The interaction of two flavonoid species (resorcinolic and fluoroglucinolic) with the 20 essential amino acids was studied by the multiple minima hypersurface (MMH) procedures, through the AM1 and PM3 semiempirical methods. Remarkable thermodynamic data related to the properties of the molecular association of these compounds were obtained, which will be of great utility for future investigations concerning the interaction of flavonoids with proteins. These results are compared with experimental and classical force field results reported in the available literature, and new evidences and criteria are shown. The hydrophilic amino acids demonstrated high affinity in the interaction with flavonoid molecules; the complexes with lysine are especially extremely stable. An affinity order for the interaction of both flavonoid species with the essential amino acids is suggested. Our theoretical results are compared with experimental evidence on flavonoid interactions with proteins of biomedical interest.

  16. Model Rocketry: University-Level Educational Tool

    ERIC Educational Resources Information Center

    Barrowman, James S.

    1974-01-01

    Describes how model rocketry can be a useful educational tool at the university level as a practical application of theoretical aerodynamic concepts and as a tool for students in experimental research. (BR)

  17. Modeling languages for biochemical network simulation: reaction vs equation based approaches.

    PubMed

    Wiechert, Wolfgang; Noack, Stephan; Elsheikh, Atya

    2010-01-01

    Biochemical network modeling and simulation is an essential task in any systems biology project. The systems biology markup language (SBML) was established as a standardized model exchange language for mechanistic models. A specific strength of SBML is that numerous tools for formulating, processing, simulation and analysis of models are freely available. Interestingly, in the field of multidisciplinary simulation, the problem of model exchange between different simulation tools occurred much earlier. Several general modeling languages like Modelica have been developed in the 1990s. Modelica enables an equation based modular specification of arbitrary hierarchical differential algebraic equation models. Moreover, libraries for special application domains can be rapidly developed. This contribution compares the reaction based approach of SBML with the equation based approach of Modelica and explains the specific strengths of both tools. Several biological examples illustrating essential SBML and Modelica concepts are given. The chosen criteria for tool comparison are flexibility for constraint specification, different modeling flavors, hierarchical, modular and multidisciplinary modeling. Additionally, support for spatially distributed systems, event handling and network analysis features is discussed. As a major result it is shown that the choice of the modeling tool has a strong impact on the expressivity of the specified models but also strongly depends on the requirements of the application context.

  18. Identification of fuel cycle simulator functionalities for analysis of transition to a new fuel cycle

    DOE PAGES

    Brown, Nicholas R.; Carlsen, Brett W.; Dixon, Brent W.; ...

    2016-06-09

    Dynamic fuel cycle simulation tools are intended to model holistic transient nuclear fuel cycle scenarios. As with all simulation tools, fuel cycle simulators require verification through unit tests, benchmark cases, and integral tests. Model validation is a vital aspect as well. Although compara-tive studies have been performed, there is no comprehensive unit test and benchmark library for fuel cycle simulator tools. The objective of this paper is to identify the must test functionalities of a fuel cycle simulator tool within the context of specific problems of interest to the Fuel Cycle Options Campaign within the U.S. Department of Energy smore » Office of Nuclear Energy. The approach in this paper identifies the features needed to cover the range of promising fuel cycle options identified in the DOE-NE Fuel Cycle Evaluation and Screening (E&S) and categorizes these features to facilitate prioritization. Features were categorized as essential functions, integrating features, and exemplary capabilities. One objective of this paper is to propose a library of unit tests applicable to each of the essential functions. Another underlying motivation for this paper is to encourage an international dialog on the functionalities and standard test methods for fuel cycle simulator tools.« less

  19. Generalization of experimental data on heat transfer in permeable shells made of porous reticular materials

    NASA Astrophysics Data System (ADS)

    Polyakov, A. F.; Strat'ev, V. K.; Tret'yakov, A. F.; Shekhter, Yu. L.

    2010-06-01

    Heat transfer from six samples of porous reticular material to cooling gas (air) at small Reynolds numbers is experimentally studied. The specific features pertinent to heat transfer essentially affected by longitudinal heat conductivity along gas flow are analyzed. The experimental results are generalized in the form of dimensionless empirical relations.

  20. Identification of novel plant peroxisomal targeting signals by a combination of machine learning methods and in vivo subcellular targeting analyses.

    PubMed

    Lingner, Thomas; Kataya, Amr R; Antonicelli, Gerardo E; Benichou, Aline; Nilssen, Kjersti; Chen, Xiong-Yan; Siemsen, Tanja; Morgenstern, Burkhard; Meinicke, Peter; Reumann, Sigrun

    2011-04-01

    In the postgenomic era, accurate prediction tools are essential for identification of the proteomes of cell organelles. Prediction methods have been developed for peroxisome-targeted proteins in animals and fungi but are missing specifically for plants. For development of a predictor for plant proteins carrying peroxisome targeting signals type 1 (PTS1), we assembled more than 2500 homologous plant sequences, mainly from EST databases. We applied a discriminative machine learning approach to derive two different prediction methods, both of which showed high prediction accuracy and recognized specific targeting-enhancing patterns in the regions upstream of the PTS1 tripeptides. Upon application of these methods to the Arabidopsis thaliana genome, 392 gene models were predicted to be peroxisome targeted. These predictions were extensively tested in vivo, resulting in a high experimental verification rate of Arabidopsis proteins previously not known to be peroxisomal. The prediction methods were able to correctly infer novel PTS1 tripeptides, which even included novel residues. Twenty-three newly predicted PTS1 tripeptides were experimentally confirmed, and a high variability of the plant PTS1 motif was discovered. These prediction methods will be instrumental in identifying low-abundance and stress-inducible peroxisomal proteins and defining the entire peroxisomal proteome of Arabidopsis and agronomically important crop plants.

  1. Computational analysis of stochastic heterogeneity in PCR amplification efficiency revealed by single molecule barcoding

    PubMed Central

    Best, Katharine; Oakes, Theres; Heather, James M.; Shawe-Taylor, John; Chain, Benny

    2015-01-01

    The polymerase chain reaction (PCR) is one of the most widely used techniques in molecular biology. In combination with High Throughput Sequencing (HTS), PCR is widely used to quantify transcript abundance for RNA-seq, and in the context of analysis of T and B cell receptor repertoires. In this study, we combine DNA barcoding with HTS to quantify PCR output from individual target molecules. We develop computational tools that simulate both the PCR branching process itself, and the subsequent subsampling which typically occurs during HTS sequencing. We explore the influence of different types of heterogeneity on sequencing output, and compare them to experimental results where the efficiency of amplification is measured by barcodes uniquely identifying each molecule of starting template. Our results demonstrate that the PCR process introduces substantial amplification heterogeneity, independent of primer sequence and bulk experimental conditions. This heterogeneity can be attributed both to inherited differences between different template DNA molecules, and the inherent stochasticity of the PCR process. The results demonstrate that PCR heterogeneity arises even when reaction and substrate conditions are kept as constant as possible, and therefore single molecule barcoding is essential in order to derive reproducible quantitative results from any protocol combining PCR with HTS. PMID:26459131

  2. Induction of regulatory T cells in Th1-/Th17-driven experimental autoimmune encephalomyelitis by zinc administration.

    PubMed

    Rosenkranz, Eva; Maywald, Martina; Hilgers, Ralf-Dieter; Brieger, Anne; Clarner, Tim; Kipp, Markus; Plümäkers, Birgit; Meyer, Sören; Schwerdtle, Tanja; Rink, Lothar

    2016-03-01

    The essential trace element zinc is indispensable for proper immune function as zinc deficiency accompanies immune defects and dysregulations like allergies, autoimmunity and an increased presence of transplant rejection. This point to the importance of the physiological and dietary control of zinc levels for a functioning immune system. This study investigates the capacity of zinc to induce immune tolerance. The beneficial impact of physiological zinc supplementation of 6 μg/day (0.3mg/kg body weight) or 30 μg/day (1.5mg/kg body weight) on murine experimental autoimmune encephalomyelitis (EAE), an animal model for multiple sclerosis with a Th1/Th17 (Th, T helper) cell-dominated immunopathogenesis, was analyzed. Zinc administration diminished EAE scores in C57BL/6 mice in vivo (P<.05), reduced Th17 RORγT(+) cells (P<.05) and significantly increased inducible iTreg cells (P<.05). While Th17 cells decreased systemically, iTreg cells accumulated in the central nervous system. Cumulatively, zinc supplementation seems to be capable to induce tolerance in unwanted immune reactions by increasing iTreg cells. This makes zinc a promising future tool for treating autoimmune diseases without suppressing the immune system. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Stereological analysis of bacterial load and lung lesions in nonhuman primates (rhesus macaques) experimentally infected with Mycobacterium tuberculosis.

    PubMed

    Luciw, Paul A; Oslund, Karen L; Yang, Xiao-Wei; Adamson, Lourdes; Ravindran, Resmi; Canfield, Don R; Tarara, Ross; Hirst, Linda; Christensen, Miles; Lerche, Nicholas W; Offenstein, Heather; Lewinsohn, David; Ventimiglia, Frank; Brignolo, Laurie; Wisner, Erik R; Hyde, Dallas M

    2011-11-01

    Infection with Mycobacterium tuberculosis primarily produces a multifocal distribution of pulmonary granulomas in which the pathogen resides. Accordingly, quantitative assessment of the bacterial load and pathology is a substantial challenge in tuberculosis. Such assessments are critical for studies of the pathogenesis and for the development of vaccines and drugs in animal models of experimental M. tuberculosis infection. Stereology enables unbiased quantitation of three-dimensional objects from two-dimensional sections and thus is suited to quantify histological lesions. We have developed a protocol for stereological analysis of the lung in rhesus macaques inoculated with a pathogenic clinical strain of M. tuberculosis (Erdman strain). These animals exhibit a pattern of infection and tuberculosis similar to that of naturally infected humans. Conditions were optimized for collecting lung samples in a nonbiased, random manner. Bacterial load in these samples was assessed by a standard plating assay, and granulomas were graded and enumerated microscopically. Stereological analysis provided quantitative data that supported a significant correlation between bacterial load and lung granulomas. Thus this stereological approach enables a quantitative, statistically valid analysis of the impact of M. tuberculosis infection in the lung and will serve as an essential tool for objectively comparing the efficacy of drugs and vaccines.

  4. Noninvasive imaging of experimental lung fibrosis.

    PubMed

    Zhou, Yong; Chen, Huaping; Ambalavanan, Namasivayam; Liu, Gang; Antony, Veena B; Ding, Qiang; Nath, Hrudaya; Eary, Janet F; Thannickal, Victor J

    2015-07-01

    Small animal models of lung fibrosis are essential for unraveling the molecular mechanisms underlying human fibrotic lung diseases; additionally, they are useful for preclinical testing of candidate antifibrotic agents. The current end-point measures of experimental lung fibrosis involve labor-intensive histological and biochemical analyses. These measures fail to account for dynamic changes in the disease process in individual animals and are limited by the need for large numbers of animals for longitudinal studies. The emergence of noninvasive imaging technologies provides exciting opportunities to image lung fibrosis in live animals as often as needed and to longitudinally track the efficacy of novel antifibrotic compounds. Data obtained by noninvasive imaging provide complementary information to histological and biochemical measurements. In addition, the use of noninvasive imaging in animal studies reduces animal usage, thus satisfying animal welfare concerns. In this article, we review these new imaging modalities with the potential for evaluation of lung fibrosis in small animal models. Such techniques include micro-computed tomography (micro-CT), magnetic resonance imaging, positron emission tomography (PET), single photon emission computed tomography (SPECT), and multimodal imaging systems including PET/CT and SPECT/CT. It is anticipated that noninvasive imaging will be increasingly used in animal models of fibrosis to gain insights into disease pathogenesis and as preclinical tools to assess drug efficacy.

  5. An integrative computational approach for prioritization of genomic variants

    DOE PAGES

    Dubchak, Inna; Balasubramanian, Sandhya; Wang, Sheng; ...

    2014-12-15

    An essential step in the discovery of molecular mechanisms contributing to disease phenotypes and efficient experimental planning is the development of weighted hypotheses that estimate the functional effects of sequence variants discovered by high-throughput genomics. With the increasing specialization of the bioinformatics resources, creating analytical workflows that seamlessly integrate data and bioinformatics tools developed by multiple groups becomes inevitable. Here we present a case study of a use of the distributed analytical environment integrating four complementary specialized resources, namely the Lynx platform, VISTA RViewer, the Developmental Brain Disorders Database (DBDB), and the RaptorX server, for the identification of high-confidence candidatemore » genes contributing to pathogenesis of spina bifida. The analysis resulted in prediction and validation of deleterious mutations in the SLC19A placental transporter in mothers of the affected children that causes narrowing of the outlet channel and therefore leads to the reduced folate permeation rate. The described approach also enabled correct identification of several genes, previously shown to contribute to pathogenesis of spina bifida, and suggestion of additional genes for experimental validations. This study demonstrates that the seamless integration of bioinformatics resources enables fast and efficient prioritization and characterization of genomic factors and molecular networks contributing to the phenotypes of interest.« less

  6. Pediatric personalized CT-dosimetry Monte Carlo simulations, using computational phantoms

    NASA Astrophysics Data System (ADS)

    Papadimitroulas, P.; Kagadis, G. C.; Ploussi, A.; Kordolaimi, S.; Papamichail, D.; Karavasilis, E.; Syrgiamiotis, V.; Loudos, G.

    2015-09-01

    The last 40 years Monte Carlo (MC) simulations serve as a “gold standard” tool for a wide range of applications in the field of medical physics and tend to be essential in daily clinical practice. Regarding diagnostic imaging applications, such as computed tomography (CT), the assessment of deposited energy is of high interest, so as to better analyze the risks and the benefits of the procedure. The last few years a big effort is done towards personalized dosimetry, especially in pediatric applications. In the present study the GATE toolkit was used and computational pediatric phantoms have been modeled for the assessment of CT examinations dosimetry. The pediatric models used come from the XCAT and IT'IS series. The X-ray spectrum of a Brightspeed CT scanner was simulated and validated with experimental data. Specifically, a DCT-10 ionization chamber was irradiated twice using 120 kVp with 100 mAs and 200 mAs, for 1 sec in 1 central axial slice (thickness = 10mm). The absorbed dose was measured in air resulting in differences lower than 4% between the experimental and simulated data. The simulations were acquired using ˜1010 number of primaries in order to achieve low statistical uncertainties. Dose maps were also saved for quantification of the absorbed dose in several children critical organs during CT acquisition.

  7. PopED lite: An optimal design software for preclinical pharmacokinetic and pharmacodynamic studies.

    PubMed

    Aoki, Yasunori; Sundqvist, Monika; Hooker, Andrew C; Gennemark, Peter

    2016-04-01

    Optimal experimental design approaches are seldom used in preclinical drug discovery. The objective is to develop an optimal design software tool specifically designed for preclinical applications in order to increase the efficiency of drug discovery in vivo studies. Several realistic experimental design case studies were collected and many preclinical experimental teams were consulted to determine the design goal of the software tool. The tool obtains an optimized experimental design by solving a constrained optimization problem, where each experimental design is evaluated using some function of the Fisher Information Matrix. The software was implemented in C++ using the Qt framework to assure a responsive user-software interaction through a rich graphical user interface, and at the same time, achieving the desired computational speed. In addition, a discrete global optimization algorithm was developed and implemented. The software design goals were simplicity, speed and intuition. Based on these design goals, we have developed the publicly available software PopED lite (http://www.bluetree.me/PopED_lite). Optimization computation was on average, over 14 test problems, 30 times faster in PopED lite compared to an already existing optimal design software tool. PopED lite is now used in real drug discovery projects and a few of these case studies are presented in this paper. PopED lite is designed to be simple, fast and intuitive. Simple, to give many users access to basic optimal design calculations. Fast, to fit a short design-execution cycle and allow interactive experimental design (test one design, discuss proposed design, test another design, etc). Intuitive, so that the input to and output from the software tool can easily be understood by users without knowledge of the theory of optimal design. In this way, PopED lite is highly useful in practice and complements existing tools. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. Regulation of the Embryonic Cell Cycle During Mammalian Preimplantation Development.

    PubMed

    Palmer, N; Kaldis, P

    2016-01-01

    The preimplantation development stage of mammalian embryogenesis consists of a series of highly conserved, regulated, and predictable cell divisions. This process is essential to allow the rapid expansion and differentiation of a single-cell zygote into a multicellular blastocyst containing cells of multiple developmental lineages. This period of development, also known as the germinal stage, encompasses several important developmental transitions, which are accompanied by dramatic changes in cell cycle profiles and dynamics. These changes are driven primarily by differences in the establishment and enforcement of cell cycle checkpoints, which must be bypassed to facilitate the completion of essential cell cycle events. Much of the current knowledge in this area has been amassed through the study of knockout models in mice. These mouse models are powerful experimental tools, which have allowed us to dissect the relative dependence of the early embryonic cell cycles on various aspects of the cell cycle machinery and highlight the extent of functional redundancy between members of the same gene family. This chapter will explore the ways in which the cell cycle machinery, their accessory proteins, and their stimuli operate during mammalian preimplantation using mouse models as a reference and how this allows for the usually well-defined stages of the cell cycle to be shaped and transformed during this unique and critical stage of development. © 2016 Elsevier Inc. All rights reserved.

  9. Locating Sites of Negations and Denegating "Negative Essentializing": Rereading Virginia Woolf's "A Room of One's Own"

    ERIC Educational Resources Information Center

    Adhikari, Manahari

    2014-01-01

    This essay examines how Virginia Woolf uses writing as a tool to locate sites of negations, such as women's exclusion from places of power and knowledge, and to expose negative essentializing that permeates patriarchal structure in "A Room of One's Own." Whereas scholarship on the book has explored a wide range of issues including sex,…

  10. Evaluation of a task-based community oriented teaching model in family medicine for undergraduate medical students in Iraq.

    PubMed

    Al-Dabbagh, Samim A; Al-Taee, Waleed G

    2005-08-22

    The inclusion of family medicine in medical school curricula is essential for producing competent general practitioners. The aim of this study is to evaluate a task-based, community oriented teaching model of family medicine for undergraduate students in Iraqi medical schools. An innovative training model in family medicine was developed based upon tasks regularly performed by family physicians providing health care services at the Primary Health Care Centre (PHCC) in Mosul, Iraq. Participants were medical students enrolled in their final clinical year. Students were assigned to one of two groups. The implementation group (28 students) was exposed to the experimental model and the control group (56 students) received the standard teaching curriculum. The study took place at the Mosul College of Medicine and at the Al-Hadba PHCC in Mosul, Iraq, during the academic year 1999-2000. Pre- and post-exposure evaluations comparing the intervention group with the control group were conducted using a variety of assessment tools. The primary endpoints were improvement in knowledge of family medicine and development of essential performance skills. Results showed that the implementation group experienced a significant increase in knowledge and performance skills after exposure to the model and in comparison with the control group. Assessment of the model by participating students revealed a high degree of satisfaction with the planning, organization, and implementation of the intervention activities. Students also highly rated the relevancy of the intervention for future work. A model on PHCC training in family medicine is essential for all Iraqi medical schools. The model is to be implemented by various relevant departments until Departments of Family medicine are established.

  11. SYRCLE’s risk of bias tool for animal studies

    PubMed Central

    2014-01-01

    Background Systematic Reviews (SRs) of experimental animal studies are not yet common practice, but awareness of the merits of conducting such SRs is steadily increasing. As animal intervention studies differ from randomized clinical trials (RCT) in many aspects, the methodology for SRs of clinical trials needs to be adapted and optimized for animal intervention studies. The Cochrane Collaboration developed a Risk of Bias (RoB) tool to establish consistency and avoid discrepancies in assessing the methodological quality of RCTs. A similar initiative is warranted in the field of animal experimentation. Methods We provide an RoB tool for animal intervention studies (SYRCLE’s RoB tool). This tool is based on the Cochrane RoB tool and has been adjusted for aspects of bias that play a specific role in animal intervention studies. To enhance transparency and applicability, we formulated signalling questions to facilitate judgment. Results The resulting RoB tool for animal studies contains 10 entries. These entries are related to selection bias, performance bias, detection bias, attrition bias, reporting bias and other biases. Half these items are in agreement with the items in the Cochrane RoB tool. Most of the variations between the two tools are due to differences in design between RCTs and animal studies. Shortcomings in, or unfamiliarity with, specific aspects of experimental design of animal studies compared to clinical studies also play a role. Conclusions SYRCLE’s RoB tool is an adapted version of the Cochrane RoB tool. Widespread adoption and implementation of this tool will facilitate and improve critical appraisal of evidence from animal studies. This may subsequently enhance the efficiency of translating animal research into clinical practice and increase awareness of the necessity of improving the methodological quality of animal studies. PMID:24667063

  12. First-Principles Design of Novel Catalytic and Chemoresponsive Materials

    NASA Astrophysics Data System (ADS)

    Roling, Luke T.

    An emerging trend in materials design is the use of computational chemistry tools to accelerate materials discovery and implementation. In particular, the parallel nature of computational models enables high-throughput screening approaches that would be laborious and time-consuming with experiments alone, and can be useful for identifying promising candidate materials for experimental synthesis and evaluation. Additionally, atomic-scale modeling allows researchers to obtain a detailed understanding of phenomena invisible to many current experimental techniques. In this thesis, we highlight mechanistic studies and successes in catalyst design for heterogeneous electrochemical reactions, discussing both anode and cathode chemistries. In particular, we evaluate the properties of a new class of Pd-Pt core-shell and hollow nanocatalysts toward the oxygen reduction reaction. We do not limit our study to electrochemical reactivity, but also consider these catalysts in a broader context by performing in-depth studies of their stability at elevated temperatures as well as investigating the mechanisms by which they are able to form. We also present fundamental surface science studies, investigating graphene formation and H2 dissociation, which are processes of both fundamental and practical interest in many catalytic applications. Finally, we extend our materials design paradigm outside the field of catalysis to develop and apply a model for the detection of small chemical analytes by chemoresponsive liquid crystals, and offer several predictions for improving the detection of small chemicals. A close connection between computation, synthesis, and experimental evaluation is essential to the work described herein, as computations are used to gain fundamental insight into experimental observations, and experiments and synthesis are in turn used to validate predictions of material activities from computational models.

  13. X-33 Experimental Aeroheating at Mach 6 Using Phosphor Thermography

    NASA Technical Reports Server (NTRS)

    Horvath, Thomas J.; Berry, Scott A.; Hollis, Brian R.; Liechty, Derek S.; Hamilton, H. Harris, II; Merski, N. Ronald

    1999-01-01

    The goal of the NASA Reusable Launch Vehicle (RLV) technology program is to mature and demonstrate essential, cost effective technologies for next generation launch systems. The X-33 flight vehicle presently being developed by Lockheed Martin is an experimental Single Stage to Orbit (SSTO) demonstrator that seeks to validate critical technologies and insure applicability to a full scale RLV. As with the design of any hypersonic vehicle, the aeroheating environment is an important issue and one of the key technologies being demonstrated on X-33 is an advanced metallic Thermal Protection System (TPS). As part of the development of this TPS system, the X-33 aeroheating environment is being defined through conceptual analysis, ground based testing, and computational fluid dynamics. This report provides an overview of the hypersonic aeroheating wind tunnel program conducted at the NASA Langley Research Center in support of the ground based testing activities. Global surface heat transfer images, surface streamline patterns, and shock shapes were measured on 0.013 scale (10-in.) ceramic models of the proposed X-33 configuration in Mach 6 air. The test parametrics include angles of attack from -5 to 40 degs, unit Reynolds numbers from 1x106 to 8x106/ft, and body flap deflections of 0, 10, and 20 deg. Experimental and computational results indicate the presence of shock/shock interactions that produced localized heating on the deflected flaps and boundary layer transition on the canted fins. Comparisons of the experimental data to laminar and turbulent predictions were performed. Laminar windward heating data from the wind tunnel was extrapolated to flight surface temperatures and generally compared to within 50 deg F of flight prediction along the centerline. When coupled with the phosphor technique, this rapid extrapolation method would serve as an invaluable TPS design tool.

  14. MicroRNAs and complex diseases: from experimental results to computational models.

    PubMed

    Chen, Xing; Xie, Di; Zhao, Qi; You, Zhu-Hong

    2017-10-17

    Plenty of microRNAs (miRNAs) were discovered at a rapid pace in plants, green algae, viruses and animals. As one of the most important components in the cell, miRNAs play a growing important role in various essential and important biological processes. For the recent few decades, amounts of experimental methods and computational models have been designed and implemented to identify novel miRNA-disease associations. In this review, the functions of miRNAs, miRNA-target interactions, miRNA-disease associations and some important publicly available miRNA-related databases were discussed in detail. Specially, considering the important fact that an increasing number of miRNA-disease associations have been experimentally confirmed, we selected five important miRNA-related human diseases and five crucial disease-related miRNAs and provided corresponding introductions. Identifying disease-related miRNAs has become an important goal of biomedical research, which will accelerate the understanding of disease pathogenesis at the molecular level and molecular tools design for disease diagnosis, treatment and prevention. Computational models have become an important means for novel miRNA-disease association identification, which could select the most promising miRNA-disease pairs for experimental validation and significantly reduce the time and cost of the biological experiments. Here, we reviewed 20 state-of-the-art computational models of predicting miRNA-disease associations from different perspectives. Finally, we summarized four important factors for the difficulties of predicting potential disease-related miRNAs, the framework of constructing powerful computational models to predict potential miRNA-disease associations including five feasible and important research schemas, and future directions for further development of computational models. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  15. Tufted capuchins (Cebus apella) attribute value to foods and tools during voluntary exchanges with humans.

    PubMed

    Westergaard, Gregory C; Liv, Chanya; Rocca, Andrea M; Cleveland, Allison; Suomi, Stephen J

    2004-01-01

    This research examined exchange and value attribution in tufted capuchin monkeys ( Cebus apella). We presented subjects with opportunities to obtain various foods and a tool from an experimenter in exchange for the foods or tool in the subjects' possession. The times elapsed before the first chow biscuits were expelled and/or an exchange took place were recorded as the dependent measures. Laboratory chow biscuits, grapes, apples, and a metal bolt (a tool used to probe for syrup) were used as experimental stimuli. The subjects demonstrated the ability to recognize that exchanges could occur when an experimenter was present with a desirable food. Results indicate that subjects exhibited significant variation in their willingness to barter based upon the types of foods that were both in their possession and presented by the experimenter. Subjects more readily traded chow biscuits for fruit, and more readily traded apples for grapes than grapes for apples. During the exchange of tools and food, the subjects preferred the following in descending order when the probing apparatus was baited with sweet syrup: grapes, metal bolts, and chow biscuits. However when the apparatus was not baited, the values changed to the following in descending order: grapes, chow, and metal bolts. These results indicate that tufted capuchins recognize opportunities to exchange and engage in a simple barter system whereby low-valued foods are readily traded for more highly valued food. Furthermore, these capuchins demonstrate that their value for a tool changes depending upon its utility.

  16. Is there a "net generation" in veterinary medicine? A comparative study on the use of the Internet and Web 2.0 by students and the veterinary profession.

    PubMed

    Tenhaven, Christoph; Tipold, Andrea; Fischer, Martin R; Ehlers, Jan P

    2013-01-01

    Informal and formal lifelong learning is essential at university and in the workplace. Apart from classical learning techniques, Web 2.0 tools can be used. It is controversial whether there is a so-called net generation amongst people under 30. To test the hypothesis that a net generation among students and young veterinarians exists. An online survey of students and veterinarians was conducted in the German-speaking countries which was advertised via online media and traditional print media. 1780 people took part in the survey. Students and veterinarians have different usage patterns regarding social networks (91.9% vs. 69%) and IM (55.9% vs. 24.5%). All tools were predominantly used passively and in private, to a lesser extent also professionally and for studying. The use of Web 2.0 tools is useful, however, teaching information and media skills, preparing codes of conduct for the internet and verification of user generated content is essential.

  17. Classical Experiments Revisited: Smartphones and Tablet PCs as Experimental Tools in Acoustics and Optics

    ERIC Educational Resources Information Center

    Klein, P.; Hirth, M.; Gröber, S.; Kuhn, J.; Müller, A.

    2014-01-01

    Smartphones and tablets are used as experimental tools and for quantitative measurements in two traditional laboratory experiments for undergraduate physics courses. The Doppler effect is analyzed and the speed of sound is determined with an accuracy of about 5% using ultrasonic frequency and two smartphones, which serve as rotating sound emitter…

  18. Lectindb: a plant lectin database.

    PubMed

    Chandra, Nagasuma R; Kumar, Nirmal; Jeyakani, Justin; Singh, Desh Deepak; Gowda, Sharan B; Prathima, M N

    2006-10-01

    Lectins, a class of carbohydrate-binding proteins, are now widely recognized to play a range of crucial roles in many cell-cell recognition events triggering several important cellular processes. They encompass different members that are diverse in their sequences, structures, binding site architectures, quaternary structures, carbohydrate affinities, and specificities as well as their larger biological roles and potential applications. It is not surprising, therefore, that the vast amount of experimental data on lectins available in the literature is so diverse, that it becomes difficult and time consuming, if not impossible to comprehend the advances in various areas and obtain the maximum benefit. To achieve an effective use of all the data toward understanding the function and their possible applications, an organization of these seemingly independent data into a common framework is essential. An integrated knowledge base ( Lectindb, http://nscdb.bic.physics.iisc.ernet.in ) together with appropriate analytical tools has therefore been developed initially for plant lectins by collating and integrating diverse data. The database has been implemented using MySQL on a Linux platform and web-enabled using PERL-CGI and Java tools. Data for each lectin pertain to taxonomic, biochemical, domain architecture, molecular sequence, and structural details as well as carbohydrate and hence blood group specificities. Extensive links have also been provided for relevant bioinformatics resources and analytical tools. Availability of diverse data integrated into a common framework is expected to be of high value not only for basic studies in lectin biology but also for basic studies in pursuing several applications in biotechnology, immunology, and clinical practice, using these molecules.

  19. IMG-ABC. A knowledge base to fuel discovery of biosynthetic gene clusters and novel secondary metabolites

    DOE PAGES

    Hadjithomas, Michalis; Chen, I-Min Amy; Chu, Ken; ...

    2015-07-14

    In the discovery of secondary metabolites, analysis of sequence data is a promising exploration path that remains largely underutilized due to the lack of computational platforms that enable such a systematic approach on a large scale. In this work, we present IMG-ABC (https://img.jgi.doe.gov/abc), an atlas of biosynthetic gene clusters within the Integrated Microbial Genomes (IMG) system, which is aimed at harnessing the power of “big” genomic data for discovering small molecules. IMG-ABC relies on IMG’s comprehensive integrated structural and functional genomic data for the analysis of biosynthetic gene clusters (BCs) and associated secondary metabolites (SMs). SMs and BCs serve asmore » the two main classes of objects in IMG-ABC, each with a rich collection of attributes. A unique feature of IMG-ABC is the incorporation of both experimentally validated and computationally predicted BCs in genomes as well as metagenomes, thus identifying BCs in uncultured populations and rare taxa. We demonstrate the strength of IMG-ABC’s focused integrated analysis tools in enabling the exploration of microbial secondary metabolism on a global scale, through the discovery of phenazine-producing clusters for the first time in lphaproteobacteria. IMG-ABC strives to fill the long-existent void of resources for computational exploration of the secondary metabolism universe; its underlying scalable framework enables traversal of uncovered phylogenetic and chemical structure space, serving as a doorway to a new era in the discovery of novel molecules. IMG-ABC is the largest publicly available database of predicted and experimental biosynthetic gene clusters and the secondary metabolites they produce. The system also includes powerful search and analysis tools that are integrated with IMG’s extensive genomic/metagenomic data and analysis tool kits. As new research on biosynthetic gene clusters and secondary metabolites is published and more genomes are sequenced, IMG-ABC will continue to expand, with the goal of becoming an essential component of any bioinformatic exploration of the secondary metabolism world.« less

  20. SS-mPMG and SS-GA: tools for finding pathways and dynamic simulation of metabolic networks.

    PubMed

    Katsuragi, Tetsuo; Ono, Naoaki; Yasumoto, Keiichi; Altaf-Ul-Amin, Md; Hirai, Masami Y; Sriyudthsak, Kansuporn; Sawada, Yuji; Yamashita, Yui; Chiba, Yukako; Onouchi, Hitoshi; Fujiwara, Toru; Naito, Satoshi; Shiraishi, Fumihide; Kanaya, Shigehiko

    2013-05-01

    Metabolomics analysis tools can provide quantitative information on the concentration of metabolites in an organism. In this paper, we propose the minimum pathway model generator tool for simulating the dynamics of metabolite concentrations (SS-mPMG) and a tool for parameter estimation by genetic algorithm (SS-GA). SS-mPMG can extract a subsystem of the metabolic network from the genome-scale pathway maps to reduce the complexity of the simulation model and automatically construct a dynamic simulator to evaluate the experimentally observed behavior of metabolites. Using this tool, we show that stochastic simulation can reproduce experimentally observed dynamics of amino acid biosynthesis in Arabidopsis thaliana. In this simulation, SS-mPMG extracts the metabolic network subsystem from published databases. The parameters needed for the simulation are determined using a genetic algorithm to fit the simulation results to the experimental data. We expect that SS-mPMG and SS-GA will help researchers to create relevant metabolic networks and carry out simulations of metabolic reactions derived from metabolomics data.

  1. Providing Written Feedback on Students' Mathematical Arguments: Proof Validations of Prospective Secondary Mathematics Teachers

    ERIC Educational Resources Information Center

    Bleiler, Sarah K.; Thompson, Denisse R.; Krajcevski, Milé

    2014-01-01

    Mathematics teachers play a unique role as experts who provide opportunities for students to engage in the practices of the mathematics community. Proof is a tool essential to the practice of mathematics, and therefore, if teachers are to provide adequate opportunities for students to engage with this tool, they must be able to validate student…

  2. Synergistic Role of Newer Techniques for Forensic and Postmortem CT Examinations.

    PubMed

    Blum, Alain; Kolopp, Martin; Teixeira, Pedro Gondim; Stroud, Tyler; Noirtin, Philippe; Coudane, Henry; Martrille, Laurent

    2018-04-30

    The aim of this article is to provide an overview of newer techniques and postprocessing tools that improve the potential impact of CT in forensic situations. CT has become a standard tool in medicolegal practice. Postmortem CT is an essential aid to the pathologist during autopsies. Advances in technology and software are constantly leading to advances in its performance.

  3. Modern Languages and Specific Learning Difficulties (SpLD): Implications of Teaching Adult Learners with Dyslexia in Distance Learning

    ERIC Educational Resources Information Center

    Gallardo, Matilde; Heiser, Sarah; Arias McLaughlin, Ximena

    2015-01-01

    In modern language (ML) distance learning programmes, teachers and students use online tools to facilitate, reinforce and support independent learning. This makes it essential for teachers to develop pedagogical expertise in using online communication tools to perform their role. Teachers frequently raise questions of how best to support the needs…

  4. Antibacterial and antioxidant activities of essential oils isolated from Thymbra capitata L. (Cav.) andOriganum vulgare L.

    PubMed

    Faleiro, Leonor; Miguel, Graça; Gomes, Sónia; Costa, Ludmila; Venâncio, Florencia; Teixeira, Adriano; Figueiredo, A Cristina; Barroso, José G; Pedro, Luis G

    2005-10-19

    Antilisterial activities of Thymbra capitata and Origanum vulgare essential oils were tested against 41 strains of Listeria monocytogenes. The oil of T. capitata was mainly constituted by one component, carvacrol (79%), whereas for O. vulgare three components constituted 70% of the oil, namely, thymol (33%), gamma-terpinene (26%), and p-cymene (11%). T. capitata essential oil had a significantly higher antilisterial activity in comparison to O. vulgare oil and chloramphenicol. No significant differences in L. monocytogenes susceptibilities to the essential oils tested were registered. The minimum inhibitory concentration values of T. capitata essential oil and of carvacrol were quite similar, ranging between 0.05 and 0.2 microL/mL. Antioxidant activity was also tested, the essential oil of T. capitata showing significantly higher antioxidant activity than that of O. vulgare. Use of T. capitata and O. vulgare essential oils can constitute a powerful tool in the control of L. monocytogenes in food and other industries.

  5. Summary and Findings of the ARL Dynamic Failure Forum

    DTIC Science & Technology

    2016-09-29

    short beam shear, quasi -static indentation, depth of penetration, and V50 limit velocity. o Experimental technique suggestions for improvement included...art in experimental , theoretical, and computational studies of dynamic failure. The forum also focused on identifying technologies and approaches...Army-specific problems. Experimental exploration of material behavior and an improved ability to parameterize material models is essential to improving

  6. Experimental investigation of a transonic potential flow around a symmetric airfoil

    NASA Technical Reports Server (NTRS)

    Hiller, W. J.; Meier, G. E. A.

    1981-01-01

    Experimental flow investigations on smooth airfoils were done using numerical solutions for transonic airfoil streaming with shockless supersonic range. The experimental flow reproduced essential sections of the theoretically computed frictionless solution. Agreement is better in the expansion part of the of the flow than in the compression part. The flow was nearly stationary in the entire velocity range investigated.

  7. Prediction of physical protein protein interactions

    NASA Astrophysics Data System (ADS)

    Szilágyi, András; Grimm, Vera; Arakaki, Adrián K.; Skolnick, Jeffrey

    2005-06-01

    Many essential cellular processes such as signal transduction, transport, cellular motion and most regulatory mechanisms are mediated by protein-protein interactions. In recent years, new experimental techniques have been developed to discover the protein-protein interaction networks of several organisms. However, the accuracy and coverage of these techniques have proven to be limited, and computational approaches remain essential both to assist in the design and validation of experimental studies and for the prediction of interaction partners and detailed structures of protein complexes. Here, we provide a critical overview of existing structure-independent and structure-based computational methods. Although these techniques have significantly advanced in the past few years, we find that most of them are still in their infancy. We also provide an overview of experimental techniques for the detection of protein-protein interactions. Although the developments are promising, false positive and false negative results are common, and reliable detection is possible only by taking a consensus of different experimental approaches. The shortcomings of experimental techniques affect both the further development and the fair evaluation of computational prediction methods. For an adequate comparative evaluation of prediction and high-throughput experimental methods, an appropriately large benchmark set of biophysically characterized protein complexes would be needed, but is sorely lacking.

  8. On an algorithmic definition for the components of the minimal cell.

    PubMed

    Martínez, Octavio; Reyes-Valdés, M Humberto

    2018-01-01

    Living cells are highly complex systems comprising a multitude of elements that are engaged in the many convoluted processes observed during the cell cycle. However, not all elements and processes are essential for cell survival and reproduction under steady-state environmental conditions. To distinguish between essential from expendable cell components and thus define the 'minimal cell' and the corresponding 'minimal genome', we postulate that the synthesis of all cell elements can be represented as a finite set of binary operators, and within this framework we show that cell elements that depend on their previous existence to be synthesized are those that are essential for cell survival. An algorithm to distinguish essential cell elements is presented and demonstrated within an interactome. Data and functions implementing the algorithm are given as supporting information. We expect that this algorithmic approach will lead to the determination of the complete interactome of the minimal cell, which could then be experimentally validated. The assumptions behind this hypothesis as well as its consequences for experimental and theoretical biology are discussed.

  9. DNA AND PROTEIN RECOVERY FROM WASHED EXPERIMENTAL STONE TOOLS

    EPA Science Inventory

    DNA residues may preserve on ancient stone tools used to process animals. We studied 24 stone tools recovered from the Bugas-Holding site in northwestern Wyoming. Nine tools that yielded DNA included five bifaces, two side scrapers, one end scraper, and one utilized flake. The...

  10. Diverse Application of Magnetic Resonance Imaging for Mouse Phenotyping

    PubMed Central

    Wu, Yijen L.; Lo, Cecilia W.

    2017-01-01

    Small animal models, particularly mouse models, of human diseases are becoming an indispensable tool for biomedical research. Studies in animal models have provided important insights into the etiology of diseases and accelerated the development of therapeutic strategies. Detailed phenotypic characterization is essential, both for the development of such animal models and mechanistic studies into disease pathogenesis and testing the efficacy of experimental therapeutics. Magnetic Resonance Imaging (MRI) is a versatile and non-invasive imaging modality with excellent penetration depth, tissue coverage, and soft tissue contrast. MRI, being a multi-modal imaging modality, together with proven imaging protocols and availability of good contrast agents, is ideally suited for phenotyping mutant mouse models. Here we describe the applications of MRI for phenotyping structural birth defects involving the brain, heart, and kidney in mice. The versatility of MRI and its ease of use are well suited to meet the rapidly increasing demands for mouse phenotyping in the coming age of functional genomics. PMID:28544650

  11. Corrections to Newton’s law of gravitation - application to hybrid Bloch brane

    NASA Astrophysics Data System (ADS)

    Almeida, C. A. S.; Veras, D. F. S.; Dantas, D. M.

    2018-02-01

    We present in this work, the calculations of corrections in the Newton’s law of gravitation due to Kaluza-Klein gravitons in five-dimensional warped thick braneworld scenarios. We consider here a recently proposed model, namely, the hybrid Bloch brane. This model couples two scalar fields to gravity and is engendered from a domain wall-like defect. Also, two other models the so-called asymmetric hybrid brane and compact brane are considered. Such models are deformations of the ϕ 4 and sine-Gordon topological defects, respectively. Therefore we consider the branes engendered by such defects and we also compute the corrections in their cases. In order to attain the mass spectrum and its corresponding eigenfunctions which are the essential quantities for computing the correction to the Newtonian potential, we develop a suitable numerical technique. The calculation of slight deviations in the gravitational potential may be used as a selection tool for braneworld scenarios matching with future experimental measurements in high energy collisions

  12. The Heisenberg Microscope: A Powerful Instructional Tool for Promoting Meta-Cognitive and Meta-Scientific Thinking on Quantum Mechanics and the ‚Nature of Science'

    NASA Astrophysics Data System (ADS)

    Hadzidaki, Pandora

    2008-06-01

    In this paper, we present a multi-dimensional study concerning Heisenberg’s ‚gamma ray microscope’, a thought experiment, which is intimately connected with the historical development of quantum mechanics (QM), and also with the most disputed interpretations of quantum theory. In this study: (a) we investigate how philosophers of science read and explicate the function of thought experimentation in physical science; (b) in the light of relevant philosophical theories, we examine the complicated epistemological questions raised by the ‚gamma-ray microscope’ during the birth-process of QM and the contribution of this thought experiment to the clarification of the physical meaning of Heisenberg’s indeterminacy relations; (c) on the basis of the preceding analysis, we propose an instructional intervention, which aims at leading learners not only to an essential understanding of QM worldview, but to a deep insight into the Nature of Science as well.

  13. Optical cell separation from three-dimensional environment in photodegradable hydrogels for pure culture techniques.

    PubMed

    Tamura, Masato; Yanagawa, Fumiki; Sugiura, Shinji; Takagi, Toshiyuki; Sumaru, Kimio; Matsui, Hirofumi; Kanamori, Toshiyuki

    2014-05-07

    Cell sorting is an essential and efficient experimental tool for the isolation and characterization of target cells. A three-dimensional environment is crucial in determining cell behavior and cell fate in biological analysis. Herein, we have applied photodegradable hydrogels to optical cell separation from a 3D environment using a computer-controlled light irradiation system. The hydrogel is composed of photocleavable tetra-arm polyethylene glycol and gelatin, which optimized cytocompatibility to adjust a composition of crosslinker and gelatin. Local light irradiation could degrade the hydrogel corresponding to the micropattern image designed on a laptop; minimum resolution of photodegradation was estimated at 20 µm. Light irradiation separated an encapsulated fluorescent microbead without any contamination of neighbor beads, even at multiple targets. Upon selective separation of target cells in the hydrogels, the separated cells have grown on another dish, resulting in pure culture. Cell encapsulation, light irradiation and degradation products exhibited negligible cytotoxicity in overall process.

  14. Calculation of electromagnetic force in electromagnetic forming process of metal sheet

    NASA Astrophysics Data System (ADS)

    Xu, Da; Liu, Xuesong; Fang, Kun; Fang, Hongyuan

    2010-06-01

    Electromagnetic forming (EMF) is a forming process that relies on the inductive electromagnetic force to deform metallic workpiece at high speed. Calculation of the electromagnetic force is essential to understand the EMF process. However, accurate calculation requires complex numerical solution, in which the coupling between the electromagnetic process and the deformation of workpiece needs be considered. In this paper, an appropriate formula has been developed to calculate the electromagnetic force in metal work-piece in the sheet EMF process. The effects of the geometric size of coil, the material properties, and the parameters of discharge circuit on electromagnetic force are taken into consideration. Through the formula, the electromagnetic force at different time and in different positions of the workpiece can be predicted. The calculated electromagnetic force and magnetic field are in good agreement with the numerical and experimental results. The accurate prediction of the electromagnetic force provides an insight into the physical process of the EMF and a powerful tool to design optimum EMF systems.

  15. Exploring Short Linear Motifs Using the ELM Database and Tools.

    PubMed

    Gouw, Marc; Sámano-Sánchez, Hugo; Van Roey, Kim; Diella, Francesca; Gibson, Toby J; Dinkel, Holger

    2017-06-27

    The Eukaryotic Linear Motif (ELM) resource is dedicated to the characterization and prediction of short linear motifs (SLiMs). SLiMs are compact, degenerate peptide segments found in many proteins and essential to almost all cellular processes. However, despite their abundance, SLiMs remain largely uncharacterized. The ELM database is a collection of manually annotated SLiM instances curated from experimental literature. In this article we illustrate how to browse and search the database for curated SLiM data, and cover the different types of data integrated in the resource. We also cover how to use this resource in order to predict SLiMs in known as well as novel proteins, and how to interpret the results generated by the ELM prediction pipeline. The ELM database is a very rich resource, and in the following protocols we give helpful examples to demonstrate how this knowledge can be used to improve your own research. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  16. Dissemination and implementation of evidence based, mental health interventions in post conflict, low resource settings

    PubMed Central

    Tol, Wietse; Jordans, Mark; Zangana, Goran Sabir; Amin, Ahmed Mohammed; Bolton, Paul; Bass, Judith; Bonilla-Escobar, Fransisco Javier; Thornicroft, Graham

    2014-01-01

    The burden of mental health problems in (post)conflict low- and middle-income countries (LMIC) is substantial. Despite growing evidence for the effectiveness of selected mental health programs in conflict-affected LMIC and growing policy support, actual uptake and implementation have been slow. A key direction for future research, and a new frontier within science and practice, is Dissemination and Implementation (DI) which directly addresses the movement of evidence-based, effective health care approaches from experimental settings into routine use. This paper outlines some key implementation challenges, and strategies to address these, while implementing evidence-based treatments in conflict-affected LMIC based on the authors’ collective experiences. Dissemination and implementation evaluation and research in conflict settings is an essential new research direction. Future DI work in LMIC should include: 1) defining concepts and developing measurement tools, 2) the measurement of DI outcomes for all programming, and 3) the systematic evaluation of specific implementation strategies. PMID:28316559

  17. Convection and chemistry effects in CVD: A 3-D analysis for silicon deposition

    NASA Technical Reports Server (NTRS)

    Gokoglu, S. A.; Kuczmarski, M. A.; Tsui, P.; Chait, A.

    1989-01-01

    The computational fluid dynamics code FLUENT has been adopted to simulate the entire rectangular-channel-like (3-D) geometry of an experimental CVD reactor designed for Si deposition. The code incorporated the effects of both homogeneous (gas phase) and heterogeneous (surface) chemistry with finite reaction rates of important species existing in silane dissociation. The experiments were designed to elucidate the effects of gravitationally-induced buoyancy-driven convection flows on the quality of the grown Si films. This goal is accomplished by contrasting the results obtained from a carrier gas mixture of H2/Ar with the ones obtained from the same molar mixture ratio of H2/He, without any accompanying change in the chemistry. Computationally, these cases are simulated in the terrestrial gravitational field and in the absence of gravity. The numerical results compare favorably with experiments. Powerful computational tools provide invaluable insights into the complex physicochemical phenomena taking place in CVD reactors. Such information is essential for the improved design and optimization of future CVD reactors.

  18. Optimization of a novel improver gel formulation for Barbari flat bread using response surface methodology.

    PubMed

    Pourfarzad, Amir; Haddad Khodaparast, Mohammad Hossein; Karimi, Mehdi; Mortazavi, Seyed Ali

    2014-10-01

    Nowadays, the use of bread improvers has become an essential part of improving the production methods and quality of bakery products. In the present study, the Response Surface Methodology (RSM) was used to determine the optimum improver gel formulation which gave the best quality, shelf life, sensory and image properties for Barbari flat bread. Sodium stearoyl-2-lactylate (SSL), diacetyl tartaric acid esters of monoglyceride (DATEM) and propylene glycol (PG) were constituents of the gel and considered in this study. A second-order polynomial model was fitted to each response and the regression coefficients were determined using least square method. The optimum gel formulation was found to be 0.49 % of SSL, 0.36 % of DATEM and 0.5 % of PG when desirability function method was applied. There was a good agreement between the experimental data and their predicted counterparts. Results showed that the RSM, image processing and texture analysis are useful tools to investigate, approximate and predict a large number of bread properties.

  19. Cavity-based architecture to preserve quantum coherence and entanglement

    NASA Astrophysics Data System (ADS)

    Man, Zhong-Xiao; Xia, Yun-Jie; Lo Franco, Rosario

    2015-09-01

    Quantum technology relies on the utilization of resources, like quantum coherence and entanglement, which allow quantum information and computation processing. This achievement is however jeopardized by the detrimental effects of the environment surrounding any quantum system, so that finding strategies to protect quantum resources is essential. Non-Markovian and structured environments are useful tools to this aim. Here we show how a simple environmental architecture made of two coupled lossy cavities enables a switch between Markovian and non-Markovian regimes for the dynamics of a qubit embedded in one of the cavity. Furthermore, qubit coherence can be indefinitely preserved if the cavity without qubit is perfect. We then focus on entanglement control of two independent qubits locally subject to such an engineered environment and discuss its feasibility in the framework of circuit quantum electrodynamics. With up-to-date experimental parameters, we show that our architecture allows entanglement lifetimes orders of magnitude longer than the spontaneous lifetime without local cavity couplings. This cavity-based architecture is straightforwardly extendable to many qubits for scalability.

  20. Polarization and amplitude probes in Hanle effect EIT noise spectroscopy of a buffer gas cell

    NASA Astrophysics Data System (ADS)

    O'Leary, Shannon; Zheng, Aojie; Crescimanno, Michael

    2015-05-01

    Noise correlation spectroscopy on systems manifesting Electromagnetically Induced Transparency (EIT) holds promise as a simple, robust method for performing high-resolution spectroscopy used in applications such as EIT-based atomic magnetometry and clocks. While this noise conversion can diminish the precision of EIT applications, noise correlation techniques transform the noise into a useful spectroscopic tool that can improve the application's precision. We study intensity noise, originating from the large phase noise of a semiconductor diode laser's light, in Rb vapor EIT in the Hanle configuration. We report here on our recent experimental work on and complementary theoretical modeling of the effects of light polarization preparation and post-selection on the correlation spectrum and on the independent noise channel traces. We also explain methodology and recent results for delineating the effects of residual laser amplitude fluctuations on the correlation noise resonance as compared to other contributing processes. Understanding these subtleties are essential for optimizing EIT-noise applications.

  1. Cavity-based architecture to preserve quantum coherence and entanglement.

    PubMed

    Man, Zhong-Xiao; Xia, Yun-Jie; Lo Franco, Rosario

    2015-09-09

    Quantum technology relies on the utilization of resources, like quantum coherence and entanglement, which allow quantum information and computation processing. This achievement is however jeopardized by the detrimental effects of the environment surrounding any quantum system, so that finding strategies to protect quantum resources is essential. Non-Markovian and structured environments are useful tools to this aim. Here we show how a simple environmental architecture made of two coupled lossy cavities enables a switch between Markovian and non-Markovian regimes for the dynamics of a qubit embedded in one of the cavity. Furthermore, qubit coherence can be indefinitely preserved if the cavity without qubit is perfect. We then focus on entanglement control of two independent qubits locally subject to such an engineered environment and discuss its feasibility in the framework of circuit quantum electrodynamics. With up-to-date experimental parameters, we show that our architecture allows entanglement lifetimes orders of magnitude longer than the spontaneous lifetime without local cavity couplings. This cavity-based architecture is straightforwardly extendable to many qubits for scalability.

  2. Cavity-based architecture to preserve quantum coherence and entanglement

    PubMed Central

    Man, Zhong-Xiao; Xia, Yun-Jie; Lo Franco, Rosario

    2015-01-01

    Quantum technology relies on the utilization of resources, like quantum coherence and entanglement, which allow quantum information and computation processing. This achievement is however jeopardized by the detrimental effects of the environment surrounding any quantum system, so that finding strategies to protect quantum resources is essential. Non-Markovian and structured environments are useful tools to this aim. Here we show how a simple environmental architecture made of two coupled lossy cavities enables a switch between Markovian and non-Markovian regimes for the dynamics of a qubit embedded in one of the cavity. Furthermore, qubit coherence can be indefinitely preserved if the cavity without qubit is perfect. We then focus on entanglement control of two independent qubits locally subject to such an engineered environment and discuss its feasibility in the framework of circuit quantum electrodynamics. With up-to-date experimental parameters, we show that our architecture allows entanglement lifetimes orders of magnitude longer than the spontaneous lifetime without local cavity couplings. This cavity-based architecture is straightforwardly extendable to many qubits for scalability. PMID:26351004

  3. The '3Is' of animal experimentation.

    PubMed

    2012-05-29

    Animal experimentation in scientific research is a good thing: important, increasing and often irreplaceable. Careful experimental design and reporting are at least as important as attention to welfare in ensuring that the knowledge we gain justifies using live animals as experimental tools.

  4. The reliability of a modified Kalamazoo Consensus Statement Checklist for assessing the communication skills of multidisciplinary clinicians in the simulated environment.

    PubMed

    Peterson, Eleanor B; Calhoun, Aaron W; Rider, Elizabeth A

    2014-09-01

    With increased recognition of the importance of sound communication skills and communication skills education, reliable assessment tools are essential. This study reports on the psychometric properties of an assessment tool based on the Kalamazoo Consensus Statement Essential Elements Communication Checklist. The Gap-Kalamazoo Communication Skills Assessment Form (GKCSAF), a modified version of an existing communication skills assessment tool, the Kalamazoo Essential Elements Communication Checklist-Adapted, was used to assess learners in a multidisciplinary, simulation-based communication skills educational program using multiple raters. 118 simulated conversations were available for analysis. Internal consistency and inter-rater reliability were determined by calculating a Cronbach's alpha score and intra-class correlation coefficients (ICC), respectively. The GKCSAF demonstrated high internal consistency with a Cronbach's alpha score of 0.844 (faculty raters) and 0.880 (peer observer raters), and high inter-rater reliability with an ICC of 0.830 (faculty raters) and 0.89 (peer observer raters). The Gap-Kalamazoo Communication Skills Assessment Form is a reliable method of assessing the communication skills of multidisciplinary learners using multi-rater methods within the learning environment. The Gap-Kalamazoo Communication Skills Assessment Form can be used by educational programs that wish to implement a reliable assessment and feedback system for a variety of learners. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  5. Genome build information is an essential part of genomic track files.

    PubMed

    Kanduri, Chakravarthi; Domanska, Diana; Hovig, Eivind; Sandve, Geir Kjetil

    2017-09-14

    Genomic locations are represented as coordinates on a specific genome build version, but the build information is frequently missing when coordinates are provided. We show that this information is essential to correctly interpret and analyse the genomic intervals contained in genomic track files. Although not a substitute for best practices, we also provide a tool to predict the genome build version of genomic track files.

  6. P-TRAP: a Panicle TRAit Phenotyping tool.

    PubMed

    A L-Tam, Faroq; Adam, Helene; Anjos, António dos; Lorieux, Mathias; Larmande, Pierre; Ghesquière, Alain; Jouannic, Stefan; Shahbazkia, Hamid Reza

    2013-08-29

    In crops, inflorescence complexity and the shape and size of the seed are among the most important characters that influence yield. For example, rice panicles vary considerably in the number and order of branches, elongation of the axis, and the shape and size of the seed. Manual low-throughput phenotyping methods are time consuming, and the results are unreliable. However, high-throughput image analysis of the qualitative and quantitative traits of rice panicles is essential for understanding the diversity of the panicle as well as for breeding programs. This paper presents P-TRAP software (Panicle TRAit Phenotyping), a free open source application for high-throughput measurements of panicle architecture and seed-related traits. The software is written in Java and can be used with different platforms (the user-friendly Graphical User Interface (GUI) uses Netbeans Platform 7.3). The application offers three main tools: a tool for the analysis of panicle structure, a spikelet/grain counting tool, and a tool for the analysis of seed shape. The three tools can be used independently or simultaneously for analysis of the same image. Results are then reported in the Extensible Markup Language (XML) and Comma Separated Values (CSV) file formats. Images of rice panicles were used to evaluate the efficiency and robustness of the software. Compared to data obtained by manual processing, P-TRAP produced reliable results in a much shorter time. In addition, manual processing is not repeatable because dry panicles are vulnerable to damage. The software is very useful, practical and collects much more data than human operators. P-TRAP is a new open source software that automatically recognizes the structure of a panicle and the seeds on the panicle in numeric images. The software processes and quantifies several traits related to panicle structure, detects and counts the grains, and measures their shape parameters. In short, P-TRAP offers both efficient results and a user-friendly environment for experiments. The experimental results showed very good accuracy compared to field operator, expert verification and well-known academic methods.

  7. P-TRAP: a Panicle Trait Phenotyping tool

    PubMed Central

    2013-01-01

    Background In crops, inflorescence complexity and the shape and size of the seed are among the most important characters that influence yield. For example, rice panicles vary considerably in the number and order of branches, elongation of the axis, and the shape and size of the seed. Manual low-throughput phenotyping methods are time consuming, and the results are unreliable. However, high-throughput image analysis of the qualitative and quantitative traits of rice panicles is essential for understanding the diversity of the panicle as well as for breeding programs. Results This paper presents P-TRAP software (Panicle TRAit Phenotyping), a free open source application for high-throughput measurements of panicle architecture and seed-related traits. The software is written in Java and can be used with different platforms (the user-friendly Graphical User Interface (GUI) uses Netbeans Platform 7.3). The application offers three main tools: a tool for the analysis of panicle structure, a spikelet/grain counting tool, and a tool for the analysis of seed shape. The three tools can be used independently or simultaneously for analysis of the same image. Results are then reported in the Extensible Markup Language (XML) and Comma Separated Values (CSV) file formats. Images of rice panicles were used to evaluate the efficiency and robustness of the software. Compared to data obtained by manual processing, P-TRAP produced reliable results in a much shorter time. In addition, manual processing is not repeatable because dry panicles are vulnerable to damage. The software is very useful, practical and collects much more data than human operators. Conclusions P-TRAP is a new open source software that automatically recognizes the structure of a panicle and the seeds on the panicle in numeric images. The software processes and quantifies several traits related to panicle structure, detects and counts the grains, and measures their shape parameters. In short, P-TRAP offers both efficient results and a user-friendly environment for experiments. The experimental results showed very good accuracy compared to field operator, expert verification and well-known academic methods. PMID:23987653

  8. Involvement of Cot/Tp12 in bone loss during periodontitis.

    PubMed

    Ohnishi, T; Okamoto, A; Kakimoto, K; Bandow, K; Chiba, N; Matsuguchi, T

    2010-02-01

    Periodontitis causes resorption of alveolar bone, in which RANKL induces osteoclastogenesis. The binding of lipopolysaccharide to Toll-like receptors causes phosphorylation of Cot/Tp12 to activate the MAPK cascade. Previous in vitro studies showed that Cot/Tp12 was essential for the induction of RANKL expression by lipopolysaccharide. In this study, we examined whether Cot/Tp12 deficiency reduced the progression of alveolar bone loss and osteoclastogenesis during experimental periodontitis. We found that the extent of alveolar bone loss and osteoclastogenesis induced by ligature-induced periodontitis was decreased in Cot/Tp12-deficient mice. In addition, reduction of RANKL expression was observed in periodontal tissues of Cot/Tp12-deficient mice with experimental periodontitis. Furthermore, we found that Cot/Tp12 was involved in the induction of TNF-alpha mRNA expression in gingiva of mice with experimental periodontitis. Our observations suggested that Cot/Tp12 is essential for the progression of alveolar bone loss and osteoclastogenesis in periodontal tissue during experimental periodontitis mediated through increased RANKL expression.

  9. Risk scoring models for predicting peri-operative morbidity and mortality in people with fragility hip fractures: Qualitative systematic review.

    PubMed

    Marufu, Takawira C; Mannings, Alexa; Moppett, Iain K

    2015-12-01

    Accurate peri-operative risk prediction is an essential element of clinical practice. Various risk stratification tools for assessing patients' risk of mortality or morbidity have been developed and applied in clinical practice over the years. This review aims to outline essential characteristics (predictive accuracy, objectivity, clinical utility) of currently available risk scoring tools for hip fracture patients. We searched eight databases; AMED, CINHAL, Clinical Trials.gov, Cochrane, DARE, EMBASE, MEDLINE and Web of Science for all relevant studies published until April 2015. We included published English language observational studies that considered the predictive accuracy of risk stratification tools for patients with fragility hip fracture. After removal of duplicates, 15,620 studies were screened. Twenty-nine papers met the inclusion criteria, evaluating 25 risk stratification tools. Risk stratification tools considered in more than two studies were; ASA, CCI, E-PASS, NHFS and O-POSSUM. All tools were moderately accurate and validated in multiple studies; however there are some limitations to consider. The E-PASS and O-POSSUM are comprehensive but complex, and require intraoperative data making them a challenge for use on patient bedside. The ASA, CCI and NHFS are simple, easy and inexpensive using routinely available preoperative data. Contrary to the ASA and CCI which has subjective variables in addition to other limitations, the NHFS variables are all objective. In the search for a simple and inexpensive, easy to calculate, objective and accurate tool, the NHFS may be the most appropriate of the currently available scores for hip fracture patients. However more studies need to be undertaken before it becomes a national hip fracture risk stratification or audit tool of choice. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  10. Monitoring the ability to deliver care in low- and middle-income countries: a systematic review of health facility assessment tools

    PubMed Central

    Nickerson, Jason W; Adams, Orvill; Attaran, Amir; Hatcher-Roberts, Janet; Tugwell, Peter

    2015-01-01

    Introduction Health facilities assessments are an essential instrument for health system strengthening in low- and middle-income countries. These assessments are used to conduct health facility censuses to assess the capacity of the health system to deliver health care and to identify gaps in the coverage of health services. Despite the valuable role of these assessments, there are currently no minimum standards or frameworks for these tools. Methods We used a structured keyword search of the MEDLINE, EMBASE and HealthStar databases and searched the websites of the World Health Organization, the World Bank and the International Health Facilities Assessment Network to locate all available health facilities assessment tools intended for use in low- and middle-income countries. We parsed the various assessment tools to identify similarities between them, which we catalogued into a framework comprising 41 assessment domains. Results We identified 10 health facility assessment tools meeting our inclusion criteria, all of which were included in our analysis. We found substantial variation in the comprehensiveness of the included tools, with the assessments containing indicators in 13 to 33 (median: 25.5) of the 41 assessment domains included in our framework. None of the tools collected data on all 41 of the assessment domains we identified. Conclusions Not only do a large number of health facility assessment tools exist, but the data they collect and methods they employ are very different. This certainly limits the comparability of the data between different countries’ health systems and probably creates blind spots that impede efforts to strengthen those systems. Agreement is needed on the essential elements of health facility assessments to guide the development of specific indicators and for refining existing instruments. PMID:24895350

  11. Estimation of the influence of tool wear on force signals: A finite element approach in AISI 1045 orthogonal cutting

    NASA Astrophysics Data System (ADS)

    Equeter, Lucas; Ducobu, François; Rivière-Lorphèvre, Edouard; Abouridouane, Mustapha; Klocke, Fritz; Dehombreux, Pierre

    2018-05-01

    Industrial concerns arise regarding the significant cost of cutting tools in machining process. In particular, their improper replacement policy can lead either to scraps, or to early tool replacements, which would waste fine tools. ISO 3685 provides the flank wear end-of-life criterion. Flank wear is also the nominal type of wear for longest tool lifetimes in optimal cutting conditions. Its consequences include bad surface roughness and dimensional discrepancies. In order to aid the replacement decision process, several tool condition monitoring techniques are suggested. Force signals were shown in the literature to be strongly linked with tools flank wear. It can therefore be assumed that force signals are highly relevant for monitoring the condition of cutting tools and providing decision-aid information in the framework of their maintenance and replacement. The objective of this work is to correlate tools flank wear with numerically computed force signals. The present work uses a Finite Element Model with a Coupled Eulerian-Lagrangian approach. The geometry of the tool is changed for different runs of the model, in order to obtain results that are specific to a certain level of wear. The model is assessed by comparison with experimental data gathered earlier on fresh tools. Using the model at constant cutting parameters, force signals under different tool wear states are computed and provide force signals for each studied tool geometry. These signals are qualitatively compared with relevant data from the literature. At this point, no quantitative comparison could be performed on worn tools because the reviewed literature failed to provide similar studies in this material, either numerical or experimental. Therefore, further development of this work should include experimental campaigns aiming at collecting cutting forces signals and assessing the numerical results that were achieved through this work.

  12. Beneficial effect of Mentha suaveolens essential oil in the treatment of vaginal candidiasis assessed by real-time monitoring of infection

    PubMed Central

    2011-01-01

    Background Vaginal candidiasis is a frequent and common distressing disease affecting up to 75% of the women of fertile age; most of these women have recurrent episodes. Essential oils from aromatic plants have been shown to have antimicrobial and antifungal activities. This study was aimed at assessing the anti-fungal activity of essential oil from Mentha suaveolens (EOMS) in an experimental infection of vaginal candidiasis. Methods The in vitro and in vivo activity of EOMS was assessed. The in vitro activity was evaluated under standard CLSI methods, and the in vivo analysis was carried out by exploiting a novel, non-invasive model of vaginal candidiasis in mice based on an in vivo imaging technique. Differences between essential oil treated and saline treated mice were evaluated by the non-parametric Mann-Whitney U-test. Viable count data from a time kill assay and yeast and hyphae survival test were compared using the Student's t-test (two-tailed). Results Our main findings were: i) EOMS shows potent candidastatic and candidacidal activity in an in vitro experimental system; ii) EOMS gives a degree of protection against vaginal candidiasis in an in vivo experimental system. Conclusions This study shows for the first time that the essential oil of a Moroccan plant Mentha suaveolens is candidastatic and candidacidal in vitro, and has a degree of anticandidal activity in a model of vaginal infection, as demonstrated in an in vivo monitoring imaging system. We conclude that our findings lay the ground for further, more extensive investigations to identify the active EOMS component(s), promising in the therapeutically problematic setting of chronic vaginal candidiasis in humans. PMID:21356078

  13. The essential value of long-term experimental data for hydrology and water management

    NASA Astrophysics Data System (ADS)

    Tetzlaff, D.; Carey, S. K.; McNamara, J. P.; Laudon, H.; Soulsby, C.

    2017-12-01

    Observations and data from long-term experimental watersheds are the foundation of hydrology as a geoscience. They allow us to benchmark process understanding, observe trends and natural cycles, and are pre-requisites for testing predictive models. Long-term experimental watersheds also are places where new measurement technologies are developed. These studies offer a crucial evidence base for understanding and managing the provision of clean water supplies; predicting and mitigating the effects of floods, and protecting ecosystem services provided by rivers and wetlands. They also show how to manage land and water in an integrated, sustainable way that reduces environmental and economic costs. We present a number of compelling examples illustrating how hydrologic process understanding has been generated through comparing hypotheses to data, and how this understanding has been essential for managing water supplies, floods, and ecosystem services today.

  14. Nonpharmacologic control of essential hypertension in man: a critical review of the experimental literature.

    PubMed

    Frumkin, K; Nathan, R J; Prout, M F; Cohen, M C

    1978-06-01

    Many nonpharmacologic (behavioral) techniques are being proposed for the therapy of essential hypertension. The research in this area is reviewed and divided roughly into two categories: the biofeedback and relaxation methodologies. While feedback can be used to lower pressures during laboratory training sessions, studies designed to alter basal blood pressure levels with biofeedback have not yet been reported. The absence of evidence for such changes through biofeedback limits the usefulness of this technique in hypertension control. The various relaxation methods, such as yoga, transcendental meditation, progressive muscle relaxation, and others have shown more promise. With varying degrees of experimental vigor, many of these techniques have been associated with long-lasting changes in blood pressure. The strengths and weaknesses of the various authors' research designs, data and conclusions are discussed, and suggestions for further experimentation are offered.

  15. The use of experimental structures to model protein dynamics.

    PubMed

    Katebi, Ataur R; Sankar, Kannan; Jia, Kejue; Jernigan, Robert L

    2015-01-01

    The number of solved protein structures submitted in the Protein Data Bank (PDB) has increased dramatically in recent years. For some specific proteins, this number is very high-for example, there are over 550 solved structures for HIV-1 protease, one protein that is essential for the life cycle of human immunodeficiency virus (HIV) which causes acquired immunodeficiency syndrome (AIDS) in humans. The large number of structures for the same protein and its variants include a sample of different conformational states of the protein. A rich set of structures solved experimentally for the same protein has information buried within the dataset that can explain the functional dynamics and structural mechanism of the protein. To extract the dynamics information and functional mechanism from the experimental structures, this chapter focuses on two methods-Principal Component Analysis (PCA) and Elastic Network Models (ENM). PCA is a widely used statistical dimensionality reduction technique to classify and visualize high-dimensional data. On the other hand, ENMs are well-established simple biophysical method for modeling the functionally important global motions of proteins. This chapter covers the basics of these two. Moreover, an improved ENM version that utilizes the variations found within a given set of structures for a protein is described. As a practical example, we have extracted the functional dynamics and mechanism of HIV-1 protease dimeric structure by using a set of 329 PDB structures of this protein. We have described, step by step, how to select a set of protein structures, how to extract the needed information from the PDB files for PCA, how to extract the dynamics information using PCA, how to calculate ENM modes, how to measure the congruency between the dynamics computed from the principal components (PCs) and the ENM modes, and how to compute entropies using the PCs. We provide the computer programs or references to software tools to accomplish each step and show how to use these programs and tools. We also include computer programs to generate movies based on PCs and ENM modes and describe how to visualize them.

  16. The Use of Experimental Structures to Model Protein Dynamics

    PubMed Central

    Katebi, Ataur R.; Sankar, Kannan; Jia, Kejue; Jernigan, Robert L.

    2014-01-01

    Summary The number of solved protein structures submitted in the Protein Data Bank (PDB) has increased dramatically in recent years. For some specific proteins, this number is very high – for example, there are over 550 solved structures for HIV-1 protease, one protein that is essential for the life cycle of human immunodeficiency virus (HIV) which causes acquired immunodeficiency syndrome (AIDS) in humans. The large number of structures for the same protein and its variants include a sample of different conformational states of the protein. A rich set of structures solved experimentally for the same protein has information buried within the dataset that can explain the functional dynamics and structural mechanism of the protein. To extract the dynamics information and functional mechanism from the experimental structures, this chapter focuses on two methods – Principal Component Analysis (PCA) and Elastic Network Models (ENM). PCA is a widely used statistical dimensionality reduction technique to classify and visualize high-dimensional data. On the other hand, ENMs are well-established simple biophysical method for modeling the functionally important global motions of proteins. This chapter covers the basics of these two. Moreover, an improved ENM version that utilizes the variations found within a given set of structures for a protein is described. As a practical example, we have extracted the functional dynamics and mechanism of HIV-1 protease dimeric structure by using a set of 329 PDB structures of this protein. We have described, step by step, how to select a set of protein structures, how to extract the needed information from the PDB files for PCA, how to extract the dynamics information using PCA, how to calculate ENM modes, how to measure the congruency between the dynamics computed from the principal components (PCs) and the ENM modes, and how to compute entropies using the PCs. We provide the computer programs or references to software tools to accomplish each step and show how to use these programs and tools. We also include computer programs to generate movies based on PCs and ENM modes and describe how to visualize them. PMID:25330965

  17. Influence of speed on wear and cutting forces in end-milling nickel alloy

    NASA Astrophysics Data System (ADS)

    Estrems, M.; Sánchez, H. T.; Kurfess, T.; Bunget, C.

    2012-04-01

    The effect of speed on the flank wear of the cutting tool when a nickel alloy is milled is studied. From the analysis of the measured forces, a dynamic semi-experimental model is developed based on the parallelism between the curve of the thrust forces of the unworn tool and the curves when the flank of the tool is worn. Based on the change in the geometry of the contact in the flank worrn face, a theory of indentation of the tool on the workpiece is formulated in such a way that upon applying equations of contact mechanics, a good approximation of the experimental results is obtained.

  18. From action to language: comparative perspectives on primate tool use, gesture and the evolution of human language

    PubMed Central

    Steele, James; Ferrari, Pier Francesco; Fogassi, Leonardo

    2012-01-01

    The papers in this Special Issue examine tool use and manual gestures in primates as a window on the evolution of the human capacity for language. Neurophysiological research has supported the hypothesis of a close association between some aspects of human action organization and of language representation, in both phonology and semantics. Tool use provides an excellent experimental context to investigate analogies between action organization and linguistic syntax. Contributors report and contextualize experimental evidence from monkeys, great apes, humans and fossil hominins, and consider the nature and the extent of overlaps between the neural representations of tool use, manual gestures and linguistic processes. PMID:22106422

  19. Seat pressure measurement technologies: considerations for their evaluation.

    PubMed

    Gyi, D E; Porter, J M; Robertson, N K

    1998-04-01

    Interface pressure measurement has generated interest in the automotive industry as a technique which could be used in the prediction of driver discomfort for various car seat designs, and provide designers and manufacturers with rapid information early on in the design process. It is therefore essential that the data obtained are of the highest quality, relevant and have some quantitative meaning. Exploratory experimental work carried out with the commercially available Talley Pressure Monitor is outlined. This led to a better understanding of the strengths and weaknesses of this system and the re-design of the sensor matrix. Such evaluation, in the context of the actual experimental environment, is considered essential.

  20. Cadmium Handling, Toxicity and Molecular Targets Involved during Pregnancy: Lessons from Experimental Models.

    PubMed

    Jacobo-Estrada, Tania; Santoyo-Sánchez, Mitzi; Thévenod, Frank; Barbier, Olivier

    2017-07-22

    Even decades after the discovery of Cadmium (Cd) toxicity, research on this heavy metal is still a hot topic in scientific literature: as we wrote this review, more than 1440 scientific articles had been published and listed by the PubMed.gov website during 2017. Cadmium is one of the most common and harmful heavy metals present in our environment. Since pregnancy is a very particular physiological condition that could impact and modify essential pathways involved in the handling of Cd, the prenatal life is a critical stage for exposure to this non-essential element. To give the reader an overview of the possible mechanisms involved in the multiple organ toxic effects in fetuses after the exposure to Cd during pregnancy, we decided to compile some of the most relevant experimental studies performed in experimental models and to summarize the advances in this field such as the Cd distribution and the factors that could alter it (diet, binding-proteins and membrane transporters), the Cd-induced toxicity in dams (preeclampsia, fertility, kidney injury, alteration in essential element homeostasis and bone mineralization), in placenta and in fetus (teratogenicity, central nervous system, liver and kidney).

  1. Cadmium Handling, Toxicity and Molecular Targets Involved during Pregnancy: Lessons from Experimental Models

    PubMed Central

    Santoyo-Sánchez, Mitzi; Thévenod, Frank; Barbier, Olivier

    2017-01-01

    Even decades after the discovery of Cadmium (Cd) toxicity, research on this heavy metal is still a hot topic in scientific literature: as we wrote this review, more than 1440 scientific articles had been published and listed by the PubMed.gov website during 2017. Cadmium is one of the most common and harmful heavy metals present in our environment. Since pregnancy is a very particular physiological condition that could impact and modify essential pathways involved in the handling of Cd, the prenatal life is a critical stage for exposure to this non-essential element. To give the reader an overview of the possible mechanisms involved in the multiple organ toxic effects in fetuses after the exposure to Cd during pregnancy, we decided to compile some of the most relevant experimental studies performed in experimental models and to summarize the advances in this field such as the Cd distribution and the factors that could alter it (diet, binding-proteins and membrane transporters), the Cd-induced toxicity in dams (preeclampsia, fertility, kidney injury, alteration in essential element homeostasis and bone mineralization), in placenta and in fetus (teratogenicity, central nervous system, liver and kidney). PMID:28737682

  2. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    DOE PAGES

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-07-28

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable ofmore » handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.« less

  3. Comparative Reliability of Structured Versus Unstructured Interviews in the Admission Process of a Residency Program

    PubMed Central

    Blouin, Danielle; Day, Andrew G.; Pavlov, Andrey

    2011-01-01

    Background Although never directly compared, structured interviews are reported as being more reliable than unstructured interviews. This study compared the reliability of both types of interview when applied to a common pool of applicants for positions in an emergency medicine residency program. Methods In 2008, one structured interview was added to the two unstructured interviews traditionally used in our resident selection process. A formal job analysis using the critical incident technique guided the development of the structured interview tool. This tool consisted of 7 scenarios assessing 4 of the domains deemed essential for success as a resident in this program. The traditional interview tool assessed 5 general criteria. In addition to these criteria, the unstructured panel members were asked to rate each candidate on the same 4 essential domains rated by the structured panel members. All 3 panels interviewed all candidates. Main outcomes were the overall, interitem, and interrater reliabilities, the correlations between interview panels, and the dimensionality of each interview tool. Results Thirty candidates were interviewed. The overall reliability reached 0.43 for the structured interview, and 0.81 and 0.71 for the unstructured interviews. Analyses of the variance components showed a high interrater, low interitem reliability for the structured interview, and a high interrater, high interitem reliability for the unstructured interviews. The summary measures from the 2 unstructured interviews were significantly correlated, but neither was correlated with the structured interview. Only the structured interview was multidimensional. Conclusions A structured interview did not yield a higher overall reliability than both unstructured interviews. The lower reliability is explained by a lower interitem reliability, which in turn is due to the multidimensionality of the interview tool. Both unstructured panels consistently rated a single dimension, even when prompted to assess the 4 specific domains established as essential to succeed in this residency program. PMID:23205201

  4. Comparative reliability of structured versus unstructured interviews in the admission process of a residency program.

    PubMed

    Blouin, Danielle; Day, Andrew G; Pavlov, Andrey

    2011-12-01

    Although never directly compared, structured interviews are reported as being more reliable than unstructured interviews. This study compared the reliability of both types of interview when applied to a common pool of applicants for positions in an emergency medicine residency program. In 2008, one structured interview was added to the two unstructured interviews traditionally used in our resident selection process. A formal job analysis using the critical incident technique guided the development of the structured interview tool. This tool consisted of 7 scenarios assessing 4 of the domains deemed essential for success as a resident in this program. The traditional interview tool assessed 5 general criteria. In addition to these criteria, the unstructured panel members were asked to rate each candidate on the same 4 essential domains rated by the structured panel members. All 3 panels interviewed all candidates. Main outcomes were the overall, interitem, and interrater reliabilities, the correlations between interview panels, and the dimensionality of each interview tool. Thirty candidates were interviewed. The overall reliability reached 0.43 for the structured interview, and 0.81 and 0.71 for the unstructured interviews. Analyses of the variance components showed a high interrater, low interitem reliability for the structured interview, and a high interrater, high interitem reliability for the unstructured interviews. The summary measures from the 2 unstructured interviews were significantly correlated, but neither was correlated with the structured interview. Only the structured interview was multidimensional. A structured interview did not yield a higher overall reliability than both unstructured interviews. The lower reliability is explained by a lower interitem reliability, which in turn is due to the multidimensionality of the interview tool. Both unstructured panels consistently rated a single dimension, even when prompted to assess the 4 specific domains established as essential to succeed in this residency program.

  5. Study of combustion experiments in space

    NASA Technical Reports Server (NTRS)

    Berlad, A. L.; Huggett, C.; Kaufman, F.; Markstein, G. H.; Palmer, H. B.; Yang, C. H.

    1974-01-01

    The physical bases and scientific merits were examined of combustion experimentation in a space environment. For a very broad range of fundamental combustion problems, extensive and systematic experimentation at reduced gravitational levels (0 g 1) are viewed as essential to the development of needed observations and related theoretical understanding.

  6. Learning Experimentation through Science Fairs

    ERIC Educational Resources Information Center

    Paul, Jürgen; Lederman, Norman G.; Groß, Jorge

    2016-01-01

    Experiments are essential for both doing science and learning science. The aim of the German youth science fair, "Jugend forscht," is to encourage scientific thinking and inquiry methods such as experimentation. Based on 57 interviews with participants of the competition, this study summarises students' conceptions and steps of learning…

  7. Machine Learning to Discover and Optimize Materials

    NASA Astrophysics Data System (ADS)

    Rosenbrock, Conrad Waldhar

    For centuries, scientists have dreamed of creating materials by design. Rather than discovery by accident, bespoke materials could be tailored to fulfill specific technological needs. Quantum theory and computational methods are essentially equal to the task, and computational power is the new bottleneck. Machine learning has the potential to solve that problem by approximating material behavior at multiple length scales. A full end-to-end solution must allow us to approximate the quantum mechanics, microstructure and engineering tasks well enough to be predictive in the real world. In this dissertation, I present algorithms and methodology to address some of these problems at various length scales. In the realm of enumeration, systems with many degrees of freedom such as high-entropy alloys may contain prohibitively many unique possibilities so that enumerating all of them would exhaust available compute memory. One possible way to address this problem is to know in advance how many possibilities there are so that the user can reduce their search space by restricting the occupation of certain lattice sites. Although tools to calculate this number were available, none performed well for very large systems and none could easily be integrated into low-level languages for use in existing scientific codes. I present an algorithm to solve these problems. Testing the robustness of machine-learned models is an essential component in any materials discovery or optimization application. While it is customary to perform a small number of system-specific tests to validate an approach, this may be insufficient in many cases. In particular, for Cluster Expansion models, the expansion may not converge quickly enough to be useful and reliable. Although the method has been used for decades, a rigorous investigation across many systems to determine when CE "breaks" was still lacking. This dissertation includes this investigation along with heuristics that use only a small training database to predict whether a model is worth pursuing in detail. To be useful, computational materials discovery must lead to experimental validation. However, experiments are difficult due to sample purity, environmental effects and a host of other considerations. In many cases, it is difficult to connect theory to experiment because computation is deterministic. By combining advanced group theory with machine learning, we created a new tool that bridges the gap between experiment and theory so that experimental and computed phase diagrams can be harmonized. Grain boundaries in real materials control many important material properties such as corrosion, thermal conductivity, and creep. Because of their high dimensionality, learning the underlying physics to optimizing grain boundaries is extremely complex. By leveraging a mathematically rigorous representation for local atomic environments, machine learning becomes a powerful tool to approximate properties for grain boundaries. But it also goes beyond predicting properties by highlighting those atomic environments that are most important for influencing the boundary properties. This provides an immense dimensionality reduction that empowers grain boundary scientists to know where to look for deeper physical insights.

  8. Critical dosimetry measures and surrogate tools that can facilitate clinical success in PDT (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Pogue, Brian W.; Davis, Scott C.; Kanick, Stephen C.; Maytin, Edward V.; Pereira, Stephen P.; Palanisami, Akilan; Hasan, Tayyaba

    2016-03-01

    Photodynamic therapy can be a highly complex treatment with more than one parameter to control, or in some cases it is easily implemented with little control other than prescribed drug and light values. The role of measured dosimetry as related to clinical adoption has not been as successful as it could have been, and part of this may be from the conflicting goals of advocating for as many measurements as possible for accurate control, versus companies and clinical adopters advocating for as few measurements as possible, to keep it simple. An organized approach to dosimetry selection is required, which shifts from mechanistic measurements in pre-clinical and early phase I trials, towards just those essential dose limiting measurements and a focus on possible surrogate measures in phase II/III trials. This essential and surrogate approach to dosimetry should help successful adoption of clinical PDT if successful. The examples of essential dosimetry points and surrogate dosimetry tools which might be implemented in phase II and higher trials are discussed for solid tissue PDT with verteporfin and skin lesion treatment with aminolevulinc acid.

  9. Numerical Implementation of Indicators and Statistical Control Tools in Monitoring and Evaluating CACEI-ISO Indicators of Study Program in Industrial Process by Systematization

    ERIC Educational Resources Information Center

    Ayala, Gabriela Cota; Real, Francia Angélica Karlos; Ivan, Ramirez Alvarado Edqar

    2016-01-01

    The research was conducted to determine if the study program of the career of industrial processes Technological University of Chihuahua, 1 year after that it was certified by CACEI, continues achieving the established indicators and ISO 9001: 2008, implementing quality tools, monitoring of essential indicators are determined, flow charts are…

  10. The Use of Individual Growth and Developmental Indicators for Progress Monitoring and Intervention Decision Making in Early Education

    ERIC Educational Resources Information Center

    Walker, Dale; Carta, Judith J.; Greenwood, Charles R.; Buzhardt, Joseph F.

    2008-01-01

    Progress monitoring tools have been shown to be essential elements in current approaches to intervention problem-solving models. Such tools have been valuable not only in marking individual children's level of performance relative to peers but also in measuring change in skill level in a way that can be attributed to intervention and development.…

  11. Tools to Analyze Morphology and Spatially Mapped Molecular Data | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    This project is to develop, deploy, and disseminate a suite of open source tools and integrated informatics platform that will facilitate multi-scale, correlative analyses of high resolution whole slide tissue image data, spatially mapped genetics and molecular data for cancer research. This platform will play an essential role in supporting studies of tumor initiation, development, heterogeneity, invasion, and metastasis.

  12. Benchmarking: contexts and details matter.

    PubMed

    Zheng, Siyuan

    2017-07-05

    Benchmarking is an essential step in the development of computational tools. We take this opportunity to pitch in our opinions on tool benchmarking, in light of two correspondence articles published in Genome Biology.Please see related Li et al. and Newman et al. correspondence articles: www.dx.doi.org/10.1186/s13059-017-1256-5 and www.dx.doi.org/10.1186/s13059-017-1257-4.

  13. Modeling the survival of Salmonella on slice cooked ham as a function of apple skin polyphenols, acetic acid, oregano essential oil and carvacrol

    USDA-ARS?s Scientific Manuscript database

    Response surface methodology was applied to investigate the combined effect of apple skin polyphenols (ASP), acetic acid (AA), oregano essential oil (O) and carvacrol (C) on the inactivation of Salmonella on sliced cooked ham. A full factorial experimental design was employed with control variables ...

  14. Novel tool wear monitoring method in milling difficult-to-machine materials using cutting chip formation

    NASA Astrophysics Data System (ADS)

    Zhang, P. P.; Guo, Y.; Wang, B.

    2017-05-01

    The main problems in milling difficult-to-machine materials are the high cutting temperature and rapid tool wear. However it is impossible to investigate tool wear in machining. Tool wear and cutting chip formation are two of the most important representations for machining efficiency and quality. The purpose of this paper is to develop the model of tool wear with cutting chip formation (width of chip and radian of chip) on difficult-to-machine materials. Thereby tool wear is monitored by cutting chip formation. A milling experiment on the machining centre with three sets cutting parameters was performed to obtain chip formation and tool wear. The experimental results show that tool wear increases gradually along with cutting process. In contrast, width of chip and radian of chip decrease. The model is developed by fitting the experimental data and formula transformations. The most of monitored errors of tool wear by the chip formation are less than 10%. The smallest error is 0.2%. Overall errors by the radian of chip are less than the ones by the width of chip. It is new way to monitor and detect tool wear by cutting chip formation in milling difficult-to-machine materials.

  15. NNDC Databases

    Science.gov Websites

    radiation. It includes an interactive chart of nuclides and a level plotting tool. XUNDL Experimental Unevaluated Nuclear Data List Experimental nuclear structure and decay data, covering more than 2,500 recent parameters* Retrieved information CSISRS alias EXFOR Nuclear reaction experimental data Experimental nuclear

  16. Biology Needs Evolutionary Software Tools: Let’s Build Them Right

    PubMed Central

    Team, Galaxy; Goecks, Jeremy; Taylor, James

    2018-01-01

    Abstract Research in population genetics and evolutionary biology has always provided a computational backbone for life sciences as a whole. Today evolutionary and population biology reasoning are essential for interpretation of large complex datasets that are characteristic of all domains of today’s life sciences ranging from cancer biology to microbial ecology. This situation makes algorithms and software tools developed by our community more important than ever before. This means that we, developers of software tool for molecular evolutionary analyses, now have a shared responsibility to make these tools accessible using modern technological developments as well as provide adequate documentation and training. PMID:29688462

  17. Blockade of Neuronal α7-nAChR by α-Conotoxin ImI Explained by Computational Scanning and Energy Calculations

    PubMed Central

    Yu, Rilei; Craik, David J.; Kaas, Quentin

    2011-01-01

    α-Conotoxins potently inhibit isoforms of nicotinic acetylcholine receptors (nAChRs), which are essential for neuronal and neuromuscular transmission. They are also used as neurochemical tools to study nAChR physiology and are being evaluated as drug leads to treat various neuronal disorders. A number of experimental studies have been performed to investigate the structure-activity relationships of conotoxin/nAChR complexes. However, the structural determinants of their binding interactions are still ambiguous in the absence of experimental structures of conotoxin-receptor complexes. In this study, the binding modes of α-conotoxin ImI to the α7-nAChR, currently the best-studied system experimentally, were investigated using comparative modeling and molecular dynamics simulations. The structures of more than 30 single point mutants of either the conotoxin or the receptor were modeled and analyzed. The models were used to explain qualitatively the change of affinities measured experimentally, including some nAChR positions located outside the binding site. Mutational energies were calculated using different methods that combine a conformational refinement procedure (minimization with a distance dependent dielectric constant or explicit water, or molecular dynamics using five restraint strategies) and a binding energy function (MM-GB/SA or MM-PB/SA). The protocol using explicit water energy minimization and MM-GB/SA gave the best correlations with experimental binding affinities, with an R2 value of 0.74. The van der Waals and non-polar desolvation components were found to be the main driving force for binding of the conotoxin to the nAChR. The electrostatic component was responsible for the selectivity of the various ImI mutants. Overall, this study provides novel insights into the binding mechanism of α-conotoxins to nAChRs and the methodological developments reported here open avenues for computational scanning studies of a rapidly expanding range of wild-type and chemically modified α-conotoxins. PMID:21390272

  18. TokenPasser: A petri net specification tool. Thesis

    NASA Technical Reports Server (NTRS)

    Mittmann, Michael

    1991-01-01

    In computer program design it is essential to know the effectiveness of different design options in improving performance, and dependability. This paper provides a description of a CAD tool for distributed hierarchical Petri nets. After a brief review of Petri nets, Petri net languages, and Petri net transducers, and descriptions of several current Petri net tools, the specifications and design of the TokenPasser tool are presented. TokenPasser is a tool to allow design of distributed hierarchical systems based on Petri nets. A case study for an intelligent robotic system is conducted, a coordination structure with one dispatcher controlling three coordinators is built to model a proposed robotic assembly system. The system is implemented using TokenPasser, and the results are analyzed to allow judgment of the tool.

  19. Modeling and Tool Wear in Routing of CFRP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iliescu, D.; Fernandez, A.; Gutierrez-Orrantia, M. E.

    2011-01-17

    This paper presents the prediction and evaluation of feed force in routing of carbon composite material. In order to extend tool life and improve quality of the machined surface, a better understanding of uncoated and coated tool behaviors is required. This work describes (1) the optimization of the geometry of multiple teeth tools minimizing the tool wear and the feed force, (2) the optimization of tool coating and (3) the development of a phenomenological model between the feed force, the routing parameters and the tool wear. The experimental results indicate that the feed rate, the cutting speed and the toolmore » wear are the most significant factors affecting the feed force. In the case of multiple teeth tools, a particular geometry with 14 teeth right helix right cut and 11 teeth left helix right cut gives the best results. A thick AlTiN coating or a diamond coating can dramatically improve the tool life while minimizing the axial force, roughness and delamination. A wear model has then been developed based on an abrasive behavior of the tool. The model links the feed rate to the tool geometry parameters (tool diameter), to the process parameters (feed rate, cutting speed and depth of cut) and to the wear. The model presented has been verified by experimental tests.« less

  20. Drug-Like Protein–Protein Interaction Modulators: Challenges and Opportunities for Drug Discovery and Chemical Biology

    PubMed Central

    Villoutreix, Bruno O; Kuenemann, Melaine A; Poyet, Jean-Luc; Bruzzoni-Giovanelli, Heriberto; Labbé, Céline; Lagorce, David; Sperandio, Olivier; Miteva, Maria A

    2014-01-01

    Fundamental processes in living cells are largely controlled by macromolecular interactions and among them, protein–protein interactions (PPIs) have a critical role while their dysregulations can contribute to the pathogenesis of numerous diseases. Although PPIs were considered as attractive pharmaceutical targets already some years ago, they have been thus far largely unexploited for therapeutic interventions with low molecular weight compounds. Several limiting factors, from technological hurdles to conceptual barriers, are known, which, taken together, explain why research in this area has been relatively slow. However, this last decade, the scientific community has challenged the dogma and became more enthusiastic about the modulation of PPIs with small drug-like molecules. In fact, several success stories were reported both, at the preclinical and clinical stages. In this review article, written for the 2014 International Summer School in Chemoinformatics (Strasbourg, France), we discuss in silico tools (essentially post 2012) and databases that can assist the design of low molecular weight PPI modulators (these tools can be found at www.vls3d.com). We first introduce the field of protein–protein interaction research, discuss key challenges and comment recently reported in silico packages, protocols and databases dedicated to PPIs. Then, we illustrate how in silico methods can be used and combined with experimental work to identify PPI modulators. PMID:25254076

  1. Finite element analysis simulations for ultrasonic array NDE inspections

    NASA Astrophysics Data System (ADS)

    Dobson, Jeff; Tweedie, Andrew; Harvey, Gerald; O'Leary, Richard; Mulholland, Anthony; Tant, Katherine; Gachagan, Anthony

    2016-02-01

    Advances in manufacturing techniques and materials have led to an increase in the demand for reliable and robust inspection techniques to maintain safety critical features. The application of modelling methods to develop and evaluate inspections is becoming an essential tool for the NDE community. Current analytical methods are inadequate for simulation of arbitrary components and heterogeneous materials, such as anisotropic welds or composite structures. Finite element analysis software (FEA), such as PZFlex, can provide the ability to simulate the inspection of these arrangements, providing the ability to economically prototype and evaluate improved NDE methods. FEA is often seen as computationally expensive for ultrasound problems however, advances in computing power have made it a more viable tool. This paper aims to illustrate the capability of appropriate FEA to produce accurate simulations of ultrasonic array inspections - minimizing the requirement for expensive test-piece fabrication. Validation is afforded via corroboration of the FE derived and experimentally generated data sets for a test-block comprising 1D and 2D defects. The modelling approach is extended to consider the more troublesome aspects of heterogeneous materials where defect dimensions can be of the same length scale as the grain structure. The model is used to facilitate the implementation of new ultrasonic array inspection methods for such materials. This is exemplified by considering the simulation of ultrasonic NDE in a weld structure in order to assess new approaches to imaging such structures.

  2. Thermal modelling approaches to enable mitigation measures implementation for salmonid gravel stages in hydropeaking rivers

    NASA Astrophysics Data System (ADS)

    Casas-Mulet, R.; Alfredsen, K. T.

    2016-12-01

    The dewatering of salmon spawning redds can lead to early life stages mortality due to hydropeaking operations, with higher impact on the alevins stages as they have lower tolerance to dewatering than the eggs. Targeted flow-related mitigations measures can reduce such mortality, but it is essential to understand how hydropeaking change thermal regimes in rivers and may impact embryo development; only then optimal measures can be implemented at the right development stage. We present a set of experimental approaches and modelling tools for the estimation of hatch and swim-up dates based on water temperature data in the river Lundesokna (Norway). We identified critical periods for gravel-stages survival and through comparing hydropeaking vs unregulated thermal and hydrological regimes, we established potential flow-release measures to minimise mortality. Modelling outcomes were then used assess the cost-efficiency of each measure. The combinations of modelling tools used in this study were overall satisfactory and their application can be useful especially in systems where little field data is available. Targeted measures built on well-informed modelling approaches can be pre-tested based on their efficiency to mitigate dewatering effects vs. the hydropower system capacity to release or conserve water for power production. Overall, environmental flow releases targeting specific ecological objectives can provide better cost-effective options than conventional operational rules complying with general legislation.

  3. The Taguchi methodology as a statistical tool for biotechnological applications: a critical appraisal.

    PubMed

    Rao, Ravella Sreenivas; Kumar, C Ganesh; Prakasham, R Shetty; Hobbs, Phil J

    2008-04-01

    Success in experiments and/or technology mainly depends on a properly designed process or product. The traditional method of process optimization involves the study of one variable at a time, which requires a number of combinations of experiments that are time, cost and labor intensive. The Taguchi method of design of experiments is a simple statistical tool involving a system of tabulated designs (arrays) that allows a maximum number of main effects to be estimated in an unbiased (orthogonal) fashion with a minimum number of experimental runs. It has been applied to predict the significant contribution of the design variable(s) and the optimum combination of each variable by conducting experiments on a real-time basis. The modeling that is performed essentially relates signal-to-noise ratio to the control variables in a 'main effect only' approach. This approach enables both multiple response and dynamic problems to be studied by handling noise factors. Taguchi principles and concepts have made extensive contributions to industry by bringing focused awareness to robustness, noise and quality. This methodology has been widely applied in many industrial sectors; however, its application in biological sciences has been limited. In the present review, the application and comparison of the Taguchi methodology has been emphasized with specific case studies in the field of biotechnology, particularly in diverse areas like fermentation, food processing, molecular biology, wastewater treatment and bioremediation.

  4. The first 'molecular disease': a story of Linus Pauling, the intellectual patron.

    PubMed

    Gormley, Melinda

    2007-06-01

    In November 1949, chemist Linus Pauling and three colleagues published an article on sickle-cell anemia, a study that opened up new and exciting possibilities for research into such 'molecular diseases'. Even before this celebrated publication appeared in Science, Pauling foresaw its potential benefits and announced it as a medical breakthrough: '... our structural chemistry and understanding of molecules is getting to the point where it should be of assistance in converting medicine into a real science' [Guiles, R. (1949) Discovery of blood disease called key to cancer research. The Detroit Times 13 Sep 1949, Newspaper Clippings 1949n.18, Pauling Papers.]. Their discovery--that this debilitating disorder was caused by an abnormal form of hemoglobin--was borne out of a rich mix of expertise, from Pauling's remarkable intuition to the careful experimental chemistry of his student Harvey A. Itano. It also relied upon technological innovation: a custom-made electrophoresis machine housed at the California Institute of Technology was the perfect tool to reveal fundamental chemical differences between normal and abnormal forms of hemoglobin. Not only did this work establish a new way of looking at inherited diseases, it also stimulated the mass production of the electrophoresis machine as an essential investigative and diagnostic tool. A close inspection of this case study illustrates just how Pauling ran his laboratory and helps to explain how one man could achieve so much over his lifetime.

  5. The auxin-inducible degradation (AID) system enables versatile conditional protein depletion in C. elegans

    PubMed Central

    Zhang, Liangyu; Ward, Jordan D.; Cheng, Ze; Dernburg, Abby F.

    2015-01-01

    Experimental manipulation of protein abundance in living cells or organisms is an essential strategy for investigation of biological regulatory mechanisms. Whereas powerful techniques for protein expression have been developed in Caenorhabditis elegans, existing tools for conditional disruption of protein function are far more limited. To address this, we have adapted the auxin-inducible degradation (AID) system discovered in plants to enable conditional protein depletion in C. elegans. We report that expression of a modified Arabidopsis TIR1 F-box protein mediates robust auxin-dependent depletion of degron-tagged targets. We document the effectiveness of this system for depletion of nuclear and cytoplasmic proteins in diverse somatic and germline tissues throughout development. Target proteins were depleted in as little as 20-30 min, and their expression could be re-established upon auxin removal. We have engineered strains expressing TIR1 under the control of various promoter and 3′ UTR sequences to drive tissue-specific or temporally regulated expression. The degron tag can be efficiently introduced by CRISPR/Cas9-based genome editing. We have harnessed this system to explore the roles of dynamically expressed nuclear hormone receptors in molting, and to analyze meiosis-specific roles for proteins required for germ line proliferation. Together, our results demonstrate that the AID system provides a powerful new tool for spatiotemporal regulation and analysis of protein function in a metazoan model organism. PMID:26552885

  6. Testing linen disinfection procedures in practice with phage-charged-bioindicators.

    PubMed

    Gerhardts, Anja; Mucha, Helmut; Höfer, Dirk

    2012-01-01

    Disinfecting laundry processes are essential to avoid contamination of laundering machines and linen during commercial laundry reprocessing in the health care sector. Recently a bacteriophage-charged bioindicator has been developed using MS2 as surrogate virus for testing of low-temperature disinfecting laundry processing on efficacy against viruses related to practice. This paper therefore aims to investigate application of MS2-bioindicators in chemothermal processes under practical conditions (phase 2/step 2) and in practice (phase 3). The experimental design was developed and modified according to the German Society for Hygiene and Microbiology (DGHM) Standard Methods for Testing Chemical Disinfection Processes. Tests under practical conditions were performed at 60 degrees C and 70 degrees C. Additional tests in tunnel washers were carried out at 60 degrees C and 70 degrees C. In all experiments validated disinfecting laundry processes, recommended for bactericidal and virucidal performance (categories A and B), were applied. The results show a temperature-dependent gradual efficacy against the test virus MS2 up to reduction values of more than 8 log10-steps. Therefore MS2-bioindicators prove to be suitable as a tool to determine the performance of disinfection procedures against viruses in practice. Phage-charged bioindicators may be a tool to provide further insights into the reliability of antiviral laundry processes for health care quality management and for infection control.

  7. Adaptive predictors based on probabilistic SVM for real time disruption mitigation on JET

    NASA Astrophysics Data System (ADS)

    Murari, A.; Lungaroni, M.; Peluso, E.; Gaudio, P.; Vega, J.; Dormido-Canto, S.; Baruzzo, M.; Gelfusa, M.; Contributors, JET

    2018-05-01

    Detecting disruptions with sufficient anticipation time is essential to undertake any form of remedial strategy, mitigation or avoidance. Traditional predictors based on machine learning techniques can be very performing, if properly optimised, but do not provide a natural estimate of the quality of their outputs and they typically age very quickly. In this paper a new set of tools, based on probabilistic extensions of support vector machines (SVM), are introduced and applied for the first time to JET data. The probabilistic output constitutes a natural qualification of the prediction quality and provides additional flexibility. An adaptive training strategy ‘from scratch’ has also been devised, which allows preserving the performance even when the experimental conditions change significantly. Large JET databases of disruptions, covering entire campaigns and thousands of discharges, have been analysed, both for the case of the graphite and the ITER Like Wall. Performance significantly better than any previous predictor using adaptive training has been achieved, satisfying even the requirements of the next generation of devices. The adaptive approach to the training has also provided unique information about the evolution of the operational space. The fact that the developed tools give the probability of disruption improves the interpretability of the results, provides an estimate of the predictor quality and gives new insights into the physics. Moreover, the probabilistic treatment permits to insert more easily these classifiers into general decision support and control systems.

  8. Guiding Students to Develop an Understanding of Scientific Inquiry: A Science Skills Approach to Instruction and Assessment

    PubMed Central

    Stone, Elisa M.

    2014-01-01

    New approaches for teaching and assessing scientific inquiry and practices are essential for guiding students to make the informed decisions required of an increasingly complex and global society. The Science Skills approach described here guides students to develop an understanding of the experimental skills required to perform a scientific investigation. An individual teacher's investigation of the strategies and tools she designed to promote scientific inquiry in her classroom is outlined. This teacher-driven action research in the high school biology classroom presents a simple study design that allowed for reciprocal testing of two simultaneous treatments, one that aimed to guide students to use vocabulary to identify and describe different scientific practices they were using in their investigations—for example, hypothesizing, data analysis, or use of controls—and another that focused on scientific collaboration. A knowledge integration (KI) rubric was designed to measure how students integrated their ideas about the skills and practices necessary for scientific inquiry. KI scores revealed that student understanding of scientific inquiry increased significantly after receiving instruction and using assessment tools aimed at promoting development of specific inquiry skills. General strategies for doing classroom-based action research in a straightforward and practical way are discussed, as are implications for teaching and evaluating introductory life sciences courses at the undergraduate level. PMID:24591508

  9. Modelling the Krebs cycle and oxidative phosphorylation.

    PubMed

    Korla, Kalyani; Mitra, Chanchal K

    2014-01-01

    The Krebs cycle and oxidative phosphorylation are the two most important sets of reactions in a eukaryotic cell that meet the major part of the total energy demands of a cell. In this paper, we present a computer simulation of the coupled reactions using open source tools for simulation. We also show that it is possible to model the Krebs cycle with a simple black box with a few inputs and outputs. However, the kinetics of the internal processes has been modelled using numerical tools. We also show that the Krebs cycle and oxidative phosphorylation together can be combined in a similar fashion - a black box with a few inputs and outputs. The Octave script is flexible and customisable for any chosen set-up for this model. In several cases, we had no explicit idea of the underlying reaction mechanism and the rate determining steps involved, and we have used the stoichiometric equations that can be easily changed as and when more detailed information is obtained. The script includes the feedback regulation of the various enzymes of the Krebs cycle. For the electron transport chain, the pH gradient across the membrane is an essential regulator of the kinetics and this has been modelled empirically but fully consistent with experimental results. The initial conditions can be very easily changed and the simulation is potentially very useful in a number of cases of clinical importance.

  10. A Vision on the Status and Evolution of HEP Physics Software Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Canal, P.; Elvira, D.; Hatcher, R.

    2013-07-28

    This paper represents the vision of the members of the Fermilab Scientific Computing Division's Computational Physics Department (SCD-CPD) on the status and the evolution of various HEP software tools such as the Geant4 detector simulation toolkit, the Pythia and GENIE physics generators, and the ROOT data analysis framework. The goal of this paper is to contribute ideas to the Snowmass 2013 process toward the composition of a unified document on the current status and potential evolution of the physics software tools which are essential to HEP.

  11. Combining experimental evolution with next-generation sequencing: a powerful tool to study adaptation from standing genetic variation.

    PubMed

    Schlötterer, C; Kofler, R; Versace, E; Tobler, R; Franssen, S U

    2015-05-01

    Evolve and resequence (E&R) is a new approach to investigate the genomic responses to selection during experimental evolution. By using whole genome sequencing of pools of individuals (Pool-Seq), this method can identify selected variants in controlled and replicable experimental settings. Reviewing the current state of the field, we show that E&R can be powerful enough to identify causative genes and possibly even single-nucleotide polymorphisms. We also discuss how the experimental design and the complexity of the trait could result in a large number of false positive candidates. We suggest experimental and analytical strategies to maximize the power of E&R to uncover the genotype-phenotype link and serve as an important research tool for a broad range of evolutionary questions.

  12. Comparative Investigation on Tool Wear during End Milling of AISI H13 Steel with Different Tool Path Strategies

    NASA Astrophysics Data System (ADS)

    Adesta, Erry Yulian T.; Riza, Muhammad; Avicena

    2018-03-01

    Tool wear prediction plays a significant role in machining industry for proper planning and control machining parameters and optimization of cutting conditions. This paper aims to investigate the effect of tool path strategies that are contour-in and zigzag tool path strategies applied on tool wear during pocket milling process. The experiments were carried out on CNC vertical machining centre by involving PVD coated carbide inserts. Cutting speed, feed rate and depth of cut were set to vary. In an experiment with three factors at three levels, Response Surface Method (RSM) design of experiment with a standard called Central Composite Design (CCD) was employed. Results obtained indicate that tool wear increases significantly at higher range of feed per tooth compared to cutting speed and depth of cut. This result of this experimental work is then proven statistically by developing empirical model. The prediction model for the response variable of tool wear for contour-in strategy developed in this research shows a good agreement with experimental work.

  13. Participatory adaptive management leads to environmental learning outcomes extending beyond the sphere of science.

    PubMed

    Fujitani, Marie; McFall, Andrew; Randler, Christoph; Arlinghaus, Robert

    2017-06-01

    Resolving uncertainties in managed social-ecological systems requires adaptive experimentation at whole-ecosystem levels. However, whether participatory adaptive management fosters ecological understanding among stakeholders beyond the sphere of science is unknown. We experimentally involved members of German angling clubs ( n = 181 in workshops, n = 2483 in total) engaged in self-governance of freshwater fisheries resources in a large-scale ecological experiment of active adaptive management of fish stocking, which constitutes a controversial management practice for biodiversity and ecosystem functioning when conducted inappropriately. The collaborative ecological experiments spanned several years and manipulated fish densities in 24 lakes with two species. In parallel, we experimentally compared changes in ecological knowledge and antecedents of proenvironmental behavior in stakeholders and managers who were members of a participatory adaptive management treatment group, with those receiving only a standard lecture, relative to placebo controls. Using a within-subjects pretest-posttest control design, changes in ecological knowledge, environmental beliefs, attitudes, norms, and behavioral intentions were evaluated. Participants in adaptive management retained more knowledge of ecological topics after a period of 8 months compared to those receiving a standard lecture, both relative to controls. Involvement in adaptive management was also the only treatment that altered personal norms and beliefs related to stocking. Critically, only the stakeholders who participated in adaptive management reduced their behavioral intentions to engage in fish stocking in the future. Adaptive management is essential for robust ecological knowledge, and we show that involving stakeholders in adaptive management experiments is a powerful tool to enhance ecological literacy and build environmental capacity to move toward sustainability.

  14. Participatory adaptive management leads to environmental learning outcomes extending beyond the sphere of science

    PubMed Central

    Fujitani, Marie; McFall, Andrew; Randler, Christoph; Arlinghaus, Robert

    2017-01-01

    Resolving uncertainties in managed social-ecological systems requires adaptive experimentation at whole-ecosystem levels. However, whether participatory adaptive management fosters ecological understanding among stakeholders beyond the sphere of science is unknown. We experimentally involved members of German angling clubs (n = 181 in workshops, n = 2483 in total) engaged in self-governance of freshwater fisheries resources in a large-scale ecological experiment of active adaptive management of fish stocking, which constitutes a controversial management practice for biodiversity and ecosystem functioning when conducted inappropriately. The collaborative ecological experiments spanned several years and manipulated fish densities in 24 lakes with two species. In parallel, we experimentally compared changes in ecological knowledge and antecedents of proenvironmental behavior in stakeholders and managers who were members of a participatory adaptive management treatment group, with those receiving only a standard lecture, relative to placebo controls. Using a within-subjects pretest-posttest control design, changes in ecological knowledge, environmental beliefs, attitudes, norms, and behavioral intentions were evaluated. Participants in adaptive management retained more knowledge of ecological topics after a period of 8 months compared to those receiving a standard lecture, both relative to controls. Involvement in adaptive management was also the only treatment that altered personal norms and beliefs related to stocking. Critically, only the stakeholders who participated in adaptive management reduced their behavioral intentions to engage in fish stocking in the future. Adaptive management is essential for robust ecological knowledge, and we show that involving stakeholders in adaptive management experiments is a powerful tool to enhance ecological literacy and build environmental capacity to move toward sustainability. PMID:28630904

  15. Evaluation of the leishmanicidal and cytotoxic potential of essential oils derived from ten colombian plants.

    PubMed

    Sanchez-Suarez, Jf; Riveros, I; Delgado, G

    2013-01-01

    The leishmanicidal and cytotoxic activity of ten essential oils obtained from ten plant specimens were evaluated. Essential oils were obtained by the steam distillation of plant leaves without any prior processing. Cytotoxicity was tested on J774 macrophages and leishmanicidal activity was assessed against four species of Leishmania associated with cutaneous leishmaniasis. Seven essential oils exhibited activity against Leishmania parasites, five of which were toxic against J774 macrophages. Selectivity indices of >6 and 13 were calculated for the essential oils of Ocimum basilicum and Origanum vulgare, respectively. The essential oil of Ocimum basilicum was active against promastigotes of Leishmania and innocuous to J774 macrophages at concentrations up to 1600 µg/mL and should be further investigated for leishmanicidal activity in others in vitro and in vivo experimental models.

  16. The Peter Effect in Early Experimental Education Research.

    ERIC Educational Resources Information Center

    Little, Joseph

    2003-01-01

    Traces the ways in which educational researchers referred to Ronald A. Fisher's analysis of variance (ANOVA) between 1932 and 1944 in the "Journal of Experimental Education" (JXE). Shows how the changes in citational practices served to separate the ANOVA from its affiliation with Fisher, essentially effacing the memory of its human…

  17. Model-based high-throughput design of ion exchange protein chromatography.

    PubMed

    Khalaf, Rushd; Heymann, Julia; LeSaout, Xavier; Monard, Florence; Costioli, Matteo; Morbidelli, Massimo

    2016-08-12

    This work describes the development of a model-based high-throughput design (MHD) tool for the operating space determination of a chromatographic cation-exchange protein purification process. Based on a previously developed thermodynamic mechanistic model, the MHD tool generates a large amount of system knowledge and thereby permits minimizing the required experimental workload. In particular, each new experiment is designed to generate information needed to help refine and improve the model. Unnecessary experiments that do not increase system knowledge are avoided. Instead of aspiring to a perfectly parameterized model, the goal of this design tool is to use early model parameter estimates to find interesting experimental spaces, and to refine the model parameter estimates with each new experiment until a satisfactory set of process parameters is found. The MHD tool is split into four sections: (1) prediction, high throughput experimentation using experiments in (2) diluted conditions and (3) robotic automated liquid handling workstations (robotic workstation), and (4) operating space determination and validation. (1) Protein and resin information, in conjunction with the thermodynamic model, is used to predict protein resin capacity. (2) The predicted model parameters are refined based on gradient experiments in diluted conditions. (3) Experiments on the robotic workstation are used to further refine the model parameters. (4) The refined model is used to determine operating parameter space that allows for satisfactory purification of the protein of interest on the HPLC scale. Each section of the MHD tool is used to define the adequate experimental procedures for the next section, thus avoiding any unnecessary experimental work. We used the MHD tool to design a polishing step for two proteins, a monoclonal antibody and a fusion protein, on two chromatographic resins, in order to demonstrate it has the ability to strongly accelerate the early phases of process development. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Characterization of essential proteins based on network topology in proteins interaction networks

    NASA Astrophysics Data System (ADS)

    Bakar, Sakhinah Abu; Taheri, Javid; Zomaya, Albert Y.

    2014-06-01

    The identification of essential proteins is theoretically and practically important as (1) it is essential to understand the minimal surviving requirements for cellular lives, and (2) it provides fundamental for development of drug. As conducting experimental studies to identify essential proteins are both time and resource consuming, here we present a computational approach in predicting them based on network topology properties from protein-protein interaction networks of Saccharomyces cerevisiae. The proposed method, namely EP3NN (Essential Proteins Prediction using Probabilistic Neural Network) employed a machine learning algorithm called Probabilistic Neural Network as a classifier to identify essential proteins of the organism of interest; it uses degree centrality, closeness centrality, local assortativity and local clustering coefficient of each protein in the network for such predictions. Results show that EP3NN managed to successfully predict essential proteins with an accuracy of 95% for our studied organism. Results also show that most of the essential proteins are close to other proteins, have assortativity behavior and form clusters/sub-graph in the network.

  19. Producing Fe-W-Co-Cr-C Alloy Cutting Tool Material Through Powder Metallurgy Route

    NASA Astrophysics Data System (ADS)

    Datta Banik, Bibhas; Dutta, Debasish; Ray, Siddhartha

    2017-04-01

    High speed steel tools can withstand high impact forces as they are tough in nature. But they cannot retain their hardness at elevated temperature i.e. their hot hardness is low. Therefore permissible cutting speed is low and tools wear out easily. Use of lubricants is essential for HSS cutting tools. On the other hand cemented carbide tools can withstand greater compressive force, but due to lower toughness the tool can break easily. Moreover the cost of the tool is comparatively high. To achieve a better machining economy, Fe-W-Co-Cr-C alloys are being used nowadays. Their toughness is as good as HSS tools and hardness is very near to carbide tools. Even, at moderate cutting speeds they can be safely used in old machines having vibration. Moreover it is much cheaper than carbide tools. This paper highlights the Manufacturing Technology of the alloy and studies the comparative tribological properties of the alloy and tungsten mono carbide.

  20. Effect of jasmonic acid elicitation on the yield, chemical composition, and antioxidant and anti-inflammatory properties of essential oil of lettuce leaf basil (Ocimum basilicum L.).

    PubMed

    Złotek, Urszula; Michalak-Majewska, Monika; Szymanowska, Urszula

    2016-12-15

    The effect of elicitation with jasmonic acid (JA) on the plant yield, the production and composition of essential oils of lettuce leaf basil was evaluated. JA-elicitation slightly affected the yield of plants and significantly increased the amount of essential oils produced by basil - the highest oil yield (0.78±0.005mL/100gdw) was achieved in plants elicited with 100μM JA. The application of the tested elicitor also influenced the chemical composition of basil essential oils - 100μM JA increased the linalool, eugenol, and limonene levels, while 1μM JA caused the highest increase in the methyl eugenol content. Essential oils from JA-elicited basil (especially 1μM and 100μM) exhibited more effective antioxidant and anti-inflammatory potential; therefore, this inducer may be a very useful biochemical tool for improving production and composition of herbal essential oils. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Development of High-Speed IV-VI Photodiodes

    DTIC Science & Technology

    1976-06-01

    is not yet an adequate theoretical analysis. However, early experimental results indicated that collection efficienclea near unitv are attainable...82171~$* te g w w ( I .f 1 INTRODUCTION 2 EXPERIMENTAL 3 JUNCTION CAPACITANCE 4 THE PINCHED-OFF PHOTODIODE 4.1 Genaral Considerations 4.2...developed by Ford Research Staff. The essential references to this previous work and to new experimental detVKji are given In Section 2 of the

  2. Using the Git Software Tool on the Peregrine System | High-Performance

    Science.gov Websites

    branch workflow. Create a local branch called "experimental" based on the current master... git branch experimental Use your branch (start working on that experimental branch....) git checkout experimental git pull origin experimental # work, work, work, commit.... Send local branch to the repo git push

  3. Guidelines for reporting and using prediction tools for genetic variation analysis.

    PubMed

    Vihinen, Mauno

    2013-02-01

    Computational prediction methods are widely used for the analysis of human genome sequence variants and their effects on gene/protein function, splice site aberration, pathogenicity, and disease risk. New methods are frequently developed. We believe that guidelines are essential for those writing articles about new prediction methods, as well as for those applying these tools in their research, so that the necessary details are reported. This will enable readers to gain the full picture of technical information, performance, and interpretation of results, and to facilitate comparisons of related methods. Here, we provide instructions on how to describe new methods, report datasets, and assess the performance of predictive tools. We also discuss what details of predictor implementation are essential for authors to understand. Similarly, these guidelines for the use of predictors provide instructions on what needs to be delineated in the text, as well as how researchers can avoid unwarranted conclusions. They are applicable to most prediction methods currently utilized. By applying these guidelines, authors will help reviewers, editors, and readers to more fully comprehend prediction methods and their use. © 2012 Wiley Periodicals, Inc.

  4. Is there a “net generation” in veterinary medicine? A comparative study on the use of the Internet and Web 2.0 by students and the veterinary profession

    PubMed Central

    Tenhaven, Christoph; Tipold, Andrea; Fischer, Martin R.; Ehlers, Jan P.

    2013-01-01

    Introduction: Informal and formal lifelong learning is essential at university and in the workplace. Apart from classical learning techniques, Web 2.0 tools can be used. It is controversial whether there is a so-called net generation amongst people under 30. Aims: To test the hypothesis that a net generation among students and young veterinarians exists. Methods: An online survey of students and veterinarians was conducted in the German-speaking countries which was advertised via online media and traditional print media. Results: 1780 people took part in the survey. Students and veterinarians have different usage patterns regarding social networks (91.9% vs. 69%) and IM (55.9% vs. 24.5%). All tools were predominantly used passively and in private, to a lesser extent also professionally and for studying. Outlook: The use of Web 2.0 tools is useful, however, teaching information and media skills, preparing codes of conduct for the internet and verification of user generated content is essential. PMID:23467682

  5. Applications of Nuclear and Particle Physics Technology: Particles & Detection — A Brief Overview

    NASA Astrophysics Data System (ADS)

    Weisenberger, Andrew G.

    A brief overview of the technology applications with significant societal benefit that have their origins in nuclear and particle physics research is presented. It is shown through representative examples that applications of nuclear physics can be classified into two basic areas: 1) applying the results of experimental nuclear physics and 2) applying the tools of experimental nuclear physics. Examples of the application of the tools of experimental nuclear and particle physics research are provided in the fields of accelerator and detector based technologies namely synchrotron light sources, nuclear medicine, ion implantation and radiation therapy.

  6. Knowledge-Acquisition Tool For Expert System

    NASA Technical Reports Server (NTRS)

    Disbrow, James D.; Duke, Eugene L.; Regenie, Victoria A.

    1988-01-01

    Digital flight-control systems monitored by computer program that evaluates and recommends. Flight-systems engineers for advanced, high-performance aircraft use knowlege-acquisition tool for expert-system flight-status monitor suppling interpretative data. Interpretative function especially important in time-critical, high-stress situations because it facilitates problem identification and corrective strategy. Conditions evaluated and recommendations made by ground-based engineers having essential knowledge for analysis and monitoring of performances of advanced aircraft systems.

  7. The Effects of Explosive Blast as Compared to Post-traumatic Stress Disorder on Brain Function and Structure

    DTIC Science & Technology

    2013-04-01

    Neuropsychology (AACN). Chicago , Illinois. One of the challenges in assessing the essential neural features of mild TBI in veterans is that... Chicago , Illinois. The tool, preliminarily called the Minnesota Blast Exposure Screening Tool (MN-BEST; see Figure 12), complements current screening...the AACN. Chicago , Illinois. Examination of the number of post-concussive symptoms endorsed by the entire National Guard sample indicates that

  8. Engineering With Nature Geographic Project Mapping Tool (EWN ProMap)

    DTIC Science & Technology

    2015-07-01

    EWN ProMap database provides numerous case studies for infrastructure projects such as breakwaters, river engineering dikes, and seawalls that have...the EWN Project Mapping Tool (EWN ProMap) is to assist users in their search for case study information that can be valuable for developing EWN ideas...Essential elements of EWN include: (1) using science and engineering to produce operational efficiencies supporting sustainable delivery of

  9. Prioritising coastal zone management issues through fuzzy cognitive mapping approach.

    PubMed

    Meliadou, Aleka; Santoro, Francesca; Nader, Manal R; Dagher, Manale Abou; Al Indary, Shadi; Salloum, Bachir Abi

    2012-04-30

    Effective public participation is an essential component of Integrated Coastal Zone Management implementation. To promote such participation, a shared understanding of stakeholders' objectives has to be built to ultimately result in common coastal management strategies. The application of quantitative and semi-quantitative methods involving tools such as Fuzzy Cognitive Mapping is presently proposed for reaching such understanding. In this paper we apply the Fuzzy Cognitive Mapping tool to elucidate the objectives and priorities of North Lebanon's coastal productive sectors, and to formalize their coastal zone perceptions and knowledge. Then, we investigate the potential of Fuzzy Cognitive Mapping as tool for support coastal zone management. Five round table discussions were organized; one for the municipalities of the area and one for each of the main coastal productive sectors (tourism, industry, fisheries, agriculture), where the participants drew cognitive maps depicting their views. The analysis of the cognitive maps showed a large number of factors perceived as affecting the current situation of the North Lebanon coastal zone that were classified into five major categories: governance, infrastructure, environment, intersectoral interactions and sectoral initiatives. Furthermore, common problems, expectations and management objectives for all sectors were exposed. Within this context, Fuzzy Cognitive Mapping proved to be an essential tool for revealing stakeholder knowledge and perception and understanding complex relationships. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. Development of method for quantifying essential tremor using a small optical device.

    PubMed

    Chen, Kai-Hsiang; Lin, Po-Chieh; Chen, Yu-Jung; Yang, Bing-Shiang; Lin, Chin-Hsien

    2016-06-15

    Clinical assessment scales are the most common means used by physicians to assess tremor severity. Some scientific tools that may be able to replace these scales to objectively assess the severity, such as accelerometers, digital tablets, electromyography (EMG) measurement devices, and motion capture cameras, are currently available. However, most of the operational modes of these tools are relatively complex or are only able to capture part of the clinical information; furthermore, using these tools is sometimes time consuming. Currently, there is no tool available for automatically quantifying tremor severity in clinical environments. We aimed to develop a rapid, objective, and quantitative system for measuring the severity of finger tremor using a small portable optical device (Leap Motion). A single test took 15s to conduct, and three algorithms were proposed to quantify the severity of finger tremor. The system was tested with four patients diagnosed with essential tremor. The proposed algorithms were able to quantify different characteristics of tremor in clinical environments, and could be used as references for future clinical assessments. A portable, easy-to-use, small-sized, and noncontact device (Leap Motion) was used to clinically detect and record finger movement, and three algorithms were proposed to describe tremor amplitudes. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Assessing the Success Rate of Students Using a Learning Management System Together with a Collaborative Tool in Web-Based Teaching of Programming Languages

    ERIC Educational Resources Information Center

    Cavus, Nadire; Ibrahim, Dogan

    2007-01-01

    The development of collaborative studies in learning has led to a renewed interest in the field of Web-based education. In this experimental study a highly interactive and collaborative virtual teaching environment has been created by supporting Moodle LMS with collaborative learning tool GREWPtool. The aim of this experimental study has been to…

  12. Smartphones as Experimental Tools: Different Methods to Determine the Gravitational Acceleration in Classroom Physics by Using Everyday Devices

    ERIC Educational Resources Information Center

    Kuhn, Jochen; Vogt, Patrik

    2013-01-01

    New media technology becomes more and more important for our daily life as well as for teaching physics. Within the scope of our N.E.T. research project we develop experiments using New Media Experimental Tools (N.E.T.) in physics education and study their influence on students learning abilities. We want to present the possibilities e.g. of…

  13. Essential learning tools for continuing medical education for physicians, geneticists, nurses, allied health professionals, mental health professionals, business administration professionals, and reproductive endocrinology and infertility (REI) fellows: the Midwest Reproductive Symposium International.

    PubMed

    Collins, Gretchen G; Jeelani, Roohi; Beltsos, Angeline; Kearns, William G

    2018-04-01

    Essential learning tools for continuing medical education are a challenge in today's rapidly evolving field of reproductive medicine. The Midwest Reproductive Symposium International (MRSi) is a yearly conference held in Chicago, IL. The conference is targeted toward physicians, geneticists, nurses, allied health professionals, mental health professionals, business administration professionals, and reproductive endocrinology and infertility (REI) fellows engaged in the practice of reproductive medicine. In addition to the scientific conference agenda, there are specific sessions for nurses, mental health professionals, and REI fellows. Unique to the MRSi conference, there is also a separate "Business Minds" session to provide education on business acumen as it is an important element to running a department, division, or private clinic.

  14. Combining functional and structural genomics to sample the essential Burkholderia structome.

    PubMed

    Baugh, Loren; Gallagher, Larry A; Patrapuvich, Rapatbhorn; Clifton, Matthew C; Gardberg, Anna S; Edwards, Thomas E; Armour, Brianna; Begley, Darren W; Dieterich, Shellie H; Dranow, David M; Abendroth, Jan; Fairman, James W; Fox, David; Staker, Bart L; Phan, Isabelle; Gillespie, Angela; Choi, Ryan; Nakazawa-Hewitt, Steve; Nguyen, Mary Trang; Napuli, Alberto; Barrett, Lynn; Buchko, Garry W; Stacy, Robin; Myler, Peter J; Stewart, Lance J; Manoil, Colin; Van Voorhis, Wesley C

    2013-01-01

    The genus Burkholderia includes pathogenic gram-negative bacteria that cause melioidosis, glanders, and pulmonary infections of patients with cancer and cystic fibrosis. Drug resistance has made development of new antimicrobials critical. Many approaches to discovering new antimicrobials, such as structure-based drug design and whole cell phenotypic screens followed by lead refinement, require high-resolution structures of proteins essential to the parasite. We experimentally identified 406 putative essential genes in B. thailandensis, a low-virulence species phylogenetically similar to B. pseudomallei, the causative agent of melioidosis, using saturation-level transposon mutagenesis and next-generation sequencing (Tn-seq). We selected 315 protein products of these genes based on structure-determination criteria, such as excluding very large and/or integral membrane proteins, and entered them into the Seattle Structural Genomics Center for Infection Disease (SSGCID) structure determination pipeline. To maximize structural coverage of these targets, we applied an "ortholog rescue" strategy for those producing insoluble or difficult to crystallize proteins, resulting in the addition of 387 orthologs (or paralogs) from seven other Burkholderia species into the SSGCID pipeline. This structural genomics approach yielded structures from 31 putative essential targets from B. thailandensis, and 25 orthologs from other Burkholderia species, yielding an overall structural coverage for 49 of the 406 essential gene families, with a total of 88 depositions into the Protein Data Bank. Of these, 25 proteins have properties of a potential antimicrobial drug target i.e., no close human homolog, part of an essential metabolic pathway, and a deep binding pocket. We describe the structures of several potential drug targets in detail. This collection of structures, solubility and experimental essentiality data provides a resource for development of drugs against infections and diseases caused by Burkholderia. All expression clones and proteins created in this study are freely available by request.

  15. Communication: essential strategies for success.

    PubMed

    O'Connor, Mary

    2013-06-01

    This department highlights change management strategies that may be successful in strategically planning and executing organizational change initiatives. With the goal of presenting practical approaches helpful to nurse leaders advance organizational change, content includes evidence-based projects, tool, and resources that mobilize and sustain organizational change initiatives. In this article, the author discusses strategies for communication for change processes, whether large or small. Intentional planning and development of a communication strategy alongside, not as an afterthought, to change initiatives are essential.

  16. The effect of sour tea (Hibiscus sabdariffa) on essential hypertension.

    PubMed

    Haji Faraji, M; Haji Tarkhani, A

    1999-06-01

    Considering the high prevalence of hypertension, its debilitating end organ damage, and the side effects of chemical drugs used for its treatment, we conducted this experimental study to evaluate the effect of sour tea (Hibiscus sabdariffa) on essential hypertension. For this purpose, 31 and 23 patients with moderate essential hypertension were randomly assigned to an experimental and control group, respectively. Patients with secondary hypertension or those consuming more than two drugs were excluded from the study. Systolic and diastolic blood pressures were measured before and 15 days after the intervention. In the experimental group, 45% of the patients were male and 55% were female, and the mean age was 52.6 +/- 7.9 years. In the control group, 30% of the patients were male, 70% were female, and the mean age of the patients was 51.5 +/- 10.1 years. Statistical findings showed an 11.2% lowering of the systolic blood pressure and a 10.7% decrease of diastolic pressure in the experimental group 12 days after beginning the treatment, as compared with the first day. The difference between the systolic blood pressures of the two groups was significant, as was the difference of the diastolic pressures of the two groups. Three days after stopping the treatment, systolic blood pressure was elevated by 7.9%, and diastolic pressure was elevated by 5.6% in the experimental and control groups. This difference between the two groups was also significant. This study proves the public belief and the results of in vitro studies concerning the effects of sour tea on lowering high blood pressure. More extensive studies on this subject are needed.

  17. Experimental Study of Tool Wear and Grinding Forces During BK-7 Glass Micro-grinding with Modified PCD Tool

    NASA Astrophysics Data System (ADS)

    Pratap, A.; Sahoo, P.; Patra, K.; Dyakonov, A. A.

    2017-09-01

    This study focuses on the improvement in grinding performance of BK-7 glass using polycrystalline diamond micro-tool. Micro-tools are modified using wire EDM and performance of modified tools is compared with that of as received tool. Tool wear of different types of tools are observed. To quantify the tool wear, a method based on weight loss of tool is introduced in this study. Modified tools significantly reduce tool wear in comparison to the normal tool. Grinding forces increase with machining time due to tool wear. However, modified tools produce lesser forces thus can improve life of the PCD micro-grinding tool.

  18. Copper toxicology, oxidative stress and inflammation using zebrafish as experimental model.

    PubMed

    Pereira, Talita Carneiro Brandão; Campos, Maria Martha; Bogo, Maurício Reis

    2016-07-01

    Copper is an essential micronutrient and a key catalytic cofactor in a wide range of enzymes. As a trace element, copper levels are tightly regulated and both its deficit and excess are deleterious to the organism. Under inflammatory conditions, serum copper levels are increased and trigger oxidative stress responses that activate inflammatory responses. Interestingly, copper dyshomeostasis, oxidative stress and inflammation are commonly present in several chronic diseases. Copper exposure can be easily modeled in zebrafish; a consolidated model in toxicology with increasing interest in immunity-related research. As a result of developmental, economical and genetic advantages, this freshwater teleost is uniquely suitable for chemical and genetic large-scale screenings, representing a powerful experimental tool for a whole-organism approach, mechanistic studies, disease modeling and beyond. Copper toxicological and more recently pro-inflammatory effects have been investigated in both larval and adult zebrafish with breakthrough findings. Here, we provide an overview of copper metabolism in health and disease and its effects on oxidative stress and inflammation responses in zebrafish models. Copper-induced inflammation is highlighted owing to its potential to easily mimic pro-oxidative and pro-inflammatory features that combined with zebrafish genetic tractability could help further in the understanding of copper metabolism, inflammatory responses and related diseases. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  19. The discovery of the neutron and its consequences (1930-1940)

    NASA Astrophysics Data System (ADS)

    Nesvizhevsky, Valery; Villain, Jacques

    2017-11-01

    In 1930, Walther Bothe and Herbert Becker performed an experiment, which was further improved by Irène and Frédéric Joliot-Curie. These authors, however, misinterpreted their results and believed to have observed γ-rays while they had seen neutrons. After additional experimental verifications, James Chadwick gave the correct interpretation of these experiments in 1932. Immediately, the new particle, the neutron, became an essential actor of nuclear and elementary particle physics, and completely changed the whole research landscape. Enrico Fermi and his group applied it to artificial radioactivity, substituting neutrons to α-rays initially used by Joliot-Curies. They also discovered that slow neutrons were more efficient than fast ones in certain nuclear reactions. A crucial discovery of Otto Hahn, Fritz Straßmann, Lise Meitner, and Otto Frisch, after several misinterpretations of complicated experimental results, was nuclear fission. When Joliot, Halban, and Kowarski demonstrated the possibility of a chain reaction by neutron multiplication due to fission, nuclear physics became a military science, at the very moment when the Second World War was beginning. Later it led to nuclear power applications and use of neutrons as an important tool and object of scientific research at large-scale neutron facilities. The Comptes rendus de l'Académie des sciences were partner of a vivid international debate involving several other journals.

  20. Accurate determination of interfacial protein secondary structure by combining interfacial-sensitive amide I and amide III spectral signals.

    PubMed

    Ye, Shuji; Li, Hongchun; Yang, Weilai; Luo, Yi

    2014-01-29

    Accurate determination of protein structures at the interface is essential to understand the nature of interfacial protein interactions, but it can only be done with a few, very limited experimental methods. Here, we demonstrate for the first time that sum frequency generation vibrational spectroscopy can unambiguously differentiate the interfacial protein secondary structures by combining surface-sensitive amide I and amide III spectral signals. This combination offers a powerful tool to directly distinguish random-coil (disordered) and α-helical structures in proteins. From a systematic study on the interactions between several antimicrobial peptides (including LKα14, mastoparan X, cecropin P1, melittin, and pardaxin) and lipid bilayers, it is found that the spectral profiles of the random-coil and α-helical structures are well separated in the amide III spectra, appearing below and above 1260 cm(-1), respectively. For the peptides with a straight backbone chain, the strength ratio for the peaks of the random-coil and α-helical structures shows a distinct linear relationship with the fraction of the disordered structure deduced from independent NMR experiments reported in the literature. It is revealed that increasing the fraction of negatively charged lipids can induce a conformational change of pardaxin from random-coil to α-helical structures. This experimental protocol can be employed for determining the interfacial protein secondary structures and dynamics in situ and in real time without extraneous labels.

  1. UV-resonance Raman spectroscopy of amino acids

    NASA Astrophysics Data System (ADS)

    Höhl, Martin; Meinhardt-Wollweber, Merve; Schmitt, Heike; Lenarz, Thomas; Morgner, Uwe

    2016-03-01

    Resonant enhancement of Raman signals is a useful method to increase sensitivity in samples with low concentration such as biological tissue. The investigation of resonance profiles shows the optimal excitation wavelength and yields valuable information about the molecules themselves. However careful characterization and calibration of all experimental parameters affecting quantum yield is required in order to achieve comparability of the single spectra recorded. We present an experimental technique for measuring the resonance profiles of different amino acids. The absorption lines of these molecules are located in the ultraviolet (UV) wavelength range. One limitation for broadband measurement of resonance profiles is the limited availability of Raman filters in certain regions of the UV for blocking the Rayleigh scattered light. Here, a wavelength range from 244.8 nm to 266.0 nm was chosen. The profiles reveal the optimal wavelength for recording the Raman spectra of amino acids in aqueous solutions in this range. This study provides the basis for measurements on more complex molecules such as proteins in the human perilymph. The composition of this liquid in the inner ear is essential for hearing and cannot be analyzed non-invasively so far. The long term aim is to implement this technique as a fiber based endoscope for non-invasive measurements during surgeries (e. g. cochlear implants) making it available as a diagnostic tool for physicians. This project is embedded in the interdisciplinary cluster of excellence "Hearing for all" (H4A).

  2. Pseudo-shock waves and their interactions in high-speed intakes

    NASA Astrophysics Data System (ADS)

    Gnani, F.; Zare-Behtash, H.; Kontis, K.

    2016-04-01

    In an air-breathing engine the flow deceleration from supersonic to subsonic conditions takes places inside the isolator through a gradual compression consisting of a series of shock waves. The wave system, referred to as a pseudo-shock wave or shock train, establishes the combustion chamber entrance conditions, and therefore influences the performance of the entire propulsion system. The characteristics of the pseudo-shock depend on a number of variables which make this flow phenomenon particularly challenging to be analysed. Difficulties in experimentally obtaining accurate flow quantities at high speeds and discrepancies of numerical approaches with measured data have been readily reported. Understanding the flow physics in the presence of the interaction of numerous shock waves with the boundary layer in internal flows is essential to developing methods and control strategies. To counteract the negative effects of shock wave/boundary layer interactions, which are responsible for the engine unstart process, multiple flow control methodologies have been proposed. Improved analytical models, advanced experimental methodologies and numerical simulations have allowed a more in-depth analysis of the flow physics. The present paper aims to bring together the main results, on the shock train structure and its associated phenomena inside isolators, studied using the aforementioned tools. Several promising flow control techniques that have more recently been applied to manipulate the shock wave/boundary layer interaction are also examined in this review.

  3. Determination of toxic and essential trace elements in serum of healthy and hypothyroid respondents by ICP-MS: A chemometric approach for discrimination of hypothyroidism.

    PubMed

    Stojsavljević, Aleksandar; Trifković, Jelena; Rasić-Milutinović, Zorica; Jovanović, Dragana; Bogdanović, Gradimir; Mutić, Jelena; Manojlović, Dragan

    2018-07-01

    Inductively coupled plasma-mass spectrometry ((ICP-MS)) was used to determine three toxic (Ni, As, Cd) and six essential trace elements (Cr, Mn, Co, Cu, Zn, Se) in blood serum of patients with hypothyroidism (Hy group) and healthy people (control group), in order to set the experimental conditions for accurate determination of a unique profile of these elements in hypothyroidism. Method validation was performed with standard reference material of the serum by varying the sample treatment with both standard and collision mode for analysis of elements isotopes. Quadratic curvilinear functions with good performances of models and the lowest detection limits were obtained for 52 Cr, 66 Zn, 75 As, 112 Cd in collision mode, and 55 Mn, 59 Co, 60 Ni, 65 Cu, 78 Se in standard mode. Treatment of serum samples with aqueous solution containing nitric acid, Triton X-100 and n-butanol gave the best results. Chemometric tools were applied for discrimination of patients with hypothyroidism. All nine elements discriminated Hy group of samples with almost the same discriminating power as indicated by their higher values for this group of patients. Statistically significant correlation (p < 0.01) was observed for several elements. Results indicated clear differences in element profile between Hy and control group and it could be used as a unique profile of hypothyroid state. Copyright © 2018 Elsevier GmbH. All rights reserved.

  4. Zebrafish in the sea of mineral (iron, zinc, and copper) metabolism

    PubMed Central

    Zhao, Lu; Xia, Zhidan; Wang, Fudi

    2014-01-01

    Iron, copper, zinc, and eight other minerals are classified as essential trace elements because they present in minute in vivo quantities and are essential for life. Because either excess or insufficient levels of trace elements can be detrimental to life (causing human diseases such as iron-deficiency anemia, hemochromatosis, Menkes syndrome and Wilson's disease), the endogenous levels of trace minerals must be tightly regulated. Many studies have demonstrated the existence of systems that maintain trace element homeostasis, and these systems are highly conserved in multiple species ranging from yeast to mice. As a model for studying trace mineral metabolism, the zebrafish is indispensable to researchers. Several large-scale mutagenesis screens have been performed in zebrafish, and these screens led to the identification of a series of metal transporters and the generation of several mutagenesis lines, providing an in-depth functional analysis at the system level. Moreover, because of their developmental advantages, zebrafish have also been used in mineral metabolism-related chemical screens and toxicology studies. Here, we systematically review the major findings of trace element homeostasis studies using the zebrafish model, with a focus on iron, zinc, copper, selenium, manganese, and iodine. We also provide a homology analysis of trace mineral transporters in fish, mice and humans. Finally, we discuss the evidence that zebrafish is an ideal experimental tool for uncovering novel mechanisms of trace mineral metabolism and for improving approaches to treat mineral imbalance-related diseases. PMID:24639652

  5. The VLab repository of thermodynamics and thermoelastic properties of minerals

    NASA Astrophysics Data System (ADS)

    Da Silveira, P. R.; Sarkar, K.; Wentzcovitch, R. M.; Shukla, G.; Lindemann, W.; Wu, Z.

    2015-12-01

    Thermodynamics and thermoelastic properties of minerals at planetary interior conditions are essential as input for geodynamics simulations and for interpretation of seismic tomography models. Precise experimental determination of these properties at such extreme conditions is very challenging. Therefore, ab initio calculations play an essential role in this context, but at the cost of great computational effort and memory use. Setting up a widely accessible and versatile mineral physics database can relax unnecessary repetition of such computationally intensive calculations. Access to such data facilitates transactional interaction across fields and can advance more quickly insights about deep Earth processes. Hosted by the Minnesota Supercomputing Institute, the Virtual Laboratory for Earth and Planetary Materials (VLab) was designed to develop and promote the theory of planetary materials using distributed, high-throughput quantum calculations. VLab hosts an interactive database of thermodynamics and thermoelastic properties or minerals computed by ab initio. Such properties can be obtained according to user's preference. The database is accompanied by interactive visualization tools, allowing users to repeat and build upon previously published results. Using VLab2015, we have evaluated thermoelastic properties, such as elastic coefficients (Cij), Voigt, Reuss, and Voigt-Reuss-Hill aggregate averages for bulk (K) and shear modulus (G), shear wave velocity (VS), longitudinal wave velocity (Vp), and bulk sound velocity (V0) for several important minerals. Developed web services are general and can be used for crystals of any symmetry. Results can be tabulated, plotted, or downloaded from the VLab website according to user's preference.

  6. Neutron spectrometry - An essential tool for diagnosing implosions at the National Ignition Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mackinnon, A J; Johnson, M G; Frenje, J A

    DT neutron yield (Y{sub n}), ion temperature (T{sub i}) and down-scatter ratio (dsr) determined from measured neutron spectra are essential metrics for diagnosing the performance of Inertial Confinement Fusion (ICF) implosions at the National Ignition Facility (NIF). A suite of neutron-Time-Of-Flight (nTOF) spectrometers and a Magnetic Recoil Spectrometer (MRS) have been implemented in different locations around the NIF target chamber, providing good implosion coverage and the redundancy required for reliable measurements of Yn, Ti and dsr. From the measured dsr value, an areal density ({rho}R) is determined from the relationship {rho}R{sub tot} (g/cm{sup 2}) = (20.4 {+-} 0.6) x dsr{submore » 10-12 MeV}. The proportionality constant is determined considering implosion geometry, neutron attenuation and energy range used for the dsr measurement. To ensure high accuracy in the measurements, a series of commissioning experiments using exploding pushers have been used for in situ calibration. The spectrometers are now performing to the required accuracy, as indicated by the good agreement between the different measurements over several commissioning shots. In addition, recent data obtained with the MRS and nTOFs indicate that the implosion performance of cryogenically layered DT implosions, characterized by the experimental Ignition Threshold Factor (ITFx) which is a function of dsr (or fuel {rho}R) and Y{sub n}, has improved almost two orders of magnitude since the first shot in September, 2010.« less

  7. Neutron spectrometry-An essential tool for diagnosing implosions at the National Ignition Facility (invited)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, M. Gatu; Frenje, J. A.; Casey, D. T.

    2012-10-15

    DT neutron yield (Y{sub n}), ion temperature (T{sub i}), and down-scatter ratio (dsr) determined from measured neutron spectra are essential metrics for diagnosing the performance of inertial confinement fusion (ICF) implosions at the National Ignition Facility (NIF). A suite of neutron-time-of-flight (nTOF) spectrometers and a magnetic recoil spectrometer (MRS) have been implemented in different locations around the NIF target chamber, providing good implosion coverage and the complementarity required for reliable measurements of Y{sub n}, T{sub i}, and dsr. From the measured dsr value, an areal density ({rho}R) is determined through the relationship {rho}R{sub tot} (g/cm{sup 2}) = (20.4 {+-} 0.6)more » Multiplication-Sign dsr{sub 10-12MeV}. The proportionality constant is determined considering implosion geometry, neutron attenuation, and energy range used for the dsr measurement. To ensure high accuracy in the measurements, a series of commissioning experiments using exploding pushers have been used for in situ calibration of the as-built spectrometers, which are now performing to the required accuracy. Recent data obtained with the MRS and nTOFs indicate that the implosion performance of cryogenically layered DT implosions, characterized by the experimental ignition threshold factor (ITFx), which is a function of dsr (or fuel {rho}R) and Y{sub n}, has improved almost two orders of magnitude since the first shot in September, 2010.« less

  8. The tools of an evidence-based culture: implementing clinical-practice guidelines in an Israeli HMO.

    PubMed

    Kahan, Natan R; Kahan, Ernesto; Waitman, Dan-Andrei; Kitai, Eliezer; Chintz, David P

    2009-09-01

    Although clinical-practice guidelines (CPGs) are implemented on the assumption that they will improve the quality, efficiency, and consistency of health care, they generally have limited effect in changing physicians' behavior. The purpose of this study was to design and implement an effective program for formulating, promulgating, and implementing CPGs to foster the development of an evidence-based culture in an Israeli HMO. The authors implemented a four-stage program of stepwise collaborative efforts with academic institutions composed of developing quantitative tools to evaluate prescribing patterns, updating CPGs, collecting MDs' input via focus groups and quantitative surveys, and conducting a randomized controlled trial of a two-stage, multipronged intervention. The test case for this study was the development, dissemination, and implementation of CPG for the treatment of acute uncomplicated cystitis in adult women. Interventions in the form of a lecture at a conference and a letter with personalized feedback were implemented, both individually and combined, to improve physicians' rates of prescribing the first-line drug, nitrofurantoin, and, in the absence of nitrofurantoin, adhering to the recommended duration of three days of treatment with ofloxacin. The tools and data-generating capabilities designed and constructed in Stage I of the project were integral components of all subsequent stages of the program. Personalized feedback alone was sufficient to improve the rate of adherence to the guidelines by 19.4% (95% CI = 16.7, 22.1). This study provides a template for introducing the component of experimentation essential for cultivating an evidence-based culture. This process, composed of collaborative efforts between academic institutions and a managed care organization, may be beneficial to other health care systems.

  9. Towards a metadata scheme for the description of materials - the description of microstructures

    NASA Astrophysics Data System (ADS)

    Schmitz, Georg J.; Böttger, Bernd; Apel, Markus; Eiken, Janin; Laschet, Gottfried; Altenfeld, Ralph; Berger, Ralf; Boussinot, Guillaume; Viardin, Alexandre

    2016-01-01

    The property of any material is essentially determined by its microstructure. Numerical models are increasingly the focus of modern engineering as helpful tools for tailoring and optimization of custom-designed microstructures by suitable processing and alloy design. A huge variety of software tools is available to predict various microstructural aspects for different materials. In the general frame of an integrated computational materials engineering (ICME) approach, these microstructure models provide the link between models operating at the atomistic or electronic scales, and models operating on the macroscopic scale of the component and its processing. In view of an improved interoperability of all these different tools it is highly desirable to establish a standardized nomenclature and methodology for the exchange of microstructure data. The scope of this article is to provide a comprehensive system of metadata descriptors for the description of a 3D microstructure. The presented descriptors are limited to a mere geometric description of a static microstructure and have to be complemented by further descriptors, e.g. for properties, numerical representations, kinetic data, and others in the future. Further attributes to each descriptor, e.g. on data origin, data uncertainty, and data validity range are being defined in ongoing work. The proposed descriptors are intended to be independent of any specific numerical representation. The descriptors defined in this article may serve as a first basis for standardization and will simplify the data exchange between different numerical models, as well as promote the integration of experimental data into numerical models of microstructures. An HDF5 template data file for a simple, three phase Al-Cu microstructure being based on the defined descriptors complements this article.

  10. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    PubMed Central

    2009-01-01

    Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks. PMID:20042075

  11. Towards a metadata scheme for the description of materials - the description of microstructures.

    PubMed

    Schmitz, Georg J; Böttger, Bernd; Apel, Markus; Eiken, Janin; Laschet, Gottfried; Altenfeld, Ralph; Berger, Ralf; Boussinot, Guillaume; Viardin, Alexandre

    2016-01-01

    The property of any material is essentially determined by its microstructure. Numerical models are increasingly the focus of modern engineering as helpful tools for tailoring and optimization of custom-designed microstructures by suitable processing and alloy design. A huge variety of software tools is available to predict various microstructural aspects for different materials. In the general frame of an integrated computational materials engineering (ICME) approach, these microstructure models provide the link between models operating at the atomistic or electronic scales, and models operating on the macroscopic scale of the component and its processing. In view of an improved interoperability of all these different tools it is highly desirable to establish a standardized nomenclature and methodology for the exchange of microstructure data. The scope of this article is to provide a comprehensive system of metadata descriptors for the description of a 3D microstructure. The presented descriptors are limited to a mere geometric description of a static microstructure and have to be complemented by further descriptors, e.g. for properties, numerical representations, kinetic data, and others in the future. Further attributes to each descriptor, e.g. on data origin, data uncertainty, and data validity range are being defined in ongoing work. The proposed descriptors are intended to be independent of any specific numerical representation. The descriptors defined in this article may serve as a first basis for standardization and will simplify the data exchange between different numerical models, as well as promote the integration of experimental data into numerical models of microstructures. An HDF5 template data file for a simple, three phase Al-Cu microstructure being based on the defined descriptors complements this article.

  12. First approximations in avalanche model validations using seismic information

    NASA Astrophysics Data System (ADS)

    Roig Lafon, Pere; Suriñach, Emma; Bartelt, Perry; Pérez-Guillén, Cristina; Tapia, Mar; Sovilla, Betty

    2017-04-01

    Avalanche dynamics modelling is an essential tool for snow hazard management. Scenario based numerical modelling provides quantitative arguments for decision-making. The software tool RAMMS (WSL Institute for Snow and Avalanche Research SLF) is one such tool, often used by government authorities and geotechnical offices. As avalanche models improve, the quality of the numerical results will depend increasingly on user experience on the specification of input (e.g. release and entrainment volumes, secondary releases, snow temperature and quality). New model developments must continue to be validated using real phenomena data, for improving performance and reliability. The avalanches group form University of Barcelona (RISKNAT - UB), has studied the seismic signals generated from avalanches since 1994. Presently, the group manages the seismic installation at SLF's Vallée de la Sionne experimental site (VDLS). At VDLS the recorded seismic signals can be correlated to other avalanche measurement techniques, including both advanced remote sensing methods (radars, videogrammetry) and obstacle based sensors (pressure, capacitance, optical sender-reflector barriers). This comparison between different measurement techniques allows the group to address the question if seismic analysis can be used alone, on more additional avalanche tracks, to gain insight and validate numerical avalanche dynamics models in different terrain conditions. In this study, we aim to add the seismic data as an external record of the phenomena, able to validate RAMMS models. The seismic sensors are considerable easy and cheaper to install than other physical measuring tools, and are able to record data from the phenomena in every atmospheric conditions (e.g. bad weather, low light, freezing make photography, and other kind of sensors not usable). With seismic signals, we record the temporal evolution of the inner and denser parts of the avalanche. We are able to recognize the approximate position of the flow in the slope, and make observations of the internal flow dynamics, especially flow regimes transitions, which depend on the slope-perpendicular energy fluxes induced by collisions at the basal boundary. The recorded data over several experimental seasons provide a catalogue of seismic data from different types and sizes of avalanches triggered at the VDLS experimental site. These avalanches are recorded also by the SLF instrumentation (FMCW radars, photography, photogrammetry, video, videogrammetry, pressure sensors). We select the best-quality avalanche data to model and establish comparisons. All this information allows us to calibrate parameters governing the internal energy fluxes, especially parameters governing the interaction of the avalanche with the incumbent snow cover. For the comparison between the seismic signal and the RAMMS models, we are focusing at the temporal evolution of the flow, trying to find the same arrival times of the front at the seismic sensor location in the avalanche path. We make direct quantitative comparisons between measurements and model outputs, using modelled flow height, normal stress, velocity, and pressure values, compared with the seismic signal, its envelope and its running spectrogram. In all cases, the first comparisons between the seismic signal and RAMMS outputs are very promising.

  13. A new method to study ferroelectrics using the remanent Henkel plots

    NASA Astrophysics Data System (ADS)

    Vopson, Melvin M.

    2018-05-01

    Analysis of experimental curves constructed from dc demagnetization and isothermal remanent magnetization known as Henkel and delta M plots, have served for over 53 years as an important tool for characterization of interactions in ferromagnets. In this article we address the question whether the same experimental technique could be applied to the study of ferroelectric systems. The successful measurement of the equivalent dc depolarisation and isothermal remanent polarization curves and the construction of the Henkel and delta P plots for ferroelectrics is reported here. Full measurement protocol is provided together with experimental examples for two ferroelectric ceramic samples. This new measurement technique is an invaluable experimental tool that could be used to further advance our understanding of ferroelectric materials and their applications.

  14. Evaluation of the Leishmanicidal and Cytotoxic Potential of Essential Oils Derived From Ten Colombian Plants

    PubMed Central

    Sanchez-Suarez, JF; Riveros, I; Delgado, G

    2013-01-01

    Background The leishmanicidal and cytotoxic activity of ten essential oils obtained from ten plant specimens were evaluated. Methods Essential oils were obtained by the steam distillation of plant leaves without any prior processing. Cytotoxicity was tested on J774 macrophages and leishmanicidal activity was assessed against four species of Leishmania associated with cutaneous leishmaniasis. Results Seven essential oils exhibited activity against Leishmania parasites, five of which were toxic against J774 macrophages. Selectivity indices of >6 and 13 were calculated for the essential oils of Ocimum basilicum and Origanum vulgare, respectively. Conclusion The essential oil of Ocimum basilicum was active against promastigotes of Leishmania and innocuous to J774 macrophages at concentrations up to 1600 µg/mL and should be further investigated for leishmanicidal activity in others in vitro and in vivo experimental models. PMID:23682270

  15. Personal Reflections on Observational and Experimental Research Approaches to Childhood Psychopathology

    ERIC Educational Resources Information Center

    Rapoport, Judith L.

    2009-01-01

    The past 50 years have seen dramatic changes in childhood psychopathology research. The goal of this overview is to contrast observational and experimental research approaches; both have grown more complex such that the boundary between these approaches may be blurred. Both are essential. Landmark observational studies with long-term follow-up…

  16. A Quasi-Experimental Examination: Cognitive Sequencing of Instruction Using Experiential Learning Theory for STEM Concepts in Agricultural Education

    ERIC Educational Resources Information Center

    Smith, Kasee L.; Rayfield, John

    2017-01-01

    Understanding methods for effectively instructing STEM education concepts is essential in the current climate of education (Freeman, Marginson, & Tyler 2014). Kolb's experiential learning theory (ELT) outlines four specific modes of learning, based on preferences for grasping and transforming information. This quasi-experimental study was…

  17. Lab Notebooks as Scientific Communication: Investigating Development from Undergraduate Courses to Graduate Research

    ERIC Educational Resources Information Center

    Stanley, Jacob T.; Lewandowski, H. J.

    2016-01-01

    In experimental physics, lab notebooks play an essential role in the research process. For all of the ubiquity of lab notebooks, little formal attention has been paid to addressing what is considered "best practice" for scientific documentation and how researchers come to learn these practices in experimental physics. Using interviews…

  18. The rights of man and animal experimentation.

    PubMed Central

    Martin, J

    1990-01-01

    Since emotions give contradictory signals about animal experimentation in medical science, man's relationship to animals must be based upon reason. Thomas Aquinas argues that man is essentially different from animals because man's intellectual processes show evidence of an abstract mechanism not possessed by animals. Man's rights arise in association with this essential difference. The consequence is that only man possesses true rights by Aquinas's definition; animals have them only by analogy. However, cruelty to animals is illicit and they should be protected, principally not because they have rights, but because he who is cruel to animals is more likely to be cruel to his fellowman. If there is a need for animal experimentation in science for the good of man, this approach gives philosophical justification for experimentation, since man's well-being must come before that of animals because of his unique possession of rights. However, those experiments should be carried out in the kindest way possible, to promote kindness towards man. To see man as solely part of a biological continuum in competition for rights with those beings close to him biologically, detracts from man's dignity. PMID:2135948

  19. Implementation of structure-mapping inference by event-file binding and action planning: a model of tool-improvisation analogies.

    PubMed

    Fields, Chris

    2011-03-01

    Structure-mapping inferences are generally regarded as dependent upon relational concepts that are understood and expressible in language by subjects capable of analogical reasoning. However, tool-improvisation inferences are executed by members of a variety of non-human primate and other species. Tool improvisation requires correctly inferring the motion and force-transfer affordances of an object; hence tool improvisation requires structure mapping driven by relational properties. Observational and experimental evidence can be interpreted to indicate that structure-mapping analogies in tool improvisation are implemented by multi-step manipulation of event files by binding and action-planning mechanisms that act in a language-independent manner. A functional model of language-independent event-file manipulations that implement structure mapping in the tool-improvisation domain is developed. This model provides a mechanism by which motion and force representations commonly employed in tool-improvisation structure mappings may be sufficiently reinforced to be available to inwardly directed attention and hence conceptualization. Predictions and potential experimental tests of this model are outlined.

  20. A Therapeutic Approach for Wound Healing by Using Essential Oils of Cupressus and Juniperus Species Growing in Turkey

    PubMed Central

    Tumen, Ibrahim; Süntar, Ipek; Keleş, Hikmet; Küpeli Akkol, Esra

    2012-01-01

    Juniperus and Cupressus genera are mainly used as diuretic, stimulant, and antiseptic, for common cold and wound healing in Turkish folk medicine. In the present study, essential oils obtained from cones of Cupressus and berries of Juniperus were evaluated for their wound healing and anti-inflammatory effects. In vivo wound healing activity was evaluated by linear incision and circular excision experimental wound models, assessment of hydroxyproline content, and subsequently histopathological analysis. The healing potential was comparatively assessed with a reference ointment Madecassol. Additionally acetic-acid-induced capillary permeability test was used for the oils' anti-inflammatory activity. The essential oils of J. oxycedrus subsp. oxycedrus and J. phoenicea demonstrated the highest activities, while the rest of the species did not show any significant wound healing effect. The experimental study revealed that J. oxycedrus subsp. oxycedrus and J. phoenicea display remarkable wound healing and anti-inflammatory activities, which support the folkloric use of the plants. PMID:21941588

Top