Sample records for laboratory tools widely

  1. Measuring laboratory-based influenza surveillance capacity: development of the 'International Influenza Laboratory Capacity Review' Tool.

    PubMed

    Muir-Paulik, S A; Johnson, L E A; Kennedy, P; Aden, T; Villanueva, J; Reisdorf, E; Humes, R; Moen, A C

    2016-01-01

    The 2005 International Health Regulations (IHR 2005) emphasized the importance of laboratory capacity to detect emerging diseases including novel influenza viruses. To support IHR 2005 requirements and the need to enhance influenza laboratory surveillance capacity, the Association of Public Health Laboratories (APHL) and the Centers for Disease Control and Prevention (CDC) Influenza Division developed the International Influenza Laboratory Capacity Review (Tool). Data from 37 assessments were reviewed and analyzed to verify that the quantitative analysis results accurately depicted a laboratory's capacity and capabilities. Subject matter experts in influenza and laboratory practice used an iterative approach to develop the Tool incorporating feedback and lessons learnt through piloting and implementation. To systematically analyze assessment data, a quantitative framework for analysis was added to the Tool. The review indicated that changes in scores consistently reflected enhanced or decreased capacity. The review process also validated the utility of adding a quantitative analysis component to the assessments and the benefit of establishing a baseline from which to compare future assessments in a standardized way. Use of the Tool has provided APHL, CDC and each assessed laboratory with a standardized analysis of the laboratory's capacity. The information generated is used to improve laboratory systems for laboratory testing and enhance influenza surveillance globally. We describe the development of the Tool and lessons learnt. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  2. How to Recognize Success and Failure: Practical Assessment of an Evolving, First-Semester Laboratory Program Using Simple, Outcome-Based Tools

    ERIC Educational Resources Information Center

    Gron, Liz U.; Bradley, Shelly B.; McKenzie, Jennifer R.; Shinn, Sara E.; Teague, M. Warfield

    2013-01-01

    This paper presents the use of simple, outcome-based assessment tools to design and evaluate the first semester of a new introductory laboratory program created to teach green analytical chemistry using environmental samples. This general chemistry laboratory program, like many introductory courses, has a wide array of stakeholders within and…

  3. SLIPTA e-Tool improves laboratory audit process in Vietnam and Cambodia

    PubMed Central

    Nguyen, Thuong T.; McKinney, Barbara; Pierson, Antoine; Luong, Khue N.; Hoang, Quynh T.; Meharwal, Sandeep; Carvalho, Humberto M.; Nguyen, Cuong Q.; Nguyen, Kim T.

    2014-01-01

    Background The Stepwise Laboratory Quality Improvement Process Towards Accreditation (SLIPTA) checklist is used worldwide to drive quality improvement in laboratories in developing countries and to assess the effectiveness of interventions such as the Strengthening Laboratory Management Toward Accreditation (SLMTA) programme. However, the paper-based format of the checklist makes administration cumbersome and limits timely analysis and communication of results. Development of e-Tool In early 2012, the SLMTA team in Vietnam developed an electronic SLIPTA checklist tool. The e-Tool was pilot tested in Vietnam in mid-2012 and revised. It was used during SLMTA implementation in Vietnam and Cambodia in 2012 and 2013 and further revised based on auditors’ feedback about usability. Outcomes The SLIPTA e-Tool enabled rapid turn-around of audit results, reduced workload and language barriers and facilitated analysis of national results. Benefits of the e-Tool will be magnified with in-country scale-up of laboratory quality improvement efforts and potential expansion to other countries. PMID:29043190

  4. SLIPTA e-Tool improves laboratory audit process in Vietnam and Cambodia.

    PubMed

    Nguyen, Thuong T; McKinney, Barbara; Pierson, Antoine; Luong, Khue N; Hoang, Quynh T; Meharwal, Sandeep; Carvalho, Humberto M; Nguyen, Cuong Q; Nguyen, Kim T; Bond, Kyle B

    2014-01-01

    The Stepwise Laboratory Quality Improvement Process Towards Accreditation (SLIPTA) checklist is used worldwide to drive quality improvement in laboratories in developing countries and to assess the effectiveness of interventions such as the Strengthening Laboratory Management Toward Accreditation (SLMTA) programme. However, the paper-based format of the checklist makes administration cumbersome and limits timely analysis and communication of results. In early 2012, the SLMTA team in Vietnam developed an electronic SLIPTA checklist tool. The e-Tool was pilot tested in Vietnam in mid-2012 and revised. It was used during SLMTA implementation in Vietnam and Cambodia in 2012 and 2013 and further revised based on auditors' feedback about usability. The SLIPTA e-Tool enabled rapid turn-around of audit results, reduced workload and language barriers and facilitated analysis of national results. Benefits of the e-Tool will be magnified with in-country scale-up of laboratory quality improvement efforts and potential expansion to other countries.

  5. Modern dosimetric tools for 60Co irradiation at high containment laboratories

    PubMed Central

    Twardoski, Barri; Feldmann, Heinz; Bloom, Marshall E.; Ward, Joe

    2011-01-01

    Purpose To evaluate an innovative photo-fluorescent film as a routine dosimetric tool during 60Co irradiations at a high containment biological research laboratory, and to investigate whether manufacturer-provided chamber exposure rates can be used to accurately administer a prescribed dose to biological specimens. Materials and methods Photo-fluorescent, lithium fluoride film dosimeters and National Institutes of Standards and Technology (NIST) transfer dosimeters were co-located in a self-shielded 60Co irradiator and exposed to γ-radiation with doses ranging from 5–85 kGy. Film dose-response relationships were developed for varying temperatures simulating conditions present when irradiating infectious biological specimens. Dose measurement results from NIST transfer dosimeters were compared to doses predicted using manufacturer-provided irradiator chamber exposure rates. Results The film dosimeter exhibited a photo-fluorescent response signal that was consistent and nearly linear in relationship to γ-radiation exposure over a wide dose range. The dosimeter response also showed negligible effects from dose fractionization and humidity. Significant disparities existed between manufacturer-provided chamber exposure rates and actual doses administered. Conclusion This study demonstrates the merit of utilizing dosimetric tools to validate the process of exposing dangerous and exotic biological agents to γ-radiation at high containment laboratories. The film dosimeter used in this study can be utilized to eliminate potential for improperly administering γ-radiation doses. PMID:21961968

  6. Development and Evaluation of Computer-Based Laboratory Practical Learning Tool

    ERIC Educational Resources Information Center

    Gandole, Y. B.

    2006-01-01

    Effective evaluation of educational software is a key issue for successful introduction of advanced tools in the curriculum. This paper details to developing and evaluating a tool for computer assisted learning of science laboratory courses. The process was based on the generic instructional system design model. Various categories of educational…

  7. Delivery of laboratory data with World Wide Web technology.

    PubMed

    Hahn, A W; Leon, M A; Klein-Leon, S; Allen, G K; Boon, G D; Patrick, T B; Klimczak, J C

    1997-01-01

    We have developed an experimental World Wide Web (WWW) based system to deliver laboratory results to clinicians in our Veterinary Medical Teaching Hospital. Laboratory results are generated by the clinical pathology section of our Veterinary Medical Diagnostic Laboratory and stored in a legacy information system. This system does not interface directly to the hospital information system, and it cannot be accessed directly by clinicians. Our "meta" system first parses routine print reports and then instantiates the data into a modern, open-architecture relational database using a data model constructed with currently accepted international standards for data representation and communication. The system does not affect either of the existing legacy systems. Location-independent delivery of patient data is via a secure WWW based system which maximizes usability and allows "value-added" graphic representations. The data can be viewed with any web browser. Future extensibility and intra- and inter-institutional compatibility served as key design criteria. The system is in the process of being evaluated using accepted methods of assessment of information technologies.

  8. Developing Learning Tool of Control System Engineering Using Matrix Laboratory Software Oriented on Industrial Needs

    NASA Astrophysics Data System (ADS)

    Isnur Haryudo, Subuh; Imam Agung, Achmad; Firmansyah, Rifqi

    2018-04-01

    The purpose of this research is to develop learning media of control technique using Matrix Laboratory software with industry requirement approach. Learning media serves as a tool for creating a better and effective teaching and learning situation because it can accelerate the learning process in order to enhance the quality of learning. Control Techniques using Matrix Laboratory software can enlarge the interest and attention of students, with real experience and can grow independent attitude. This research design refers to the use of research and development (R & D) methods that have been modified by multi-disciplinary team-based researchers. This research used Computer based learning method consisting of computer and Matrix Laboratory software which was integrated with props. Matrix Laboratory has the ability to visualize the theory and analysis of the Control System which is an integration of computing, visualization and programming which is easy to use. The result of this instructional media development is to use mathematical equations using Matrix Laboratory software on control system application with DC motor plant and PID (Proportional-Integral-Derivative). Considering that manufacturing in the field of Distributed Control systems (DCSs), Programmable Controllers (PLCs), and Microcontrollers (MCUs) use PID systems in production processes are widely used in industry.

  9. Learning motion concepts using real-time microcomputer-based laboratory tools

    NASA Astrophysics Data System (ADS)

    Thornton, Ronald K.; Sokoloff, David R.

    1990-09-01

    Microcomputer-based laboratory (MBL) tools have been developed which interface to Apple II and Macintosh computers. Students use these tools to collect physical data that are graphed in real time and then can be manipulated and analyzed. The MBL tools have made possible discovery-based laboratory curricula that embody results from educational research. These curricula allow students to take an active role in their learning and encourage them to construct physical knowledge from observation of the physical world. The curricula encourage collaborative learning by taking advantage of the fact that MBL tools present data in an immediately understandable graphical form. This article describes one of the tools—the motion detector (hardware and software)—and the kinematics curriculum. The effectiveness of this curriculum compared to traditional college and university methods for helping students learn basic kinematics concepts has been evaluated by pre- and post-testing and by observation. There is strong evidence for significantly improved learning and retention by students who used the MBL materials, compared to those taught in lecture.

  10. Test Review for Preschool-Wide Evaluation Tool (PreSET) Manual: Assessing Universal Program-Wide Positive Behavior Support in Early Childhood

    ERIC Educational Resources Information Center

    Rodriguez, Billie Jo

    2013-01-01

    The Preschool-Wide Evaluation Tool (PreSET; Steed & Pomerleau, 2012) is published by Paul H. Brookes Publishing Company in Baltimore, MD. The PreSET purports to measure universal and program-wide features of early childhood programs' implementation fidelity of program-wide positive behavior intervention and support (PW-PBIS) and is,…

  11. World Wide Web Pages--Tools for Teaching and Learning.

    ERIC Educational Resources Information Center

    Beasley, Sarah; Kent, Jean

    Created to help educators incorporate World Wide Web pages into teaching and learning, this collection of Web pages presents resources, materials, and techniques for using the Web. The first page focuses on tools for teaching and learning via the Web, providing pointers to sites containing the following: (1) course materials for both distance and…

  12. Simulation of the hydraulic performance of highway filter drains through laboratory models and stormwater management tools.

    PubMed

    Sañudo-Fontaneda, Luis A; Jato-Espino, Daniel; Lashford, Craig; Coupe, Stephen J

    2017-05-23

    Road drainage is one of the most relevant assets in transport infrastructure due to its inherent influence on traffic management and road safety. Highway filter drains (HFDs), also known as "French Drains", are the main drainage system currently in use in the UK, throughout 7000 km of its strategic road network. Despite being a widespread technique across the whole country, little research has been completed on their design considerations and their subsequent impact on their hydraulic performance, representing a gap in the field. Laboratory experiments have been proven to be a reliable indicator for the simulation of the hydraulic performance of stormwater best management practices (BMPs). In addition to this, stormwater management tools (SMT) have been preferentially chosen as a design tool for BMPs by practitioners from all over the world. In this context, this research aims to investigate the hydraulic performance of HFDs by comparing the results from laboratory simulation and two widely used SMT such as the US EPA's stormwater management model (SWMM) and MicroDrainage®. Statistical analyses were applied to a series of rainfall scenarios simulated, showing a high level of accuracy between the results obtained in laboratory and using SMT as indicated by the high and low values of the Nash-Sutcliffe and R 2 coefficients and root-mean-square error (RMSE) reached, which validated the usefulness of SMT to determine the hydraulic performance of HFDs.

  13. Tools for Scientific Thinking: Microcomputer-Based Laboratories for the Naive Science Learner.

    ERIC Educational Resources Information Center

    Thornton, Ronald K.

    A promising new development in science education is the use of microcomputer-based laboratory tools that allow for student-directed data acquisition, display, and analysis. Microcomputer-based laboratories (MBL) make use of inexpensive microcomputer-connected probes to measure such physical quantities as temperature, position, and various…

  14. WorldWide Web: Hypertext from CERN.

    ERIC Educational Resources Information Center

    Nickerson, Gord

    1992-01-01

    Discussion of software tools for accessing information on the Internet focuses on the WorldWideWeb (WWW) system, which was developed at the European Particle Physics Laboratory (CERN) in Switzerland to build a worldwide network of hypertext links using available networking technology. Its potential for use with multimedia documents is also…

  15. Implementation of a digital preparation validation tool in dental skills laboratory training.

    PubMed

    Kozarovska, A; Larsson, C

    2018-05-01

    To describe the implementation of a digital tool for preparation validation and evaluate it as an aid in students' self-assessment. Students at the final semester of skills laboratory training were asked to use a digital preparation validation tool (PVT) when performing two different tasks; preparation of crowns for teeth 11 and 21. The students were divided into two groups. Group A self-assessed and scanned all three attempts at 21 ("prep-and-scan"). Group B self-assessed all attempts chose the best one and scanned it ("best-of-three"). The situation was reversed for 11. The students assessed five parameters of the preparation and marked them as approved (A) or failed (F). These marks were compared with the information from the PVT. The students also completed a questionnaire. Each question was rated from 1 to 5. Teachers' opinions were collected at staff meetings throughout the project. Most students in the "prep-and-scan" groups showed an increase in agreement between their self-assessment and the information from the PVT, whereas students in the "best-of-three" groups showed lower levels of agreement. All students rated the PVT positively. Most strongly agreed that the tool was helpful in developing skills (mean 4.15), easy to use (mean 4.23) and that it added benefits in comparison to existing assessment tools (mean 4.05). They did not however, fully agree that the tool is time efficient (mean 2.55), and they did not consider it a substitute for verbal teacher feedback. Teachers' feedback suggested advantages of the tool in the form of ease of use, visual aid and increasing interest and motivation during skills laboratory training however, they did not notice a reduction in need of verbal feedback. Within the limitations of the study, our conclusion is that a digital PVT may be a valuable adjunct to other assessment tools in skills laboratory training. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. Information systems as a quality management tool in clinical laboratories

    NASA Astrophysics Data System (ADS)

    Schmitz, Vanessa; Rosecler Bez el Boukhari, Marta

    2007-11-01

    This article describes information systems as a quality management tool in clinical laboratories. The quality of laboratory analyses is of fundamental importance for health professionals in aiding appropriate diagnosis and treatment. Information systems allow the automation of internal quality management processes, using standard sample tests, Levey-Jennings charts and Westgard multirule analysis. This simplifies evaluation and interpretation of quality tests and reduces the possibility of human error. This study proposes the development of an information system with appropriate functions and costs for the automation of internal quality control in small and medium-sized clinical laboratories. To this end, it evaluates the functions and usability of two commercial software products designed for this purpose, identifying the positive features of each, so that these can be taken into account during the development of the proposed system.

  17. Duplicate laboratory test reduction using a clinical decision support tool.

    PubMed

    Procop, Gary W; Yerian, Lisa M; Wyllie, Robert; Harrison, A Marc; Kottke-Marchant, Kandice

    2014-05-01

    Duplicate laboratory tests that are unwarranted increase unnecessary phlebotomy, which contributes to iatrogenic anemia, decreased patient satisfaction, and increased health care costs. We employed a clinical decision support tool (CDST) to block unnecessary duplicate test orders during the computerized physician order entry (CPOE) process. We assessed laboratory cost savings after 2 years and searched for untoward patient events associated with this intervention. This CDST blocked 11,790 unnecessary duplicate test orders in these 2 years, which resulted in a cost savings of $183,586. There were no untoward effects reported associated with this intervention. The movement to CPOE affords real-time interaction between the laboratory and the physician through CDSTs that signal duplicate orders. These interactions save health care dollars and should also increase patient satisfaction and well-being.

  18. The intelligent clinical laboratory as a tool to increase cancer care management productivity.

    PubMed

    Mohammadzadeh, Niloofar; Safdari, Reza

    2014-01-01

    Studies of the causes of cancer, early detection, prevention or treatment need accurate, comprehensive, and timely cancer data. The clinical laboratory provides important cancer information needed for physicians which influence clinical decisions regarding treatment, diagnosis and patient monitoring. Poor communication between health care providers and clinical laboratory personnel can lead to medical errors and wrong decisions in providing cancer care. Because of the key impact of laboratory information on cancer diagnosis and treatment the quality of the tests, lab reports, and appropriate lab management are very important. A laboratory information management system (LIMS) can have an important role in diagnosis, fast and effective access to cancer data, decrease redundancy and costs, and facilitate the integration and collection of data from different types of instruments and systems. In spite of significant advantages LIMS is limited by factors such as problems in adaption to new instruments that may change existing work processes. Applications of intelligent software simultaneously with existing information systems, in addition to remove these restrictions, have important benefits including adding additional non-laboratory-generated information to the reports, facilitating decision making, and improving quality and productivity of cancer care services. Laboratory systems must have flexibility to change and have the capability to develop and benefit from intelligent devices. Intelligent laboratory information management systems need to benefit from informatics tools and latest technologies like open sources. The aim of this commentary is to survey application, opportunities and necessity of intelligent clinical laboratory as a tool to increase cancer care management productivity.

  19. Plant Aquaporins: Genome-Wide Identification, Transcriptomics, Proteomics, and Advanced Analytical Tools.

    PubMed

    Deshmukh, Rupesh K; Sonah, Humira; Bélanger, Richard R

    2016-01-01

    Aquaporins (AQPs) are channel-forming integral membrane proteins that facilitate the movement of water and many other small molecules. Compared to animals, plants contain a much higher number of AQPs in their genome. Homology-based identification of AQPs in sequenced species is feasible because of the high level of conservation of protein sequences across plant species. Genome-wide characterization of AQPs has highlighted several important aspects such as distribution, genetic organization, evolution and conserved features governing solute specificity. From a functional point of view, the understanding of AQP transport system has expanded rapidly with the help of transcriptomics and proteomics data. The efficient analysis of enormous amounts of data generated through omic scale studies has been facilitated through computational advancements. Prediction of protein tertiary structures, pore architecture, cavities, phosphorylation sites, heterodimerization, and co-expression networks has become more sophisticated and accurate with increasing computational tools and pipelines. However, the effectiveness of computational approaches is based on the understanding of physiological and biochemical properties, transport kinetics, solute specificity, molecular interactions, sequence variations, phylogeny and evolution of aquaporins. For this purpose, tools like Xenopus oocyte assays, yeast expression systems, artificial proteoliposomes, and lipid membranes have been efficiently exploited to study the many facets that influence solute transport by AQPs. In the present review, we discuss genome-wide identification of AQPs in plants in relation with recent advancements in analytical tools, and their availability and technological challenges as they apply to AQPs. An exhaustive review of omics resources available for AQP research is also provided in order to optimize their efficient utilization. Finally, a detailed catalog of computational tools and analytical pipelines is

  20. Plant Aquaporins: Genome-Wide Identification, Transcriptomics, Proteomics, and Advanced Analytical Tools

    PubMed Central

    Deshmukh, Rupesh K.; Sonah, Humira; Bélanger, Richard R.

    2016-01-01

    Aquaporins (AQPs) are channel-forming integral membrane proteins that facilitate the movement of water and many other small molecules. Compared to animals, plants contain a much higher number of AQPs in their genome. Homology-based identification of AQPs in sequenced species is feasible because of the high level of conservation of protein sequences across plant species. Genome-wide characterization of AQPs has highlighted several important aspects such as distribution, genetic organization, evolution and conserved features governing solute specificity. From a functional point of view, the understanding of AQP transport system has expanded rapidly with the help of transcriptomics and proteomics data. The efficient analysis of enormous amounts of data generated through omic scale studies has been facilitated through computational advancements. Prediction of protein tertiary structures, pore architecture, cavities, phosphorylation sites, heterodimerization, and co-expression networks has become more sophisticated and accurate with increasing computational tools and pipelines. However, the effectiveness of computational approaches is based on the understanding of physiological and biochemical properties, transport kinetics, solute specificity, molecular interactions, sequence variations, phylogeny and evolution of aquaporins. For this purpose, tools like Xenopus oocyte assays, yeast expression systems, artificial proteoliposomes, and lipid membranes have been efficiently exploited to study the many facets that influence solute transport by AQPs. In the present review, we discuss genome-wide identification of AQPs in plants in relation with recent advancements in analytical tools, and their availability and technological challenges as they apply to AQPs. An exhaustive review of omics resources available for AQP research is also provided in order to optimize their efficient utilization. Finally, a detailed catalog of computational tools and analytical pipelines is

  1. [The balanced scorecard used as a management tool in a clinical laboratory: internal business processes indicators].

    PubMed

    Salinas La Casta, Maria; Flores Pardo, Emilio; Uris Selles, Joaquín

    2009-01-01

    to propose a set of indicators as a management tool for a clinical laboratory, by using the balanced scorecard internal business processes perspective. indicators proposed are obtained from different sources; external proficiency testing of the Valencia Community Government, by means of internal surveys and laboratory information system registers. One year testing process proportion indicators results are showed. internal management indicators are proposed (process, appropriateness and proficiency testing). The process indicators results show gradual improvement since its establishment. after one years of using a conceptually solid Balanced Scorecard Internal business processes perspective indicators, the obtained results validate the usefulness as a laboratory management tool.

  2. Average of delta: a new quality control tool for clinical laboratories.

    PubMed

    Jones, Graham R D

    2016-01-01

    Average of normals is a tool used to control assay performance using the average of a series of results from patients' samples. Delta checking is a process of identifying errors in individual patient results by reviewing the difference from previous results of the same patient. This paper introduces a novel alternate approach, average of delta, which combines these concepts to use the average of a number of sequential delta values to identify changes in assay performance. Models for average of delta and average of normals were developed in a spreadsheet application. The model assessed the expected scatter of average of delta and average of normals functions and the effect of assay bias for different values of analytical imprecision and within- and between-subject biological variation and the number of samples included in the calculations. The final assessment was the number of patients' samples required to identify an added bias with 90% certainty. The model demonstrated that with larger numbers of delta values, the average of delta function was tighter (lower coefficient of variation). The optimal number of samples for bias detection with average of delta was likely to be between 5 and 20 for most settings and that average of delta outperformed average of normals when the within-subject biological variation was small relative to the between-subject variation. Average of delta provides a possible additional assay quality control tool which theoretical modelling predicts may be more valuable than average of normals for analytes where the group biological variation is wide compared with within-subject variation and where there is a high rate of repeat testing in the laboratory patient population. © The Author(s) 2015.

  3. WebPresent: a World Wide Web-based telepresentation tool for physicians

    NASA Astrophysics Data System (ADS)

    Sampath-Kumar, Srihari; Banerjea, Anindo; Moshfeghi, Mehran

    1997-05-01

    In this paper, we present the design architecture and the implementation status of WebPresent - a world wide web based tele-presentation tool. This tool allows a physician to use a conference server workstation and make a presentation of patient cases to a geographically distributed audience. The audience consists of other physicians collaborating on patients' health care management and physicians participating in continuing medical education. These physicians are at several locations with networks of different bandwidth and capabilities connecting them. Audiences also receive the patient case information on different computers ranging form high-end display workstations to laptops with low-resolution displays. WebPresent is a scalable networked multimedia tool which supports the presentation of hypertext, images, audio, video, and a white-board to remote physicians with hospital Intranet access. WebPresent allows the audience to receive customized information. The data received can differ in resolution and bandwidth, depending on the availability of resources such as display resolution and network bandwidth.

  4. Using Laboratory Experiments and Circuit Simulation IT Tools in an Undergraduate Course in Analog Electronics

    ERIC Educational Resources Information Center

    Baltzis, Konstantinos B.; Koukias, Konstantinos D.

    2009-01-01

    Laboratory-based courses play a significant role in engineering education. Given the role of electronics in engineering and technology, laboratory experiments and circuit simulation IT tools are used in their teaching in several academic institutions. This paper discusses the characteristics and benefits of both methods. The content and structure…

  5. T.I.M.S: TaqMan Information Management System, tools to organize data flow in a genotyping laboratory

    PubMed Central

    Monnier, Stéphanie; Cox, David G; Albion, Tim; Canzian, Federico

    2005-01-01

    Background Single Nucleotide Polymorphism (SNP) genotyping is a major activity in biomedical research. The Taqman technology is one of the most commonly used approaches. It produces large amounts of data that are difficult to process by hand. Laboratories not equipped with a Laboratory Information Management System (LIMS) need tools to organize the data flow. Results We propose a package of Visual Basic programs focused on sample management and on the parsing of input and output TaqMan files. The code is written in Visual Basic, embedded in the Microsoft Office package, and it allows anyone to have access to those tools, without any programming skills and with basic computer requirements. Conclusion We have created useful tools focused on management of TaqMan genotyping data, a critical issue in genotyping laboratories whithout a more sophisticated and expensive system, such as a LIMS. PMID:16221298

  6. Enabling laboratory EUV research with a compact exposure tool

    NASA Astrophysics Data System (ADS)

    Brose, Sascha; Danylyuk, Serhiy; Tempeler, Jenny; Kim, Hyun-su; Loosen, Peter; Juschkin, Larissa

    2016-03-01

    In this work we present the capabilities of the designed and realized extreme ultraviolet laboratory exposure tool (EUVLET) which has been developed at the RWTH-Aachen, Chair for the Technology of Optical Systems (TOS), in cooperation with the Fraunhofer Institute for Laser Technology (ILT) and Bruker ASC GmbH. Main purpose of this laboratory setup is the direct application in research facilities and companies with small batch production, where the fabrication of high resolution periodic arrays over large areas is required. The setup can also be utilized for resist characterization and evaluation of its pre- and post-exposure processing. The tool utilizes a partially coherent discharge produced plasma (DPP) source and minimizes the number of other critical components to a transmission grating, the photoresist coated wafer and the positioning system for wafer and grating and utilizes the Talbot lithography approach. To identify the limits of this approach first each component is analyzed and optimized separately and relations between these components are identified. The EUV source has been optimized to achieve the best values for spatial and temporal coherence. Phase-shifting and amplitude transmission gratings have been fabricated and exposed. Several commercially available electron beam resists and one EUV resist have been characterized by open frame exposures to determine their contrast under EUV radiation. Cold development procedure has been performed to further increase the resist contrast. By analyzing the exposure results it can be demonstrated that only a 1:1 copy of the mask structure can be fully resolved by the utilization of amplitude masks. The utilized phase-shift masks offer higher 1st order diffraction efficiency and allow a demagnification of the mask structure in the achromatic Talbot plane.

  7. Measurement Tools for the Immersive Visualization Environment: Steps Toward the Virtual Laboratory.

    PubMed

    Hagedorn, John G; Dunkers, Joy P; Satterfield, Steven G; Peskin, Adele P; Kelso, John T; Terrill, Judith E

    2007-01-01

    This paper describes a set of tools for performing measurements of objects in a virtual reality based immersive visualization environment. These tools enable the use of the immersive environment as an instrument for extracting quantitative information from data representations that hitherto had be used solely for qualitative examination. We provide, within the virtual environment, ways for the user to analyze and interact with the quantitative data generated. We describe results generated by these methods to obtain dimensional descriptors of tissue engineered medical products. We regard this toolbox as our first step in the implementation of a virtual measurement laboratory within an immersive visualization environment.

  8. A Reexamination of the Psychometric Properties of the "School-Wide Evaluation Tool" (SET)

    ERIC Educational Resources Information Center

    Vincent, Claudia; Spaulding, Scott; Tobin, Tary Jeanne

    2010-01-01

    As a follow-up to Horner et al., this study focuses on the internal consistency and validity of the School-wide Evaluation Tool (SET) at all school levels. Analyzing SET data from 833 elementary, 264 middle, and 93 high schools, the authors focused on (a) describing commonalities and differences in SET data across the school levels, (b) assessing…

  9. Genome-Wide Association of the Laboratory-Based Nicotine Metabolite Ratio in Three Ancestries

    PubMed Central

    Baurley, James W.; Edlund, Christopher K.; Pardamean, Carissa I.; Conti, David V.; Krasnow, Ruth; Javitz, Harold S.; Hops, Hyman; Swan, Gary E.; Benowitz, Neal L.

    2016-01-01

    Introduction: Metabolic enzyme variation and other patient and environmental characteristics influence smoking behaviors, treatment success, and risk of related disease. Population-specific variation in metabolic genes contributes to challenges in developing and optimizing pharmacogenetic interventions. We applied a custom genome-wide genotyping array for addiction research (Smokescreen), to three laboratory-based studies of nicotine metabolism with oral or venous administration of labeled nicotine and cotinine, to model nicotine metabolism in multiple populations. The trans-3′-hydroxycotinine/cotinine ratio, the nicotine metabolite ratio (NMR), was the nicotine metabolism measure analyzed. Methods: Three hundred twelve individuals of self-identified European, African, and Asian American ancestry were genotyped and included in ancestry-specific genome-wide association scans (GWAS) and a meta-GWAS analysis of the NMR. We modeled natural-log transformed NMR with covariates: principal components of genetic ancestry, age, sex, body mass index, and smoking status. Results: African and Asian American NMRs were statistically significantly (P values ≤ 5E-5) lower than European American NMRs. Meta-GWAS analysis identified 36 genome-wide significant variants over a 43 kilobase pair region at CYP2A6 with minimum P = 2.46E-18 at rs12459249, proximal to CYP2A6. Additional minima were located in intron 4 (rs56113850, P = 6.61E-18) and in the CYP2A6-CYP2A7 intergenic region (rs34226463, P = 1.45E-12). Most (34/36) genome-wide significant variants suggested reduced CYP2A6 activity; functional mechanisms were identified and tested in knowledge-bases. Conditional analysis resulted in intergenic variants of possible interest (P values < 5E-5). Conclusions: This meta-GWAS of the NMR identifies CYP2A6 variants, replicates the top-ranked single nucleotide polymorphism from a recent Finnish meta-GWAS of the NMR, identifies functional mechanisms, and provides pan

  10. Adding value to laboratory medicine: a professional responsibility.

    PubMed

    Beastall, Graham H

    2013-01-01

    Laboratory medicine is a medical specialty at the centre of healthcare. When used optimally laboratory medicine generates knowledge that can facilitate patient safety, improve patient outcomes, shorten patient journeys and lead to more cost-effective healthcare. Optimal use of laboratory medicine relies on dynamic and authoritative leadership outside as well as inside the laboratory. The first responsibility of the head of a clinical laboratory is to ensure the provision of a high quality service across a wide range of parameters culminating in laboratory accreditation against an international standard, such as ISO 15189. From that essential baseline the leadership of laboratory medicine at local, national and international level needs to 'add value' to ensure the optimal delivery, use, development and evaluation of the services provided for individuals and for groups of patients. A convenient tool to illustrate added value is use of the mnemonic 'SCIENCE'. This tool allows added value to be considered in seven domains: standardisation and harmonisation; clinical effectiveness; innovation; evidence-based practice; novel applications; cost-effectiveness; and education of others. The assessment of added value in laboratory medicine may be considered against a framework that comprises three dimensions: operational efficiency; patient management; and patient behaviours. The profession and the patient will benefit from sharing examples of adding value to laboratory medicine.

  11. Tool & Die and EDM Series. Educational Resources for the Machine Tool Industry. Course Syllabi, Instructor's Handbook, [and] Student Laboratory Manual.

    ERIC Educational Resources Information Center

    Texas State Technical Coll. System, Waco.

    This package consists of course syllabi, an instructor's handbook, and a student laboratory manual for a 2-year vocational training program to prepare students for entry-level employment as tool and die makers. The program was developed through a modification of the DACUM (Developing a Curriculum) technique. The course syllabi volume begins with…

  12. Cardiac catheterization laboratory inpatient forecast tool: a prospective evaluation

    PubMed Central

    Flanagan, Eleni; Siddiqui, Sauleh; Appelbaum, Jeff; Kasper, Edward K; Levin, Scott

    2016-01-01

    Objective To develop and prospectively evaluate a web-based tool that forecasts the daily bed need for admissions from the cardiac catheterization laboratory using routinely available clinical data within electronic medical records (EMRs). Methods The forecast model was derived using a 13-month retrospective cohort of 6384 catheterization patients. Predictor variables such as demographics, scheduled procedures, and clinical indicators mined from free-text notes were input to a multivariable logistic regression model that predicted the probability of inpatient admission. The model was embedded into a web-based application connected to the local EMR system and used to support bed management decisions. After implementation, the tool was prospectively evaluated for accuracy on a 13-month test cohort of 7029 catheterization patients. Results The forecast model predicted admission with an area under the receiver operating characteristic curve of 0.722. Daily aggregate forecasts were accurate to within one bed for 70.3% of days and within three beds for 97.5% of days during the prospective evaluation period. The web-based application housing the forecast model was used by cardiology providers in practice to estimate daily admissions from the catheterization laboratory. Discussion The forecast model identified older age, male gender, invasive procedures, coronary artery bypass grafts, and a history of congestive heart failure as qualities indicating a patient was at increased risk for admission. Diagnostic procedures and less acute clinical indicators decreased patients’ risk of admission. Despite the site-specific limitations of the model, these findings were supported by the literature. Conclusion Data-driven predictive analytics may be used to accurately forecast daily demand for inpatient beds for cardiac catheterization patients. Connecting these analytics to EMR data sources has the potential to provide advanced operational decision support. PMID:26342217

  13. A Model for Program-Wide Assessment of the Effectiveness of Writing Instruction in Science Laboratory Courses

    ERIC Educational Resources Information Center

    Saitta, Erin K.; Zemliansky, Pavel; Turner, Anna

    2015-01-01

    The authors present a model for program-wide assessment of the effectiveness of writing instruction in a chemistry laboratory course. This model, which involves collaboration between faculty from chemistry, the Writing Across the Curriculum (WAC) program, and the Faculty Center for Teaching and Learning, is based on several theories and…

  14. Experimental Stage Separation Tool Development in NASA Langley's Aerothermodynamics Laboratory

    NASA Technical Reports Server (NTRS)

    Murphy, Kelly J.; Scallion, William I.

    2005-01-01

    As part of the research effort at NASA in support of the stage separation and ascent aerothermodynamics research program, proximity testing of a generic bimese wing-body configuration was conducted in NASA Langley's Aerothermodynamics Laboratory in the 20-Inch Mach 6 Air Tunnel. The objective of this work is the development of experimental tools and testing methodologies to apply to hypersonic stage separation problems for future multi-stage launch vehicle systems. Aerodynamic force and moment proximity data were generated at a nominal Mach number of 6 over a small range of angles of attack. The generic bimese configuration was tested in a belly-to-belly and back-to-belly orientation at 86 relative proximity locations. Over 800 aerodynamic proximity data points were taken to serve as a database for code validation. Longitudinal aerodynamic data generated in this test program show very good agreement with viscous computational predictions. Thus a framework has been established to study separation problems in the hypersonic regime using coordinated experimental and computational tools.

  15. Pathology economic model tool: a novel approach to workflow and budget cost analysis in an anatomic pathology laboratory.

    PubMed

    Muirhead, David; Aoun, Patricia; Powell, Michael; Juncker, Flemming; Mollerup, Jens

    2010-08-01

    The need for higher efficiency, maximum quality, and faster turnaround time is a continuous focus for anatomic pathology laboratories and drives changes in work scheduling, instrumentation, and management control systems. To determine the costs of generating routine, special, and immunohistochemical microscopic slides in a large, academic anatomic pathology laboratory using a top-down approach. The Pathology Economic Model Tool was used to analyze workflow processes at The Nebraska Medical Center's anatomic pathology laboratory. Data from the analysis were used to generate complete cost estimates, which included not only materials, consumables, and instrumentation but also specific labor and overhead components for each of the laboratory's subareas. The cost data generated by the Pathology Economic Model Tool were compared with the cost estimates generated using relative value units. Despite the use of automated systems for different processes, the workflow in the laboratory was found to be relatively labor intensive. The effect of labor and overhead on per-slide costs was significantly underestimated by traditional relative-value unit calculations when compared with the Pathology Economic Model Tool. Specific workflow defects with significant contributions to the cost per slide were identified. The cost of providing routine, special, and immunohistochemical slides may be significantly underestimated by traditional methods that rely on relative value units. Furthermore, a comprehensive analysis may identify specific workflow processes requiring improvement.

  16. The Virtual Robotics Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kress, R.L.; Love, L.J.

    The growth of the Internet has provided a unique opportunity to expand research collaborations between industry, universities, and the national laboratories. The Virtual Robotics Laboratory (VRL) is an innovative program at Oak Ridge National Laboratory (ORNL) that is focusing on the issues related to collaborative research through controlled access of laboratory equipment using the World Wide Web. The VRL will provide different levels of access to selected ORNL laboratory secondary education programs. In the past, the ORNL Robotics and Process Systems Division has developed state-of-the-art robotic systems for the Army, NASA, Department of Energy, Department of Defense, as well asmore » many other clients. After proof of concept, many of these systems sit dormant in the laboratories. This is not out of completion of all possible research topics. but from completion of contracts and generation of new programs. In the past, a number of visiting professors have used this equipment for their own research. However, this requires that the professor, and possibly his/her students, spend extended periods at the laboratory facility. In addition, only a very exclusive group of faculty can gain access to the laboratory and hardware. The VRL is a tool that enables extended collaborative efforts without regard to geographic limitations.« less

  17. World-Wide Web Tools for Locating Planetary Images

    NASA Technical Reports Server (NTRS)

    Kanefsky, Bob; Deiss, Ron (Technical Monitor)

    1995-01-01

    The explosive growth of the World-Wide Web (WWW) in the past year has made it feasible to provide interactive graphical tools to assist scientists in locating planetary images. The highest available resolution images of any site of interest can be quickly found on a map or plot, and, if online, displayed immediately on nearly any computer equipped with a color screen, an Internet connection, and any of the free WWW browsers. The same tools may also be of interest to educators, students, and the general public. Image finding tools have been implemented covering most of the solar system: Earth, Mars, and the moons and planets imaged by Voyager. The Mars image-finder, which plots the footprints of all the high-resolution Viking Orbiter images and can be used to display any that are available online, also contains a complete scrollable atlas and hypertext gazetteer to help locating areas. The Earth image-finder is linked to thousands of Shuttle images stored at NASA/JSC, and displays them as red dots on a globe. The Voyager image-finder plots images as dots, by longitude and apparent target size, linked to online images. The locator (URL) for the top-level page is http: //ic-www.arc.nasa.gov/ic/projects/bayes-group/Atlas/. Through the efforts of the Planetary Data System and other organizations, hundreds of thousands of planetary images are now available on CD-ROM, and many of these have been made available on the WWW. However, locating images of a desired site is still problematic, in practice. For example, many scientists studying Mars use digital image maps, which are one third the resolution of Viking Orbiter survey images. When they douse Viking Orbiter images, they often work with photographically printed hardcopies, which lack the flexibility of digital images: magnification, contrast stretching, and other basic image-processing techniques offered by off-the-shelf software. From the perspective of someone working on an experimental image processing technique for

  18. Genome-Wide Association of the Laboratory-Based Nicotine Metabolite Ratio in Three Ancestries.

    PubMed

    Baurley, James W; Edlund, Christopher K; Pardamean, Carissa I; Conti, David V; Krasnow, Ruth; Javitz, Harold S; Hops, Hyman; Swan, Gary E; Benowitz, Neal L; Bergen, Andrew W

    2016-09-01

    Metabolic enzyme variation and other patient and environmental characteristics influence smoking behaviors, treatment success, and risk of related disease. Population-specific variation in metabolic genes contributes to challenges in developing and optimizing pharmacogenetic interventions. We applied a custom genome-wide genotyping array for addiction research (Smokescreen), to three laboratory-based studies of nicotine metabolism with oral or venous administration of labeled nicotine and cotinine, to model nicotine metabolism in multiple populations. The trans-3'-hydroxycotinine/cotinine ratio, the nicotine metabolite ratio (NMR), was the nicotine metabolism measure analyzed. Three hundred twelve individuals of self-identified European, African, and Asian American ancestry were genotyped and included in ancestry-specific genome-wide association scans (GWAS) and a meta-GWAS analysis of the NMR. We modeled natural-log transformed NMR with covariates: principal components of genetic ancestry, age, sex, body mass index, and smoking status. African and Asian American NMRs were statistically significantly (P values ≤ 5E-5) lower than European American NMRs. Meta-GWAS analysis identified 36 genome-wide significant variants over a 43 kilobase pair region at CYP2A6 with minimum P = 2.46E-18 at rs12459249, proximal to CYP2A6. Additional minima were located in intron 4 (rs56113850, P = 6.61E-18) and in the CYP2A6-CYP2A7 intergenic region (rs34226463, P = 1.45E-12). Most (34/36) genome-wide significant variants suggested reduced CYP2A6 activity; functional mechanisms were identified and tested in knowledge-bases. Conditional analysis resulted in intergenic variants of possible interest (P values < 5E-5). This meta-GWAS of the NMR identifies CYP2A6 variants, replicates the top-ranked single nucleotide polymorphism from a recent Finnish meta-GWAS of the NMR, identifies functional mechanisms, and provides pan-continental population biomarkers for nicotine metabolism. This

  19. 78 FR 63999 - Notice of Vitamin D Standardization Program (VDSP) Symposium: Tools To Improve Laboratory...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-25

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Notice of Vitamin D Standardization Program (VDSP) Symposium: Tools To Improve Laboratory Measurement SUMMARY: The National Institutes of Health, Office of Dietary Supplements (ODS), and the National Institute of Standards and...

  20. Virtual Laboratory "vs." Traditional Laboratory: Which Is More Effective for Teaching Electrochemistry?

    ERIC Educational Resources Information Center

    Hawkins, Ian; Phelps, Amy J.

    2013-01-01

    The use of virtual laboratories has become an increasing issue regarding science laboratories due to the increasing cost of hands-on laboratories, and the increase in distance education. Recent studies have looked at the use of virtual tools for laboratory to be used as supplements to the regular hands-on laboratories but many virtual tools have…

  1. Web-Based Virtual Laboratory for Food Analysis Course

    NASA Astrophysics Data System (ADS)

    Handayani, M. N.; Khoerunnisa, I.; Sugiarti, Y.

    2018-02-01

    Implementation of learning on food analysis course in Program Study of Agro-industrial Technology Education faced problems. These problems include the availability of space and tools in the laboratory that is not comparable with the number of students also lack of interactive learning tools. On the other hand, the information technology literacy of students is quite high as well the internet network is quite easily accessible on campus. This is a challenge as well as opportunities in the development of learning media that can help optimize learning in the laboratory. This study aims to develop web-based virtual laboratory as one of the alternative learning media in food analysis course. This research is R & D (research and development) which refers to Borg & Gall model. The results showed that assessment’s expert of web-based virtual labs developed, in terms of software engineering aspects; visual communication; material relevance; usefulness and language used, is feasible as learning media. The results of the scaled test and wide-scale test show that students strongly agree with the development of web based virtual laboratory. The response of student to this virtual laboratory was positive. Suggestions from students provided further opportunities for improvement web based virtual laboratory and should be considered for further research.

  2. Laboratory preparation questionnaires as a tool for the implementation of the Just in Time Teaching in the Physics I laboratories: Research training

    NASA Astrophysics Data System (ADS)

    Miranda, David A.; Sanchez, Melba J.; Forero, Oscar M.

    2017-06-01

    The implementation of the JiTT (Just in Time Teaching) strategy is presented to increase the previous preparation of students enrolled in the subject Physics Laboratory I offered at the Industrial University of Santander (UIS), Colombia. In this study, a laboratory preparation questionnaire (CPL) was applied as a tool for the implementation of JiTT combined with elements of mediated learning. It was found that the CPL allows to improve the students’ experience regarding the preparation of the laboratory and the development of the experimental session. These questionnaires were implemented in an academic manager (Moodle) and a web application (lab.ciencias.uis.edu.co) was used to publish the contents essential for the preparation of the student before each practical session. The most significant result was that the students performed the experimental session with the basic knowledge to improve their learning experience.

  3. Student laboratory presentations as a learning tool in anatomy education.

    PubMed

    Chollet, Madeleine B; Teaford, Mark F; Garofalo, Evan M; DeLeon, Valerie B

    2009-01-01

    Previous studies have shown that anatomy students who complete oral laboratory presentations believe they understand the material better and retain it longer than they otherwise would if they only took examinations on the material; however, we have found no studies that empirically test such outcomes. The purpose of this study was to assess the effectiveness of oral presentations through comparisons with other methods of assessment, most notably, examination performance. Specifically, we tested whether students (n = 256) performed better on examination questions on topics covered by their oral presentations than on other topics. Each student completed two graded, 12-minute laboratory presentations on two different assigned topics during the course and took three examinations, each of which covered a third of the course material. Examination questions were characterized by type (memorization, pathway, analytical, spatial). A two-way repeated measures analysis of variance revealed that students performed better on topics covered by their presentations than on topics not covered by their presentations (P < 0.005), regardless of presentation grade (P > 0.05) and question type (P > 0.05). These results demonstrate empirically that oral presentations are an effective learning tool.

  4. Reduced Clostridium difficile Tests and Laboratory-Identified Events With a Computerized Clinical Decision Support Tool and Financial Incentive.

    PubMed

    Madden, Gregory R; German Mesner, Ian; Cox, Heather L; Mathers, Amy J; Lyman, Jason A; Sifri, Costi D; Enfield, Kyle B

    2018-06-01

    We hypothesized that a computerized clinical decision support tool for Clostridium difficile testing would reduce unnecessary inpatient tests, resulting in fewer laboratory-identified events. Census-adjusted interrupted time-series analyses demonstrated significant reductions of 41% fewer tests and 31% fewer hospital-onset C. difficile infection laboratory-identified events following this intervention.Infect Control Hosp Epidemiol 2018;39:737-740.

  5. DECIDE: a Decision Support Tool to Facilitate Parents' Choices Regarding Genome-Wide Sequencing.

    PubMed

    Birch, Patricia; Adam, S; Bansback, N; Coe, R R; Hicklin, J; Lehman, A; Li, K C; Friedman, J M

    2016-12-01

    We describe the rationale, development, and usability testing for an integrated e-learning tool and decision aid for parents facing decisions about genome-wide sequencing (GWS) for their children with a suspected genetic condition. The online tool, DECIDE, is designed to provide decision-support and to promote high quality decisions about undergoing GWS with or without return of optional incidental finding results. DECIDE works by integrating educational material with decision aids. Users may tailor their learning by controlling both the amount of information and its format - text and diagrams and/or short videos. The decision aid guides users to weigh the importance of various relevant factors in their own lives and circumstances. After considering the pros and cons of GWS and return of incidental findings, DECIDE summarizes the user's responses and apparent preferred choices. In a usability study of 16 parents who had already chosen GWS after conventional genetic counselling, all participants found DECIDE to be helpful. Many would have been satisfied to use it alone to guide their GWS decisions, but most would prefer to have the option of consulting a health care professional as well to aid their decision. Further testing is necessary to establish the effectiveness of using DECIDE as an adjunct to or instead of conventional pre-test genetic counselling for clinical genome-wide sequencing.

  6. Sandia National Laboratories analysis code data base

    NASA Astrophysics Data System (ADS)

    Peterson, C. W.

    1994-11-01

    Sandia National Laboratories' mission is to solve important problems in the areas of national defense, energy security, environmental integrity, and industrial technology. The laboratories' strategy for accomplishing this mission is to conduct research to provide an understanding of the important physical phenomena underlying any problem, and then to construct validated computational models of the phenomena which can be used as tools to solve the problem. In the course of implementing this strategy, Sandia's technical staff has produced a wide variety of numerical problem-solving tools which they use regularly in the design, analysis, performance prediction, and optimization of Sandia components, systems, and manufacturing processes. This report provides the relevant technical and accessibility data on the numerical codes used at Sandia, including information on the technical competency or capability area that each code addresses, code 'ownership' and release status, and references describing the physical models and numerical implementation.

  7. The Wide-Field Imaging Interferometry Testbed (WIIT): Recent Progress and Results

    NASA Technical Reports Server (NTRS)

    Rinehart, Stephen A.; Frey, Bradley J.; Leisawitz, David T.; Lyon, Richard G.; Maher, Stephen F.; Martino, Anthony J.

    2008-01-01

    Continued research with the Wide-Field Imaging Interferometry Testbed (WIIT) has achieved several important milestones. We have moved WIIT into the Advanced Interferometry and Metrology (AIM) Laboratory at Goddard, and have characterized the testbed in this well-controlled environment. The system is now completely automated and we are in the process of acquiring large data sets for analysis. In this paper, we discuss these new developments and outline our future research directions. The WIIT testbed, combined with new data analysis techniques and algorithms, provides a demonstration of the technique of wide-field interferometric imaging, a powerful tool for future space-borne interferometers.

  8. Tools to manage the enterprise-wide picture archiving and communications system environment.

    PubMed

    Lannum, L M; Gumpf, S; Piraino, D

    2001-06-01

    The presentation will focus on the implementation and utilization of a central picture archiving and communications system (PACS) network-monitoring tool that allows for enterprise-wide operations management and support of the image distribution network. The MagicWatch (Siemens, Iselin, NJ) PACS/radiology information system (RIS) monitoring station from Siemens has allowed our organization to create a service support structure that has given us proactive control of our environment and has allowed us to meet the service level performance expectations of the users. The Radiology Help Desk has used the MagicWatch PACS monitoring station as an applications support tool that has allowed the group to monitor network activity and individual systems performance at each node. Fast and timely recognition of the effects of single events within the PACS/RIS environment has allowed the group to proactively recognize possible performance issues and resolve problems. The PACS/operations group performs network management control, image storage management, and software distribution management from a single, central point in the enterprise. The MagicWatch station allows for the complete automation of software distribution, installation, and configuration process across all the nodes in the system. The tool has allowed for the standardization of the workstations and provides a central configuration control for the establishment and maintenance of the system standards. This report will describe the PACS management and operation prior to the implementation of the MagicWatch PACS monitoring station and will highlight the operational benefits of a centralized network and system-monitoring tool.

  9. Evolution of a residue laboratory network and the management tools for monitoring its performance.

    PubMed

    Lins, E S; Conceição, E S; Mauricio, A De Q

    2012-01-01

    Since 2005 the National Residue & Contaminants Control Plan (NRCCP) in Brazil has been considerably enhanced, increasing the number of samples, substances and species monitored, and also the analytical detection capability. The Brazilian laboratory network was forced to improve its quality standards in order to comply with the NRCP's own evolution. Many aspects such as the limits of quantification (LOQs), the quality management systems within the laboratories and appropriate method validation are in continuous improvement, generating new scenarios and demands. Thus, efficient management mechanisms for monitoring network performance and its adherence to the established goals and guidelines are required. Performance indicators associated to computerised information systems arise as a powerful tool to monitor the laboratories' activity, making use of different parameters to describe this activity on a day-to-day basis. One of these parameters is related to turnaround times, and this factor is highly affected by the way each laboratory organises its management system, as well as the regulatory requirements. In this paper a global view is presented of the turnaround times related to the type of analysis, laboratory, number of samples per year, type of matrix, country region and period of the year, all these data being collected from a computerised system called SISRES. This information gives a solid background to management measures aiming at the improvement of the service offered by the laboratory network.

  10. Laboratory diagnostics of malaria

    NASA Astrophysics Data System (ADS)

    Siahaan, L.

    2018-03-01

    Even now, malaria treatment should only be administered after laboratory confirmation. There are several principal methods for diagnosing malaria. All these methods have their disadvantages.Presumptive treatment of malaria is widely practiced where laboratory tests are not readily available. Microscopy of Giemsa-stained thick and thin blood films remains the gold standard for the diagnosis of malaria infection. The technique of slide preparation, staining and reading are well known and standardized, and so is the estimate of the parasite density and parasite stages. Microscopy is not always available or feasible at primary health services in limited resource settings due to cost, lack of skilled manpower, accessories and reagents required. Rapid diagnostic tests (RDTs) are potential tools for parasite-based diagnosis since the tests are accurate in detecting malaria infections and are easy to use. The test is based on the capture of parasite antigen that released from parasitized red blood cells using monoclonal antibodies prepared against malaria antigen target. Polymerase Chain Reaction (PCR), depend on DNA amplification approaches and have higher sensitivity than microscopy. PCR it is not widely used due to the lack of a standardized methodology, high costs, and the need for highly-trained staff.

  11. Application of modern radiative transfer tools to model laboratory quartz emissivity

    NASA Astrophysics Data System (ADS)

    Pitman, Karly M.; Wolff, Michael J.; Clayton, Geoffrey C.

    2005-08-01

    Planetary remote sensing of regolith surfaces requires use of theoretical models for interpretation of constituent grain physical properties. In this work, we review and critically evaluate past efforts to strengthen numerical radiative transfer (RT) models with comparison to a trusted set of nadir incidence laboratory quartz emissivity spectra. By first establishing a baseline statistical metric to rate successful model-laboratory emissivity spectral fits, we assess the efficacy of hybrid computational solutions (Mie theory + numerically exact RT algorithm) to calculate theoretical emissivity values for micron-sized α-quartz particles in the thermal infrared (2000-200 cm-1) wave number range. We show that Mie theory, a widely used but poor approximation to irregular grain shape, fails to produce the single scattering albedo and asymmetry parameter needed to arrive at the desired laboratory emissivity values. Through simple numerical experiments, we show that corrections to single scattering albedo and asymmetry parameter values generated via Mie theory become more necessary with increasing grain size. We directly compare the performance of diffraction subtraction and static structure factor corrections to the single scattering albedo, asymmetry parameter, and emissivity for dense packing of grains. Through these sensitivity studies, we provide evidence that, assuming RT methods work well given sufficiently well-quantified inputs, assumptions about the scatterer itself constitute the most crucial aspect of modeling emissivity values.

  12. Genetic Simulation Tools for Post-Genome Wide Association Studies of Complex Diseases

    PubMed Central

    Amos, Christopher I.; Bafna, Vineet; Hauser, Elizabeth R.; Hernandez, Ryan D.; Li, Chun; Liberles, David A.; McAllister, Kimberly; Moore, Jason H.; Paltoo, Dina N.; Papanicolaou, George J.; Peng, Bo; Ritchie, Marylyn D.; Rosenfeld, Gabriel; Witte, John S.

    2014-01-01

    Genetic simulation programs are used to model data under specified assumptions to facilitate the understanding and study of complex genetic systems. Standardized data sets generated using genetic simulation are essential for the development and application of novel analytical tools in genetic epidemiology studies. With continuing advances in high-throughput genomic technologies and generation and analysis of larger, more complex data sets, there is a need for updating current approaches in genetic simulation modeling. To provide a forum to address current and emerging challenges in this area, the National Cancer Institute (NCI) sponsored a workshop, entitled “Genetic Simulation Tools for Post-Genome Wide Association Studies of Complex Diseases” at the National Institutes of Health (NIH) in Bethesda, Maryland on March 11-12, 2014. The goals of the workshop were to: (i) identify opportunities, challenges and resource needs for the development and application of genetic simulation models; (ii) improve the integration of tools for modeling and analysis of simulated data; and (iii) foster collaborations to facilitate development and applications of genetic simulation. During the course of the meeting the group identified challenges and opportunities for the science of simulation, software and methods development, and collaboration. This paper summarizes key discussions at the meeting, and highlights important challenges and opportunities to advance the field of genetic simulation. PMID:25371374

  13. A Framework for Laboratory Pre-Work Based on the Concepts, Tools and Techniques Questioning Method

    ERIC Educational Resources Information Center

    Huntula, J.; Sharma, M. D.; Johnston, I.; Chitaree, R.

    2011-01-01

    Learning in the laboratory is different from learning in other contexts because students have to engage with various aspects of the practice of science. They have to use many skills and knowledge in parallel--not only to understand the concepts of physics but also to use the tools and analyse the data. The question arises, how to best guide…

  14. Incorporation of Gas Chromatography-Mass Spectrometry into the Undergraduate Organic Chemistry Laboratory Curriculum

    ERIC Educational Resources Information Center

    Giarikos, Dimitrios G.; Patel, Sagir; Lister, Andrew; Razeghifard, Reza

    2013-01-01

    Gas chromatography-mass spectrometry (GC-MS) is a powerful analytical tool for detection, identification, and quantification of many volatile organic compounds. However, many colleges and universities have not fully incorporated this technique into undergraduate teaching laboratories despite its wide application and ease of use in organic…

  15. Remote sensing education and Internet/World Wide Web technology

    USGS Publications Warehouse

    Griffith, J.A.; Egbert, S.L.

    2001-01-01

    Remote sensing education is increasingly in demand across academic and professional disciplines. Meanwhile, Internet technology and the World Wide Web (WWW) are being more frequently employed as teaching tools in remote sensing and other disciplines. The current wealth of information on the Internet and World Wide Web must be distilled, nonetheless, to be useful in remote sensing education. An extensive literature base is developing on the WWW as a tool in education and in teaching remote sensing. This literature reveals benefits and limitations of the WWW, and can guide its implementation. Among the most beneficial aspects of the Web are increased access to remote sensing expertise regardless of geographic location, increased access to current material, and access to extensive archives of satellite imagery and aerial photography. As with other teaching innovations, using the WWW/Internet may well mean more work, not less, for teachers, at least at the stage of early adoption. Also, information posted on Web sites is not always accurate. Development stages of this technology range from on-line posting of syllabi and lecture notes to on-line laboratory exercises and animated landscape flyovers and on-line image processing. The advantages of WWW/Internet technology may likely outweigh the costs of implementing it as a teaching tool.

  16. Fine pattern replication on 10 x 10-mm exposure area using ETS-1 laboratory tool in HIT

    NASA Astrophysics Data System (ADS)

    Hamamoto, K.; Watanabe, Takeo; Hada, Hideo; Komano, Hiroshi; Kishimura, Shinji; Okazaki, Shinji; Kinoshita, Hiroo

    2002-07-01

    Utilizing ETS-1 laboratory tool in Himeji Institute of Technology (HIT), as for the fine pattern replicated by using the Cr mask in static exposure, it is replicated in the exposure area of 10 mm by 2 mm in size that the line and space pattern width of 60 nm, the isolated line pattern width of 40 nm, and hole pattern width of 150 nm. According to the synchronous scanning of the mass and wafer with EUVL laboratory tool with reduction optical system which consisted of three-aspherical-mirror in the NewSUBARU facilities succeeded in the line of 60 nm and the space pattern formation in the exposure region of 10mm by 10mm. From the result of exposure characteristics for positive- tone resist for KrF and EB, KrF chemically amplified resist has better characteristics than EB chemically amplified resist.

  17. Statistical Analysis Tools for Learning in Engineering Laboratories.

    ERIC Educational Resources Information Center

    Maher, Carolyn A.

    1990-01-01

    Described are engineering programs that have used automated data acquisition systems to implement data collection and analyze experiments. Applications include a biochemical engineering laboratory, heat transfer performance, engineering materials testing, mechanical system reliability, statistical control laboratory, thermo-fluid laboratory, and a…

  18. "Caenorhabditis Elegans" as an Undergraduate Educational Tool for Teaching RNAi

    ERIC Educational Resources Information Center

    Andersen, Janet; Krichevsky, Alexander; Leheste, Joerg R.; Moloney, Daniel J.

    2008-01-01

    Discovery of RNA-mediated interference (RNAi) is widely recognized as one of the most significant molecular biology breakthroughs in the past 10 years. There is a need for science educators to develop teaching tools and laboratory activities that demonstrate the power of this new technology and help students to better understand the RNAi process.…

  19. A Useful Laboratory Tool

    ERIC Educational Resources Information Center

    Johnson, Samuel A.; Tutt, Tye

    2008-01-01

    Recently, a high school Science Club generated a large number of questions involving temperature. Therefore, they decided to construct a thermal gradient apparatus in order to conduct a wide range of experiments beyond the standard "cookbook" labs. They felt that this apparatus could be especially useful in future ninth-grade biology classes, in…

  20. Laboratory Astrophysics: Enabling Scientific Discovery and Understanding

    NASA Technical Reports Server (NTRS)

    Kirby, K.

    2006-01-01

    NASA's Science Strategic Roadmap for Universe Exploration lays out a series of science objectives on a grand scale and discusses the various missions, over a wide range of wavelengths, which will enable discovery. Astronomical spectroscopy is arguably the most powerful tool we have for exploring the Universe. Experimental and theoretical studies in Laboratory Astrophysics convert "hard-won data into scientific understanding". However, the development of instruments with increasingly high spectroscopic resolution demands atomic and molecular data of unprecedented accuracy and completeness. How to meet these needs, in a time of severe budgetary constraints, poses a significant challenge both to NASA, the astronomical observers and model-builders, and the laboratory astrophysics community. I will discuss these issues, together with some recent examples of productive astronomy/lab astro collaborations.

  1. A NASA-wide approach toward cost-effective, high-quality software through reuse

    NASA Technical Reports Server (NTRS)

    Scheper, Charlotte O. (Editor); Smith, Kathryn A. (Editor)

    1993-01-01

    NASA Langley Research Center sponsored the second Workshop on NASA Research in Software Reuse on May 5-6, 1992 at the Research Triangle Park, North Carolina. The workshop was hosted by the Research Triangle Institute. Participants came from the three NASA centers, four NASA contractor companies, two research institutes and the Air Force's Rome Laboratory. The purpose of the workshop was to exchange information on software reuse tool development, particularly with respect to tool needs, requirements, and effectiveness. The participants presented the software reuse activities and tools being developed and used by their individual centers and programs. These programs address a wide range of reuse issues. The group also developed a mission and goals for software reuse within NASA. This publication summarizes the presentations and the issues discussed during the workshop.

  2. Sandia National Laboratories site-wide hydrogeologic characterization project calendar year 1992 annual report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crowson, D.; Gibson, J.D.; Haase, C.S.

    1993-10-01

    The Sandia National Laboratories, New Mexico (SNL/NM) Site-Wide Hydrogeologic Characterization (SWHC) project has been implemented as part of the SNL/NM Environmental Restoration (ER) Program to develop the regional hydrogeologic framework and baseline for the approximately 100 mi of Kirtland Air Force Base (KAFB) and adjacent withdrawn public lands upon which SNL/NM has performed research and development activities. Additionally, the SWHC project will investigate and characterize generic hydrogeologic issues associated with the 172 ER sites owned by SNL/NM across its facilities on KAFB. As called for in the Hazardous and Solid Waste Amendments (HSWA) to the Resource Conservation and Recovery Actmore » (RCRA) Part B permit agreement between the U.S. Environmental Protection Agency (EPA) as the permitter and the U.S. Department of Energy (DOE) and SNL/NM as the permittees, an annual report is to be prepared by the SWHC project team. This document serves two primary purposes: (1) to identify and describe the conceptual framework for the hydrogeologic system underlying SNL/NM and (2) to describe characterization activities undertaken in the preceding year that add to our understanding (reduce our uncertainties) regarding the conceptual and quantitative hydrogeologic framework. This SWHC project annual report focuses primarily on purpose 1, providing a summary description of the current {open_quotes}state of knowledge{close_quotes} of the Sandia National Laboratories/Kirtland Air Force Base (SNL/KAFB) hydrogeologic setting.« less

  3. Explosively driven two-shockwave tools with application to ejecta formation at the Los Alamos National Laboratory Proton Radiography Facility

    NASA Astrophysics Data System (ADS)

    Buttler, William

    2013-06-01

    We present the development of an explosively driven physics tool to generate two mostly uniaxial shockwaves. The tool is being used to extend single shockwave ejecta models to a subsequent shockwave event separated by a time interval on the order of a few microseconds. We explore the possibility of varying the amplitude of both the first and second shockwaves, and we apply the tool in experimental geometries on Sn with a surface roughness of Ra = 0 . 8 μ m. We then evaluate the tool further at the Los Alamos National Laboratory Proton Radiography (pRad) Facility in an application to Sn with larger scale perturbations of wavelength 550 μ m, and various amplitudes that gave wave-number amplitude products of η0 2 π / λ = { 3 / 4 , 1 / 2 , 1 / 4 , 1 / 8 } , where the perturbation amplitude is η0, and the wave-number k = 2 π / λ . The pRad data and velocimetry imply it should be possible to develop a second shock ejecta model based on unstable Richtmyer-Meshkov physics. In collaboration with David Oro, Fesseha Mariam, Alexander Saunders, Malcolm Andrews, Frank Cherne, James Hammerberg. Robert Hixson, Christopher Morris, Russell Olson, Dean Preston, Joseph Stone, Dale Tupa, and Wendy Vogan-McNeil, Los Alamos National Laboratory,

  4. Measurements of Size Resolved Organic Particulate Mass Using An On-line Aerosol Mass Spectrometer (ams) Laboratory Validation; Analysis Tool Development; and Interpretation of Field Data

    NASA Astrophysics Data System (ADS)

    Alfarra, M. R.; Coe, H.; Allan, J. D.; Bower, K. N.; Garforth, A. A.; Canagaratna, M.; Worsnop, D.

    The aerosol mass spectrometer (AMS) is a quantitative instrument designed to deliver real-time size resolved chemical composition of the volatile and semi volatile aerosol fractions. The AMS response to a wide range of organic compounds has been exper- imentally characterized, and has been shown to compare well with standard libraries of 70 eV electron impact ionization mass spectra. These results will be presented. Due to the scanning nature of the quadrupole mass spectrometer, the AMS provides averaged composition of ensemble of particles rather than single particle composi- tion. However, the mass spectra measured by AMS are reproducible and similar to those of standard libraries so analysis tools can be developed on large mass spectral libraries that can provide chemical composition information about the type of organic compounds in the aerosol. One such tool is presented and compared with laboratory measurements of single species and mixed component organic particles by the AMS. We will then discuss the applicability of these tools to interpreting field AMS data ob- tained in a range of experiments at different sites in the UK and Canada. The data will be combined with other measurements to show the behaviour of the organic aerosol fraction in urban and sub-urban environments.

  5. The Los Alamos universe: Using multimedia to promote laboratory capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kindel, J.

    2000-03-01

    This project consists of a multimedia presentation that explains the technological capabilities of Los Alamos National Laboratory. It takes the form of a human-computer interface built around the metaphor of the universe. The project is intended promote Laboratory capabilities to a wide audience. Multimedia is simply a means of communicating information through a diverse set of tools--be they text, sound, animation, video, etc. Likewise, Los Alamos National Laboratory is a collection of diverse technologies, projects, and people. Given the ample material available at the Laboratory, there are tangible benefits to be gained by communicating across media. This paper consists ofmore » three parts. The first section provides some basic information about the Laboratory, its mission, and its needs. The second section introduces this multimedia presentation and the metaphor it is based on along with some basic concepts of color and user interaction used in the building of this project. The final section covers construction of the project, pitfalls, and future improvements.« less

  6. Early experiences in evolving an enterprise-wide information model for laboratory and clinical observations.

    PubMed

    Chen, Elizabeth S; Zhou, Li; Kashyap, Vipul; Schaeffer, Molly; Dykes, Patricia C; Goldberg, Howard S

    2008-11-06

    As Electronic Healthcare Records become more prevalent, there is an increasing need to ensure unambiguous data capture, interpretation, and exchange within and across heterogeneous applications. To address this need, a common, uniform, and comprehensive approach for representing clinical information is essential. At Partners HealthCare System, we are investigating the development and implementation of enterprise-wide information models to specify the representation of clinical information to support semantic interoperability. This paper summarizes our early experiences in: (1) defining a process for information model development, (2) reviewing and comparing existing healthcare information models, (3) identifying requirements for representation of laboratory and clinical observations, and (4) exploring linkages to existing terminology and data standards. These initial findings provide insight to the various challenges ahead and guidance on next steps for adoption of information models at our organization.

  7. Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools

    NASA Astrophysics Data System (ADS)

    Boe, Bryce A.

    There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.

  8. Modeling of low-temperature plasmas generated using laser-induced breakdown spectroscopy: the ChemCam diagnostic tool on the Mars Science Laboratory Rover

    NASA Astrophysics Data System (ADS)

    Colgan, James

    2016-05-01

    We report on efforts to model the low-temperature plasmas generated using laser-induced breakdown spectroscopy (LIBS). LIBS is a minimally invasive technique that can quickly and efficiently determine the elemental composition of a target and is employed in an extremely wide range of applications due to its ease of use and fast turnaround. In particular, LIBS is the diagnostic tool used by the ChemCam instrument on the Mars Science Laboratory rover Curiosity. In this talk, we report on the use of the Los Alamos plasma modeling code ATOMIC to simulate LIBS plasmas, which are typically at temperatures of order 1 eV and electron densities of order 10 16 - 17 cm-3. At such conditions, these plasmas are usually in local-thermodynamic equilibrium (LTE) and normally contain neutral and singly ionized species only, which then requires that modeling must use accurate atomic structure data for the element under investigation. Since LIBS devices are often employed in a very wide range of applications, it is therefore desirable to have accurate data for most of the elements in the periodic table, ideally including actinides. Here, we discuss some recent applications of our modeling using ATOMIC that have explored the plasma physics aspects of LIBS generated plasmas, and in particular discuss the modeling of a plasma formed from a basalt sample used as a ChemCam standard1. We also highlight some of the more general atomic physics challenges that are encountered when attempting to model low-temperature plasmas. The Los Alamos National Laboratory is operated by Los Alamos National Security, LLC for the National Nuclear Security Administration of the U.S. Department of Energy under Contract No. DE-AC5206NA25396. Work performed in conjunction with D. P. Kilcrease, H. M. Johns, E. J. Judge, J. E. Barefield, R. C. Wiens, S. M. Clegg.

  9. Optimizing the ASC WAN: evaluating network performance tools for comparing transport protocols.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lydick, Christopher L.

    2007-07-01

    The Advanced Simulation & Computing Wide Area Network (ASC WAN), which is a high delay-bandwidth network connection between US Department of Energy National Laboratories, is constantly being examined and evaluated for efficiency. One of the current transport-layer protocols which is used, TCP, was developed for traffic demands which are different from that on the ASC WAN. The Stream Control Transport Protocol (SCTP), on the other hand, has shown characteristics which make it more appealing to networks such as these. Most important, before considering a replacement for TCP on any network, a testing tool that performs well against certain criteria needsmore » to be found. In order to try to find such a tool, two popular networking tools (Netperf v.2.4.3 & v.2.4.6 (OpenSS7 STREAMS), and Iperf v.2.0.6) were tested. These tools implement both TCP and SCTP and were evaluated using four metrics: (1) How effectively can the tool reach a throughput near the bandwidth? (2) How much of the CPU does the tool utilize during operation? (3) Is the tool freely and widely available? And, (4) Is the tool actively developed? Following the analysis of those tools, this paper goes further into explaining some recommendations and ideas for future work.« less

  10. SWEIS Yearbook-2012 Comparison of 2012 Data to Projections of the 2008 Site-Wide Environmental Impact Statement for Continued Operation of Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahowald, Hallie B.; Wright, Marjorie Alys

    2014-01-16

    Los Alamos National Laboratory (LANL or the Laboratory) operations data for Calendar Year (CY) 2012 mostly fell within the 2008 Site-Wide Environmental Impact Statement (SWEIS) projections. Operation levels for one LANL facility exceeded the 2008 SWEIS capability projections—Radiochemistry Facility; however, none of the capability increases caused exceedances in radioactive air emissions, waste generation, or National Pollutant Discharge Elimination System (NPDES) discharge. Several facilities exceeded the2008 SWEIS levels for waste generation quantities; however, all were one-time, non-routine events that do not reflect the day-to-day operations of the Laboratory. In addition, total site-wide waste generation quantities were below SWEIS projections for allmore » waste types, reflecting the overall levels of operations at both the Key and Non-Key Facilities. Although gas and electricity consumption have remained within the 2008 SWEIS limits for utilities, water consumption exceeded the 2008 SWEIS projections by 27 million gallons in CY 2012.« less

  11. Making ideas at scientific fabrication laboratories

    NASA Astrophysics Data System (ADS)

    Fonda, Carlo; Canessa, Enrique

    2016-11-01

    Creativity, together with the making of ideas into fruition, is essential for progress. Today the evolution from an idea to its application can be facilitated by the implementation of Fabrication Laboratories, or FabLabs, having affordable digital tools for prototyping. FabLabs aiming at scientific research and invention are now starting to be established inside Universities, Research Centers and Schools. We review the setting up of the ICTP Scientific FabLab in Trieste, Italy, give concrete examples on the use in physics, and propose to replicate world-wide this class of multi-purpose workplaces within academia as a support for physics and math education and for community development.

  12. Meneco, a Topology-Based Gap-Filling Tool Applicable to Degraded Genome-Wide Metabolic Networks

    PubMed Central

    Prigent, Sylvain; Frioux, Clémence; Dittami, Simon M.; Larhlimi, Abdelhalim; Collet, Guillaume; Gutknecht, Fabien; Got, Jeanne; Eveillard, Damien; Bourdon, Jérémie; Plewniak, Frédéric; Tonon, Thierry; Siegel, Anne

    2017-01-01

    Increasing amounts of sequence data are becoming available for a wide range of non-model organisms. Investigating and modelling the metabolic behaviour of those organisms is highly relevant to understand their biology and ecology. As sequences are often incomplete and poorly annotated, draft networks of their metabolism largely suffer from incompleteness. Appropriate gap-filling methods to identify and add missing reactions are therefore required to address this issue. However, current tools rely on phenotypic or taxonomic information, or are very sensitive to the stoichiometric balance of metabolic reactions, especially concerning the co-factors. This type of information is often not available or at least prone to errors for newly-explored organisms. Here we introduce Meneco, a tool dedicated to the topological gap-filling of genome-scale draft metabolic networks. Meneco reformulates gap-filling as a qualitative combinatorial optimization problem, omitting constraints raised by the stoichiometry of a metabolic network considered in other methods, and solves this problem using Answer Set Programming. Run on several artificial test sets gathering 10,800 degraded Escherichia coli networks Meneco was able to efficiently identify essential reactions missing in networks at high degradation rates, outperforming the stoichiometry-based tools in scalability. To demonstrate the utility of Meneco we applied it to two case studies. Its application to recent metabolic networks reconstructed for the brown algal model Ectocarpus siliculosus and an associated bacterium Candidatus Phaeomarinobacter ectocarpi revealed several candidate metabolic pathways for algal-bacterial interactions. Then Meneco was used to reconstruct, from transcriptomic and metabolomic data, the first metabolic network for the microalga Euglena mutabilis. These two case studies show that Meneco is a versatile tool to complete draft genome-scale metabolic networks produced from heterogeneous data, and to

  13. Meneco, a Topology-Based Gap-Filling Tool Applicable to Degraded Genome-Wide Metabolic Networks.

    PubMed

    Prigent, Sylvain; Frioux, Clémence; Dittami, Simon M; Thiele, Sven; Larhlimi, Abdelhalim; Collet, Guillaume; Gutknecht, Fabien; Got, Jeanne; Eveillard, Damien; Bourdon, Jérémie; Plewniak, Frédéric; Tonon, Thierry; Siegel, Anne

    2017-01-01

    Increasing amounts of sequence data are becoming available for a wide range of non-model organisms. Investigating and modelling the metabolic behaviour of those organisms is highly relevant to understand their biology and ecology. As sequences are often incomplete and poorly annotated, draft networks of their metabolism largely suffer from incompleteness. Appropriate gap-filling methods to identify and add missing reactions are therefore required to address this issue. However, current tools rely on phenotypic or taxonomic information, or are very sensitive to the stoichiometric balance of metabolic reactions, especially concerning the co-factors. This type of information is often not available or at least prone to errors for newly-explored organisms. Here we introduce Meneco, a tool dedicated to the topological gap-filling of genome-scale draft metabolic networks. Meneco reformulates gap-filling as a qualitative combinatorial optimization problem, omitting constraints raised by the stoichiometry of a metabolic network considered in other methods, and solves this problem using Answer Set Programming. Run on several artificial test sets gathering 10,800 degraded Escherichia coli networks Meneco was able to efficiently identify essential reactions missing in networks at high degradation rates, outperforming the stoichiometry-based tools in scalability. To demonstrate the utility of Meneco we applied it to two case studies. Its application to recent metabolic networks reconstructed for the brown algal model Ectocarpus siliculosus and an associated bacterium Candidatus Phaeomarinobacter ectocarpi revealed several candidate metabolic pathways for algal-bacterial interactions. Then Meneco was used to reconstruct, from transcriptomic and metabolomic data, the first metabolic network for the microalga Euglena mutabilis. These two case studies show that Meneco is a versatile tool to complete draft genome-scale metabolic networks produced from heterogeneous data, and to

  14. Students' Perceptions of the Effectiveness of the World Wide Web as a Research and Teaching Tool in Science Learning.

    ERIC Educational Resources Information Center

    Ng, Wan; Gunstone, Richard

    2002-01-01

    Investigates the use of the World Wide Web (WWW) as a research and teaching tool in promoting self-directed learning groups of 15-year-old students. Discusses the perceptions of students of the effectiveness of the WWW in assisting them with the construction of knowledge on photosynthesis and respiration. (Contains 33 references.) (Author/YDS)

  15. Wide Angle Movie

    NASA Technical Reports Server (NTRS)

    1999-01-01

    This brief movie illustrates the passage of the Moon through the Saturn-bound Cassini spacecraft's wide-angle camera field of view as the spacecraft passed by the Moon on the way to its closest approach with Earth on August 17, 1999. From beginning to end of the sequence, 25 wide-angle images (with a spatial image scale of about 14 miles per pixel (about 23 kilometers)were taken over the course of 7 and 1/2 minutes through a series of narrow and broadband spectral filters and polarizers, ranging from the violet to the near-infrared regions of the spectrum, to calibrate the spectral response of the wide-angle camera. The exposure times range from 5 milliseconds to 1.5 seconds. Two of the exposures were smeared and have been discarded and replaced with nearby images to make a smooth movie sequence. All images were scaled so that the brightness of Crisium basin, the dark circular region in the upper right, is approximately the same in every image. The imaging data were processed and released by the Cassini Imaging Central Laboratory for Operations (CICLOPS)at the University of Arizona's Lunar and Planetary Laboratory, Tucson, AZ.

    Photo Credit: NASA/JPL/Cassini Imaging Team/University of Arizona

    Cassini, launched in 1997, is a joint mission of NASA, the European Space Agency and Italian Space Agency. The mission is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Space Science, Washington DC. JPL is a division of the California Institute of Technology, Pasadena, CA.

  16. Using Modern Solid-State Analytical Tools for Investigations of an Advanced Carbon Capture Material: Experiments for the Inorganic Chemistry Laboratory

    ERIC Educational Resources Information Center

    Wriedt, Mario; Sculley, Julian P.; Aulakh, Darpandeep; Zhou, Hong-Cai

    2016-01-01

    A simple and straightforward synthesis of an ultrastable porous metal-organic framework (MOF) based on copper(II) and a mixed N donor ligand system is described as a laboratory experiment for chemistry undergraduate students. These experiments and the resulting analysis are designed to teach students basic research tools and procedures while…

  17. Implementation of a Parameterization Framework for Cybersecurity Laboratories

    DTIC Science & Technology

    2017-03-01

    designer of laboratory exercises with tools to parameterize labs for each student , and automate some aspects of the grading of laboratory exercises. A...is to provide the designer of laboratory exercises with tools to parameterize labs for each student , and automate some aspects of the grading of...support might assist the designer of laboratory exercises to achieve the following? 1. Verify that students performed lab exercises, with some

  18. Gene silencing by siRNAs and antisense oligonucleotides in the laboratory and the clinic

    PubMed Central

    Watts, Jonathan K.; Corey, David R.

    2014-01-01

    Synthetic nucleic acids are commonly used laboratory tools for modulating gene expression and have the potential to be widely used in the clinic. Progress towards nucleic acid drugs, however, has been slow and many challenges remain to be overcome before their full impact on patient care can be understood. Antisense oligonucleotides (ASOs) and small interfering RNAs (siRNAs) are the two most widely used strategies for silencing gene expression. We first describe these two approaches and contrast their relative strengths and weaknesses for laboratory applications. We then review the choices faced during development of clinical candidates and the current state of clinical trials. Attitudes towards clinical development of nucleic acid silencing strategies have repeatedly swung from optimism to depression during the past twenty years. Our goal is to provide the information needed to design robust studies with oligonucleotides, making use of the strengths of each oligonucleotide technology. PMID:22069063

  19. Memory management in genome-wide association studies

    PubMed Central

    2009-01-01

    Genome-wide association is a powerful tool for the identification of genes that underlie common diseases. Genome-wide association studies generate billions of genotypes and pose significant computational challenges for most users including limited computer memory. We applied a recently developed memory management tool to two analyses of North American Rheumatoid Arthritis Consortium studies and measured the performance in terms of central processing unit and memory usage. We conclude that our memory management approach is simple, efficient, and effective for genome-wide association studies. PMID:20018047

  20. Fast 3D Net Expeditions: Tools for Effective Scientific Collaboration on the World Wide Web

    NASA Technical Reports Server (NTRS)

    Watson, Val; Chancellor, Marisa K. (Technical Monitor)

    1996-01-01

    Two new technologies, the FASTexpedition and Remote FAST, have been developed that provide remote, 3D (three dimensional), high resolution, dynamic, interactive viewing of scientific data. The FASTexpedition permits one to access scientific data from the World Wide Web, take guided expeditions through the data, and continue with self controlled expeditions through the data. Remote FAST permits collaborators at remote sites to simultaneously view an analysis of scientific data being controlled by one of the collaborators. Control can be transferred between sites. These technologies are now being used for remote collaboration in joint university, industry, and NASA projects. Also, NASA Ames Research Center has initiated a project to make scientific data and guided expeditions through the data available as FASTexpeditions on the World Wide Web for educational purposes. Previously, remote visualization of dynamic data was done using video format (transmitting pixel information) such as video conferencing or MPEG (Motion Picture Expert Group) movies on the Internet. The concept for this new technology is to send the raw data (e.g., grids, vectors, and scalars) along with viewing scripts over the Internet and have the pixels generated by a visualization tool running on the viewers local workstation. The visualization tool that is currently used is FAST (Flow Analysis Software Toolkit). The advantages of this new technology over using video format are: (1) The visual is much higher in resolution (1280x1024 pixels with 24 bits of color) than typical video format transmitted over the network. (2) The form of the visualization can be controlled interactively (because the viewer is interactively controlling the visualization tool running on his workstation). (3) A rich variety of guided expeditions through the data can be included easily. (4) A capability is provided for other sites to see a visual analysis of one site as the analysis is interactively performed. Control of

  1. Modular workcells: modern methods for laboratory automation.

    PubMed

    Felder, R A

    1998-12-01

    Laboratory automation is beginning to become an indispensable survival tool for laboratories facing difficult market competition. However, estimates suggest that only 8% of laboratories will be able to afford total laboratory automation systems. Therefore, automation vendors have developed alternative hardware configurations called 'modular automation', to fit the smaller laboratory. Modular automation consists of consolidated analyzers, integrated analyzers, modular workcells, and pre- and post-analytical automation. These terms will be defined in this paper. Using a modular automation model, the automated core laboratory will become a site where laboratory data is evaluated by trained professionals to provide diagnostic information to practising physicians. Modem software information management and process control tools will complement modular hardware. Proper standardization that will allow vendor-independent modular configurations will assure success of this revolutionary new technology.

  2. Safety in the Chemical Laboratory: Laboratory Air Quality: Part I. A Concentration Model.

    ERIC Educational Resources Information Center

    Butcher, Samuel S.; And Others

    1985-01-01

    Offers a simple model for estimating vapor concentrations in instructional laboratories. Three methods are described for measuring ventilation rates, and the results of measurements in six laboratories are presented. The model should provide a simple screening tool for evaluating worst-case personal exposures. (JN)

  3. Dynamic optimization case studies in DYNOPT tool

    NASA Astrophysics Data System (ADS)

    Ozana, Stepan; Pies, Martin; Docekal, Tomas

    2016-06-01

    Dynamic programming is typically applied to optimization problems. As the analytical solutions are generally very difficult, chosen software tools are used widely. These software packages are often third-party products bound for standard simulation software tools on the market. As typical examples of such tools, TOMLAB and DYNOPT could be effectively applied for solution of problems of dynamic programming. DYNOPT will be presented in this paper due to its licensing policy (free product under GPL) and simplicity of use. DYNOPT is a set of MATLAB functions for determination of optimal control trajectory by given description of the process, the cost to be minimized, subject to equality and inequality constraints, using orthogonal collocation on finite elements method. The actual optimal control problem is solved by complete parameterization both the control and the state profile vector. It is assumed, that the optimized dynamic model may be described by a set of ordinary differential equations (ODEs) or differential-algebraic equations (DAEs). This collection of functions extends the capability of the MATLAB Optimization Tool-box. The paper will introduce use of DYNOPT in the field of dynamic optimization problems by means of case studies regarding chosen laboratory physical educational models.

  4. Accuracy of Laboratory Data Communication on ICU Daily Rounds Using an Electronic Health Record.

    PubMed

    Artis, Kathryn A; Dyer, Edward; Mohan, Vishnu; Gold, Jeffrey A

    2017-02-01

    Accurately communicating patient data during daily ICU rounds is critically important since data provide the basis for clinical decision making. Despite its importance, high fidelity data communication during interprofessional ICU rounds is assumed, yet unproven. We created a robust but simple methodology to measure the prevalence of inaccurately communicated (misrepresented) data and to characterize data communication failures by type. We also assessed how commonly the rounding team detected data misrepresentation and whether data communication was impacted by environmental, human, and workflow factors. Direct observation of verbalized laboratory data during daily ICU rounds compared with data within the electronic health record and on presenters' paper prerounding notes. Twenty-six-bed academic medical ICU with a well-established electronic health record. ICU rounds presenter (medical student or resident physician), interprofessional rounding team. None. During 301 observed patient presentations including 4,945 audited laboratory results, presenters used a paper prerounding tool for 94.3% of presentations but tools contained only 78% of available electronic health record laboratory data. Ninty-six percent of patient presentations included at least one laboratory misrepresentation (mean, 6.3 per patient) and 38.9% of all audited laboratory data were inaccurately communicated. Most misrepresentation events were omissions. Only 7.8% of all laboratory misrepresentations were detected. Despite a structured interprofessional rounding script and a well-established electronic health record, clinician laboratory data retrieval and communication during ICU rounds at our institution was poor, prone to omissions and inaccuracies, yet largely unrecognized by the rounding team. This highlights an important patient safety issue that is likely widely prevalent, yet underrecognized.

  5. Atomic Oxygen Erosion Yield Predictive Tool for Spacecraft Polymers in Low Earth Orbit

    NASA Technical Reports Server (NTRS)

    Bank, Bruce A.; de Groh, Kim K.; Backus, Jane A.

    2008-01-01

    A predictive tool was developed to estimate the low Earth orbit (LEO) atomic oxygen erosion yield of polymers based on the results of the Polymer Erosion and Contamination Experiment (PEACE) Polymers experiment flown as part of the Materials International Space Station Experiment 2 (MISSE 2). The MISSE 2 PEACE experiment accurately measured the erosion yield of a wide variety of polymers and pyrolytic graphite. The 40 different materials tested were selected specifically to represent a variety of polymers used in space as well as a wide variety of polymer chemical structures. The resulting erosion yield data was used to develop a predictive tool which utilizes chemical structure and physical properties of polymers that can be measured in ground laboratory testing to predict the in-space atomic oxygen erosion yield of a polymer. The properties include chemical structure, bonding information, density and ash content. The resulting predictive tool has a correlation coefficient of 0.914 when compared with actual MISSE 2 space data for 38 polymers and pyrolytic graphite. The intent of the predictive tool is to be able to make estimates of atomic oxygen erosion yields for new polymers without requiring expensive and time consumptive in-space testing.

  6. Virtual Special Issue on Catalysis at the U.S. Department of Energy’s National Laboratories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pruski, Marek; Sadow, Aaron; Slowing, Igor

    Catalysis research at the U.S. Department of Energy's (DOE's) National Laboratories covers a wide range of research topics in heterogeneous catalysis, homogeneous/ molecular catalysis, electrocatalysis, and surface science. Since much of the work at National Laboratories is funded by DOE, the research is largely focused on addressing DOE’s mission to ensure America’s security and prosperity by addressing its energy, environmental, and nuclear challenges through trans-formative science and technology solutions. The catalysis research carried out at the DOE National Laboratories ranges from very fundamental catalysis science, funded by DOE’s Office of Basic Energy Sciences (BES), to applied research and development (R&D)more » in areas such as biomass conversion to fuels and chemicals, fuel cells, and vehicle emission control with primary funding from DOE’s Office of Energy Efficiency and Renewable Energy. National Laboratories are home to many DOE Office of Science national scientific user facilities that provide researchers with the most advanced tools of modern science, including accelerators, colliders, supercomputers, light sources, and neutron sources, as well as facilities for studying the nanoworld and the terrestrial environment. National Laboratory research programs typically feature teams of researchers working closely together, often joining scientists from different disciplines to attack scientific and technical problems using a variety of tools and techniques available at the DOE national scientific user facilities. Along with collaboration between National Laboratory scientists, interactions with university colleagues are common in National Laboratory catalysis R&D. In some cases, scientists have joint appoint-ments at a university and a National Laboratory.« less

  7. Virtual Special Issue on Catalysis at the U.S. Department of Energy’s National Laboratories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pruski, Marek; Sadow, Aaron D.; Slowing, Igor I.

    Catalysis research at the U.S. Department of Energy’s (DOE’s) National Laboratories covers a wide range of research topics in heterogeneous catalysis, homogeneous/molecular catalysis, biocatalysis, electrocatalysis, and surface science. Since much of the work at National Laboratories is funded by DOE, the research is largely focused on addressing DOE’s mission to ensure America’s security and prosperity by addressing its energy, environmental, and nuclear challenges through transformative science and technology solutions. The catalysis research carried out at the DOE National Laboratories ranges from very fundamental catalysis science, funded by DOE’s Office of Basic Energy Sciences (BES), to applied research and development (R&D)more » in areas such as biomass conversion to fuels and chemicals, fuel cells, and vehicle emission control with primary funding from DOE’s Office of Energy Efficiency and Renewable Energy. National Laboratories are home to many DOE Office of Science national scientific user facilities that provide researchers with the most advanced tools of modern science, including accelerators, colliders, supercomputers, light sources, and neutron sources, as well as facilities for studying the nanoworld and the terrestrial environment. National Laboratory research programs typically feature teams of researchers working closely together, often joining scientists from different disciplines to tackle scientific and technical problems using a variety of tools and techniques available at the DOE national scientific user facilities. Along with collaboration between National Laboratory scientists, interactions with university colleagues are common in National Laboratory catalysis R&D. In some cases, scientists have joint appointments at a university and a National Laboratory.« less

  8. PIMMS (Pragmatic Insertional Mutation Mapping System) Laboratory Methodology a Readily Accessible Tool for Identification of Essential Genes in Streptococcus

    PubMed Central

    Blanchard, Adam M.; Egan, Sharon A.; Emes, Richard D.; Warry, Andrew; Leigh, James A.

    2016-01-01

    The Pragmatic Insertional Mutation Mapping (PIMMS) laboratory protocol was developed alongside various bioinformatics packages (Blanchard et al., 2015) to enable detection of essential and conditionally essential genes in Streptococcus and related bacteria. This extended the methodology commonly used to locate insertional mutations in individual mutants to the analysis of mutations in populations of bacteria. In Streptococcus uberis, a pyogenic Streptococcus associated with intramammary infection and mastitis in ruminants, the mutagen pGhost9:ISS1 was shown to integrate across the entire genome. Analysis of >80,000 mutations revealed 196 coding sequences, which were not be mutated and a further 67 where mutation only occurred beyond the 90th percentile of the coding sequence. These sequences showed good concordance with sequences within the database of essential genes and typically matched sequences known to be associated with basic cellular functions. Due to the broad utility of this mutagen and the simplicity of the methodology it is anticipated that PIMMS will be of value to a wide range of laboratories in functional genomic analysis of a wide range of Gram positive bacteria (Streptococcus, Enterococcus, and Lactococcus) of medical, veterinary, and industrial significance. PMID:27826289

  9. Cosmic Rays - A Word-Wide Student Laboratory

    NASA Astrophysics Data System (ADS)

    Adams, Mark

    2017-01-01

    The QuarkNet program has distributed hundreds of cosmic ray detectors for use in high schools and research facilities throughout the world over the last decade. Data collected by those students has been uploaded to a central server where web-based analysis tools enable users to characterize and to analyze everyone's cosmic ray data. Since muons rain down on everyone in the world, all students can participate in this free, high energy particle environment. Through self-directed inquiry students have designed their own experiments: exploring cosmic ray rates and air shower structure; and using muons to measure their speed, time dilation, lifetime, and affects on biological systems. We also plan to expand our annual International Muon Week project to create a large student-led collaboration where similar cosmic ray measurements are performed simultaneously throughout the world.

  10. Computer-Aided Drafting and Design Series. Educational Resources for the Machine Tool Industry, Course Syllabi, [and] Instructor's Handbook. Student Laboratory Manual.

    ERIC Educational Resources Information Center

    Texas State Technical Coll. System, Waco.

    This package consists of course syllabi, an instructor's handbook, and a student laboratory manual for a 2-year vocational training program to prepare students for entry-level employment in computer-aided drafting and design in the machine tool industry. The program was developed through a modification of the DACUM (Developing a Curriculum)…

  11. User Guide for the Plotting Software for the Los Alamos National Laboratory Nuclear Weapons Analysis Tools Version 2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cleland, Timothy James

    The Los Alamos National Laboratory Plotting Software for the Nuclear Weapons Analysis Tools is a Java™ application based upon the open source library JFreeChart. The software provides a capability for plotting data on graphs with a rich variety of display options while allowing the viewer interaction via graph manipulation and scaling to best view the data. The graph types include XY plots, Date XY plots, Bar plots and Histogram plots.

  12. #LancerHealth: Using Twitter and Instagram as a tool in a campus wide health promotion initiative

    PubMed Central

    Santarossa, Sara; Woodruff, Sarah J.

    2018-01-01

    The present study aimed to explore using popular technology that people already have/use as a health promotion tool, in a campus wide social media health promotion initiative, entitled #LancerHealth. During a two-week period the university community was asked to share photos on Twitter and Instagram of What does being healthy on campus look like to you?, while tagging the image with #LancerHealth. All publically tagged media was collected using the Netlytic software and analysed. Text analysis (N=234 records, Twitter; N=141 records, Instagram) revealed that the majority of the conversation was positive and focused on health and the university. Social network analysis, based on five network properties, showed a small network with little interaction. Lastly, photo coding analysis (N=71 unique image) indicated that the majority of the shared images were of physical activity (52%) and on campus (80%). Further research into this area is warranted. Significance for public healthAs digital media continues to become a popular tool among both public health organizations and those in academia, it is important to understand how, why, and which platforms individuals are using in regards to their health. This campus wide, social media health promotion initiative found that people will use popular social networking sites like Twitter and Instagram to share their healthy behaviours. Online social networks, created through social networking sites, can play a role in social diffusion of public health information and health behaviours. In this study, however, social network analysis revealed that there needs to be influential and highly connected individuals sharing information to generate social diffusion. This study can help guide future public health research in the area of social media and its potential influence on health promotion. PMID:29780763

  13. Establishment of National Laboratory Standards in Public and Private Hospital Laboratories

    PubMed Central

    ANJARANI, Soghra; SAFADEL, Nooshafarin; DAHIM, Parisa; AMINI, Rana; MAHDAVI, Saeed; MIRAB SAMIEE, Siamak

    2013-01-01

    In September 2007 national standard manual was finalized and officially announced as the minimal quality requirements for all medical laboratories in the country. Apart from auditing laboratories, Reference Health Laboratory has performed benchmarking auditing of medical laboratory network (surveys) in provinces. 12th benchmarks performed in Tehran and Alborz provinces, Iran in 2010 in three stages. We tried to compare different processes, their quality and accordance with national standard measures between public and private hospital laboratories. The assessment tool was a standardized checklist consists of 164 questions. Analyzing process show although in most cases implementing the standard requirements are more prominent in private laboratories, there is still a long way to complete fulfillment of requirements, and it takes a lot of effort. Differences between laboratories in public and private sectors especially in laboratory personnel and management process are significant. Probably lack of motivation, plays a key role in obtaining less desirable results in laboratories in public sectors. PMID:23514840

  14. Sigma metrics as a tool for evaluating the performance of internal quality control in a clinical chemistry laboratory

    PubMed Central

    Kumar, B. Vinodh; Mohan, Thuthi

    2018-01-01

    OBJECTIVE: Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. MATERIALS AND METHODS: This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. RESULTS: For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. CONCLUSION: This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes. PMID:29692587

  15. Sigma metrics as a tool for evaluating the performance of internal quality control in a clinical chemistry laboratory.

    PubMed

    Kumar, B Vinodh; Mohan, Thuthi

    2018-01-01

    Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes.

  16. Quality Indicators in Laboratory Medicine: the status of the progress of IFCC Working Group "Laboratory Errors and Patient Safety" project.

    PubMed

    Sciacovelli, Laura; Lippi, Giuseppe; Sumarac, Zorica; West, Jamie; Garcia Del Pino Castro, Isabel; Furtado Vieira, Keila; Ivanov, Agnes; Plebani, Mario

    2017-03-01

    The knowledge of error rates is essential in all clinical laboratories as it enables them to accurately identify their risk level, and compare it with those of other laboratories in order to evaluate their performance in relation to the State-of-the-Art (i.e. benchmarking) and define priorities for improvement actions. Although no activity is risk free, it is widely accepted that the risk of error is minimized by the use of Quality Indicators (QIs) managed as a part of laboratory improvement strategy and proven to be suitable monitoring and improvement tools. The purpose of QIs is to keep the error risk at a level that minimizes the likelihood of patients. However, identifying a suitable State-of-the-Art is challenging, because it calls for the knowledge of error rates measured in a variety of laboratories throughout world that differ in their organization and management, context, and the population they serve. Moreover, it also depends on the choice of the events to keep under control and the individual procedure for measurement. Although many laboratory professionals believe that the systemic use of QIs in Laboratory Medicine may be effective in decreasing errors occurring throughout the total testing process (TTP), to improve patient safety as well as to satisfy the requirements of International Standard ISO 15189, they find it difficult to maintain standardized and systematic data collection, and to promote continued high level of interest, commitment and dedication in the entire staff. Although many laboratories worldwide express a willingness to participate to the Model of QIs (MQI) project of IFCC Working Group "Laboratory Errors and Patient Safety", few systematically enter/record their own results and/or use a number of QIs designed to cover all phases of the TTP. Many laboratories justify their inadequate participation in data collection of QIs by claiming that the number of QIs included in the MQI is excessive. However, an analysis of results suggests

  17. Improvement of laboratory turnaround time using lean methodology.

    PubMed

    Gupta, Shradha; Kapil, Sahil; Sharma, Monica

    2018-05-14

    Purpose The purpose of this paper is to discuss the implementation of lean methodology to reduce the turnaround time (TAT) of a clinical laboratory in a super speciality hospital. Delays in report delivery lead to delayed diagnosis increased waiting time and decreased customer satisfaction. The reduction in TAT will lead to increased patient satisfaction, quality of care, employee satisfaction and ultimately the hospital's revenue. Design/methodology/approach The generic causes resulting in increasing TAT of clinical laboratories were identified using lean tools and techniques such as value stream mapping (VSM), Gemba, Pareto Analysis and Root Cause Analysis. VSM was used as a tool to analyze the current state of the process and further VSM was used to design the future state with suggestions for process improvements. Findings This study identified 12 major non-value added factors for the hematology laboratory and 5 major non-value added factors for the biochemistry lab which were acting as bottlenecks resulting in limiting throughput. A four-month research study by the authors together with hospital quality department and laboratory staff members led to reduction of the average TAT from 180 to 95minutes in the hematology lab and from 268 to 208 minutes in the biochemistry lab. Practical implications Very few improvement initiatives in Indian healthcare are based on industrial engineering tools and techniques, which might be due to a lack of interaction between healthcare and engineering. The study provides a positive outcome in terms of improving the efficiency of services in hospitals and identifies a scope for lean in the Indian healthcare sector. Social implications Applying lean in the Indian healthcare sector gives its own potential solution to the problem caused, due to a wide gap between lean accessibility and lean implementation. Lean helped in changing the mindset of an organization toward providing the highest quality of services with faster delivery at

  18. FSPP: A Tool for Genome-Wide Prediction of smORF-Encoded Peptides and Their Functions

    PubMed Central

    Li, Hui; Xiao, Li; Zhang, Lili; Wu, Jiarui; Wei, Bin; Sun, Ninghui; Zhao, Yi

    2018-01-01

    smORFs are small open reading frames of less than 100 codons. Recent low throughput experiments showed a lot of smORF-encoded peptides (SEPs) played crucial rule in processes such as regulation of transcription or translation, transportation through membranes and the antimicrobial activity. In order to gather more functional SEPs, it is necessary to have access to genome-wide prediction tools to give profound directions for low throughput experiments. In this study, we put forward a functional smORF-encoded peptides predictor (FSPP) which tended to predict authentic SEPs and their functions in a high throughput method. FSPP used the overlap of detected SEPs from Ribo-seq and mass spectrometry as target objects. With the expression data on transcription and translation levels, FSPP built two co-expression networks. Combing co-location relations, FSPP constructed a compound network and then annotated SEPs with functions of adjacent nodes. Tested on 38 sequenced samples of 5 human cell lines, FSPP successfully predicted 856 out of 960 annotated proteins. Interestingly, FSPP also highlighted 568 functional SEPs from these samples. After comparison, the roles predicted by FSPP were consistent with known functions. These results suggest that FSPP is a reliable tool for the identification of functional small peptides. FSPP source code can be acquired at https://www.bioinfo.org/FSPP. PMID:29675032

  19. Software Engineering Laboratory (SEL) compendium of tools, revision 1

    NASA Technical Reports Server (NTRS)

    1982-01-01

    A set of programs used to aid software product development is listed. Known as software tools, such programs include requirements analyzers, design languages, precompilers, code auditors, code analyzers, and software librarians. Abstracts, resource requirements, documentation, processing summaries, and availability are indicated for most tools.

  20. Thai clinical laboratory responsible to economic crisis.

    PubMed

    Sirisali, K; Vattanaviboon, P; Manochiopinij, S; Ananskulwat, W

    1999-01-01

    Nowadays, Thailand encounters a serious economic crisis. A clear consensus has been made that a cost-saving system must be the important tool. Both private and government organizations are engaged in this situation. We studied the cost-saving in the clinical laboratory. A questionnaire was distributed to 45 hospital laboratories located in Bangkok. Results showed that efforts to control the cost are the essential policy. There was a variety of factors contributing to the cost-saving process. The usage of public utility, non-recycle material and unnecessary utility were reconsidered. Besides, capital cost (wages and salary) personnel incentive are assessed. Forty three of the 45 respondents had attempted to reduce the cost via curtailing the unnecessary electricity. Eliminating the needless usage of telephone-call. water and unnecessary material was also an effective strategy. A reduction of 86.9%, 80 % and 80.0% of the mentioned factors respectively, was reported. An inventory system of the reagent, chemical and supplies was focused. Most of the laboratories have a policy on cost-saving by decreased the storage. Twenty eight of the 45 laboratories considered to purchase the cheaper with similar quality reagents instead. And some one would purchase a bulky pack when it is the best bargain. A specific system "contact reagent with a free rent instrument" has been used widely (33.3%). Finally, a new personnel management system has been chosen. Workload has rearranged and unnecessary extra-hour work was abandoned.

  1. Vision in laboratory rodents-Tools to measure it and implications for behavioral research.

    PubMed

    Leinonen, Henri; Tanila, Heikki

    2017-07-29

    Mice and rats are nocturnal mammals and their vision is specialized for detection of motion and contrast in dim light conditions. These species possess a large proportion of UV-sensitive cones in their retinas and the majority of their optic nerve axons target superior colliculus rather than visual cortex. Therefore, it was a widely held belief that laboratory rodents hardly utilize vision during day-time behavior. This dogma is being questioned as accumulating evidence suggests that laboratory rodents are able to perform complex visual functions, such as perceiving subjective contours, and that declined vision may affect their performance in many behavioral tasks. For instance, genetic engineering may have unexpected consequences on vision as mouse models of Alzheimer's and Huntington's diseases have declined visual function. Rodent vision can be tested in numerous ways using operant training or reflex-based behavioral tasks, or alternatively using electrophysiological recordings. In this article, we will first provide a summary of visual system and explain its characteristics unique to rodents. Then, we present well-established techniques to test rodent vision, with an emphasis on pattern vision: visual water test, optomotor reflex test, pattern electroretinography and pattern visual evoked potentials. Finally, we highlight the importance of visual phenotyping in rodents. As the number of genetically engineered rodent models and volume of behavioral testing increase simultaneously, the possibility of visual dysfunctions needs to be addressed. Neglect in this matter potentially leads to crude biases in the field of neuroscience and beyond. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Web Delivery of Interactive Laboratories: Comparison of Three Authoring Tools.

    NASA Astrophysics Data System (ADS)

    Silbar, Richard R.

    2001-11-01

    It is well-known that the more the end user (e.g., a student) interacts with a subject, the better he or she will learn it. This is particularly so in technical subjects. One way to do this is to have "laboratories" in which the student manipulates objects on the screen with keyboard or mouse and then sees the outcome of those actions. An example of such a laboratory can be seen at http://www.whistlesoft.com/ silbar/demo/vecadd, which deals with addition of two vectors in the geometric approach. This laboratory was built using Macromedia's Authorware. The problem with Authorware for this purpose is that, if one wants to deliver the training over the Web, that requires the download and installation of a big plug-in. As an experiment I recently tried to build clones of the Vector Addition Laboratory using Macromedia's Director or Flash, each of which have smaller plug-ins which are often already installed in the user's browser. I was able to come up with Director and Flash versions that are similar to (but definitely not the same as) the Authorware version. This talk goes into these differences and demonstrates the techniques used.

  3. Clay Caterpillars: A Tool for Ecology & Evolution Laboratories

    ERIC Educational Resources Information Center

    Barber, Nicholas A.

    2012-01-01

    I present a framework for ecology and evolution laboratory exercises using artificial caterpillars made from modeling clay. Students generate and test hypotheses about predation rates on caterpillars that differ in appearance or "behavior" to understand how natural selection by predators shapes distribution and physical characteristics of…

  4. Teaching Contemporary Physics Topics Using Real-Time Data Obtained via the World Wide Web

    NASA Astrophysics Data System (ADS)

    Post-Zwicker, A. P.; Davis, W.; Grip, R.; McKay, M.; Pfaff, R.; Stotler, D. P.

    1999-12-01

    As a teaching tool, the World Wide Web (WWW) is unprecedented in its ability to transmit information and enhance communication between scientist and student. Just beginning to be developed are sites that actively engage the user in the learning process and provide hands-on methods of teaching contemporary topics. These topics are often not found in the classroom due to the complexity and expense of the laboratory equipment and the WWW is an ideal tool for overcoming this difficulty. This paper presents a model for using the Internet to teach high school students about plasma physics and fusion energy. Students are given access to real-time data, virtual experiments, and communication with professional scientists via email. Preliminary data indicate that student collaboration and student-led learning is encouraged when using the site in the classroom. Scientist/student mentoring is enhanced with this form of communication.

  5. Development of a Tool to Recreate the Mars Science Laboratory Aerothermal Environment

    NASA Technical Reports Server (NTRS)

    Beerman, A. F.; Lewis, M. J.; Santos, J. A.; White, T. R.

    2010-01-01

    The Mars Science Laboratory will enter the Martian atmosphere in 2012 with multiple char depth sensors and in-depth thermocouples in its heatshield. The aerothermal environment experienced by MSL may be computationally recreated using the data from the sensors and a material response program, such as the Fully Implicit Ablation and Thermal (FIAT) response program, through the matching of the char depth and thermocouple predictions of the material response program to the sensor data. A tool, CHanging Inputs from the Environment of FIAT (CHIEF), was developed to iteratively change different environmental conditions such that FIAT predictions match within certain criteria applied to an external data set. The computational environment is changed by iterating on the enthalpy, pressure, or heat transfer coefficient at certain times in the trajectory. CHIEF was initially compared against arc-jet test data from the development of the MSL heatshield and then against simulated sensor data derived from design trajectories for MSL. CHIEF was able to match char depth and in-depth thermocouple temperatures within the bounds placed upon it for these cases. Further refinement of CHIEF to compare multiple time points and assign convergence criteria may improve accuracy.

  6. Electronic laboratory notebooks progress and challenges in implementation.

    PubMed

    Machina, Hari K; Wild, David J

    2013-08-01

    Electronic laboratory notebooks (ELNs) are increasingly replacing paper notebooks in life science laboratories, including those in industry, academic settings, and hospitals. ELNs offer significant advantages over paper notebooks, but adopting them in a predominantly paper-based environment is usually disruptive. The benefits of ELN increase when they are integrated with other laboratory informatics tools such as laboratory information management systems, chromatography data systems, analytical instrumentation, and scientific data management systems, but there is no well-established path for effective integration of these tools. In this article, we review and evaluate some of the approaches that have been taken thus far and also some radical new methods of integration that are emerging.

  7. The Laser Level as an Optics Laboratory Tool

    ERIC Educational Resources Information Center

    Kutzner, Mickey

    2013-01-01

    For decades now, the laser has been used as a handy device for performing ray traces in geometrical optics demonstrations and laboratories. For many ray- trace applications, I have found the laser level 3 to be even more visually compelling and easy for student use than the laser pointer.

  8. Whole Class Laboratories: More Examples

    ERIC Educational Resources Information Center

    Kouh, Minjoon

    2016-01-01

    Typically, introductory physics courses are taught with a combination of lectures and laboratories in which students have opportunities to discover the natural laws through hands-on activities in small groups. This article reports the use of Google Drive, a free online document-sharing tool, in physics laboratories for pooling experimental data…

  9. To other worlds via the laboratory (Invited)

    NASA Astrophysics Data System (ADS)

    Lorenz, R. D.

    2009-12-01

    Planetary science is fun, largely by virtue of the wide range of disciplines and techniques it embraces. Progress relies not only on spacecraft observation and models, but also on laboratory work to provide reference data with which to interpret observations and to provide quantitative constraints on model parameters. An important distinction should be drawn between two classes of investigation. The most familiar, pursued by those who make laboratory studies the focus of their careers, is the construction of well-controlled experiments, typically to determine the functional dependence of some desired physical property upon one or two controlled parameters such as temperature, pressure or concentration. Another class of experiment is more exploratory - to 'see what happens'. This exercise often reveals that models may be based on entirely false assumptions. In some cases laboratory results also have value as persuasive tools in providing graphic support for unfamiliar properties or processes - the iconic image of 'flaming ice' makes the exotic notion of methane clathrate immediately accessible. This talk will review the role of laboratory work in planetary science and especially the outer solar system. A few of the author's personal forays into laboratory measurements will be discussed in the talk; These include the physical properties of dessicated icy loess in the US Army Permafrost tunnel in Alaska (as a Mars analog), the use of a domestic microwave oven to measure radar absorptivity (in particular of ammonia-rich water ice) and the generation of waves - and ice - on the surface of a liquid by wind with fluid and air parameters appropriate to Mars and Titan rather than Earth using the MARSWIT wind tunnel at NASA Ames.

  10. Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Dean N.

    2011-07-20

    This report summarizes work carried out by the Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT) Team for the period of January 1, 2011 through June 30, 2011. It discusses highlights, overall progress, period goals, and collaborations and lists papers and presentations. To learn more about our project, please visit our UV-CDAT website (URL: http://uv-cdat.org). This report will be forwarded to the program manager for the Department of Energy (DOE) Office of Biological and Environmental Research (BER), national and international collaborators and stakeholders, and to researchers working on a wide range of other climate model, reanalysis, and observation evaluation activities. Themore » UV-CDAT executive committee consists of Dean N. Williams of Lawrence Livermore National Laboratory (LLNL); Dave Bader and Galen Shipman of Oak Ridge National Laboratory (ORNL); Phil Jones and James Ahrens of Los Alamos National Laboratory (LANL), Claudio Silva of Polytechnic Institute of New York University (NYU-Poly); and Berk Geveci of Kitware, Inc. The UV-CDAT team consists of researchers and scientists with diverse domain knowledge whose home institutions also include the National Aeronautics and Space Administration (NASA) and the University of Utah. All work is accomplished under DOE open-source guidelines and in close collaboration with the project's stakeholders, domain researchers, and scientists. Working directly with BER climate science analysis projects, this consortium will develop and deploy data and computational resources useful to a wide variety of stakeholders, including scientists, policymakers, and the general public. Members of this consortium already collaborate with other institutions and universities in researching data discovery, management, visualization, workflow analysis, and provenance. The UV-CDAT team will address the following high-level visualization requirements: (1) Alternative parallel streaming statistics and analysis pipelines

  11. RealTime Physics: Active learning laboratory

    NASA Astrophysics Data System (ADS)

    Thornton, Ronald K.; Sokoloff, David R.

    1997-03-01

    Our research shows that student learning of physics concepts in introductory physics courses is enhanced by the use of special guided discovery laboratory curricula which embody the results of educational research and which are supported by the use of the Tools for Scientific Thinking microcomputer-based laboratory (MBL) tools. In this paper we first describe the general characteristics of the research-based RealTime Physics laboratory curricula developed for use in introductory physics classes in colleges, universities and high schools. We then describe RealTime Physics Mechanics in detail. Finally we examine student learning of dynamics in traditional physics courses and in courses using RealTime Physics Mechanics, primarily by the use of correlated questions on the Force and Motion Conceptual Evaluation. We present considerable evidence that students who use the new laboratory curricula demonstrate significantly improved learning and retention of dynamics concepts compared to students taught by traditional methods.

  12. Google+ as a Tool for Use in Cooperative Laboratory Activities between Universities

    ERIC Educational Resources Information Center

    Puig-Ortiz, Joan; Pàmies-Vilà, Rosa; Martinez Miralles, Jordi Ramon

    2015-01-01

    The following is a proposal for collaboration between universities with the aim to improve curricula that require laboratory activities. A methodology is suggested to implement an innovative educational project involving the exchange of laboratory activities. The exchange of laboratory activities can be carried out on different levels of…

  13. MRMPROBS: a data assessment and metabolite identification tool for large-scale multiple reaction monitoring based widely targeted metabolomics.

    PubMed

    Tsugawa, Hiroshi; Arita, Masanori; Kanazawa, Mitsuhiro; Ogiwara, Atsushi; Bamba, Takeshi; Fukusaki, Eiichiro

    2013-05-21

    We developed a new software program, MRMPROBS, for widely targeted metabolomics by using the large-scale multiple reaction monitoring (MRM) mode. The strategy became increasingly popular for the simultaneous analysis of up to several hundred metabolites at high sensitivity, selectivity, and quantitative capability. However, the traditional method of assessing measured metabolomics data without probabilistic criteria is not only time-consuming but is often subjective and makeshift work. Our program overcomes these problems by detecting and identifying metabolites automatically, by separating isomeric metabolites, and by removing background noise using a probabilistic score defined as the odds ratio from an optimized multivariate logistic regression model. Our software program also provides a user-friendly graphical interface to curate and organize data matrices and to apply principal component analyses and statistical tests. For a demonstration, we conducted a widely targeted metabolome analysis (152 metabolites) of propagating Saccharomyces cerevisiae measured at 15 time points by gas and liquid chromatography coupled to triple quadrupole mass spectrometry. MRMPROBS is a useful and practical tool for the assessment of large-scale MRM data available to any instrument or any experimental condition.

  14. Variability of ethics education in laboratory medicine training programs: results of an international survey.

    PubMed

    Bruns, David E; Burtis, Carl A; Gronowski, Ann M; McQueen, Matthew J; Newman, Anthony; Jonsson, Jon J

    2015-03-10

    Ethical considerations are increasingly important in medicine. We aimed to determine the mode and extent of teaching of ethics in training programs in clinical chemistry and laboratory medicine. We developed an on-line survey of teaching in areas of ethics relevant to laboratory medicine. Reponses were invited from directors of training programs who were recruited via email to leaders of national organizations. The survey was completed by 80 directors from 24 countries who directed 113 programs. The largest numbers of respondents directed postdoctoral training of scientists (42%) or physicians (33%), post-masters degree programs (33%), and PhD programs (29%). Most programs (82%) were 2years or longer in duration. Formal training was offered in research ethics by 39%, medical ethics by 31%, professional ethics by 24% and business ethics by 9%. The number of reported hours of formal training varied widely, e.g., from 0 to >15h/year for research ethics and from 0 to >15h for medical ethics. Ethics training was required and/or tested in 75% of programs that offered training. A majority (54%) of respondents reported plans to add or enhance training in ethics; many indicated a desire for online resources related to ethics, especially resources with self-assessment tools. Formal teaching of ethics is absent from many training programs in clinical chemistry and laboratory medicine, with heterogeneity in the extent and methods of ethics training among the programs that provide the training. A perceived need exists for online training tools, especially tools with self-assessment components. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. SwarmSight: Measuring the Temporal Progression of Animal Group Activity Levels from Natural Scene and Laboratory Videos

    PubMed Central

    Birgiolas, Justas; Jernigan, Christopher M.; Smith, Brian H.; Crook, Sharon M.

    2016-01-01

    We describe SwarmSight (available at: https://github.com/justasb/SwarmSight), a novel, open-source, Microsoft Windows software tool for quantitative assessment of the temporal progression of animal group activity levels from recorded videos. The tool utilizes a background subtraction machine vision algorithm and provides an activity metric that can be used to quantitatively assess and compare animal group behavior. Here we demonstrate the tool utility by analyzing defensive bee behavior as modulated by alarm pheromones, wild bird feeding onset and interruption, and cockroach nest finding activity. While more sophisticated, commercial software packages are available, SwarmSight provides a low-cost, open-source, and easy-to-use alternative that is suitable for a wide range of users, including minimally trained research technicians and behavioral science undergraduate students in classroom laboratory settings. PMID:27130170

  16. Genomics Virtual Laboratory: A Practical Bioinformatics Workbench for the Cloud

    PubMed Central

    Afgan, Enis; Sloggett, Clare; Goonasekera, Nuwan; Makunin, Igor; Benson, Derek; Crowe, Mark; Gladman, Simon; Kowsar, Yousef; Pheasant, Michael; Horst, Ron; Lonie, Andrew

    2015-01-01

    Background Analyzing high throughput genomics data is a complex and compute intensive task, generally requiring numerous software tools and large reference data sets, tied together in successive stages of data transformation and visualisation. A computational platform enabling best practice genomics analysis ideally meets a number of requirements, including: a wide range of analysis and visualisation tools, closely linked to large user and reference data sets; workflow platform(s) enabling accessible, reproducible, portable analyses, through a flexible set of interfaces; highly available, scalable computational resources; and flexibility and versatility in the use of these resources to meet demands and expertise of a variety of users. Access to an appropriate computational platform can be a significant barrier to researchers, as establishing such a platform requires a large upfront investment in hardware, experience, and expertise. Results We designed and implemented the Genomics Virtual Laboratory (GVL) as a middleware layer of machine images, cloud management tools, and online services that enable researchers to build arbitrarily sized compute clusters on demand, pre-populated with fully configured bioinformatics tools, reference datasets and workflow and visualisation options. The platform is flexible in that users can conduct analyses through web-based (Galaxy, RStudio, IPython Notebook) or command-line interfaces, and add/remove compute nodes and data resources as required. Best-practice tutorials and protocols provide a path from introductory training to practice. The GVL is available on the OpenStack-based Australian Research Cloud (http://nectar.org.au) and the Amazon Web Services cloud. The principles, implementation and build process are designed to be cloud-agnostic. Conclusions This paper provides a blueprint for the design and implementation of a cloud-based Genomics Virtual Laboratory. We discuss scope, design considerations and technical and

  17. Genomics Virtual Laboratory: A Practical Bioinformatics Workbench for the Cloud.

    PubMed

    Afgan, Enis; Sloggett, Clare; Goonasekera, Nuwan; Makunin, Igor; Benson, Derek; Crowe, Mark; Gladman, Simon; Kowsar, Yousef; Pheasant, Michael; Horst, Ron; Lonie, Andrew

    2015-01-01

    Analyzing high throughput genomics data is a complex and compute intensive task, generally requiring numerous software tools and large reference data sets, tied together in successive stages of data transformation and visualisation. A computational platform enabling best practice genomics analysis ideally meets a number of requirements, including: a wide range of analysis and visualisation tools, closely linked to large user and reference data sets; workflow platform(s) enabling accessible, reproducible, portable analyses, through a flexible set of interfaces; highly available, scalable computational resources; and flexibility and versatility in the use of these resources to meet demands and expertise of a variety of users. Access to an appropriate computational platform can be a significant barrier to researchers, as establishing such a platform requires a large upfront investment in hardware, experience, and expertise. We designed and implemented the Genomics Virtual Laboratory (GVL) as a middleware layer of machine images, cloud management tools, and online services that enable researchers to build arbitrarily sized compute clusters on demand, pre-populated with fully configured bioinformatics tools, reference datasets and workflow and visualisation options. The platform is flexible in that users can conduct analyses through web-based (Galaxy, RStudio, IPython Notebook) or command-line interfaces, and add/remove compute nodes and data resources as required. Best-practice tutorials and protocols provide a path from introductory training to practice. The GVL is available on the OpenStack-based Australian Research Cloud (http://nectar.org.au) and the Amazon Web Services cloud. The principles, implementation and build process are designed to be cloud-agnostic. This paper provides a blueprint for the design and implementation of a cloud-based Genomics Virtual Laboratory. We discuss scope, design considerations and technical and logistical constraints, and explore the

  18. Next generation tools for genomic data generation, distribution, and visualization

    PubMed Central

    2010-01-01

    Background With the rapidly falling cost and availability of high throughput sequencing and microarray technologies, the bottleneck for effectively using genomic analysis in the laboratory and clinic is shifting to one of effectively managing, analyzing, and sharing genomic data. Results Here we present three open-source, platform independent, software tools for generating, analyzing, distributing, and visualizing genomic data. These include a next generation sequencing/microarray LIMS and analysis project center (GNomEx); an application for annotating and programmatically distributing genomic data using the community vetted DAS/2 data exchange protocol (GenoPub); and a standalone Java Swing application (GWrap) that makes cutting edge command line analysis tools available to those who prefer graphical user interfaces. Both GNomEx and GenoPub use the rich client Flex/Flash web browser interface to interact with Java classes and a relational database on a remote server. Both employ a public-private user-group security model enabling controlled distribution of patient and unpublished data alongside public resources. As such, they function as genomic data repositories that can be accessed manually or programmatically through DAS/2-enabled client applications such as the Integrated Genome Browser. Conclusions These tools have gained wide use in our core facilities, research laboratories and clinics and are freely available for non-profit use. See http://sourceforge.net/projects/gnomex/, http://sourceforge.net/projects/genoviz/, and http://sourceforge.net/projects/useq. PMID:20828407

  19. Software engineering laboratory series: Annotated bibliography of software engineering laboratory literature

    NASA Technical Reports Server (NTRS)

    Morusiewicz, Linda; Valett, Jon

    1992-01-01

    This document is an annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory. More than 100 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials have been grouped into eight general subject areas for easy reference: (1) the Software Engineering Laboratory; (2) the Software Engineering Laboratory: Software Development Documents; (3) Software Tools; (4) Software Models; (5) Software Measurement; (6) Technology Evaluations; (7) Ada Technology; and (8) Data Collection. This document contains an index of these publications classified by individual author.

  20. Laboratory Molecular Astrophysics as an Invaluable Tool in understanding Astronomical Observations.

    NASA Astrophysics Data System (ADS)

    Fraser, Helen Jane

    2015-08-01

    We are entering the decade of molecular astrochemistry: spectroscopic data pertaining to the interactions between baryonic matter and electromagnetic radiation are now at the forefront of astronomical observations. Elucidating such data is reliant on inputs from laboratory experiments, modeling, and theoretical chemistry / physics, a field that is intended to be a key focus for the proposed new commission in Laboratory Astrophysics.Here, we propose a “tour de force” review of some recent successes since the last GA in molecular astrophysics, particularly those that have been directly facilitated by laboratory data in Astrochemistry. It is vital to highlight to the astronomers that the absence of laboratory data from the literature would otherwise have precluded advances in our astronomical understanding, e.g:the detection of gas-phase water deep in pre-stellar cores,the detection of water and other molecular species in gravitationally lensed galaxies at z~6“Jumps” in the appearance or disappearance of molecules, including the very recent detection of the first branched organic molecule in the ISM, iso-propyl-cyanide,disentangling dense spectroscopic features in the sub-mm as measured by ALMA, Herschel and SOFIA, the so-called “weeds” and “flowers”,the first ''image'' of a CO snow-line in a protoplanetary disk.Looking forward, the advent of high spatial and spectral resolution telescopes, particularly ALMA, SKA E-ELT and JWST, will continue to drive forward the needs and interests of laboratory astrochemistry in the coming decade. We will look forward to five key areas where advances are expected, and both observational and laboratory techniques are evolving:-(a) understanding star forming regions at very high spatial and spectral senstivity and resolution(b) extragalactic astrochemistry(c) (exo-)planetary atmospheres, surfaces and Solar System sample return - linkinginterstellar and planetary chemistry(d) astrobiology - linking simple molecular

  1. Digitizing and Securing Archived Laboratory Notebooks

    ERIC Educational Resources Information Center

    Caporizzo, Marilyn

    2008-01-01

    The Information Group at Millipore has been successfully using a digital rights management tool to secure the email distribution of archived laboratory notebooks. Millipore is a life science leader providing cutting-edge technologies, tools, and services for bioscience research and biopharmaceutical manufacturing. Consisting of four full-time…

  2. Integrating Reservations and Queuing in Remote Laboratory Scheduling

    ERIC Educational Resources Information Center

    Lowe, D.

    2013-01-01

    Remote laboratories (RLs) have become increasingly seen as a useful tool in supporting flexible shared access to scarce laboratory resources. An important element in supporting shared access is coordinating the scheduling of the laboratory usage. Optimized scheduling can significantly decrease access waiting times and improve the utilization level…

  3. Biomechanical ToolKit: Open-source framework to visualize and process biomechanical data.

    PubMed

    Barre, Arnaud; Armand, Stéphane

    2014-04-01

    C3D file format is widely used in the biomechanical field by companies and laboratories to store motion capture systems data. However, few software packages can visualize and modify the integrality of the data in the C3D file. Our objective was to develop an open-source and multi-platform framework to read, write, modify and visualize data from any motion analysis systems using standard (C3D) and proprietary file formats (used by many companies producing motion capture systems). The Biomechanical ToolKit (BTK) was developed to provide cost-effective and efficient tools for the biomechanical community to easily deal with motion analysis data. A large panel of operations is available to read, modify and process data through C++ API, bindings for high-level languages (Matlab, Octave, and Python), and standalone application (Mokka). All these tools are open-source and cross-platform and run on all major operating systems (Windows, Linux, MacOS X). Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  4. #LancerHealth: Using Twitter and Instagram as a tool in a campus wide health promotion initiative.

    PubMed

    Santarossa, Sara; Woodruff, Sarah J

    2018-02-05

    The present study aimed to explore using popular technology that people already have/use as a health promotion tool, in a campus wide social media health promotion initiative, entitled #LancerHealth . During a two-week period the university community was asked to share photos on Twitter and Instagram of What does being healthy on campus look like to you ?, while tagging the image with #LancerHealth . All publically tagged media was collected using the Netlytic software and analysed. Text analysis (N=234 records, Twitter; N=141 records, Instagram) revealed that the majority of the conversation was positive and focused on health and the university. Social network analysis, based on five network properties, showed a small network with little interaction. Lastly, photo coding analysis (N=71 unique image) indicated that the majority of the shared images were of physical activity (52%) and on campus (80%). Further research into this area is warranted.

  5. Antibiogramj: A tool for analysing images from disk diffusion tests.

    PubMed

    Alonso, C A; Domínguez, C; Heras, J; Mata, E; Pascual, V; Torres, C; Zarazaga, M

    2017-05-01

    Disk diffusion testing, known as antibiogram, is widely applied in microbiology to determine the antimicrobial susceptibility of microorganisms. The measurement of the diameter of the zone of growth inhibition of microorganisms around the antimicrobial disks in the antibiogram is frequently performed manually by specialists using a ruler. This is a time-consuming and error-prone task that might be simplified using automated or semi-automated inhibition zone readers. However, most readers are usually expensive instruments with embedded software that require significant changes in laboratory design and workflow. Based on the workflow employed by specialists to determine the antimicrobial susceptibility of microorganisms, we have designed a software tool that, from images of disk diffusion tests, semi-automatises the process. Standard computer vision techniques are employed to achieve such an automatisation. We present AntibiogramJ, a user-friendly and open-source software tool to semi-automatically determine, measure and categorise inhibition zones of images from disk diffusion tests. AntibiogramJ is implemented in Java and deals with images captured with any device that incorporates a camera, including digital cameras and mobile phones. The fully automatic procedure of AntibiogramJ for measuring inhibition zones achieves an overall agreement of 87% with an expert microbiologist; moreover, AntibiogramJ includes features to easily detect when the automatic reading is not correct and fix it manually to obtain the correct result. AntibiogramJ is a user-friendly, platform-independent, open-source, and free tool that, up to the best of our knowledge, is the most complete software tool for antibiogram analysis without requiring any investment in new equipment or changes in the laboratory. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Final Report for DE-SC0002298 Agency Number: DE-PS02-09ER09-01 An Advanced Network and distributed Storage Laboratory (ANDSL) for Data Intensive Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Livny, Miron

    2014-08-17

    The original intent of this project was to build and operate an Advanced Network and Distributed Storage Laboratory (ANDSL) for Data Intensive Science that will prepare the Open Science Grid (OSG) community for a new generation of wide area communication capabilities operating at a 100Gb rate. Given the significant cut in our proposed budget we changed the scope of the ANDSL to focus on the software aspects of the laboratory – workload generators and monitoring tools and on the offering of experimental data to the ANI project. The main contributions of our work are twofold: early end-user input and experimentalmore » data to the ANI project and software tools for conducting large scale end-to-end data placement experiments.« less

  7. Engineering Laboratory Instruction in Virtual Environment--"eLIVE"

    ERIC Educational Resources Information Center

    Chaturvedi, Sushil; Prabhakaran, Ramamurthy; Yoon, Jaewan; Abdel-Salam, Tarek

    2011-01-01

    A novel application of web-based virtual laboratories to prepare students for physical experiments is explored in some detail. The pedagogy of supplementing physical laboratory with web-based virtual laboratories is implemented by developing a web-based tool, designated in this work as "eLIVE", an acronym for Engineering Laboratory…

  8. Laboratory Diagnosis of Zika Virus Infection.

    PubMed

    Landry, Marie Louise; St George, Kirsten

    2017-01-01

    -The rapid and accurate diagnosis of Zika virus infection is an international priority. -To review current recommendations, methods, limitations, and priorities for Zika virus testing. -Sources include published literature, public health recommendations, laboratory procedures, and testing experience. -Until recently, the laboratory diagnosis of Zika infection was confined to public health or research laboratories that prepared their own reagents, and test capacity has been limited. Furthermore, Zika cross-reacts serologically with other flaviviruses, such as dengue, West Nile, and yellow fever. Current or past infection, or even vaccination with another flavivirus, will often cause false-positive or uninterpretable Zika serology results. Detection of viral RNA during acute infection using nucleic acid amplification tests provides more specific results, and a number of commercial nucleic acid amplification tests have received emergency use authorization. In addition to serum, testing of whole blood and urine is recommended because of the higher vial loads and longer duration of shedding. However, nucleic acid amplification testing has limited utility because many patients are asymptomatic or present for testing after the brief period of Zika shedding has passed. Thus, the greatest need and most difficult challenge is development of accurate antibody tests for the diagnosis of recent Zika infection. Research is urgently needed to identify Zika virus epitopes that do not cross-react with other flavivirus antigens. New information is emerging at a rapid pace and, with ongoing public-private and international collaborations and government support, it is hoped that rapid progress will be made in developing robust and widely applicable diagnostic tools.

  9. Powder X-ray diffraction laboratory, Reston, Virginia

    USGS Publications Warehouse

    Piatak, Nadine M.; Dulong, Frank T.; Jackson, John C.; Folger, Helen W.

    2014-01-01

    The powder x-ray diffraction (XRD) laboratory is managed jointly by the Eastern Mineral and Environmental Resources and Eastern Energy Resources Science Centers. Laboratory scientists collaborate on a wide variety of research problems involving other U.S. Geological Survey (USGS) science centers and government agencies, universities, and industry. Capabilities include identification and quantification of crystalline and amorphous phases, and crystallographic and atomic structure analysis for a wide variety of sample media. Customized laboratory procedures and analyses commonly are used to characterize non-routine samples including, but not limited to, organic and inorganic components in petroleum source rocks, ore and mine waste, clay minerals, and glassy phases. Procedures can be adapted to meet a variety of research objectives.

  10. Tools for Achieving TQE.

    ERIC Educational Resources Information Center

    Latta, Raymond F.; Downey, Carolyn J.

    This book presents a wide array of sophisticated problem-solving tools and shows how to use them in a humanizing way that involves all stakeholders in the process. Chapter 1 develops the rationale for educational stakeholders to consider quality tools. Chapter 2 highlights three quality group-process tools--brainstorming, the nominal group…

  11. The effectiveness of digital microscopy as a teaching tool in medical laboratory science curriculum.

    PubMed

    Castillo, Demetra

    2012-01-01

    A fundamental component to the practice of Medical Laboratory Science (MLS) is the microscope. While traditional microscopy (TM) is gold standard, the high cost of maintenance has led to an increased demand for alternative methods, such as digital microscopy (DM). Slides embedded with blood specimens are converted into a digital form that can be run with computer driven software. The aim of this study was to investigate the effectiveness of digital microscopy as a teaching tool in the field of Medical Laboratory Science. Participants reviewed known study slides using both traditional and digital microscopy methods and were assessed using both methods. Participants were randomly divided into two groups. Group 1 performed TM as the primary method and DM as the alternate. Group 2 performed DM as the primary and TM as the alternate. Participants performed differentials with their primary method, were assessed with both methods, and then performed differentials with their alternate method. A detailed assessment rubric was created to determine the accuracy of student responses through comparison of clinical laboratory and instructor results. Student scores were reflected as a percentage correct from these methods. This assessment was done over two different classes. When comparing results between methods for each, independent of the primary method used, results were not statistically different. However, when comparing methods between groups, Group 1 (n = 11) (TM = 73.79% +/- 9.19, DM = 81.43% +/- 8.30; paired t10 = 0.182, p < 0.001) showed a significant difference from Group 2 (n = 14) (TM = 85.64% +/- 5.30, DM = 85.91% +/- 7.62; paired t13 = 3.647, p = 0.860). In the subsequent class, results between both groups (n = 13, n = 16, respectively) did not show any significant difference between groups (Group 1 TM = 86.38% +/- 8.17, Group 1 DM = 88.69% +/- 3.86; paired t12 = 1.253, p = 0.234; Group 2 TM = 86.75% +/- 5.37, Group 2 DM = 86.25% +/- 7.01, paired t15 = 0.280, p

  12. Raising Virtual Laboratories in Australia onto global platforms

    NASA Astrophysics Data System (ADS)

    Wyborn, L. A.; Barker, M.; Fraser, R.; Evans, B. J. K.; Moloney, G.; Proctor, R.; Moise, A. F.; Hamish, H.

    2016-12-01

    Across the globe, Virtual Laboratories (VLs), Science Gateways (SGs), and Virtual Research Environments (VREs) are being developed that enable users who are not co-located to actively work together at various scales to share data, models, tools, software, workflows, best practices, etc. Outcomes range from enabling `long tail' researchers to more easily access specific data collections, to facilitating complex workflows on powerful supercomputers. In Australia, government funding has facilitated the development of a range of VLs through the National eResearch Collaborative Tools and Resources (NeCTAR) program. The VLs provide highly collaborative, research-domain oriented, integrated software infrastructures that meet user community needs. Twelve VLs have been funded since 2012, including the Virtual Geophysics Laboratory (VGL); Virtual Hazards, Impact and Risk Laboratory (VHIRL); Climate and Weather Science Laboratory (CWSLab); Marine Virtual Laboratory (MarVL); and Biodiversity and Climate Change Virtual Laboratory (BCCVL). These VLs share similar technical challenges, with common issues emerging on integration of tools, applications and access data collections via both cloud-based environments and other distributed resources. While each VL began with a focus on a specific research domain, communities of practice have now formed across the VLs around common issues, and facilitate identification of best practice case studies, and new standards. As a result, tools are now being shared where the VLs access data via data services using international standards such as ISO, OGC, W3C. The sharing of these approaches is starting to facilitate re-usability of infrastructure and is a step towards supporting interdisciplinary research. Whilst the focus of the VLs are Australia-centric, by using standards, these environments are able to be extended to analysis on other international datasets. Many VL datasets are subsets of global datasets and so extension to global is a

  13. Web Delivery of Interactive Laboratories: Comparison of Three Authoring Tools

    NASA Astrophysics Data System (ADS)

    Silbar, Richard R.

    2002-04-01

    It is well-known that the more a student interacts with a subject, the better he or she will learn it. This is particularly true in technical subjects. One way to do this is to have computer-based "laboratories" in which the student manipulates objects on the screen with keyboard or mouse and then sees the outcome of those actions. One example of such a laboratory we have built, using Macromedia's Authorware, deals with addition of two vectors in the geometric approach. The problem with Authorware, however, is that, if one wants to deliver the training over the Web, that requires the download and installation of a big plug-in. Therefore, as an experiment, I built clones of the Vector Addition Laboratory using Macromedia's Director or Flash, each of which have smaller plug-ins which are often already installed in the user's browser. The Director and Flash versions are similar to (but definitely not the same as) the Authorware version. This talk goes into these differences and demonstrates the techniques used. You can view the three examples on-line at http://www.whistlesoft.com/ silbar.

  14. Use of a collaborative tool to simplify the outsourcing of preclinical safety studies: an insight into the AstraZeneca-Charles River Laboratories strategic relationship.

    PubMed

    Martin, Frederic D C; Benjamin, Amanda; MacLean, Ruth; Hollinshead, David M; Landqvist, Claire

    2017-12-01

    In 2012, AstraZeneca entered into a strategic relationship with Charles River Laboratories whereby preclinical safety packages comprising safety pharmacology, toxicology, formulation analysis, in vivo ADME, bioanalysis and pharmacokinetics studies are outsourced. New processes were put in place to ensure seamless workflows with the aim of accelerating the delivery of new medicines to patients. Here, we describe in more detail the AstraZeneca preclinical safety outsourcing model and the way in which a collaborative tool has helped to translate the processes in AstraZeneca and Charles River Laboratories into simpler integrated workflows that are efficient and visible across the two companies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. World wide matching of registration metrology tools of various generations

    NASA Astrophysics Data System (ADS)

    Laske, F.; Pudnos, A.; Mackey, L.; Tran, P.; Higuchi, M.; Enkrich, C.; Roeth, K.-D.; Schmidt, K.-H.; Adam, D.; Bender, J.

    2008-10-01

    Turn around time/cycle time is a key success criterion in the semiconductor photomask business. Therefore, global mask suppliers typically allocate work loads based on fab capability and utilization capacity. From a logistical point of view, the manufacturing location of a photomask should be transparent to the customer (mask user). Matching capability of production equipment and especially metrology tools is considered a key enabler to guarantee cross site manufacturing flexibility. Toppan, with manufacturing sites in eight countries worldwide, has an on-going program to match the registration metrology systems of all its production sites. This allows for manufacturing flexibility and risk mitigation.In cooperation with Vistec Semiconductor Systems, Toppan has recently completed a program to match the Vistec LMS IPRO systems at all production sites worldwide. Vistec has developed a new software feature which allows for significantly improved matching of LMS IPRO(x) registration metrology tools of various generations. We will report on the results of the global matching campaign of several of the leading Toppan sites.

  16. Video Observation as a Tool to Analyze and Modify an Electronics Laboratory

    NASA Astrophysics Data System (ADS)

    Coppens, Pieter; Van den Bossche, Johan; De Cock, Mieke

    2016-12-01

    Laboratories are an important part of science and engineering education, especially in the field of electronics. Yet very little research into the benefits of such labs to student learning exists. In particular, it is not well known what students do and, even more importantly, think during electronics laboratories. Therefore, we conducted a study based on video observation of second year students at 3 university campuses in Belgium during a traditional lab on first order R C filters. In this laboratory, students spent the majority of their time performing measurements, while very little time was spent processing or discussing the results. This in turn resulted in hardly any time spent talking about content knowledge. Based on those observations, a new laboratory was designed that includes a preparation with a virtual oscilloscope, a black box approach during the lab session itself, and a form of quick reporting at the end of the lab. This adjusted laboratory was evaluated using the same methodology and was more successful in the sense that the students spent less time gathering measurements and more time processing and analyzing them, resulting in more content-based discussion.

  17. iScreen: Image-Based High-Content RNAi Screening Analysis Tools.

    PubMed

    Zhong, Rui; Dong, Xiaonan; Levine, Beth; Xie, Yang; Xiao, Guanghua

    2015-09-01

    High-throughput RNA interference (RNAi) screening has opened up a path to investigating functional genomics in a genome-wide pattern. However, such studies are often restricted to assays that have a single readout format. Recently, advanced image technologies have been coupled with high-throughput RNAi screening to develop high-content screening, in which one or more cell image(s), instead of a single readout, were generated from each well. This image-based high-content screening technology has led to genome-wide functional annotation in a wider spectrum of biological research studies, as well as in drug and target discovery, so that complex cellular phenotypes can be measured in a multiparametric format. Despite these advances, data analysis and visualization tools are still largely lacking for these types of experiments. Therefore, we developed iScreen (image-Based High-content RNAi Screening Analysis Tool), an R package for the statistical modeling and visualization of image-based high-content RNAi screening. Two case studies were used to demonstrate the capability and efficiency of the iScreen package. iScreen is available for download on CRAN (http://cran.cnr.berkeley.edu/web/packages/iScreen/index.html). The user manual is also available as a supplementary document. © 2014 Society for Laboratory Automation and Screening.

  18. Investigating the Validity of Two Widely Used Quantitative Text Tools

    ERIC Educational Resources Information Center

    Cunningham, James W.; Hiebert, Elfrieda H.; Mesmer, Heidi Anne

    2018-01-01

    In recent years, readability formulas have gained new prominence as a basis for selecting texts for learning and assessment. Variables that quantitative tools count (e.g., word frequency, sentence length) provide valid measures of text complexity insofar as they accurately predict representative and high-quality criteria. The longstanding…

  19. Applications of mid-infrared spectroscopy in the clinical laboratory setting.

    PubMed

    De Bruyne, Sander; Speeckaert, Marijn M; Delanghe, Joris R

    2018-01-01

    Fourier transform mid-infrared (MIR-FTIR) spectroscopy is a nondestructive, label-free, highly sensitive and specific technique that provides complete information on the chemical composition of biological samples. The technique both can offer fundamental structural information and serve as a quantitative analysis tool. Therefore, it has many potential applications in different fields of clinical laboratory science. Although considerable technological progress has been made to promote biomedical applications of this powerful analytical technique, most clinical laboratory analyses are based on spectroscopic measurements in the visible or ultraviolet (UV) spectrum and the potential role of FTIR spectroscopy still remains unexplored. In this review, we present some general principles of FTIR spectroscopy as a useful method to study molecules in specimens by MIR radiation together with a short overview of methods to interpret spectral data. We aim at illustrating the wide range of potential applications of the proposed technique in the clinical laboratory setting with a focus on its advantages and limitations and discussing the future directions. The reviewed applications of MIR spectroscopy include (1) quantification of clinical parameters in body fluids, (2) diagnosis and monitoring of cancer and other diseases by analysis of body fluids, cells, and tissues, (3) classification of clinically relevant microorganisms, and (4) analysis of kidney stones, nails, and faecal fat.

  20. Wiki Laboratory Notebooks: Supporting Student Learning in Collaborative Inquiry-Based Laboratory Experiments

    ERIC Educational Resources Information Center

    Lawrie, Gwendolyn Angela; Grøndahl, Lisbeth; Boman, Simon; Andrews, Trish

    2016-01-01

    Recent examples of high-impact teaching practices in the undergraduate chemistry laboratory that include course-based undergraduate research experiences and inquiry-based experiments require new approaches to assessing individual student learning outcomes. Instructors require tools and strategies that can provide them with insight into individual…

  1. The Leverage Effect on Wealth Distribution in a Controllable Laboratory Stock Market

    PubMed Central

    Zhu, Chenge; Yang, Guang; An, Kenan; Huang, Jiping

    2014-01-01

    Wealth distribution has always been an important issue in our economic and social life, since it affects the harmony and stabilization of the society. Under the background of widely used financial tools to raise leverage these years, we studied the leverage effect on wealth distribution of a population in a controllable laboratory market in which we have conducted several human experiments, and drawn the conclusion that higher leverage leads to a higher Gini coefficient in the market. A higher Gini coefficient means the wealth distribution among a population becomes more unequal. This is a result of the ascending risk with growing leverage level in the market plus the diversified trading abilities and risk preference of the participants. This work sheds light on the effects of leverage and its related regulations, especially its impact on wealth distribution. It also shows the capability of the method of controllable laboratory markets which could be helpful in several fields of study such as economics, econophysics and sociology. PMID:24968222

  2. The leverage effect on wealth distribution in a controllable laboratory stock market.

    PubMed

    Zhu, Chenge; Yang, Guang; An, Kenan; Huang, Jiping

    2014-01-01

    Wealth distribution has always been an important issue in our economic and social life, since it affects the harmony and stabilization of the society. Under the background of widely used financial tools to raise leverage these years, we studied the leverage effect on wealth distribution of a population in a controllable laboratory market in which we have conducted several human experiments, and drawn the conclusion that higher leverage leads to a higher Gini coefficient in the market. A higher Gini coefficient means the wealth distribution among a population becomes more unequal. This is a result of the ascending risk with growing leverage level in the market plus the diversified trading abilities and risk preference of the participants. This work sheds light on the effects of leverage and its related regulations, especially its impact on wealth distribution. It also shows the capability of the method of controllable laboratory markets which could be helpful in several fields of study such as economics, econophysics and sociology.

  3. Enabling Data Intensive Science through Service Oriented Science: Virtual Laboratories and Science Gateways

    NASA Astrophysics Data System (ADS)

    Lescinsky, D. T.; Wyborn, L. A.; Evans, B. J. K.; Allen, C.; Fraser, R.; Rankine, T.

    2014-12-01

    We present collaborative work on a generic, modular infrastructure for virtual laboratories (VLs, similar to science gateways) that combine online access to data, scientific code, and computing resources as services that support multiple data intensive scientific computing needs across a wide range of science disciplines. We are leveraging access to 10+ PB of earth science data on Lustre filesystems at Australia's National Computational Infrastructure (NCI) Research Data Storage Infrastructure (RDSI) node, co-located with NCI's 1.2 PFlop Raijin supercomputer and a 3000 CPU core research cloud. The development, maintenance and sustainability of VLs is best accomplished through modularisation and standardisation of interfaces between components. Our approach has been to break up tightly-coupled, specialised application packages into modules, with identified best techniques and algorithms repackaged either as data services or scientific tools that are accessible across domains. The data services can be used to manipulate, visualise and transform multiple data types whilst the scientific tools can be used in concert with multiple scientific codes. We are currently designing a scalable generic infrastructure that will handle scientific code as modularised services and thereby enable the rapid/easy deployment of new codes or versions of codes. The goal is to build open source libraries/collections of scientific tools, scripts and modelling codes that can be combined in specially designed deployments. Additional services in development include: provenance, publication of results, monitoring, workflow tools, etc. The generic VL infrastructure will be hosted at NCI, but can access alternative computing infrastructures (i.e., public/private cloud, HPC).The Virtual Geophysics Laboratory (VGL) was developed as a pilot project to demonstrate the underlying technology. This base is now being redesigned and generalised to develop a Virtual Hazards Impact and Risk Laboratory

  4. The State Public Health Laboratory System.

    PubMed

    Inhorn, Stanley L; Astles, J Rex; Gradus, Stephen; Malmberg, Veronica; Snippes, Paula M; Wilcke, Burton W; White, Vanessa A

    2010-01-01

    This article describes the development since 2000 of the State Public Health Laboratory System in the United States. These state systems collectively are related to several other recent public health laboratory (PHL) initiatives. The first is the Core Functions and Capabilities of State Public Health Laboratories, a white paper that defined the basic responsibilities of the state PHL. Another is the Centers for Disease Control and Prevention National Laboratory System (NLS) initiative, the goal of which is to promote public-private collaboration to assure quality laboratory services and public health surveillance. To enhance the realization of the NLS, the Association of Public Health Laboratories (APHL) launched in 2004 a State Public Health Laboratory System Improvement Program. In the same year, APHL developed a Comprehensive Laboratory Services Survey, a tool to measure improvement through the decade to assure that essential PHL services are provided.

  5. [Outsourcing of clinical laboratory department].

    PubMed

    Murai, T

    2000-03-01

    Recently, to improve financial difficulties at various hospitals, outsourcing of the laboratory department is be coming more wide spread. At the department of clinical pathology of St. Luke's International Hospital, the system, so called, "Branch labo" which is one of the outsourcing laboratory conditions, was adopted in March 1999. In this reports. We described the decision procedure for accepting the situation and the circumstances of operation.

  6. Tools and Metrics for Environmental Sustainability

    EPA Science Inventory

    Within the U.S. Environmental Protection Agency’s Office of Research and Development the National Risk Management Research Laboratory has been developing tools to help design and evaluate chemical processes with a life cycle perspective. These tools include the Waste Reduction (...

  7. Global Clusters as Laboratories for Stellar Evolution

    NASA Technical Reports Server (NTRS)

    Catelan, Marcio; Valcarce, Aldo A. R.; Sweigart, Allen V.

    2010-01-01

    Globular clusters have long been considered the closest approximation to a physicist's laboratory in astrophysics, and as such a near-ideal laboratory for (low-mass) stellar evolution, However, recent observations have cast a shadow on this long-standing paradigm, suggesting the presence of multiple populations with widely different abundance patterns, and - crucially - with widely different helium abundances as welL In this review we discuss which features of the Hertzsprung-Russell diagram may be used as helium abundance indicators, and present an overview of available constraints on the helium abundance in globular clusters,

  8. Laboratory and field measurements and evaluations of vibration at the handles of riveting hammers

    PubMed Central

    McDOWELL, THOMAS W.; WARREN, CHRISTOPHER; WELCOME, DANIEL E.; DONG, REN G.

    2015-01-01

    The use of riveting hammers can expose workers to harmful levels of hand-transmitted vibration (HTV). As a part of efforts to reduce HTV exposures through tool selection, the primary objective of this study was to evaluate the applicability of a standardized laboratory-based riveting hammer assessment protocol for screening riveting hammers. The second objective was to characterize the vibration emissions of reduced vibration riveting hammers and to make approximations of the HTV exposures of workers operating these tools in actual work tasks. Eight pneumatic riveting hammers were selected for the study. They were first assessed in a laboratory using the standardized method for measuring vibration emissions at the tool handle. The tools were then further assessed under actual working conditions during three aircraft sheet metal riveting tasks. Although the average vibration magnitudes of the riveting hammers measured in the laboratory test were considerably different from those measured in the field study, the rank orders of the tools determined via these tests were fairly consistent, especially for the lower vibration tools. This study identified four tools that consistently exhibited lower frequency-weighted and unweighted accelerations in both the laboratory and workplace evaluations. These observations suggest that the standardized riveting hammer test is acceptable for identifying tools that could be expected to exhibit lower vibrations in workplace environments. However, the large differences between the accelerations measured in the laboratory and field suggest that the standardized laboratory-based tool assessment is not suitable for estimating workplace riveting hammer HTV exposures. Based on the frequency-weighted accelerations measured at the tool handles during the three work tasks, the sheet metal mechanics assigned to these tasks at the studied workplace are unlikely to exceed the daily vibration exposure action value (2.5 m s−2) using any of the

  9. Developing a Remote Laboratory for Engineering Education

    ERIC Educational Resources Information Center

    Fabregas, E.; Farias, G.; Dormido-Canto, S.; Dormido, S.; Esquembre, F.

    2011-01-01

    New information technologies provide great opportunities for education. One such opportunity is the use of remote control laboratories for teaching students about control systems. This paper describes the creation of interactive remote laboratories (RLs). Two main software tools are used: Simulink and Easy Java Simulations (EJS). The first is a…

  10. Oscillation Baselining and Analysis Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    PNNL developed a new tool for oscillation analysis and baselining. This tool has been developed under a new DOE Grid Modernization Laboratory Consortium (GMLC) Project (GM0072 - “Suite of open-source applications and models for advanced synchrophasor analysis”) and it is based on the open platform for PMU analysis. The Oscillation Baselining and Analysis Tool (OBAT) performs the oscillation analysis and identifies modes of oscillations (frequency, damping, energy, and shape). The tool also does oscillation event baselining (fining correlation between oscillations characteristics and system operating conditions).

  11. Lean management and medical laboratory: application in transfusionnal immuno-hematology.

    PubMed

    Thibert, Jean-Baptiste; Le Vacon, Françoise; Danic, Bruno

    2017-10-01

    Despite a common use in industrial applications, only a few studies describe the lean management methods in medical laboratory. These tools have been evaluated in analysis laboratory of blood donors, especially in immuno-hematology sector. The aim was to optimize the organization and maintain team cohesion and strong staff involvement in a restructuring context. The tools used and the results obtained are presented in this study.

  12. Evaluation of Calibration Laboratories Performance

    NASA Astrophysics Data System (ADS)

    Filipe, Eduarda

    2011-12-01

    One of the main goals of interlaboratory comparisons (ILCs) is the evaluation of the laboratories performance for the routine calibrations they perform for the clients. In the frame of Accreditation of Laboratories, the national accreditation boards (NABs) in collaboration with the national metrology institutes (NMIs) organize the ILCs needed to comply with the requirements of the international accreditation organizations. In order that an ILC is a reliable tool for a laboratory to validate its best measurement capability (BMC), it is needed that the NMI (reference laboratory) provides a better traveling standard—in terms of accuracy class or uncertainty—than the laboratories BMCs. Although this is the general situation, there are cases where the NABs ask the NMIs to evaluate the performance of the accredited laboratories when calibrating industrial measuring instruments. The aim of this article is to discuss the existing approaches for the evaluation of ILCs and propose a basis for the validation of the laboratories measurement capabilities. An example is drafted with the evaluation of the results of mercury-in-glass thermometers ILC with 12 participant laboratories.

  13. Ground Data System Analysis Tools to Track Flight System State Parameters for the Mars Science Laboratory (MSL) and Beyond

    NASA Technical Reports Server (NTRS)

    Allard, Dan; Deforrest, Lloyd

    2014-01-01

    Flight software parameters enable space mission operators fine-tuned control over flight system configurations, enabling rapid and dynamic changes to ongoing science activities in a much more flexible manner than can be accomplished with (otherwise broadly used) configuration file based approaches. The Mars Science Laboratory (MSL), Curiosity, makes extensive use of parameters to support complex, daily activities via commanded changes to said parameters in memory. However, as the loss of Mars Global Surveyor (MGS) in 2006 demonstrated, flight system management by parameters brings with it risks, including the possibility of losing track of the flight system configuration and the threat of invalid command executions. To mitigate this risk a growing number of missions have funded efforts to implement parameter tracking parameter state software tools and services including MSL and the Soil Moisture Active Passive (SMAP) mission. This paper will discuss the engineering challenges and resulting software architecture of MSL's onboard parameter state tracking software and discuss the road forward to make parameter management tools suitable for use on multiple missions.

  14. Calgary Laboratory Services

    PubMed Central

    2015-01-01

    Calgary Laboratory Services provides global hospital and community laboratory services for Calgary and surrounding areas (population 1.4 million) and global academic support for the University of Calgary Cumming School of Medicine. It developed rapidly after the Alberta Provincial Government implemented an austerity program to address rising health care costs and to address Alberta’s debt and deficit in 1994. Over roughly the next year, all hospital and community laboratory test funding within the province was put into a single budget, fee codes for fee-for-service test billing were closed, roughly 40% of the provincial laboratory budget was cut, and roughly 40% of the pathologists left the province of Alberta. In Calgary, in the face of these abrupt changes in the laboratory environment, private laboratories, publicly funded hospital laboratories and the medical school department precipitously and reluctantly merged in 1996. The origin of Calgary Laboratory Services was likened to an “unhappy shotgun marriage” by all parties. Although such a structure could save money by eliminating duplicated services and excess capacity and could provide excellent city-wide clinical service by increasing standardization, it was less clear whether it could provide strong academic support for a medical school. Over the past decade, iterations of the Calgary Laboratory Services model have been implemented or are being considered in other Canadian jurisdictions. This case study analyzes the evolution of Calgary Laboratory Services, provides a metric-based review of academic performance over time, and demonstrates that this model, essentially arising as an unplanned experiment, has merit within a Canadian health care context. PMID:28725754

  15. Exposure to hazardous substances in a standard molecular biology laboratory environment: evaluation of exposures in IARC laboratories.

    PubMed

    Chapot, Brigitte; Secretan, Béatrice; Robert, Annie; Hainaut, Pierre

    2009-07-01

    Working in a molecular biology laboratory environment implies regular exposure to a wide range of hazardous substances. Several recent studies have shown that laboratory workers may have an elevated risk of certain cancers. Data on the nature and frequency of exposures in such settings are scanty. The frequency of use of 163 agents by staff working in molecular biology laboratories was evaluated over a period of 4 years by self-administered questionnaire. Of the agents listed, ethanol was used by the largest proportion of staff (70%), followed by ethidium bromide (55%). Individual patterns of use showed three patterns, namely (i) frequent use of a narrow range of products, (ii) occasional use of a wide range of products, and (iii) frequent and occasional use of an intermediate range of products. Among known or suspected carcinogens (International Agency for Research on Cancer Group 1 and 2A, respectively), those most frequently used included formaldehyde (17%), oncogenic viruses (4%), and acrylamide (32%). The type of exposure encountered in research laboratories is extremely diverse. Few carcinogenic agents are used frequently but many laboratory workers may be exposed occasionally to known human carcinogens. In addition, many of the chemicals handled by staff represent a health hazard. The results enabled the staff physician to develop an individual approach to medical surveillance and to draw a personal history of occupational exposures for laboratory staff.

  16. Improving quality management systems of laboratories in developing countries: an innovative training approach to accelerate laboratory accreditation.

    PubMed

    Yao, Katy; McKinney, Barbara; Murphy, Anna; Rotz, Phil; Wafula, Winnie; Sendagire, Hakim; Okui, Scolastica; Nkengasong, John N

    2010-09-01

    The Strengthening Laboratory Management Toward Accreditation (SLMTA) program was developed to promote immediate, measurable improvement in laboratories of developing countries. The laboratory management framework, a tool that prescribes managerial job tasks, forms the basis of the hands-on, activity-based curriculum. SLMTA is implemented through multiple workshops with intervening site visits to support improvement projects. To evaluate the effectiveness of SLMTA, the laboratory accreditation checklist was developed and subsequently adopted by the World Health Organization Regional Office for Africa (WHO AFRO). The SLMTA program and the implementation model were validated through a pilot in Uganda. SLMTA yielded observable, measurable results in the laboratories and improved patient flow and turnaround time in a laboratory simulation. The laboratory staff members were empowered to improve their own laboratories by using existing resources, communicate with clinicians and hospital administrators, and advocate for system strengthening. The SLMTA program supports laboratories by improving management and building preparedness for accreditation.

  17. The Vanderbilt University nanoscale science and engineering fabrication laboratory

    NASA Astrophysics Data System (ADS)

    Hmelo, Anthony B.; Belbusti, Edward F.; Smith, Mark L.; Brice, Sean J.; Wheaton, Robert F.

    2005-08-01

    Vanderbilt University has realized the design and construction of a 1635 sq. ft. Class 10,000 cleanroom facility to support the wide-ranging research mission associated with the Vanderbilt Institute for Nanoscale Science and Engineering (VINSE). By design we have brought together disparate technologies and researchers formerly dispersed across the campus to work together in a small contiguous space intended to foster interaction and synergy of nano-technologies not often found in close proximity. The space hosts a variety of tools for lithographic patterning of substrates, the deposition of thin films, the synthesis of diamond nanostructures and carbon nanotubes, and a variety of reactive ion etchers for the fabrication of nanostructures on silicon substrates. In addition, a separate 911 sq. ft. chemistry laboratory supports nanocrystal synthesis and the investigation of biomolecular films. The design criteria required an integrated space that would support the scientific agenda of the laboratory while satisfying all applicable code and safety concerns. This project required the renovation of pre-existing laboratory space with minimal disruption to ongoing activities in a mixed-use building, while meeting the requirements of the 2000 edition of the International Building Code for the variety of potentially hazardous processes that have been programmed for the space. In this paper we describe how architectural and engineering challenges were met in the areas of mitigating floor vibration issues, shielding our facility against EMI emanations, design of the contamination control facility itself, chemical storage and handling, toxic gas use and management, as well as mechanical, electrical, plumbing, lab security, fire and laboratory safety issues.

  18. Implementation research: a mentoring programme to improve laboratory quality in Cambodia

    PubMed Central

    Voeurng, Vireak; Sek, Sophat; Song, Sophanna; Vong, Nora; Tous, Chansamrach; Flandin, Jean-Frederic; Confer, Deborah; Costa, Alexandre; Martin, Robert

    2016-01-01

    Abstract Objective To implement a mentored laboratory quality stepwise implementation (LQSI) programme to strengthen the quality and capacity of Cambodian hospital laboratories. Methods We recruited four laboratory technicians to be mentors and trained them in mentoring skills, laboratory quality management practices and international standard organization (ISO) 15189 requirements for medical laboratories. Separately, we trained staff from 12 referral hospital laboratories in laboratory quality management systems followed by tri-weekly in-person mentoring on quality management systems implementation using the LQSI tool, which is aligned with the ISO 15189 standard. The tool was adapted from a web-based resource into a software-based spreadsheet checklist, which includes a detailed action plan and can be used to qualitatively monitor each laboratory’s progress. The tool – translated into Khmer – included a set of quality improvement activities grouped into four phases for implementation with increasing complexity. Project staff reviewed the laboratories’ progress and challenges in weekly conference calls and bi-monthly meetings with focal points of the health ministry, participating laboratories and local partners. We present the achievements in implementation from September 2014 to March 2016. Findings As of March 2016, the 12 laboratories have completed 74–90% of the 104 activities in phase 1, 53–78% of the 178 activities in phase 2, and 18–26% of the 129 activities in phase 3. Conclusion Regular on-site mentoring of laboratories using a detailed action plan in the local language allows staff to learn concepts of quality management system and learn on the job without disruption to laboratory service provision. PMID:27843164

  19. Laboratory quality improvement in Tanzania.

    PubMed

    Andiric, Linda R; Massambu, Charles G

    2015-04-01

    The article describes the implementation and improvement in the first groups of medical laboratories in Tanzania selected to participate in the training program on Strengthening Laboratory Management Toward Accreditation (SLMTA). As in many other African nations, the selected improvement plan consisted of formalized hands-on training (SLMTA) that teaches the tasks and skills of laboratory management and provides the tools for implementation of best laboratory practice. Implementation of the improvements learned during training was verified before and after SLMTA with the World Health Organization African Region Stepwise Laboratory Improvement Process Towards Accreditation checklist. During a 4-year period, the selected laboratories described in this article demonstrated improvement with a range of 2% to 203% (cohort I) and 12% to 243% (cohort II) over baseline scores. The article describes the progress made in Tanzania's first cohorts, the obstacles encountered, and the lessons learned during the pilot and subsequent implementations. Copyright© by the American Society for Clinical Pathology.

  20. National Survey of Adult and Pediatric Reference Intervals in Clinical Laboratories across Canada: A Report of the CSCC Working Group on Reference Interval Harmonization.

    PubMed

    Adeli, Khosrow; Higgins, Victoria; Seccombe, David; Collier, Christine P; Balion, Cynthia M; Cembrowski, George; Venner, Allison A; Shaw, Julie

    2017-11-01

    Reference intervals are widely used decision-making tools in laboratory medicine, serving as health-associated standards to interpret laboratory test results. Numerous studies have shown wide variation in reference intervals, even between laboratories using assays from the same manufacturer. Lack of consistency in either sample measurement or reference intervals across laboratories challenges the expectation of standardized patient care regardless of testing location. Here, we present data from a national survey conducted by the Canadian Society of Clinical Chemists (CSCC) Reference Interval Harmonization (hRI) Working Group that examines variation in laboratory reference sample measurements, as well as pediatric and adult reference intervals currently used in clinical practice across Canada. Data on reference intervals currently used by 37 laboratories were collected through a national survey to examine the variation in reference intervals for seven common laboratory tests. Additionally, 40 clinical laboratories participated in a baseline assessment by measuring six analytes in a reference sample. Of the seven analytes examined, alanine aminotransferase (ALT), alkaline phosphatase (ALP), and creatinine reference intervals were most variable. As expected, reference interval variation was more substantial in the pediatric population and varied between laboratories using the same instrumentation. Reference sample results differed between laboratories, particularly for ALT and free thyroxine (FT4). Reference interval variation was greater than test result variation for the majority of analytes. It is evident that there is a critical lack of harmonization in laboratory reference intervals, particularly for the pediatric population. Furthermore, the observed variation in reference intervals across instruments cannot be explained by the bias between the results obtained on instruments by different manufacturers. Copyright © 2017 The Canadian Society of Clinical Chemists

  1. Laboratory automation: trajectory, technology, and tactics.

    PubMed

    Markin, R S; Whalen, S A

    2000-05-01

    Laboratory automation is in its infancy, following a path parallel to the development of laboratory information systems in the late 1970s and early 1980s. Changes on the horizon in healthcare and clinical laboratory service that affect the delivery of laboratory results include the increasing age of the population in North America, the implementation of the Balanced Budget Act (1997), and the creation of disease management companies. Major technology drivers include outcomes optimization and phenotypically targeted drugs. Constant cost pressures in the clinical laboratory have forced diagnostic manufacturers into less than optimal profitability states. Laboratory automation can be a tool for the improvement of laboratory services and may decrease costs. The key to improvement of laboratory services is implementation of the correct automation technology. The design of this technology should be driven by required functionality. Automation design issues should be centered on the understanding of the laboratory and its relationship to healthcare delivery and the business and operational processes in the clinical laboratory. Automation design philosophy has evolved from a hardware-based approach to a software-based approach. Process control software to support repeat testing, reflex testing, and transportation management, and overall computer-integrated manufacturing approaches to laboratory automation implementation are rapidly expanding areas. It is clear that hardware and software are functionally interdependent and that the interface between the laboratory automation system and the laboratory information system is a key component. The cost-effectiveness of automation solutions suggested by vendors, however, has been difficult to evaluate because the number of automation installations are few and the precision with which operational data have been collected to determine payback is suboptimal. The trend in automation has moved from total laboratory automation to a

  2. Towards a green analytical laboratory: microextraction techniques as a useful tool for the monitoring of polluted soils

    NASA Astrophysics Data System (ADS)

    Lopez-Garcia, Ignacio; Viñas, Pilar; Campillo, Natalia; Hernandez Cordoba, Manuel; Perez Sirvent, Carmen

    2016-04-01

    Microextraction techniques are a valuable tool at the analytical laboratory since they allow sensitive measurements of pollutants to be carried out by means of easily available instrumentation. There is a large number of such procedures involving miniaturized liquid-liquid or liquid-solid extractions with the common denominator of using very low amounts (only a few microliters) or even none of organic solvents. Since minimal amounts of reagents are involved, and the generation of residues is consequently minimized, the approach falls within the concept of Green Analytical Chemistry. This general methodology is useful both for inorganic and organic pollutants. Thus, low amounts of metallic ions can be measured without the need of using ICP-MS since this instrument can be replaced by a simple AAS spectrometer which is commonly present in any laboratory and involves low acquisition and maintenance costs. When dealing with organic pollutants, the microextracts obtained can be introduced into liquid or gas chromatographs equipped with common detectors and there is no need for the most sophisticated and expensive mass spectrometers. This communication reports an overview of the advantages of such a methodology, and gives examples for the determination of some particular contaminants in soil and water samples The authors are grateful to the Comunidad Autonóma de la Región de Murcia , Spain (Fundación Séneca, 19888/GERM/15) for financial support

  3. The mass storage testing laboratory at GSFC

    NASA Technical Reports Server (NTRS)

    Venkataraman, Ravi; Williams, Joel; Michaud, David; Gu, Heng; Kalluri, Atri; Hariharan, P. C.; Kobler, Ben; Behnke, Jeanne; Peavey, Bernard

    1998-01-01

    Industry-wide benchmarks exist for measuring the performance of processors (SPECmarks), and of database systems (Transaction Processing Council). Despite storage having become the dominant item in computing and IT (Information Technology) budgets, no such common benchmark is available in the mass storage field. Vendors and consultants provide services and tools for capacity planning and sizing, but these do not account for the complete set of metrics needed in today's archives. The availability of automated tape libraries, high-capacity RAID systems, and high- bandwidth interconnectivity between processor and peripherals has led to demands for services which traditional file systems cannot provide. File Storage and Management Systems (FSMS), which began to be marketed in the late 80's, have helped to some extent with large tape libraries, but their use has introduced additional parameters affecting performance. The aim of the Mass Storage Test Laboratory (MSTL) at Goddard Space Flight Center is to develop a test suite that includes not only a comprehensive check list to document a mass storage environment but also benchmark code. Benchmark code is being tested which will provide measurements for both baseline systems, i.e. applications interacting with peripherals through the operating system services, and for combinations involving an FSMS. The benchmarks are written in C, and are easily portable. They are initially being aimed at the UNIX Open Systems world. Measurements are being made using a Sun Ultra 170 Sparc with 256MB memory running Solaris 2.5.1 with the following configuration: 4mm tape stacker on SCSI 2 Fast/Wide; 4GB disk device on SCSI 2 Fast/Wide; and Sony Petaserve on Fast/Wide differential SCSI 2.

  4. Using Mobile Devices for Motor-Learning Laboratory Exercises

    ERIC Educational Resources Information Center

    Hill, Kory

    2014-01-01

    When teaching motor-learning concepts, laboratory experiments can be valuable tools for promoting learning. In certain circumstances, traditional laboratory exercises are often impractical due to facilities, time, or cost. Inexpensive or free applications (apps) that run on mobile devices can serve as useful alternatives. This article details…

  5. The widely distributed hard tick, Haemaphysalis longicornis, can retain canine parvovirus, but not be infected in laboratory condition

    PubMed Central

    MORI, Hiroyuki; TANAKA, Tetsuya; MOCHIZUKI, Masami

    2014-01-01

    ABSTRACT. Ticks are known to transmit various pathogens, radically threatening humans and animals. Despite the close contact between ticks and viruses, our understanding on their interaction and biology is still lacking. The aim of this study was to experimentally assess the interaction between canine parvovirus (CPV) and a widely distributed hard tick, Haemaphysalis longicornis, in laboratory condition. After inoculation of CPV into the hemocoel of the ticks, polymerase chain reaction assay revealed that CPV persisted in inoculated unfed adult female ticks for 28 days. Canine parvovirus was recovered from the inoculated ticks using a cell culture, indicating that the virus retained intact in the ticks after inoculation, but significant positive reaction indicating virus infection was not detected in the tick organs by immunofluorescence antibody test using a monoclonal antibody. In the case of ticks inoculated with feline leukemia virus, the virus had shorter persistence in the ticks compared to CPV. These findings provide significant important information on the characteristic interaction of tick with non-tick-borne virus. PMID:25650060

  6. Bootstrapping Methods Applied for Simulating Laboratory Works

    ERIC Educational Resources Information Center

    Prodan, Augustin; Campean, Remus

    2005-01-01

    Purpose: The aim of this work is to implement bootstrapping methods into software tools, based on Java. Design/methodology/approach: This paper presents a category of software e-tools aimed at simulating laboratory works and experiments. Findings: Both students and teaching staff use traditional statistical methods to infer the truth from sample…

  7. Laboratory Resources Management in Manufacturing Systems Programs

    ERIC Educational Resources Information Center

    Obi, Samuel C.

    2004-01-01

    Most, if not all, industrial technology (IT) programs have laboratories or workshops. Often equipped with modern equipment, tools, materials, and measurement and test instruments, these facilities constitute a major investment for IT programs. Improper use or over use of program facilities may result in dirty lab equipment, lost or damaged tools,…

  8. Tool time: Gender and students' use of tools, control, and authority

    NASA Astrophysics Data System (ADS)

    Jones, M. Gail; Brader-Araje, Laura; Carboni, Lisa Wilson; Carter, Glenda; Rua, Melissa J.; Banilower, Eric; Hatch, Holly

    2000-10-01

    In this study, we examined how students used science equipment and tools in constructing knowledge during science instruction. Within a geographical metaphor, we focused on how students use tools when constructing new knowledge, how control of tools is actualized from pedagogical perspectives, how language and tool accessibility intersect, how gender intersects with tool use, and how competition for resources impacts access to tools. Sixteen targeted students from five elementary science classes were observed for 3 days of instruction. Results showed gender differences in students' use of exclusive language and commands, as well as in the ways students played and tinkered with tools. Girls tended to carefully follow the teacher's directions during the laboratory and did little playing or tinkering with science tools. Male students tended to use tools in inventive and exploratory ways. Results also showed that whether or not a student had access to his or her own materials became indicative of the type of verbal interactions that took place during the science investigation. Gender-related patterns in how tools are shared, how dyads relate to the materials and each other, and how materials are used to build knowledge are described.

  9. Computational simulation of laboratory-scale volcanic jets

    NASA Astrophysics Data System (ADS)

    Solovitz, S.; Van Eaton, A. R.; Mastin, L. G.; Herzog, M.

    2017-12-01

    Volcanic eruptions produce ash clouds that may travel great distances, significantly impacting aviation and communities downwind. Atmospheric hazard forecasting relies partly on numerical models of the flow physics, which incorporate data from eruption observations and analogue laboratory tests. As numerical tools continue to increase in complexity, they must be validated to fine-tune their effectiveness. Since eruptions are relatively infrequent and challenging to observe in great detail, analogue experiments can provide important insights into expected behavior over a wide range of input conditions. Unfortunately, laboratory-scale jets cannot easily attain the high Reynolds numbers ( 109) of natural volcanic eruption columns. Comparisons between the computational models and analogue experiments can help bridge this gap. In this study, we investigate a 3-D volcanic plume model, the Active Tracer High-resolution Atmospheric Model (ATHAM), which has been used to simulate a variety of eruptions. However, it has not been previously validated using laboratory-scale data. We conducted numerical simulations of three flows that we have studied in the laboratory: a vertical jet in a quiescent environment, a vertical jet in horizontal cross flow, and a particle-laden jet. We considered Reynolds numbers from 10,000 to 50,000, jet-to-cross flow velocity ratios of 2 to 10, and particle mass loadings of up to 25% of the exit mass flow rate. Vertical jet simulations produce Gaussian velocity profiles in the near exit region by 3 diameters downstream, matching the mean experimental profiles. Simulations of air entrainment are of the correct order of magnitude, but they show decreasing entrainment with vertical distance from the vent. Cross flow simulations reproduce experimental trajectories for the jet centerline initially, although confinement appears to impact the response later. Particle-laden simulations display minimal variation in concentration profiles between cases with

  10. Liability of Science Educators for Laboratory Safety. NSTA Position Statement

    ERIC Educational Resources Information Center

    National Science Teachers Association (NJ1), 2007

    2007-01-01

    Laboratory investigations are essential for the effective teaching and learning of science. A school laboratory investigation ("lab") is an experience in the laboratory, classroom, or the field that provides students with opportunities to interact directly with natural phenomena or with data collected by others using tools, materials, data…

  11. [Laboratory diagnosis of toxoplasmosis].

    PubMed

    Strhársky, J; Mad'arová, L; Klement, C

    2009-04-01

    Under Central European climatic conditions, toxoplasmosis is one of the most common human parasitic diseases. A wide range of methods for both direct and indirect detection of the causative agent are currently available for the laboratory diagnosis of toxoplasmosis. The purpose of the article is to review the history of the discovery of the causative agent of toxoplasmosis and how laboratory diagnostic methods were developed and improved. The main emphasis is placed on current options in the diagnosis of Toxoplasma gondii, more precisely on the serodiagnosis and new trends in molecular biology-based techniques.

  12. Tools for Educational Data Mining: A Review

    ERIC Educational Resources Information Center

    Slater, Stefan; Joksimovic, Srecko; Kovanovic, Vitomir; Baker, Ryan S.; Gasevic, Dragan

    2017-01-01

    In recent years, a wide array of tools have emerged for the purposes of conducting educational data mining (EDM) and/or learning analytics (LA) research. In this article, we hope to highlight some of the most widely used, most accessible, and most powerful tools available for the researcher interested in conducting EDM/LA research. We will…

  13. Screen and clean: a tool for identifying interactions in genome-wide association studies.

    PubMed

    Wu, Jing; Devlin, Bernie; Ringquist, Steven; Trucco, Massimo; Roeder, Kathryn

    2010-04-01

    Epistasis could be an important source of risk for disease. How interacting loci might be discovered is an open question for genome-wide association studies (GWAS). Most researchers limit their statistical analyses to testing individual pairwise interactions (i.e., marginal tests for association). A more effective means of identifying important predictors is to fit models that include many predictors simultaneously (i.e., higher-dimensional models). We explore a procedure called screen and clean (SC) for identifying liability loci, including interactions, by using the lasso procedure, which is a model selection tool for high-dimensional regression. We approach the problem by using a varying dictionary consisting of terms to include in the model. In the first step the lasso dictionary includes only main effects. The most promising single-nucleotide polymorphisms (SNPs) are identified using a screening procedure. Next the lasso dictionary is adjusted to include these main effects and the corresponding interaction terms. Again, promising terms are identified using lasso screening. Then significant terms are identified through the cleaning process. Implementation of SC for GWAS requires algorithms to explore the complex model space induced by the many SNPs genotyped and their interactions. We propose and explore a set of algorithms and find that SC successfully controls Type I error while yielding good power to identify risk loci and their interactions. When the method is applied to data obtained from the Wellcome Trust Case Control Consortium study of Type 1 Diabetes it uncovers evidence supporting interaction within the HLA class II region as well as within Chromosome 12q24.

  14. Project management: importance for diagnostic laboratories.

    PubMed

    Croxatto, A; Greub, G

    2017-07-01

    The need for diagnostic laboratories to improve both quality and productivity alongside personnel shortages incite laboratory managers to constantly optimize laboratory workflows, organization, and technology. These continuous modifications of the laboratories should be conducted using efficient project and change management approaches to maximize the opportunities for successful completion of the project. This review aims at presenting a general overview of project management with an emphasis on selected critical aspects. Conventional project management tools and models, such as HERMES, described in the literature, associated personal experience, and educational courses on management have been used to illustrate this review. This review presents general guidelines of project management and highlights their importance for microbiology diagnostic laboratories. As an example, some critical aspects of project management will be illustrated with a project of automation, as experienced at the laboratories of bacteriology and hygiene of the University Hospital of Lausanne. It is important to define clearly beforehand the objective of a project, its perimeter, its costs, and its time frame including precise duration estimates of each step. Then, a project management plan including explanations and descriptions on how to manage, execute, and control the project is necessary to continuously monitor the progression of a project to achieve its defined goals. Moreover, a thorough risk analysis with contingency and mitigation measures should be performed at each phase of a project to minimize the impact of project failures. The increasing complexities of modern laboratories mean clinical microbiologists must use several management tools including project and change management to improve the outcome of major projects and activities. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  15. Easily configured real-time CPOE Pick Off Tool supporting focused clinical research and quality improvement.

    PubMed

    Rosenbaum, Benjamin P; Silkin, Nikolay; Miller, Randolph A

    2014-01-01

    Real-time alerting systems typically warn providers about abnormal laboratory results or medication interactions. For more complex tasks, institutions create site-wide 'data warehouses' to support quality audits and longitudinal research. Sophisticated systems like i2b2 or Stanford's STRIDE utilize data warehouses to identify cohorts for research and quality monitoring. However, substantial resources are required to install and maintain such systems. For more modest goals, an organization desiring merely to identify patients with 'isolation' orders, or to determine patients' eligibility for clinical trials, may adopt a simpler, limited approach based on processing the output of one clinical system, and not a data warehouse. We describe a limited, order-entry-based, real-time 'pick off' tool, utilizing public domain software (PHP, MySQL). Through a web interface the tool assists users in constructing complex order-related queries and auto-generates corresponding database queries that can be executed at recurring intervals. We describe successful application of the tool for research and quality monitoring.

  16. Quality indicators in laboratory medicine: a fundamental tool for quality and patient safety.

    PubMed

    Plebani, Mario; Sciacovelli, Laura; Marinova, Mariela; Marcuccitti, Jessica; Chiozza, Maria Laura

    2013-09-01

    The identification of reliable quality indicators (QIs) is a crucial step in enabling users to quantify the quality of laboratory services. The current lack of attention to extra-laboratory factors is in stark contrast with the body of evidence pointing to the multitude of errors that continue to occur in the pre- and post-analytical phases. Different QIs and terminologies are currently used and, therefore, there is the need to harmonize proposed QIs. A model of quality indicators (MQI) has been consensually developed by a group of clinical laboratories according to a project launched by a working group of the International Federation of Clinical Chemistry and Laboratory Medicine (IFCC). The model includes 57 QIs related to key processes (35 pre-, 7 intra- and 15 post-analytical phases) and 3 to support processes. The developed MQI and the data collected provide evidence of the feasibility of the project to harmonize currently available QIs, but further efforts should be done to involve more clinical laboratories and to collect a more consistent amount of data. Copyright © 2012 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  17. A Guided-Inquiry pH Laboratory Exercise for Introductory Biological Science Laboratories

    ERIC Educational Resources Information Center

    Snodgrass, Meagan A.; Lux, Nicholas; Metz, Anneke M.

    2011-01-01

    There is a continuing need for engaging inquiry-based laboratory experiences for advanced high school and undergraduate biology courses. The authors describe a guided-inquiry exercise investigating the pH-dependence of lactase enzyme that uses an inexpensive, wide-range buffering system, lactase dietary supplement, over-the-counter glucose test…

  18. Hairy Root as a Model System for Undergraduate Laboratory Curriculum and Research

    ERIC Educational Resources Information Center

    Keyes, Carol A.; Subramanian, Senthil; Yu, Oliver

    2009-01-01

    Hairy root transformation has been widely adapted in plant laboratories to rapidly generate transgenic roots for biochemical and molecular analysis. We present hairy root transformations as a versatile and adaptable model system for a wide variety of undergraduate laboratory courses and research. This technique is easy, efficient, and fast making…

  19. Usability testing of a monitoring and feedback tool to stimulate physical activity.

    PubMed

    van der Weegen, Sanne; Verwey, Renée; Tange, Huibert J; Spreeuwenberg, Marieke D; de Witte, Luc P

    2014-01-01

    A MONITORING AND FEEDBACK TOOL TO STIMULATE PHYSICAL ACTIVITY, CONSISTING OF AN ACTIVITY SENSOR, SMARTPHONE APPLICATION (APP), AND WEBSITE FOR PATIENTS AND THEIR PRACTICE NURSES, HAS BEEN DEVELOPED: the 'It's LiFe!' tool. In this study the usability of the tool was evaluated by technology experts and end users (people with chronic obstructive pulmonary disease or type 2 diabetes, with ages from 40-70 years), to improve the user interfaces and content of the tool. THE STUDY HAD FOUR PHASES: 1) a heuristic evaluation with six technology experts; 2) a usability test in a laboratory by five patients; 3) a pilot in real life wherein 20 patients used the tool for 3 months; and 4) a final lab test by five patients. In both lab tests (phases 2 and 4) qualitative data were collected through a thinking-aloud procedure and video recordings, and quantitative data through questions about task complexity, text comprehensiveness, and readability. In addition, the post-study system usability questionnaire (PSSUQ) was completed for the app and the website. In the pilot test (phase 3), all patients were interviewed three times and the Software Usability Measurement Inventory (SUMI) was completed. After each phase, improvements were made, mainly to the layout and text. The main improvement was a refresh button for active data synchronization between activity sensor, app, and server, implemented after connectivity problems in the pilot test. The mean score on the PSSUQ for the website improved from 5.6 (standard deviation [SD] 1.3) to 6.5 (SD 0.5), and for the app from 5.4 (SD 1.5) to 6.2 (SD 1.1). Satisfaction in the pilot was not very high according to the SUMI. The use of laboratory versus real-life tests and expert-based versus user-based tests revealed a wide range of usability issues. The usability of the It's LiFe! tool improved considerably during the study.

  20. Internet, World Wide Web, and Creativity.

    ERIC Educational Resources Information Center

    Siau, Keng

    1999-01-01

    This article presents the services available on the Internet for creativity and discusses their applicability to electronic brainstorming. Services include bulletin boards, electronic mail and listservs, chat groups, file transfers, and remote login. Opportunities provided by the World Wide Web are discussed, along with tools available to…

  1. An Engineering Tool for the Prediction of Internal Dielectric Charging

    NASA Astrophysics Data System (ADS)

    Rodgers, D. J.; Ryden, K. A.; Wrenn, G. L.; Latham, P. M.; Sorensen, J.; Levy, L.

    1998-11-01

    A practical internal charging tool has been developed. It provides an easy-to-use means for satellite engineers to predict whether on-board dielectrics are vulnerable to electrostatic discharge in the outer radiation belt. The tool is designed to simulate irradiation of single-dielectric planar or cylindrical structures with or without shielding. Analytical equations are used to describe current deposition in the dielectric. This is fast and gives charging currents to sufficient accuracy given the uncertainties in other aspects of the problem - particularly material characteristics. Time-dependent internal electric fields are calculated, taking into account the effect on conductivity of electric field, dose rate and temperature. A worst-case model of electron fluxes in the outer belt has been created specifically for the internal charging problem and is built into the code. For output, the tool gives a YES or NO decision on the susceptibility of the structure to internal electrostatic breakdown and if necessary, calculates the required changes to bring the system below the breakdown threshold. A complementary programme of laboratory irradiations has been carried out to validate the tool. The results for Epoxy-fibreglass samples show that the code models electric field realistically for a wide variety of shields, dielectric thicknesses and electron spectra. Results for Teflon samples indicate that some further experimentation is required and the radiation-induced conductivity aspects of the code have not been validated.

  2. Phlebotomy, a bridge between laboratory and patient.

    PubMed

    Ialongo, Cristiano; Bernardini, Sergio

    2016-01-01

    The evidence-based paradigm has changed and evolved medical practice. Phlebotomy, which dates back to the age of ancient Greece, has gained experience through the evolution of medicine becoming a fundamental diagnostic tool. Nowadays it connects the patient with the clinical laboratory dimension building up a bridge. However, more often there is a gap between laboratory and phlebotomist that causes misunderstandings and burdens on patient safety. Therefore, the scope of this review is delivering a view of modern phlebotomy to "bridge" patient and laboratory. In this regard the paper describes devices, tools and procedures in the light of the most recent scientific findings, also discussing their impact on both quality of blood testing and patient safety. It also addresses the issues concerning medical aspect of venipuncture, like the practical approach to the superficial veins anatomy, as well as the management of the patient's compliance with the blood draw. Thereby, the clinical, technical and practical issues are treated with the same relevance throughout the entire paper.

  3. Mars Science Laboratory Rover Taking Shape

    NASA Technical Reports Server (NTRS)

    2008-01-01

    This image taken in August 2008 in a clean room at NASA's Jet Propulsion Laboratory, Pasadena, Calif., shows NASA's next Mars rover, the Mars Science Laboratory, in the course of its assembly, before additions of its arm, mast, laboratory instruments and other equipment.

    The rover is about 9 feet wide and 10 feet long.

    Viewing progress on the assembly are, from left: NASA Associate Administrator for Science Ed Weiler, California Institute of Technology President Jean-Lou Chameau, JPL Director Charles Elachi, and JPL Associate Director for Flight Projects and Mission Success Tom Gavin.

    JPL, a division of Caltech, manages the Mars Science Laboratory project for the NASA Science Mission Directorate, Washington.

  4. Integrated Analysis Tools for the NERRS System-Wide Monitoring Program Data

    EPA Science Inventory

    Standardized monitoring programs have vastly improved the quantity and quality of data that form the basis of environmental decision-making. One example is the NOAA-funded National Estuarine Research Reserve System (NERRS) System-wide Monitoring Program (SWMP) that was implement...

  5. Vehicle Thermal Management Models and Tools | Transportation Research |

    Science.gov Websites

    NREL Models and Tools Vehicle Thermal Management Models and Tools The National Renewable Energy Laboratory's (NREL's) vehicle thermal management modeling tools allow researchers to assess the trade-offs and calculate the potential benefits of thermal design options. image of three models of semi truck cabs. Truck

  6. Genome-wide association studies and epigenome-wide association studies go together in cancer control

    PubMed Central

    Verma, Mukesh

    2016-01-01

    Completion of the human genome a decade ago laid the foundation for: using genetic information in assessing risk to identify individuals and populations that are likely to develop cancer, and designing treatments based on a person's genetic profiling (precision medicine). Genome-wide association studies (GWAS) completed during the past few years have identified risk-associated single nucleotide polymorphisms that can be used as screening tools in epidemiologic studies of a variety of tumor types. This led to the conduct of epigenome-wide association studies (EWAS). This article discusses the current status, challenges and research opportunities in GWAS and EWAS. Information gained from GWAS and EWAS has potential applications in cancer control and treatment. PMID:27079684

  7. Part I: Virtual Laboratory versus Traditional Laboratory: Which Is More Effective for Teaching Electrochemistry? Part II: The Green Synthesis of Aurones Using a Deep Eutectic Solvent

    ERIC Educational Resources Information Center

    Hawkins, Ian C.

    2013-01-01

    The role of the teaching laboratory in science education has been debated over the last century. The goals and purposes of the laboratory are still debated and while most science educators consider laboratory a vital part of the education process, they differ widely on the purposes for laboratory and what methods should be used to teach…

  8. Report formatting in laboratory medicine - a call for harmony.

    PubMed

    Jones, Graham R D; Legg, Michael

    2018-04-19

    The results of medical laboratory testing are only useful if they lead to appropriate actions by medical practitioners and/or patients. An underappreciated component of the medical testing process is the transfer of the information from the laboratory report into the reader's brain. The format of laboratory reports can be determined by the testing laboratory, which may issue a formatted report, or by electronic systems receiving information from laboratories and controlling the report format. As doctors can receive information from many laboratories, interpreting information from reports in a safe and rapid manner is facilitated by having similar report layouts and formats. Using Australia as an example, there is a wide variation in report formats in spite of a body of work to define standards for reporting. In addition to standardising of report formats, consideration needs to be given to optimisation of report formatting to facilitate rapid and unambiguous reading of the report and also interpretation of the data. Innovative report formats have been developed by some laboratories; however, wide adoption has not followed. The need to balance uniformity of reporting with appropriate innovation is a challenge for safe reporting of laboratory results. This paper discusses the current status and opportunity for improvement in safety and efficiency of the reading of laboratory reports, using current practise and developments in Australia as examples.

  9. Laboratory and Workplace Assessments of Rivet Bucking Bar Vibration Emissions

    PubMed Central

    McDowell, Thomas W.; Warren, Christopher; Xu, Xueyan S.; Welcome, Daniel E.; Dong, Ren G.

    2016-01-01

    Sheet metal workers operating rivet bucking bars are at risk of developing hand and wrist musculoskeletal disorders associated with exposures to hand-transmitted vibrations and forceful exertions required to operate these hand tools. New bucking bar technologies have been introduced in efforts to reduce workplace vibration exposures to these workers. However, the efficacy of these new bucking bar designs has not been well documented. While there are standardized laboratory-based methodologies for assessing the vibration emissions of many types of powered hand tools, no such standard exists for rivet bucking bars. Therefore, this study included the development of a laboratory-based method for assessing bucking bar vibrations which utilizes a simulated riveting task. With this method, this study evaluated three traditional steel bucking bars, three similarly shaped tungsten alloy bars, and three bars featuring spring-dampeners. For comparison the bucking bar vibrations were also assessed during three typical riveting tasks at a large aircraft maintenance facility. The bucking bars were rank-ordered in terms of unweighted and frequency-weighted acceleration measured at the hand-tool interface. The results suggest that the developed laboratory method is a reasonable technique for ranking bucking bar vibration emissions; the lab-based riveting simulations produced similar rankings to the workplace rankings. However, the laboratory-based acceleration averages were considerably lower than the workplace measurements. These observations suggest that the laboratory test results are acceptable for comparing and screening bucking bars, but the laboratory measurements should not be directly used for assessing the risk of workplace bucking bar vibration exposures. The newer bucking bar technologies exhibited significantly reduced vibrations compared to the traditional steel bars. The results of this study, together with other information such as rivet quality, productivity, tool

  10. Managing Laboratory Data Using Cloud Computing as an Organizational Tool

    ERIC Educational Resources Information Center

    Bennett, Jacqueline; Pence, Harry E.

    2011-01-01

    One of the most significant difficulties encountered when directing undergraduate research and developing new laboratory experiments is how to efficiently manage the data generated by a number of students. Cloud computing, where both software and computer files reside online, offers a solution to this data-management problem and allows researchers…

  11. ToxPredictor: a Toxicity Estimation Software Tool

    EPA Science Inventory

    The Computational Toxicology Team within the National Risk Management Research Laboratory has developed a software tool that will allow the user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be ac...

  12. Annotated bibliography of Software Engineering Laboratory literature

    NASA Technical Reports Server (NTRS)

    Morusiewicz, Linda; Valett, Jon D.

    1991-01-01

    An annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory is given. More than 100 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. All materials have been grouped into eight general subject areas for easy reference: The Software Engineering Laboratory; The Software Engineering Laboratory: Software Development Documents; Software Tools; Software Models; Software Measurement; Technology Evaluations; Ada Technology; and Data Collection. Subject and author indexes further classify these documents by specific topic and individual author.

  13. Annotated bibliography of software engineering laboratory literature

    NASA Technical Reports Server (NTRS)

    Buhler, Melanie; Valett, Jon

    1989-01-01

    An annotated bibliography is presented of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory. The bibliography was updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials were grouped into eight general subject areas for easy reference: (1) The Software Engineering Laboratory; (2) The Software Engineering Laboratory: Software Development Documents; (3) Software Tools; (4) Software Models; (5) Software Measurement; (6) Technology Evaluations; (7) Ada Technology; and (8) Data Collection. Subject and author indexes further classify these documents by specific topic and individual author.

  14. Grid Modernization Laboratory Consortium - Testing and Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kroposki, Benjamin; Skare, Paul; Pratt, Rob

    This paper highlights some of the unique testing capabilities and projects being performed at several national laboratories as part of the U. S. Department of Energy Grid Modernization Laboratory Consortium. As part of this effort, the Grid Modernization Laboratory Consortium Testing Network isbeing developed to accelerate grid modernization by enablingaccess to a comprehensive testing infrastructure and creating a repository of validated models and simulation tools that will be publicly available. This work is key to accelerating thedevelopment, validation, standardization, adoption, and deployment of new grid technologies to help meet U. S. energy goals.

  15. Navigating freely-available software tools for metabolomics analysis.

    PubMed

    Spicer, Rachel; Salek, Reza M; Moreno, Pablo; Cañueto, Daniel; Steinbeck, Christoph

    2017-01-01

    The field of metabolomics has expanded greatly over the past two decades, both as an experimental science with applications in many areas, as well as in regards to data standards and bioinformatics software tools. The diversity of experimental designs and instrumental technologies used for metabolomics has led to the need for distinct data analysis methods and the development of many software tools. To compile a comprehensive list of the most widely used freely available software and tools that are used primarily in metabolomics. The most widely used tools were selected for inclusion in the review by either ≥ 50 citations on Web of Science (as of 08/09/16) or the use of the tool being reported in the recent Metabolomics Society survey. Tools were then categorised by the type of instrumental data (i.e. LC-MS, GC-MS or NMR) and the functionality (i.e. pre- and post-processing, statistical analysis, workflow and other functions) they are designed for. A comprehensive list of the most used tools was compiled. Each tool is discussed within the context of its application domain and in relation to comparable tools of the same domain. An extended list including additional tools is available at https://github.com/RASpicer/MetabolomicsTools which is classified and searchable via a simple controlled vocabulary. This review presents the most widely used tools for metabolomics analysis, categorised based on their main functionality. As future work, we suggest a direct comparison of tools' abilities to perform specific data analysis tasks e.g. peak picking.

  16. Developing a Virtual Rock Deformation Laboratory

    NASA Astrophysics Data System (ADS)

    Zhu, W.; Ougier-simonin, A.; Lisabeth, H. P.; Banker, J. S.

    2012-12-01

    Experimental rock physics plays an important role in advancing earthquake research. Despite its importance in geophysics, reservoir engineering, waste deposits and energy resources, most geology departments in U.S. universities don't have rock deformation facilities. A virtual deformation laboratory can serve as an efficient tool to help geology students naturally and internationally learn about rock deformation. Working with computer science engineers, we built a virtual deformation laboratory that aims at fostering user interaction to facilitate classroom and outreach teaching and learning. The virtual lab is built to center around a triaxial deformation apparatus in which laboratory measurements of mechanical and transport properties such as stress, axial and radial strains, acoustic emission activities, wave velocities, and permeability are demonstrated. A student user can create her avatar to enter the virtual lab. In the virtual lab, the avatar can browse and choose among various rock samples, determine the testing conditions (pressure, temperature, strain rate, loading paths), then operate the virtual deformation machine to observe how deformation changes physical properties of rocks. Actual experimental results on the mechanical, frictional, sonic, acoustic and transport properties of different rocks at different conditions are compiled. The data acquisition system in the virtual lab is linked to the complied experimental data. Structural and microstructural images of deformed rocks are up-loaded and linked to different deformation tests. The integration of the microstructural image and the deformation data allows the student to visualize how forces reshape the structure of the rock and change the physical properties. The virtual lab is built using the Game Engine. The geological background, outstanding questions related to the geological environment, and physical and mechanical concepts associated with the problem will be illustrated on the web portal. In

  17. Practical solution for control of the pre-analytical phase in decentralized clinical laboratories for meeting the requirements of the medical laboratory accreditation standard DIN EN ISO 15189.

    PubMed

    Vacata, Vladimir; Jahns-Streubel, Gerlinde; Baldus, Mirjana; Wood, William Graham

    2007-01-01

    This report was written in response to the article by Wood published recently in this journal. It describes a practical solution to the problems of controlling the pre-analytical phase in the clinical diagnostic laboratory. As an indicator of quality in the pre-analytical phase of sample processing, a target analyte was chosen which is sensitive to delay in centrifugation and/or analysis. The results of analyses of the samples sent by satellite medical practitioners were compared with those from an on-site hospital laboratory with a controllable optimized pre-analytical phase. The aim of the comparison was: (a) to identify those medical practices whose mean/median sample values significantly deviate from those of the control situation in the hospital laboratory due to the possible problems in the pre-analytical phase; (b) to aid these laboratories in the process of rectifying these problems. A Microsoft Excel-based Pre-Analytical Survey tool (PAS tool) has been developed which addresses the above mentioned problems. It has been tested on serum potassium which is known to be sensitive to delay and/or irregularities in sample treatment. The PAS tool has been shown to be one possibility for improving the quality of the analyses by identifying the sources of problems within the pre-analytical phase, thus allowing them to be rectified. Additionally, the PAS tool has an educational value and can also be adopted for use in other decentralized laboratories.

  18. The effect of introducing computers into an introductory physics problem-solving laboratory

    NASA Astrophysics Data System (ADS)

    McCullough, Laura Ellen

    2000-10-01

    Computers are appearing in every type of classroom across the country. Yet they often appear without benefit of studying their effects. The research that is available on computer use in classrooms has found mixed results, and often ignores the theoretical and instructional contexts of the computer in the classroom. The University of Minnesota's physics department employs a cooperative-group problem solving pedagogy, based on a cognitive apprenticeship instructional model, in its calculus-based introductory physics course. This study was designed to determine possible negative effects of introducing a computerized data-acquisition and analysis tool into this pedagogy as a problem-solving tool for students to use in laboratory. To determine the effects of the computer tool, two quasi-experimental treatment groups were selected. The computer-tool group (N = 170) used a tool, designed for this study (VideoTool), to collect and analyze motion data in the laboratory. The control group (N = 170) used traditional non-computer equipment (spark tapes and Polaroid(TM) film). The curriculum was kept as similar as possible for the two groups. During the ten week academic quarter, groups were examined for effects on performance on conceptual tests and grades, attitudes towards the laboratory and the laboratory tools, and behaviors within cooperative groups. Possible interactions with gender were also examined. Few differences were found between the control and computer-tool groups. The control group received slightly higher scores on one conceptual test, but this difference was not educationally significant. The computer-tool group had slightly more positive attitudes towards using the computer tool than their counterparts had towards the traditional tools. The computer-tool group also perceived that they spoke more frequently about physics misunderstandings, while the control group felt that they discussed equipment difficulties more often. This perceptual difference interacted

  19. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bush, B.; Penev, M.; Melaina, M.

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  20. Copper Indium Gallium Diselenide Cluster Tool | Photovoltaic Research |

    Science.gov Websites

    -mobile unit The figure shows where chambers-numbered in the list above-are physically located on the laboratory space. Samples from the CIGS cluster tool can be transported to these other tools using a mobile

  1. Modeling and MBL: Software Tools for Science.

    ERIC Educational Resources Information Center

    Tinker, Robert F.

    Recent technological advances and new software packages put unprecedented power for experimenting and theory-building in the hands of students at all levels. Microcomputer-based laboratory (MBL) and model-solving tools illustrate the educational potential of the technology. These tools include modeling software and three MBL packages (which are…

  2. Reliability on intra-laboratory and inter-laboratory data of hair mineral analysis comparing with blood analysis.

    PubMed

    Namkoong, Sun; Hong, Seung Phil; Kim, Myung Hwa; Park, Byung Cheol

    2013-02-01

    Nowadays, although its clinical value remains controversial institutions utilize hair mineral analysis. Arguments about the reliability of hair mineral analysis persist, and there have been evaluations of commercial laboratories performing hair mineral analysis. The objective of this study was to assess the reliability of intra-laboratory and inter-laboratory data at three commercial laboratories conducting hair mineral analysis, compared to serum mineral analysis. Two divided hair samples taken from near the scalp were submitted for analysis at the same time, to all laboratories, from one healthy volunteer. Each laboratory sent a report consisting of quantitative results and their interpretation of health implications. Differences among intra-laboratory and interlaboratory data were analyzed using SPSS version 12.0 (SPSS Inc., USA). All the laboratories used identical methods for quantitative analysis, and they generated consistent numerical results according to Friedman analysis of variance. However, the normal reference ranges of each laboratory varied. As such, each laboratory interpreted the patient's health differently. On intra-laboratory data, Wilcoxon analysis suggested they generated relatively coherent data, but laboratory B could not in one element, so its reliability was doubtful. In comparison with the blood test, laboratory C generated identical results, but not laboratory A and B. Hair mineral analysis has its limitations, considering the reliability of inter and intra laboratory analysis comparing with blood analysis. As such, clinicians should be cautious when applying hair mineral analysis as an ancillary tool. Each laboratory included in this study requires continuous refinement from now on for inducing standardized normal reference levels.

  3. Friction Mapping as a Tool for Measuring the Elastohydrodynamic Contact Running-in Process

    DTIC Science & Technology

    2015-10-01

    ARL-TR-7501 ● OCT 2015 US Army Research Laboratory Friction Mapping as a Tool for Measuring the Elastohydrodynamic Contact...Research Laboratory Friction Mapping as a Tool for Measuring the Elastohydrodynamic Contact Running-in Process by Stephen Berkebile Vehicle...YYYY) October 2015 2. REPORT TYPE Final 3. DATES COVERED (From - To) 1 January–30 June 2015 4. TITLE AND SUBTITLE Friction Mapping as a Tool for

  4. Using the arthroscopic surgery skill evaluation tool as a pass-fail examination.

    PubMed

    Koehler, Ryan J; Nicandri, Gregg T

    2013-12-04

    Examination of arthroscopic skill requires evaluation tools that are valid and reliable with clear criteria for passing. The Arthroscopic Surgery Skill Evaluation Tool was developed as a video-based assessment of technical skill with criteria for passing established by a panel of experts. The purpose of this study was to test the validity and reliability of the Arthroscopic Surgery Skill Evaluation Tool as a pass-fail examination of arthroscopic skill. Twenty-eight residents and two sports medicine faculty members were recorded performing diagnostic knee arthroscopy on a left and right cadaveric specimen in our arthroscopic skills laboratory. Procedure videos were evaluated with use of the Arthroscopic Surgery Skill Evaluation Tool by two raters blind to subject identity. Subjects were considered to pass the Arthroscopic Surgery Skill Evaluation Tool when they attained scores of ≥ 3 on all eight assessment domains. The raters agreed on a pass-fail rating for fifty-five of sixty videos rated with an interclass correlation coefficient value of 0.83. Ten of thirty participants were assigned passing scores by both raters for both diagnostic arthroscopies performed in the laboratory. Receiver operating characteristic analysis demonstrated that logging more than eighty arthroscopic cases or performing more than thirty-five arthroscopic knee cases was predictive of attaining a passing Arthroscopic Surgery Skill Evaluation Tool score on both procedures performed in the laboratory. The Arthroscopic Surgery Skill Evaluation Tool is valid and reliable as a pass-fail examination of diagnostic arthroscopy of the knee in the simulation laboratory. This study demonstrates that the Arthroscopic Surgery Skill Evaluation Tool may be a useful tool for pass-fail examination of diagnostic arthroscopy of the knee in the simulation laboratory. Further study is necessary to determine whether the Arthroscopic Surgery Skill Evaluation Tool can be used for the assessment of multiple

  5. Designing application software in wide area network settings

    NASA Technical Reports Server (NTRS)

    Makpangou, Mesaac; Birman, Ken

    1990-01-01

    Progress in methodologies for developing robust local area network software has not been matched by similar results for wide area settings. The design of application software spanning multiple local area environments is examined. For important classes of applications, simple design techniques are presented that yield fault tolerant wide area programs. An implementation of these techniques as a set of tools for use within the ISIS system is described.

  6. Monocular tool control, eye dominance, and laterality in New Caledonian crows.

    PubMed

    Martinho, Antone; Burns, Zackory T; von Bayern, Auguste M P; Kacelnik, Alex

    2014-12-15

    Tool use, though rare, is taxonomically widespread, but morphological adaptations for tool use are virtually unknown. We focus on the New Caledonian crow (NCC, Corvus moneduloides), which displays some of the most innovative tool-related behavior among nonhumans. One of their major food sources is larvae extracted from burrows with sticks held diagonally in the bill, oriented with individual, but not species-wide, laterality. Among possible behavioral and anatomical adaptations for tool use, NCCs possess unusually wide binocular visual fields (up to 60°), suggesting that extreme binocular vision may facilitate tool use. Here, we establish that during natural extractions, tool tips can only be viewed by the contralateral eye. Thus, maintaining binocular view of tool tips is unlikely to have selected for wide binocular fields; the selective factor is more likely to have been to allow each eye to see far enough across the midsagittal line to view the tool's tip monocularly. Consequently, we tested the hypothesis that tool side preference follows eye preference and found that eye dominance does predict tool laterality across individuals. This contrasts with humans' species-wide motor laterality and uncorrelated motor-visual laterality, possibly because bill-held tools are viewed monocularly and move in concert with eyes, whereas hand-held tools are visible to both eyes and allow independent combinations of eye preference and handedness. This difference may affect other models of coordination between vision and mechanical control, not necessarily involving tools. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. The upper bound of abutment scour defined by selected laboratory and field data

    USGS Publications Warehouse

    Benedict, Stephen; Caldwell, Andral W.

    2015-01-01

    The U.S. Geological Survey, in cooperation with the South Carolina Department of Transportation, conducted a field investigation of abutment scour in South Carolina and used that data to develop envelope curves defining the upper bound of abutment scour. To expand upon this previous work, an additional cooperative investigation was initiated to combine the South Carolina data with abutment-scour data from other sources and evaluate the upper bound of abutment scour with the larger data set. To facilitate this analysis, a literature review was made to identify potential sources of published abutment-scour data, and selected data, consisting of 446 laboratory and 331 field measurements, were compiled for the analysis. These data encompassed a wide range of laboratory and field conditions and represent field data from 6 states within the United States. The data set was used to evaluate the South Carolina abutment-scour envelope curves. Additionally, the data were used to evaluate a dimensionless abutment-scour envelope curve developed by Melville (1992), highlighting the distinct difference in the upper bound for laboratory and field data. The envelope curves evaluated in this investigation provide simple but useful tools for assessing the potential maximum abutment-scour depth in the field setting.

  8. EPOS Multi-Scale Laboratory platform: a long-term reference tool for experimental Earth Sciences

    NASA Astrophysics Data System (ADS)

    Trippanera, Daniele; Tesei, Telemaco; Funiciello, Francesca; Sagnotti, Leonardo; Scarlato, Piergiorgio; Rosenau, Matthias; Elger, Kirsten; Ulbricht, Damian; Lange, Otto; Calignano, Elisa; Spiers, Chris; Drury, Martin; Willingshofer, Ernst; Winkler, Aldo

    2017-04-01

    With continuous progress on scientific research, a large amount of datasets has been and will be produced. The data access and sharing along with their storage and homogenization within a unique and coherent framework is a new challenge for the whole scientific community. This is particularly emphasized for geo-scientific laboratories, encompassing the most diverse Earth Science disciplines and typology of data. To this aim the "Multiscale Laboratories" Work Package (WP16), operating in the framework of the European Plate Observing System (EPOS), is developing a virtual platform of geo-scientific data and services for the worldwide community of laboratories. This long-term project aims at merging the top class multidisciplinary laboratories in Geoscience into a coherent and collaborative network, facilitating the standardization of virtual access to data, data products and software. This will help our community to evolve beyond the stage in which most of data produced by the different laboratories are available only within the related scholarly publications (often as print-version only) or they remain unpublished and inaccessible on local devices. The EPOS multi-scale laboratory platform will provide the possibility to easily share and discover data by means of open access, DOI-referenced, online data publication including long-term storage, managing and curation services and to set up a cohesive community of laboratories. The WP16 is starting with three pilot cases laboratories: (1) rock physics, (2) palaeomagnetic, and (3) analogue modelling. As a proof of concept, first analogue modelling datasets have been published via GFZ Data Services (http://doidb.wdc-terra.org/search/public/ui?&sort=updated+desc&q=epos). The datasets include rock analogue material properties (e.g. friction data, rheology data, SEM imagery), as well as supplementary figures, images and movies from experiments on tectonic processes. A metadata catalogue tailored to the specific communities

  9. Applying Behavior-Based Robotics Concepts to Telerobotic Use of Power Tooling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Noakes, Mark W; Hamel, Dr. William R.

    While it has long been recognized that telerobotics has potential advantages to reduce operator fatigue, to permit lower skilled operators to function as if they had higher skill levels, and to protect tools and manipulators from excessive forces during operation, relatively little laboratory research in telerobotics has actually been implemented in fielded systems. Much of this has to do with the complexity of the implementation and its lack of ability to operate in complex unstructured remote systems environments. One possible solution is to approach the tooling task using an adaptation of behavior-based techniques to facilitate task decomposition to a simplermore » perspective and to provide sensor registration to the task target object in the field. An approach derived from behavior-based concepts has been implemented to provide automated tool operation for a teleoperated manipulator system. The generic approach is adaptable to a wide range of typical remote tools used in hot-cell and decontamination and dismantlement-type operations. Two tasks are used in this work to test the validity of the concept. First, a reciprocating saw is used to cut a pipe. The second task is bolt removal from mockup process equipment. This paper explains the technique, its implementation, and covers experimental data, analysis of results, and suggestions for implementation on fielded systems.« less

  10. Assessing the welfare of laboratory mice in their home environment using animal-based measures--a benchmarking tool.

    PubMed

    Spangenberg, Elin M F; Keeling, Linda J

    2016-02-01

    Welfare problems in laboratory mice can be a consequence of an ongoing experiment, or a characteristic of a particular genetic line, but in some cases, such as breeding animals, they are most likely to be a result of the design and management of the home cage. Assessment of the home cage environment is commonly performed using resource-based measures, like access to nesting material. However, animal-based measures (related to the health status and behaviour of the animals) can be used to assess the current welfare of animals regardless of the inputs applied (i.e. the resources or management). The aim of this study was to design a protocol for assessing the welfare of laboratory mice using only animal-based measures. The protocol, to be used as a benchmarking tool, assesses mouse welfare in the home cage and does not contain parameters related to experimental situations. It is based on parameters corresponding to the 12 welfare criteria established by the Welfare Quality® project. Selection of animal-based measures was performed by scanning existing published, web-based and informal protocols, and by choosing parameters that matched these criteria, were feasible in practice and, if possible, were already validated indicators of mouse welfare. The parameters should identify possible animal welfare problems and enable assessment directly in an animal room during cage cleaning procedures, without the need for extra equipment. Thermal comfort behaviours and positive emotional states are areas where more research is needed to find valid, reliable and feasible animal-based measures. © The Author(s) 2015.

  11. Mapping basin-wide subaquatic slope failure susceptibility as a tool to assess regional seismic and tsunami hazards

    NASA Astrophysics Data System (ADS)

    Strasser, Michael; Hilbe, Michael; Anselmetti, Flavio S.

    2010-05-01

    occurred. Comparison of reconstructed critical stability conditions with the known distribution of landslide deposits reveals minimum and maximum threshold conditions for slopes that failed or remained stable, respectively. The resulting correlations reveal good agreements and suggest that the slope stability model generally succeeds in reproducing past events. The basin-wide mapping of subaquatic slope failure susceptibility through time thus can also be considered as a promising paleoseismologic tool that allows quantification of past earthquake ground shaking intensities. Furthermore, it can be used to assess the present-day slope failure susceptibility allowing for identification of location and estimation of size of future, potentially tsunamigenic subaquatic landslides. The new approach presented in our comprehensive lake study and resulting conceptual ideas can be vital to improve our understanding of larger marine slope instabilities and related seismic and oceanic geohazards along formerly glaciated ocean margins and closed basins worldwide.

  12. A Strategy for an Enterprise-Wide Data Management Capability at the Jet Propulsion Laboratory

    NASA Technical Reports Server (NTRS)

    Fuhrman, D.

    2000-01-01

    The Jet Propulsion Laboratory (JPL) is a Federally Research and Development Center (FFRDC) operated by the California Institute of Technology that is engaged in the quest for knowledge about the solar system, the universe, and the Earth.

  13. Genome wide approaches to identify protein-DNA interactions.

    PubMed

    Ma, Tao; Ye, Zhenqing; Wang, Liguo

    2018-05-29

    Transcription factors are DNA-binding proteins that play key roles in many fundamental biological processes. Unraveling their interactions with DNA is essential to identify their target genes and understand the regulatory network. Genome-wide identification of their binding sites became feasible thanks to recent progress in experimental and computational approaches. ChIP-chip, ChIP-seq, and ChIP-exo are three widely used techniques to demarcate genome-wide transcription factor binding sites. This review aims to provide an overview of these three techniques including their experiment procedures, computational approaches, and popular analytic tools. ChIP-chip, ChIP-seq, and ChIP-exo have been the major techniques to study genome-wide in vivo protein-DNA interaction. Due to the rapid development of next-generation sequencing technology, array-based ChIP-chip is deprecated and ChIP-seq has become the most widely used technique to identify transcription factor binding sites in genome-wide. The newly developed ChIP-exo further improves the spatial resolution to single nucleotide. Numerous tools have been developed to analyze ChIP-chip, ChIP-seq and ChIP-exo data. However, different programs may employ different mechanisms or underlying algorithms thus each will inherently include its own set of statistical assumption and bias. So choosing the most appropriate analytic program for a given experiment needs careful considerations. Moreover, most programs only have command line interface so their installation and usage will require basic computation expertise in Unix/Linux. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  14. Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melaina, Marc

    This presentation describes the Hydrogen Financial Analysis Scenario Tool, H2FAST, and provides an overview of each of the three H2FAST formats: the H2FAST web tool, the H2FAST Excel spreadsheet, and the H2FAST Business Case Scenario (BCS) tool. Examples are presented to illustrate the types of questions that H2FAST can help answer.

  15. Wide-Field Plate Database

    NASA Astrophysics Data System (ADS)

    Tsvetkov, M. K.; Stavrev, K. Y.; Tsvetkova, K. P.; Semkov, E. H.; Mutatov, A. S.

    The Wide-Field Plate Database (WFPDB) and the possibilities for its application as a research tool in observational astronomy are presented. Currently the WFPDB comprises the descriptive data for 400 000 archival wide field photographic plates obtained with 77 instruments, from a total of 1 850 000 photographs stored in 269 astronomical archives all over the world since the end of last century. The WFPDB is already accessible for the astronomical community, now only in batch mode through user requests sent by e-mail. We are working on on-line interactive access to the data via INTERNET from Sofia and parallel from the Centre de Donnees Astronomiques de Strasbourg. (Initial information can be found on World Wide Web homepage URL http://www.wfpa.acad.bg.) The WFPDB may be useful in studies of a variety of astronomical objects and phenomena, andespecially for long-term investigations of variable objects and for multi-wavelength research. We have analysed the data in the WFPDB in order to derive the overall characteristics of the totality of wide-field observations, such as the sky coverage, the distributions by observation time and date, by spectral band, and by object type. We have also examined the totality of wide-field observations from point of view of their quality, availability and digitisation. The usefulness of the WFPDB is demonstrated by the results of identification and investigation of the photometrical behaviour of optical analogues of gamma-ray bursts.

  16. Phlebotomy, a bridge between laboratory and patient

    PubMed Central

    Ialongo, Cristiano; Bernardini, Sergio

    2016-01-01

    The evidence-based paradigm has changed and evolved medical practice. Phlebotomy, which dates back to the age of ancient Greece, has gained experience through the evolution of medicine becoming a fundamental diagnostic tool. Nowadays it connects the patient with the clinical laboratory dimension building up a bridge. However, more often there is a gap between laboratory and phlebotomist that causes misunderstandings and burdens on patient safety. Therefore, the scope of this review is delivering a view of modern phlebotomy to “bridge” patient and laboratory. In this regard the paper describes devices, tools and procedures in the light of the most recent scientific findings, also discussing their impact on both quality of blood testing and patient safety. It also addresses the issues concerning medical aspect of venipuncture, like the practical approach to the superficial veins anatomy, as well as the management of the patient’s compliance with the blood draw. Thereby, the clinical, technical and practical issues are treated with the same relevance throughout the entire paper. PMID:26981016

  17. A review of laboratory and numerical modelling in volcanology

    NASA Astrophysics Data System (ADS)

    Kavanagh, Janine L.; Engwell, Samantha L.; Martin, Simon A.

    2018-04-01

    Modelling has been used in the study of volcanic systems for more than 100 years, building upon the approach first applied by Sir James Hall in 1815. Informed by observations of volcanological phenomena in nature, including eye-witness accounts of eruptions, geophysical or geodetic monitoring of active volcanoes, and geological analysis of ancient deposits, laboratory and numerical models have been used to describe and quantify volcanic and magmatic processes that span orders of magnitudes of time and space. We review the use of laboratory and numerical modelling in volcanological research, focussing on sub-surface and eruptive processes including the accretion and evolution of magma chambers, the propagation of sheet intrusions, the development of volcanic flows (lava flows, pyroclastic density currents, and lahars), volcanic plume formation, and ash dispersal. When first introduced into volcanology, laboratory experiments and numerical simulations marked a transition in approach from broadly qualitative to increasingly quantitative research. These methods are now widely used in volcanology to describe the physical and chemical behaviours that govern volcanic and magmatic systems. Creating simplified models of highly dynamical systems enables volcanologists to simulate and potentially predict the nature and impact of future eruptions. These tools have provided significant insights into many aspects of the volcanic plumbing system and eruptive processes. The largest scientific advances in volcanology have come from a multidisciplinary approach, applying developments in diverse fields such as engineering and computer science to study magmatic and volcanic phenomena. A global effort in the integration of laboratory and numerical volcano modelling is now required to tackle key problems in volcanology and points towards the importance of benchmarking exercises and the need for protocols to be developed so that models are routinely tested against real world data.

  18. Experimenter's laboratory for visualized interactive science

    NASA Technical Reports Server (NTRS)

    Hansen, Elaine R.; Klemp, Marjorie K.; Lasater, Sally W.; Szczur, Marti R.; Klemp, Joseph B.

    1992-01-01

    The science activities of the 1990's will require the analysis of complex phenomena and large diverse sets of data. In order to meet these needs, we must take advantage of advanced user interaction techniques: modern user interface tools; visualization capabilities; affordable, high performance graphics workstations; and interoperable data standards and translator. To meet these needs, we propose to adopt and upgrade several existing tools and systems to create an experimenter's laboratory for visualized interactive science. Intuitive human-computer interaction techniques have already been developed and demonstrated at the University of Colorado. A Transportable Applications Executive (TAE+), developed at GSFC, is a powerful user interface tool for general purpose applications. A 3D visualization package developed by NCAR provides both color shaded surface displays and volumetric rendering in either index or true color. The Network Common Data Form (NetCDF) data access library developed by Unidata supports creation, access and sharing of scientific data in a form that is self-describing and network transparent. The combination and enhancement of these packages constitutes a powerful experimenter's laboratory capable of meeting key science needs of the 1990's. This proposal encompasses the work required to build and demonstrate this capability.

  19. Experimenter's laboratory for visualized interactive science

    NASA Technical Reports Server (NTRS)

    Hansen, Elaine R.; Klemp, Marjorie K.; Lasater, Sally W.; Szczur, Marti R.; Klemp, Joseph B.

    1993-01-01

    The science activities of the 1990's will require the analysis of complex phenomena and large diverse sets of data. In order to meet these needs, we must take advantage of advanced user interaction techniques: modern user interface tools; visualization capabilities; affordable, high performance graphics workstations; and interoperatable data standards and translator. To meet these needs, we propose to adopt and upgrade several existing tools and systems to create an experimenter's laboratory for visualized interactive science. Intuitive human-computer interaction techniques have already been developed and demonstrated at the University of Colorado. A Transportable Applications Executive (TAE+), developed at GSFC, is a powerful user interface tool for general purpose applications. A 3D visualization package developed by NCAR provides both color-shaded surface displays and volumetric rendering in either index or true color. The Network Common Data Form (NetCDF) data access library developed by Unidata supports creation, access and sharing of scientific data in a form that is self-describing and network transparent. The combination and enhancement of these packages constitutes a powerful experimenter's laboratory capable of meeting key science needs of the 1990's. This proposal encompasses the work required to build and demonstrate this capability.

  20. The laboratory domestication of Caenorhabditis elegans.

    PubMed

    Sterken, Mark G; Snoek, L Basten; Kammenga, Jan E; Andersen, Erik C

    2015-05-01

    Model organisms are of great importance to our understanding of basic biology and to making advances in biomedical research. However, the influence of laboratory cultivation on these organisms is underappreciated, and especially how that environment can affect research outcomes. Recent experiments led to insights into how the widely used laboratory reference strain of the nematode Caenorhabditis elegans compares with natural strains. Here we describe potential selective pressures that led to the fixation of laboratory-derived alleles for the genes npr-1, glb-5, and nath-10. These alleles influence a large number of traits, resulting in behaviors that affect experimental interpretations. Furthermore, strong phenotypic effects caused by these laboratory-derived alleles hinder the discovery of natural alleles. We highlight strategies to reduce the influence of laboratory-derived alleles and to harness the full power of C. elegans. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Laboratory investigation of hypercoagulability.

    PubMed

    Francis, J L

    1998-01-01

    For many years, the laboratory investigation of patients with thrombophilia has lagged behind that of patients with bleeding diathesis. Improved understanding of the mechanisms that control and regulate coagulation, and the resultant recognition of new defects, have greatly stimulated clinical laboratory interest in this area. Assays to detect resistance to activated protein C; deficiencies of antithrombin, protein C, and protein S; and the presence of antiphospholipid antibodies are widely available and should form part of the investigation of patients that present with idiopathic thrombosis. Such a work-up will likely provide an explanation for thrombosis in 40 to 60% of patients. Abnormalities of fibrinogen and fibrinolysis may explain still more, although such defects are currently considered rare. In addition, presently unrecognized defects almost certainly exist, and the identification of such individuals will undoubtedly improve our understanding of the hemostatic mechanism. Laboratory tests to define the hypercoagulable state are continually being developed. They include whole blood coagulation and platelet function tests and novel activation markers. However, acceptance of these approaches by clinical laboratories has been slow.

  2. An Introduction to Solar Decision-Making Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mow, Benjamin

    2017-09-12

    The National Renewable Energy Laboratory (NREL) offers a variety of models and analysis tools to help decision makers evaluate and make informed decisions about solar projects, policies, and programs. This fact sheet aims to help decision makers determine which NREL tool to use for a given solar project or policy question, depending on its scope.

  3. A Modeling Tool for Household Biogas Burner Flame Port Design

    NASA Astrophysics Data System (ADS)

    Decker, Thomas J.

    Anaerobic digestion is a well-known and potentially beneficial process for rural communities in emerging markets, providing the opportunity to generate usable gaseous fuel from agricultural waste. With recent developments in low-cost digestion technology, communities across the world are gaining affordable access to the benefits of anaerobic digestion derived biogas. For example, biogas can displace conventional cooking fuels such as biomass (wood, charcoal, dung) and Liquefied Petroleum Gas (LPG), effectively reducing harmful emissions and fuel cost respectively. To support the ongoing scaling effort of biogas in rural communities, this study has developed and tested a design tool aimed at optimizing flame port geometry for household biogas-fired burners. The tool consists of a multi-component simulation that incorporates three-dimensional CAD designs with simulated chemical kinetics and computational fluid dynamics. An array of circular and rectangular port designs was developed for a widely available biogas stove (called the Lotus) as part of this study. These port designs were created through guidance from previous studies found in the literature. The three highest performing designs identified by the tool were manufactured and tested experimentally to validate tool output and to compare against the original port geometry. The experimental results aligned with the tool's prediction for the three chosen designs. Each design demonstrated improved thermal efficiency relative to the original, with one configuration of circular ports exhibiting superior performance. The results of the study indicated that designing for a targeted range of port hydraulic diameter, velocity and mixture density in the tool is a relevant way to improve the thermal efficiency of a biogas burner. Conversely, the emissions predictions made by the tool were found to be unreliable and incongruent with laboratory experiments.

  4. Pre- and Post-Processing Tools to Create and Characterize Particle-Based Composite Model Structures

    DTIC Science & Technology

    2017-11-01

    ARL-TR-8213 ● NOV 2017 US Army Research Laboratory Pre- and Post -Processing Tools to Create and Characterize Particle-Based...ARL-TR-8213 ● NOV 2017 US Army Research Laboratory Pre- and Post -Processing Tools to Create and Characterize Particle-Based Composite...AND SUBTITLE Pre- and Post -Processing Tools to Create and Characterize Particle-Based Composite Model Structures 5a. CONTRACT NUMBER 5b. GRANT

  5. Validation of a risk stratification tool for fall-related injury in a state-wide cohort.

    PubMed

    McCoy, Thomas H; Castro, Victor M; Cagan, Andrew; Roberson, Ashlee M; Perlis, Roy H

    2017-02-06

    A major preventable contributor to healthcare costs among older individuals is fall-related injury. We sought to validate a tool to stratify such risk based on readily available clinical data, including projected medication adverse effects, using state-wide medical claims data. Sociodemographic and clinical features were drawn from health claims paid in the state of Massachusetts for individuals aged 35-65 with a hospital admission for a period spanning January-December 2012. Previously developed logistic regression models of hospital readmission for fall-related injury were refit in a testing set including a randomly selected 70% of individuals, and examined in a training set comprised of the remaining 30%. Medications at admission were summarised based on reported adverse effect frequencies in published medication labelling. The Massachusetts health system. A total of 68 764 hospitalised individuals aged 35-65 years. Hospital readmission for fall-related injury defined by claims code. A total of 2052 individuals (3.0%) were hospitalised for fall-related injury within 90 days of discharge, and 3391 (4.9%) within 180 days. After recalibrating the model in a training data set comprised of 48 136 individuals (70%), model discrimination in the remaining 30% test set yielded an area under the receiver operating characteristic curve (AUC) of 0.74 (95% CI 0.72 to 0.76). AUCs were similar across age decades (0.71 to 0.78) and sex (0.72 male, 0.76 female), and across most common diagnostic categories other than psychiatry. For individuals in the highest risk quartile, 11.4% experienced fall within 180 days versus 1.2% in the lowest risk quartile; 57.6% of falls occurred in the highest risk quartile. This analysis of state-wide claims data demonstrates the feasibility of predicting fall-related injury requiring hospitalisation using readily available sociodemographic and clinical details. This translatable approach to stratification allows for identification of

  6. Laboratory and workplace assessments of rivet bucking bar vibration emissions.

    PubMed

    McDowell, Thomas W; Warren, Christopher; Xu, Xueyan S; Welcome, Daniel E; Dong, Ren G

    2015-04-01

    Sheet metal workers operating rivet bucking bars are at risk of developing hand and wrist musculoskeletal disorders associated with exposures to hand-transmitted vibrations and forceful exertions required to operate these hand tools. New bucking bar technologies have been introduced in efforts to reduce workplace vibration exposures to these workers. However, the efficacy of these new bucking bar designs has not been well documented. While there are standardized laboratory-based methodologies for assessing the vibration emissions of many types of powered hand tools, no such standard exists for rivet bucking bars. Therefore, this study included the development of a laboratory-based method for assessing bucking bar vibrations which utilizes a simulated riveting task. With this method, this study evaluated three traditional steel bucking bars, three similarly shaped tungsten alloy bars, and three bars featuring spring-dampeners. For comparison the bucking bar vibrations were also assessed during three typical riveting tasks at a large aircraft maintenance facility. The bucking bars were rank-ordered in terms of unweighted and frequency-weighted acceleration measured at the hand-tool interface. The results suggest that the developed laboratory method is a reasonable technique for ranking bucking bar vibration emissions; the lab-based riveting simulations produced similar rankings to the workplace rankings. However, the laboratory-based acceleration averages were considerably lower than the workplace measurements. These observations suggest that the laboratory test results are acceptable for comparing and screening bucking bars, but the laboratory measurements should not be directly used for assessing the risk of workplace bucking bar vibration exposures. The newer bucking bar technologies exhibited significantly reduced vibrations compared to the traditional steel bars. The results of this study, together with other information such as rivet quality, productivity, tool

  7. Quality Indicators in Laboratory Medicine: from theory to practice. Preliminary data from the IFCC Working Group Project "Laboratory Errors and Patient Safety".

    PubMed

    Sciacovelli, Laura; O'Kane, Maurice; Skaik, Younis Abdelwahab; Caciagli, Patrizio; Pellegrini, Cristina; Da Rin, Giorgio; Ivanov, Agnes; Ghys, Timothy; Plebani, Mario

    2011-05-01

    The adoption of Quality Indicators (QIs) has prompted the development of tools to measure and evaluate the quality and effectiveness of laboratory testing, first in the hospital setting and subsequently in ambulatory and other care settings. While Laboratory Medicine has an important role in the delivery of high-quality care, no consensus exists as yet on the use of QIs focussing on all steps of the laboratory total testing process (TTP), and further research in this area is required. In order to reduce errors in laboratory testing, the IFCC Working Group on "Laboratory Errors and Patient Safety" (WG-LEPS) developed a series of Quality Indicators, specifically designed for clinical laboratories. In the first phase of the project, specific QIs for key processes of the TTP were identified, including all the pre-, intra- and post-analytic steps. The overall aim of the project is to create a common reporting system for clinical laboratories based on standardized data collection, and to define state-of-the-art and Quality Specifications (QSs) for each QI independent of: a) the size of organization and type of activities; b) the complexity of processes undertaken; and c) different degree of knowledge and ability of the staff. The aim of the present paper is to report the results collected from participating laboratories from February 2008 to December 2009 and to identify preliminary QSs. The results demonstrate that a Model of Quality Indicators managed as an External Quality Assurance Program can serve as a tool to monitor and control the pre-, intra- and post-analytical activities. It might also allow clinical laboratories to identify risks that lead to errors resulting in patient harm: identification and design of practices that eliminate medical errors; the sharing of information and education of clinical and laboratory teams on practices that reduce or prevent errors; the monitoring and evaluation of improvement activities.

  8. A review of genome-wide approaches to study the genetic basis for spermatogenic defects.

    PubMed

    Aston, Kenneth I; Conrad, Donald F

    2013-01-01

    Rapidly advancing tools for genetic analysis on a genome-wide scale have been instrumental in identifying the genetic bases for many complex diseases. About half of male infertility cases are of unknown etiology in spite of tremendous efforts to characterize the genetic basis for the disorder. Advancing our understanding of the genetic basis for male infertility will require the application of established and emerging genomic tools. This chapter introduces many of the tools available for genetic studies on a genome-wide scale along with principles of study design and data analysis.

  9. Wide-bandwidth, wide-beamwidth, high-resolution, millimeter-wave imaging for concealed weapon detection

    NASA Astrophysics Data System (ADS)

    Sheen, David M.; Fernandes, Justin L.; Tedeschi, Jonathan R.; McMakin, Douglas L.; Jones, A. Mark; Lechelt, Wayne M.; Severtsen, Ronald H.

    2013-05-01

    Active millimeter-wave imaging is currently being used for personnel screening at airports and other high-security facilities. The cylindrical imaging techniques used in the deployed systems are based on licensed technology developed at the Pacific Northwest National Laboratory. The cylindrical and a related planar imaging technique form three-dimensional images by scanning a diverging beam swept frequency transceiver over a two-dimensional aperture and mathematically focusing or reconstructing the data into three-dimensional images of the person being screened. The resolution, clothing penetration, and image illumination quality obtained with these techniques can be significantly enhanced through the selection of the aperture size, antenna beamwidth, center frequency, and bandwidth. The lateral resolution can be improved by increasing the center frequency, or it can be increased with a larger antenna beamwidth. The wide beamwidth approach can significantly improve illumination quality relative to a higher frequency system. Additionally, a wide antenna beamwidth allows for operation at a lower center frequency resulting in less scattering and attenuation from the clothing. The depth resolution of the system can be improved by increasing the bandwidth. Utilization of extremely wide bandwidths of up to 30 GHz can result in depth resolution as fine as 5 mm. This wider bandwidth operation may allow for improved detection techniques based on high range resolution. In this paper, the results of an extensive imaging study that explored the advantages of using extremely wide beamwidth and bandwidth are presented, primarily for 10-40 GHz frequency band.

  10. Maude: A Wide Spectrum Language for Secure Active Networks

    DTIC Science & Technology

    2002-08-01

    AFRL-IF-RS-TR-2002-197 Final Technical Report August 2002 MAUDE: A WIDE SPECTRUM LANGUAGE FOR SECURE ACTIVE NETWORKS SRI...MAUDE: A WIDE SPECTRUM FORMAL LANGUAGE FOR SECURE ACTIVE NETWORKS 6. AUTHOR(S) Jose Meseguer and Carolyn Talcott 5. FUNDING NUMBERS C...specifications to address this challenge. We also show how, using the Maude rewriting logic language and tools, active network systems, languages , and

  11. [Development of laboratory sequence analysis software based on WWW and UNIX].

    PubMed

    Huang, Y; Gu, J R

    2001-01-01

    Sequence analysis tools based on WWW and UNIX were developed in our laboratory to meet the needs of molecular genetics research in our laboratory. General principles of computer analysis of DNA and protein sequences were also briefly discussed in this paper.

  12. Introducing New Learning Tools into a Standard Classroom: A Multi-Tool Approach to Integrating Fuel-Cell Concepts into Introductory College Chemistry

    ERIC Educational Resources Information Center

    D'Amato, Matthew J.; Lux, Kenneth W.; Walz, Kenneth A.; Kerby, Holly Walter; Anderegg, Barbara

    2007-01-01

    A multi-tool approach incorporating traditional lectures, multimedia learning objects, and a laboratory activity were introduced as the concepts surrounding hydrogen fuel-cell technology in college chemistry courses. The new tools are adaptable, facilitating use in different educational environments and address variety of learning styles to…

  13. Annotated bibliography of software engineering laboratory literature

    NASA Technical Reports Server (NTRS)

    Groves, Paula; Valett, Jon

    1990-01-01

    An annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory is given. More than 100 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials have been grouped into eight general subject areas for easy reference: the Software Engineering Laboratory; the Software Engineering Laboratory-software development documents; software tools; software models; software measurement; technology evaluations; Ada technology; and data collection. Subject and author indexes further classify these documents by specific topic and individual author.

  14. Annotated bibliography of Software Engineering Laboratory literature

    NASA Technical Reports Server (NTRS)

    Morusiewicz, Linda; Valett, Jon

    1993-01-01

    This document is an annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory. Nearly 200 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials have been grouped into eight general subject areas for easy reference: the Software Engineering Laboratory; the Software Engineering Laboratory: software development documents; software tools; software models; software measurement; technology evaluations; Ada technology; and data collection. This document contains an index of these publications classified by individual author.

  15. Accurate typing of short tandem repeats from genome-wide sequencing data and its applications.

    PubMed

    Fungtammasan, Arkarachai; Ananda, Guruprasad; Hile, Suzanne E; Su, Marcia Shu-Wei; Sun, Chen; Harris, Robert; Medvedev, Paul; Eckert, Kristin; Makova, Kateryna D

    2015-05-01

    Short tandem repeats (STRs) are implicated in dozens of human genetic diseases and contribute significantly to genome variation and instability. Yet profiling STRs from short-read sequencing data is challenging because of their high sequencing error rates. Here, we developed STR-FM, short tandem repeat profiling using flank-based mapping, a computational pipeline that can detect the full spectrum of STR alleles from short-read data, can adapt to emerging read-mapping algorithms, and can be applied to heterogeneous genetic samples (e.g., tumors, viruses, and genomes of organelles). We used STR-FM to study STR error rates and patterns in publicly available human and in-house generated ultradeep plasmid sequencing data sets. We discovered that STRs sequenced with a PCR-free protocol have up to ninefold fewer errors than those sequenced with a PCR-containing protocol. We constructed an error correction model for genotyping STRs that can distinguish heterozygous alleles containing STRs with consecutive repeat numbers. Applying our model and pipeline to Illumina sequencing data with 100-bp reads, we could confidently genotype several disease-related long trinucleotide STRs. Utilizing this pipeline, for the first time we determined the genome-wide STR germline mutation rate from a deeply sequenced human pedigree. Additionally, we built a tool that recommends minimal sequencing depth for accurate STR genotyping, depending on repeat length and sequencing read length. The required read depth increases with STR length and is lower for a PCR-free protocol. This suite of tools addresses the pressing challenges surrounding STR genotyping, and thus is of wide interest to researchers investigating disease-related STRs and STR evolution. © 2015 Fungtammasan et al.; Published by Cold Spring Harbor Laboratory Press.

  16. Circular Dichroism Spectroscopy: Enhancing a Traditional Undergraduate Biochemistry Laboratory Experience

    ERIC Educational Resources Information Center

    Lewis, Russell L.; Seal, Erin L.; Lorts, Aimee R.; Stewart, Amanda L.

    2017-01-01

    The undergraduate biochemistry laboratory curriculum is designed to provide students with experience in protein isolation and purification protocols as well as various data analysis techniques, which enhance the biochemistry lecture course and give students a broad range of tools upon which to build in graduate level laboratories or once they…

  17. Space Laboratory on a Table Top: A Next Generative ECLSS design and diagnostic tool

    NASA Technical Reports Server (NTRS)

    Ramachandran, N.

    2005-01-01

    This paper describes the development plan for a comprehensive research and diagnostic tool for aspects of advanced life support systems in space-based laboratories. Specifically it aims to build a high fidelity tabletop model that can be used for the purpose of risk mitigation, failure mode analysis, contamination tracking, and testing reliability. We envision a comprehensive approach involving experimental work coupled with numerical simulation to develop this diagnostic tool. It envisions a 10% scale transparent model of a space platform such as the International Space Station that operates with water or a specific matched index of refraction liquid as the working fluid. This allows the scaling of a 10 ft x 10 ft x 10 ft room with air flow to 1 ft x 1 ft x 1 ft tabletop model with water/liquid flow. Dynamic similitude for this length scale dictates model velocities to be 67% of full-scale and thereby the time scale of the model to represent 15% of the full- scale system; meaning identical processes in the model are completed in 15% of the full- scale-time. The use of an index matching fluid (fluid that matches the refractive index of cast acrylic, the model material) allows making the entire model (with complex internal geometry) transparent and hence conducive to non-intrusive optical diagnostics. So using such a system one can test environment control parameters such as core flows (axial flows), cross flows (from registers and diffusers), potential problem areas such as flow short circuits, inadequate oxygen content, build up of other gases beyond desirable levels, test mixing processes within the system at local nodes or compartments and assess the overall system performance. The system allows quantitative measurements of contaminants introduced in the system and allows testing and optimizing the tracking process and removal of contaminants. The envisaged system will be modular and hence flexible for quick configuration change and subsequent testing. The data

  18. Marketing and Distribution: What About Training Plans in the DE Project Laboratory?

    ERIC Educational Resources Information Center

    Snyder, Ruth

    1977-01-01

    Managing a distributive education (DE) laboratory is a challenge. The laboratory is the simulated training station, with the instructor taking on the role of employer, managing student activities and learning. One tool to be utilized in managing a DE laboratory is a training plan. This article discusses the need for student training plans and the…

  19. Learn to Teach Chemistry Using Visual Media Tools

    ERIC Educational Resources Information Center

    Turkoguz, Suat

    2012-01-01

    The aim of this study was to investigate undergraduate students' attitudes to using visual media tools in the chemistry laboratory. One hundred and fifteen undergraduates studying science education at Dokuz Eylul University, Turkey participated in the study. They video-recorded chemistry experiments with visual media tools and assessed them on a…

  20. Laboratory Education in New Zealand

    ERIC Educational Resources Information Center

    Borrmann, Thomas

    2008-01-01

    Laboratory work is one of the main forms of teaching used in chemistry, physics, biology and medicine. For many years researchers and teachers have argued in favor of or against this form of education. Student opinion could be a valuable tool for teachers to demonstrate the validity of such expensive and work intensive forms of education as…

  1. A surgical skills laboratory improves residents' knowledge and performance of episiotomy repair.

    PubMed

    Banks, Erika; Pardanani, Setul; King, Mary; Chudnoff, Scott; Damus, Karla; Freda, Margaret Comerford

    2006-11-01

    This study was undertaken to assess whether a surgical skills laboratory improves residents' knowledge and performance of episiotomy repair. Twenty-four first- and second-year residents were randomly assigned to either a surgical skills laboratory on episiotomy repair or traditional teaching alone. Pre- and posttests assessed basic knowledge. Blinded attending physicians assessed performance, evaluating residents on second-degree laceration/episiotomy repairs in the clinical setting with 3 validated tools: a task-specific checklist, global rating scale, and a pass-fail grade. Postgraduate year 1 (PGY-1) residents participating in the laboratory scored significantly better on all 3 surgical assessment tools: the checklist, the global score, and the pass/fail analysis. All the residents who had the teaching laboratory demonstrated significant improvements on knowledge and the skills checklist. PGY-2 residents did not benefit as much as PGY-1 residents. A surgical skills laboratory improved residents' knowledge and performance in the clinical setting. Improvement was greatest for PGY-1 residents.

  2. Laboratory Diagnosis of Congenital Toxoplasmosis.

    PubMed

    Pomares, Christelle; Montoya, Jose G

    2016-10-01

    Recent studies have demonstrated that screening and treatment for toxoplasmosis during gestation result in a decrease of vertical transmission and clinical sequelae. Early treatment was associated with improved outcomes. Thus, laboratory methods should aim for early identification of infants with congenital toxoplasmosis (CT). Diagnostic approaches should include, at least, detection of Toxoplasma IgG, IgM, and IgA and a comprehensive review of maternal history, including the gestational age at which the mother was infected and treatment. Here, we review laboratory methods for the diagnosis of CT, with emphasis on serological tools. A diagnostic algorithm that takes into account maternal history is presented. Copyright © 2016 Pomares and Montoya.

  3. Laboratory Diagnosis of Congenital Toxoplasmosis

    PubMed Central

    Pomares, Christelle

    2016-01-01

    Recent studies have demonstrated that screening and treatment for toxoplasmosis during gestation result in a decrease of vertical transmission and clinical sequelae. Early treatment was associated with improved outcomes. Thus, laboratory methods should aim for early identification of infants with congenital toxoplasmosis (CT). Diagnostic approaches should include, at least, detection of Toxoplasma IgG, IgM, and IgA and a comprehensive review of maternal history, including the gestational age at which the mother was infected and treatment. Here, we review laboratory methods for the diagnosis of CT, with emphasis on serological tools. A diagnostic algorithm that takes into account maternal history is presented. PMID:27147724

  4. A Systematic Approach to Capacity Strengthening of Laboratory Systems for Control of Neglected Tropical Diseases in Ghana, Kenya, Malawi and Sri Lanka

    PubMed Central

    Njelesani, Janet; Dacombe, Russell; Palmer, Tanith; Smith, Helen; Koudou, Benjamin; Bockarie, Moses; Bates, Imelda

    2014-01-01

    Background The lack of capacity in laboratory systems is a major barrier to achieving the aims of the London Declaration (2012) on neglected tropical diseases (NTDs). To counter this, capacity strengthening initiatives have been carried out in NTD laboratories worldwide. Many of these initiatives focus on individuals' skills or institutional processes and structures ignoring the crucial interactions between the laboratory and the wider national and international context. Furthermore, rigorous methods to assess these initiatives once they have been implemented are scarce. To address these gaps we developed a set of assessment and monitoring tools that can be used to determine the capacities required and achieved by laboratory systems at the individual, organizational, and national/international levels to support the control of NTDs. Methodology and principal findings We developed a set of qualitative and quantitative assessment and monitoring tools based on published evidence on optimal laboratory capacity. We implemented the tools with laboratory managers in Ghana, Malawi, Kenya, and Sri Lanka. Using the tools enabled us to identify strengths and gaps in the laboratory systems from the following perspectives: laboratory quality benchmarked against ISO 15189 standards, the potential for the laboratories to provide support to national and regional NTD control programmes, and the laboratory's position within relevant national and international networks and collaborations. Conclusion We have developed a set of mixed methods assessment and monitoring tools based on evidence derived from the components needed to strengthen the capacity of laboratory systems to control NTDs. Our tools help to systematically assess and monitor individual, organizational, and wider system level capacity of laboratory systems for NTD control and can be applied in different country contexts. PMID:24603407

  5. Wiki Laboratory Notebooks: Supporting Student Learning in Collaborative Inquiry-Based Laboratory Experiments

    NASA Astrophysics Data System (ADS)

    Lawrie, Gwendolyn Angela; Grøndahl, Lisbeth; Boman, Simon; Andrews, Trish

    2016-06-01

    Recent examples of high-impact teaching practices in the undergraduate chemistry laboratory that include course-based undergraduate research experiences and inquiry-based experiments require new approaches to assessing individual student learning outcomes. Instructors require tools and strategies that can provide them with insight into individual student contributions to collaborative group/teamwork throughout the processes of experimental design, data analysis, display and communication of their outcomes in relation to their research question(s). Traditional assessments in the form of laboratory notebooks or experimental reports provide limited insight into the processes of collaborative inquiry-based activities. A wiki environment offers a collaborative domain that can potentially support collaborative laboratory processes and scientific record keeping. In this study, the effectiveness of the wiki in supporting laboratory learning and assessment has been evaluated through analysis of the content and histories for three consenting, participating groups of students. The conversational framework has been applied to map the relationships between the instructor, tutor, students and laboratory activities. Analytics that have been applied to the wiki platform include: character counts, page views, edits, timelines and the extent and nature of the contribution by each student to the wiki. Student perceptions of both the role and the impact of the wiki on their experiences and processes have also been collected. Evidence has emerged from this study that the wiki environment has enhanced co-construction of understanding of both the experimental process and subsequent communication of outcomes and data. A number of features are identified to support success in the use of the wiki platform for laboratory notebooks.

  6. WaveAR: A software tool for calculating parameters for water waves with incident and reflected components

    NASA Astrophysics Data System (ADS)

    Landry, Blake J.; Hancock, Matthew J.; Mei, Chiang C.; García, Marcelo H.

    2012-09-01

    The ability to determine wave heights and phases along a spatial domain is vital to understanding a wide range of littoral processes. The software tool presented here employs established Stokes wave theory and sampling methods to calculate parameters for the incident and reflected components of a field of weakly nonlinear waves, monochromatic at first order in wave slope and propagating in one horizontal dimension. The software calculates wave parameters over an entire wave tank and accounts for reflection, weak nonlinearity, and a free second harmonic. Currently, no publicly available program has such functionality. The included MATLAB®-based open source code has also been compiled for Windows®, Mac® and Linux® operating systems. An additional companion program, VirtualWave, is included to generate virtual wave fields for WaveAR. Together, the programs serve as ideal analysis and teaching tools for laboratory water wave systems.

  7. [Assessment of a supervision grid being used in the laboratories of cutaneous leishmaniasis in Morocco].

    PubMed

    El Mansouri, Bouchra; Amarir, Fatima; Hajli, Yamina; Fellah, Hajiba; Sebti, Faiza; Delouane, Bouchra; Sadak, Abderrahim; Adlaoui, El Bachir; Rhajaoui, Mohammed

    2017-01-01

    The aim of our study was to assess a standardized supervisory grid as a new supervision tool being used in the laboratories of leishmaniasis. We conducted a pilot trial to evaluate the ongoing performances of seven provincial laboratories, in four provinces in Morocco, over a period of two years, between 2006 and 2014. This study detailed the situation in provincial laboratories before and after the implementation of the supervisory grid. A total of twenty-one grids were analyzed. In 2006, the results clearly showed a poor performance of laboratories: need for training (41.6%), staff performing skin biopsy (25%), shortage of materials and reagents (65%), non-compliant document and local management (85%). Several corrective actions were conducted by the National Reference Laboratory (LNRL) of Leishmaniasis during the study period. In 2014, the LNRL recorded a net improvement of the performances of the laboratories. The need for training, the quality of the biopsy, the supply of tools and reagents were met and an effective coordination activity was established between the LNRL and the provincial laboratories. This trial shows the effectiveness of the grid as a high quality supervisory tool and as a cornerstone of making progress on fight programs against leishmaniases.

  8. Artificial intelligence within the chemical laboratory.

    PubMed

    Winkel, P

    1994-01-01

    Various techniques within the area of artificial intelligence such as expert systems and neural networks may play a role during the problem-solving processes within the clinical biochemical laboratory. Neural network analysis provides a non-algorithmic approach to information processing, which results in the ability of the computer to form associations and to recognize patterns or classes among data. It belongs to the machine learning techniques which also include probabilistic techniques such as discriminant function analysis and logistic regression and information theoretical techniques. These techniques may be used to extract knowledge from example patients to optimize decision limits and identify clinically important laboratory quantities. An expert system may be defined as a computer program that can give advice in a well-defined area of expertise and is able to explain its reasoning. Declarative knowledge consists of statements about logical or empirical relationships between things. Expert systems typically separate declarative knowledge residing in a knowledge base from the inference engine: an algorithm that dynamically directs and controls the system when it searches its knowledge base. A tool is an expert system without a knowledge base. The developer of an expert system uses a tool by entering knowledge into the system. Many, if not the majority of problems encountered at the laboratory level are procedural. A problem is procedural if it is possible to write up a step-by-step description of the expert's work or if it can be represented by a decision tree. To solve problems of this type only small expert system tools and/or conventional programming are required.(ABSTRACT TRUNCATED AT 250 WORDS)

  9. Benefits of a Unified LaSRS++ Simulation for NAS-Wide and High-Fidelity Modeling

    NASA Technical Reports Server (NTRS)

    Glaab, Patricia; Madden, Michael

    2014-01-01

    The LaSRS++ high-fidelity vehicle simulation was extended in 2012 to support a NAS-wide simulation mode. Since the initial proof-of-concept, the LaSRS++ NAS-wide simulation is maturing into a research-ready tool. A primary benefit of this new capability is the consolidation of the two modeling paradigms under a single framework to save cost, facilitate iterative concept testing between the two tools, and to promote communication and model sharing between user communities at Langley. Specific benefits of each type of modeling are discussed along with the expected benefits of the unified framework. Current capability details of the LaSRS++ NAS-wide simulations are provided, including the visualization tool, live data interface, trajectory generators, terminal routing for arrivals and departures, maneuvering, re-routing, navigation, winds, and turbulence. The plan for future development is also described.

  10. Facilitating quality control for spectra assignments of small organic molecules: nmrshiftdb2--a free in-house NMR database with integrated LIMS for academic service laboratories.

    PubMed

    Kuhn, Stefan; Schlörer, Nils E

    2015-08-01

    nmrshiftdb2 supports with its laboratory information management system the integration of an electronic lab administration and management into academic NMR facilities. Also, it offers the setup of a local database, while full access to nmrshiftdb2's World Wide Web database is granted. This freely available system allows on the one hand the submission of orders for measurement, transfers recorded data automatically or manually, and enables download of spectra via web interface, as well as the integrated access to prediction, search, and assignment tools of the NMR database for lab users. On the other hand, for the staff and lab administration, flow of all orders can be supervised; administrative tools also include user and hardware management, a statistic functionality for accounting purposes, and a 'QuickCheck' function for assignment control, to facilitate quality control of assignments submitted to the (local) database. Laboratory information management system and database are based on a web interface as front end and are therefore independent of the operating system in use. Copyright © 2015 John Wiley & Sons, Ltd.

  11. Lessons learned in deploying software estimation technology and tools

    NASA Technical Reports Server (NTRS)

    Panlilio-Yap, Nikki; Ho, Danny

    1994-01-01

    Developing a software product involves estimating various project parameters. This is typically done in the planning stages of the project when there is much uncertainty and very little information. Coming up with accurate estimates of effort, cost, schedule, and reliability is a critical problem faced by all software project managers. The use of estimation models and commercially available tools in conjunction with the best bottom-up estimates of software-development experts enhances the ability of a product development group to derive reasonable estimates of important project parameters. This paper describes the experience of the IBM Software Solutions (SWS) Toronto Laboratory in selecting software estimation models and tools and deploying their use to the laboratory's product development groups. It introduces the SLIM and COSTAR products, the software estimation tools selected for deployment to the product areas, and discusses the rationale for their selection. The paper also describes the mechanisms used for technology injection and tool deployment, and concludes with a discussion of important lessons learned in the technology and tool insertion process.

  12. Tool bending in New Caledonian crows.

    PubMed

    Rutz, Christian; Sugasawa, Shoko; van der Wal, Jessica E M; Klump, Barbara C; St Clair, James J H

    2016-08-01

    'Betty' the New Caledonian crow astonished the world when she 'spontaneously' bent straight pieces of garden wire into hooked foraging tools. Recent field experiments have revealed that tool bending is part of the species' natural behavioural repertoire, providing important context for interpreting Betty's iconic wire-bending feat. More generally, this discovery provides a compelling illustration of how natural history observations can inform laboratory-based research into the cognitive capacities of non-human animals.

  13. Concept and development of a discharge alert filter for abnormal laboratory values coupled with computerized provider order entry: a tool for quality improvement and hospital risk management.

    PubMed

    Mathew, George; Kho, Abel; Dexter, Paul; Bloodworth, Nathaniel; Fantz, Corinne; Spell, Nathan; LaBorde, David V

    2012-06-01

    To develop a clinical decision support system activated at the time of discharge to reduce potentially inappropriate discharges from unidentified or unaddressed abnormal laboratory values. We identified 106 laboratory tests for possible inclusion in the discharge alert filter. We selected 7 labs as widely available, commonly obtained, and associated with high risk for potential morbidity or mortality within abnormal ranges. We identified trigger thresholds at levels that would capture significant laboratory abnormalities while avoiding excessive flag generation because of laboratory results that minimally deviate outside the normal reference range. We selected sodium (>155 or <125 mmol/L), potassium (<2.5 or >6 mEq/dL) phosphorous (<1.6 mg/dL), magnesium (<1.2 mg/dL), creatinine greater than 1.1 with a rise of 20% or more between the 2 most recent results, white blood cell count (>11,000 cells/mm with a rise of 20% or more between the 2 most recent results), and international normalized ratio greater than 4. A discharge alert filter that reliably and effectively identifies patients that may be discharged in unsafe situations because of unaddressed critical laboratory values can improve patient safety at discharge and potentially reduce the incidence of costly litigation. Further research is needed to validate whether the proposed discharge alert filter is effective at improving patient safety at discharge.

  14. Real-Time Internet Mediated Laboratory Experiments for Distance Education Students.

    ERIC Educational Resources Information Center

    Lemckert, Charles; Florance, John

    2002-01-01

    Discusses the demand for distance education opportunities in engineering and science and considers delivery methods for theoretical content and for laboratory work. Explains the Real-Time Internet Mediated Laboratory Experiments (RTIMLE) that use the World Wide Web, and suggests that RTIMLE may be most appropriate for students who already have…

  15. Clean Cities Tools: Tools to Help You Save Money, Use Less Petroleum, and Reduce Emissions (Brochure)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2012-01-01

    Clean Cities Alternative Fuels and Advanced Vehicles Data Center (AFDC) features a wide range of Web-based tools to help vehicle fleets and individual consumers reduce their petroleum use. This brochure lists and describes Clean Cities online tools related to vehicles, alternative fueling stations, electric vehicle charging stations, fuel conservation, emissions reduction, fuel economy, and more.

  16. Practical marketing for dentistry. 9. Marketing communication tools.

    PubMed

    Ball, R

    1996-09-21

    There are many communication tools available at a wide range of different costs. This article discusses a selection of the tools available and also examines the most important first step-identifying your target audience.

  17. Reading the Writing on the Graffiti Wall: The World Wide Web and Training.

    ERIC Educational Resources Information Center

    Jones, Charles M.

    This paper examines the benefits to be derived from networked computer-based instruction (CBI) and discusses the potential of the World Wide Web (WWW) as an effective tool in employee training. Methods of utilizing the WWW as a training tool and communication tool are explored. The discussion is divided into the following sections: (1) "WWW and…

  18. Annotated bibliography of software engineering laboratory literature

    NASA Technical Reports Server (NTRS)

    Kistler, David; Bristow, John; Smith, Don

    1994-01-01

    This document is an annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory. Nearly 200 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials have been grouped into eight general subject areas for easy reference: (1) The Software Engineering Laboratory; (2) The Software Engineering Laboratory: Software Development Documents; (3) Software Tools; (4) Software Models; (5) Software Measurement; (6) Technology Evaluations; (7) Ada Technology; and (8) Data Collection. This document contains an index of these publications classified by individual author.

  19. Wide-Field Raman Imaging of Dental Lesions

    PubMed Central

    Yang, Shan; Li, Bolan; Akkus, Anna; Akkus, Ozan; Lang, Lisa

    2014-01-01

    Detection of dental caries at the onset remains as a great challenge in dentistry. Raman spectroscopy could be successfully applied towards detecting caries since it is sensitive to the amount of the Raman active mineral crystals, the most abundant component of enamel. Effective diagnosis requires full examination of a tooth surface via a Raman mapping. Point-scan Raman mapping is not clinically relevant (feasible) due to lengthy data acquisition time. In this work, a wide-field Raman imaging system was assembled based on a high-sensitivity 2D CCD camera for imaging the mineralization status of teeth with lesions. Wide-field images indicated some lesions to be hypomineralized and others to be hypermineralized. The observations of wide-field Raman imaging were in agreement with point-scan Raman mapping. Therefore, sound enamel and lesions can be discriminated by Raman imaging of the mineral content. In conclusion, wide-field Raman imaging is a potentially useful tool for visualization of dental lesions in the clinic. PMID:24781363

  20. The Protein Information Management System (PiMS): a generic tool for any structural biology research laboratory

    PubMed Central

    Morris, Chris; Pajon, Anne; Griffiths, Susanne L.; Daniel, Ed; Savitsky, Marc; Lin, Bill; Diprose, Jonathan M.; Wilter da Silva, Alan; Pilicheva, Katya; Troshin, Peter; van Niekerk, Johannes; Isaacs, Neil; Naismith, James; Nave, Colin; Blake, Richard; Wilson, Keith S.; Stuart, David I.; Henrick, Kim; Esnouf, Robert M.

    2011-01-01

    The techniques used in protein production and structural biology have been developing rapidly, but techniques for recording the laboratory information produced have not kept pace. One approach is the development of laboratory information-management systems (LIMS), which typically use a relational database schema to model and store results from a laboratory workflow. The underlying philosophy and implementation of the Protein Information Management System (PiMS), a LIMS development specifically targeted at the flexible and unpredictable workflows of protein-production research laboratories of all scales, is described. PiMS is a web-based Java application that uses either Postgres or Oracle as the underlying relational database-management system. PiMS is available under a free licence to all academic laboratories either for local installation or for use as a managed service. PMID:21460443

  1. The Protein Information Management System (PiMS): a generic tool for any structural biology research laboratory.

    PubMed

    Morris, Chris; Pajon, Anne; Griffiths, Susanne L; Daniel, Ed; Savitsky, Marc; Lin, Bill; Diprose, Jonathan M; da Silva, Alan Wilter; Pilicheva, Katya; Troshin, Peter; van Niekerk, Johannes; Isaacs, Neil; Naismith, James; Nave, Colin; Blake, Richard; Wilson, Keith S; Stuart, David I; Henrick, Kim; Esnouf, Robert M

    2011-04-01

    The techniques used in protein production and structural biology have been developing rapidly, but techniques for recording the laboratory information produced have not kept pace. One approach is the development of laboratory information-management systems (LIMS), which typically use a relational database schema to model and store results from a laboratory workflow. The underlying philosophy and implementation of the Protein Information Management System (PiMS), a LIMS development specifically targeted at the flexible and unpredictable workflows of protein-production research laboratories of all scales, is described. PiMS is a web-based Java application that uses either Postgres or Oracle as the underlying relational database-management system. PiMS is available under a free licence to all academic laboratories either for local installation or for use as a managed service.

  2. Open-source Software for Demand Forecasting of Clinical Laboratory Test Volumes Using Time-series Analysis.

    PubMed

    Mohammed, Emad A; Naugler, Christopher

    2017-01-01

    Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. This tool will allow anyone with historic test volume data to model future demand.

  3. Open-source Software for Demand Forecasting of Clinical Laboratory Test Volumes Using Time-series Analysis

    PubMed Central

    Mohammed, Emad A.; Naugler, Christopher

    2017-01-01

    Background: Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. Method: In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. Results: This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. Conclusion: This tool will allow anyone with historic test volume data to model future demand. PMID:28400996

  4. Tool bending in New Caledonian crows

    PubMed Central

    Sugasawa, Shoko; van der Wal, Jessica E. M.; Klump, Barbara C.; St Clair, James J. H.

    2016-01-01

    ‘Betty’ the New Caledonian crow astonished the world when she ‘spontaneously’ bent straight pieces of garden wire into hooked foraging tools. Recent field experiments have revealed that tool bending is part of the species' natural behavioural repertoire, providing important context for interpreting Betty's iconic wire-bending feat. More generally, this discovery provides a compelling illustration of how natural history observations can inform laboratory-based research into the cognitive capacities of non-human animals. PMID:27853622

  5. Developing and Demonstrating an Augmented Reality Colorimetric Titration Tool

    ERIC Educational Resources Information Center

    Tee, Nicholas Yee Kwang; Gan, Hong Seng; Li, Jonathan; Cheong, Brandon Huey-Ping; Tan, Han Yen; Liew, Oi Wah; Ng, Tuck Wah

    2018-01-01

    The handling of chemicals in the laboratory presents a challenge in instructing large class sizes and when students are relatively new to the laboratory environment. In this work, we describe and demonstrate an augmented reality colorimetric titration tool that operates out of the smartphone or tablet of students. It allows multiple students to…

  6. Doing laboratory ethnography: reflections on method in scientific workplaces.

    PubMed

    Stephens, Neil; Lewis, Jamie

    2017-04-01

    Laboratory ethnography extended the social scientist's gaze into the day-to-day accomplishment of scientific practice. Here we reflect upon our own ethnographies of biomedical scientific workspaces to provoke methodological discussion on the doing of laboratory ethnography. What we provide is less a 'how to' guide and more a commentary on what to look for and what to look at. We draw upon our empirical research with stem cell laboratories and animal houses, teams producing robotic surgical tools, musicians sonifying data science, a psychiatric genetics laboratory, and scientists developing laboratory grown meat. We use these cases to example a set of potential ethnographic themes worthy of pursuit: science epistemics and the extended laboratory, the interaction order of scientific work, sensory realms and the rending of science as sensible, conferences as performative sites, and the spaces, places and temporalities of scientific work.

  7. Educational websites--Bioinformatics Tools II.

    PubMed

    Lomberk, Gwen

    2009-01-01

    In this issue, the highlighted websites are a continuation of a series of educational websites; this one in particular from a couple of years ago, Bioinformatics Tools [Pancreatology 2005;5:314-315]. These include sites that are valuable resources for many research needs in genomics and proteomics. Bioinformatics has become a laboratory tool to map sequences to databases, develop models of molecular interactions, evaluate structural compatibilities, describe differences between normal and disease-associated DNA, identify conserved motifs within proteins, and chart extensive signaling networks, all in silico. Copyright 2008 S. Karger AG, Basel and IAP.

  8. An automated genotyping tool for enteroviruses and noroviruses.

    PubMed

    Kroneman, A; Vennema, H; Deforche, K; v d Avoort, H; Peñaranda, S; Oberste, M S; Vinjé, J; Koopmans, M

    2011-06-01

    Molecular techniques are established as routine in virological laboratories and virus typing through (partial) sequence analysis is increasingly common. Quality assurance for the use of typing data requires harmonization of genotype nomenclature, and agreement on target genes, depending on the level of resolution required, and robustness of methods. To develop and validate web-based open-access typing-tools for enteroviruses and noroviruses. An automated web-based typing algorithm was developed, starting with BLAST analysis of the query sequence against a reference set of sequences from viruses in the family Picornaviridae or Caliciviridae. The second step is phylogenetic analysis of the query sequence and a sub-set of the reference sequences, to assign the enterovirus type or norovirus genotype and/or variant, with profile alignment, construction of phylogenetic trees and bootstrap validation. Typing is performed on VP1 sequences of Human enterovirus A to D, and ORF1 and ORF2 sequences of genogroup I and II noroviruses. For validation, we used the tools to automatically type sequences in the RIVM and CDC enterovirus databases and the FBVE norovirus database. Using the typing-tools, 785(99%) of 795 Enterovirus VP1 sequences, and 8154(98.5%) of 8342 norovirus sequences were typed in accordance with previously used methods. Subtyping into variants was achieved for 4439(78.4%) of 5838 NoV GII.4 sequences. The online typing-tools reliably assign genotypes for enteroviruses and noroviruses. The use of phylogenetic methods makes these tools robust to ongoing evolution. This should facilitate standardized genotyping and nomenclature in clinical and public health laboratories, thus supporting inter-laboratory comparisons. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. Multi-modal virtual environment research at Armstrong Laboratory

    NASA Technical Reports Server (NTRS)

    Eggleston, Robert G.

    1995-01-01

    One mission of the Paul M. Fitts Human Engineering Division of Armstrong Laboratory is to improve the user interface for complex systems through user-centered exploratory development and research activities. In support of this goal, many current projects attempt to advance and exploit user-interface concepts made possible by virtual reality (VR) technologies. Virtual environments may be used as a general purpose interface medium, an alternative display/control method, a data visualization and analysis tool, or a graphically based performance assessment tool. An overview is given of research projects within the division on prototype interface hardware/software development, integrated interface concept development, interface design and evaluation tool development, and user and mission performance evaluation tool development.

  10. HEMODOSE: A Set of Multi-parameter Biodosimetry Tools

    NASA Technical Reports Server (NTRS)

    Hu, Shaowen; Blakely, William F.; Cucinotta, Francis A.

    2014-01-01

    After the events of September 11, 2001 and recent events at the Fukushima reactors in Japan, there is an increasing concern of the occurrence of nuclear and radiological terrorism or accidents that may result in large casualty in densely populated areas. To guide medical personnel in their clinical decisions for effective medical management and treatment of the exposed individuals, biological markers are usually applied to examine the radiation induced changes at different biological levels. Among these the peripheral blood cell counts are widely used to assess the extent of radiation induced injury. This is due to the fact that hematopoietic system is the most vulnerable part of the human body to radiation damage. Particularly, the lymphocyte, granulocyte, and platelet cells are the most radiosensitive of the blood elements, and monitoring their changes after exposure is regarded as the most practical and best laboratory test to estimate radiation dose. The HEMODOSE web tools are built upon solid physiological and pathophysiological understanding of mammalian hematopoietic systems, and rigorous coarse-grained biomathematical modeling and validation. Using single or serial granulocyte, lymphocyte, leukocyte, or platelet counts after exposure, these tools can estimate absorbed doses of adult victims very rapidly and accurately. Some patient data in historical accidents are utilized as examples to demonstrate the capabilities of these tools as a rapid point-of-care diagnostic or centralized high-throughput assay system in a large scale radiological disaster scenario. Unlike previous dose prediction algorithms, the HEMODOSE web tools establish robust correlations between the absorbed doses and victim's various types of blood cell counts not only in the early time window (1 or 2 days), but also in very late phase (up to 4 weeks) after exposure

  11. HEMODOSE: A Set of Multi-parameter Biodosimetry Tools

    NASA Technical Reports Server (NTRS)

    Hu, Shaowen; Blakely, William F.; Cucinotta, Francis A.

    2014-01-01

    There continues to be important concerns of the possibility of the occurrence of acute radiation syndromes following nuclear and radiological terrorism or accidents that may result in mass casualties in densely populated areas. To guide medical personnel in their clinical decisions for effective medical management and treatment of the exposed individuals, biological markers are usually applied to examine radiation induced biological changes to assess the severity of radiation injury to sensitive organ systems. Among these the peripheral blood cell counts are widely used to assess the extent of radiation induced bone marrow (BM) injury. This is due to the fact that hematopoietic system is a vulnerable part of the human body to radiation damage. Particularly, the lymphocyte, granulocyte, and platelet cells are the most radiosensitive of the blood elements, and monitoring their changes after exposure is regarded as a practical and recommended laboratory test to estimate radiation dose and injury. In this work we describe the HEMODOSE web tools, which are built upon solid physiological and pathophysiological understanding of mammalian hematopoietic systems, and rigorous coarse-grained biomathematical modeling and validation. Using single or serial granulocyte, lymphocyte, leukocyte, or platelet counts after exposure, these tools can estimate absorbed doses of adult victims very rapidly and accurately to assess the severity of BM radiation injury. Some patient data from historical accidents are utilized as examples to demonstrate the capabilities of these tools as a rapid point-of-care diagnostic or centralized high-throughput assay system in a large-scale radiological disaster scenario. HEMODOSE web tools establish robust correlations between the absorbed doses and victim's various types of blood cell counts not only in the early time window (1 or 2 days), but also in very late phase (up to 4 weeks) after exposure.

  12. Software Engineering Laboratory (SEL) data and information policy

    NASA Technical Reports Server (NTRS)

    Mcgarry, Frank

    1991-01-01

    The policies and overall procedures that are used in distributing and in making available products of the Software Engineering Laboratory (SEL) are discussed. The products include project data and measures, project source code, reports, and software tools.

  13. World Wide Web Indexes and Hierarchical Lists: Finding Tools for the Internet.

    ERIC Educational Resources Information Center

    Munson, Kurt I.

    1996-01-01

    In World Wide Web indexing: (1) the creation process is automated; (2) the indexes are merely descriptive, not analytical of document content; (3) results may be sorted differently depending on the search engine; and (4) indexes link directly to the resources. This article compares the indexing methods and querying options of the search engines…

  14. International standards for tuberculosis care: relevance and implications for laboratory professionals.

    PubMed

    Pai, M; Daley, P; Hopewell, P C

    2007-04-01

    On World Tuberculosis (TB) Day 2006, the International Standards for Tuberculosis Care (ISTC) was officially released and widely endorsed by several agencies and organizations. The ISTC release was the culmination of a year long global effort to develop and set internationally acceptable, evidence-based standards for tuberculosis care. The ISTC describes a widely endorsed level of care that all practitioners, public and private, should seek to achieve in managing individuals who have or are suspected of having, TB and is intended to facilitate the effective engagement of all healthcare providers in delivering high quality care for patients of all ages, including those with smear-positive, smear-negative and extra-pulmonary TB, TB caused by drug-resistant Mycobacterium tuberculosis and TB/HIV coinfection. In this article, we present the ISTC, with a special focus on the diagnostic standards and describe their implications and relevance for laboratory professionals in India and worldwide. Laboratory professionals play a critical role in ensuring that all the standards are actually met by providing high quality laboratory services for smear microscopy, culture and drug susceptibility testing and other services such as testing for HIV infection. In fact, if the ISTC is widely followed, it can be expected that there will be a greater need and demand for quality assured laboratory services and this will have obvious implications for all laboratories in terms of work load, requirement for resources and trained personnel and organization of quality assurance systems.

  15. CLIPS: An expert system building tool

    NASA Technical Reports Server (NTRS)

    Riley, Gary

    1991-01-01

    The C Language Integrated Production System (CLIPS) is an expert system building tool, which provides a complete environment for the development and delivery of rule and/or object based expert systems. CLIPS was specifically designed to provide a low cost option for developing and deploying expert system applications across a wide range of hardware platforms. The commercial potential of CLIPS is vast. Currently, CLIPS is being used by over 3,300 individuals throughout the public and private sector. Because the CLIPS source code is readily available, numerous groups have used CLIPS as a basis for their own expert system tools. To date, three commercially available tools have been derived from CLIPS. In general, the development of CLIPS has helped to improve the ability to deliver expert system technology throughout the public and private sectors for a wide range of applications and diverse computing environments.

  16. Automating testbed documentation and database access using World Wide Web (WWW) tools

    NASA Technical Reports Server (NTRS)

    Ames, Charles; Auernheimer, Brent; Lee, Young H.

    1994-01-01

    A method for providing uniform transparent access to disparate distributed information systems was demonstrated. A prototype testing interface was developed to access documentation and information using publicly available hypermedia tools. The prototype gives testers a uniform, platform-independent user interface to on-line documentation, user manuals, and mission-specific test and operations data. Mosaic was the common user interface, and HTML (Hypertext Markup Language) provided hypertext capability.

  17. Chronic myelogenous leukemia: laboratory diagnosis and monitoring.

    PubMed

    Wang, Y L; Bagg, A; Pear, W; Nowell, P C; Hess, J L

    2001-10-01

    Rapid developments have occurred both in laboratory medicine and in therapeutic interventions for the management of patients with chronic myelogenous leukemia (CML). With a wide array of laboratory tests available, selecting the appropriate test for a specific diagnostic or therapeutic setting has become increasingly difficult. In this review, we first discuss, from the point of view of laboratory medicine, the advantages and disadvantages of several commonly used laboratory assays, including cytogenetics, fluorescence in situ hybridization (FISH), and qualitative and quantitative reverse transcriptase-polymerase chain reaction (RT-PCR). We then discuss, from the point of view of clinical care, the test(s) of choice for the most common clinical scenarios, including diagnosis and monitoring of the therapeutic response and minimal residual disease in patients treated with different therapies. The purpose of this review is to help clinicians and laboratory physicians select appropriate tests for the diagnosis and monitoring of CML, with the ultimate goal of improving the cost-effective usage of clinical laboratories and improving patient care. Copyright 2001 Wiley-Liss, Inc.

  18. GMOseek: a user friendly tool for optimized GMO testing.

    PubMed

    Morisset, Dany; Novak, Petra Kralj; Zupanič, Darko; Gruden, Kristina; Lavrač, Nada; Žel, Jana

    2014-08-01

    With the increasing pace of new Genetically Modified Organisms (GMOs) authorized or in pipeline for commercialization worldwide, the task of the laboratories in charge to test the compliance of food, feed or seed samples with their relevant regulations became difficult and costly. Many of them have already adopted the so called "matrix approach" to rationalize the resources and efforts used to increase their efficiency within a limited budget. Most of the time, the "matrix approach" is implemented using limited information and some proprietary (if any) computational tool to efficiently use the available data. The developed GMOseek software is designed to support decision making in all the phases of routine GMO laboratory testing, including the interpretation of wet-lab results. The tool makes use of a tabulated matrix of GM events and their genetic elements, of the laboratory analysis history and the available information about the sample at hand. The tool uses an optimization approach to suggest the most suited screening assays for the given sample. The practical GMOseek user interface allows the user to customize the search for a cost-efficient combination of screening assays to be employed on a given sample. It further guides the user to select appropriate analyses to determine the presence of individual GM events in the analyzed sample, and it helps taking a final decision regarding the GMO composition in the sample. GMOseek can also be used to evaluate new, previously unused GMO screening targets and to estimate the profitability of developing new GMO screening methods. The presented freely available software tool offers the GMO testing laboratories the possibility to select combinations of assays (e.g. quantitative real-time PCR tests) needed for their task, by allowing the expert to express his/her preferences in terms of multiplexing and cost. The utility of GMOseek is exemplified by analyzing selected food, feed and seed samples from a national reference

  19. The Johns Hopkins Hunterian Laboratory Philosophy: Mentoring Students in a Scientific Neurosurgical Research Laboratory.

    PubMed

    Tyler, Betty M; Liu, Ann; Sankey, Eric W; Mangraviti, Antonella; Barone, Michael A; Brem, Henry

    2016-06-01

    After over 50 years of scientific contribution under the leadership of Harvey Cushing and later Walter Dandy, the Johns Hopkins Hunterian Laboratory entered a period of dormancy between the 1960s and early 1980s. In 1984, Henry Brem reinstituted the Hunterian Neurosurgical Laboratory, with a new focus on localized delivery of therapies for brain tumors, leading to several discoveries such as new antiangiogenic agents and Gliadel chemotherapy wafers for the treatment of malignant gliomas. Since that time, it has been the training ground for 310 trainees who have dedicated their time to scientific exploration in the lab, resulting in numerous discoveries in the area of neurosurgical research. The Hunterian Neurosurgical Laboratory has been a unique example of successful mentoring in a translational research environment. The laboratory's philosophy emphasizes mentorship, independence, self-directed learning, creativity, and people-centered collaboration, while maintaining productivity with a focus on improving clinical outcomes. This focus has been served by the diverse backgrounds of its trainees, both in regard to educational status as well as culturally. Through this philosophy and strong legacy of scientific contribution, the Hunterian Laboratory has maintained a positive and productive research environment that supports highly motivated students and trainees. In this article, the authors discuss the laboratory's training philosophy, linked to the principles of adult learning (andragogy), as well as the successes and the limitations of including a wide educational range of students in a neurosurgical translational laboratory and the phenomenon of combining clinical expertise with rigorous scientific training.

  20. Engineered nanomaterials: toward effective safety management in research laboratories.

    PubMed

    Groso, Amela; Petri-Fink, Alke; Rothen-Rutishauser, Barbara; Hofmann, Heinrich; Meyer, Thierry

    2016-03-15

    It is still unknown which types of nanomaterials and associated doses represent an actual danger to humans and environment. Meanwhile, there is consensus on applying the precautionary principle to these novel materials until more information is available. To deal with the rapid evolution of research, including the fast turnover of collaborators, a user-friendly and easy-to-apply risk assessment tool offering adequate preventive and protective measures has to be provided. Based on new information concerning the hazards of engineered nanomaterials, we improved a previously developed risk assessment tool by following a simple scheme to gain in efficiency. In the first step, using a logical decision tree, one of the three hazard levels, from H1 to H3, is assigned to the nanomaterial. Using a combination of decision trees and matrices, the second step links the hazard with the emission and exposure potential to assign one of the three nanorisk levels (Nano 3 highest risk; Nano 1 lowest risk) to the activity. These operations are repeated at each process step, leading to the laboratory classification. The third step provides detailed preventive and protective measures for the determined level of nanorisk. We developed an adapted simple and intuitive method for nanomaterial risk management in research laboratories. It allows classifying the nanoactivities into three levels, additionally proposing concrete preventive and protective measures and associated actions. This method is a valuable tool for all the participants in nanomaterial safety. The users experience an essential learning opportunity and increase their safety awareness. Laboratory managers have a reliable tool to obtain an overview of the operations involving nanomaterials in their laboratories; this is essential, as they are responsible for the employee safety, but are sometimes unaware of the works performed. Bringing this risk to a three-band scale (like other types of risks such as biological, radiation

  1. A Single-Display Groupware Collaborative Language Laboratory

    ERIC Educational Resources Information Center

    Calderón, Juan Felipe; Nussbaum, Miguel; Carmach, Ignacio; Díaz, Juan Jaime; Villalta, Marco

    2016-01-01

    Language learning tools have evolved to take into consideration new teaching models of collaboration and communication. While second language acquisition tasks have been taken online, the traditional language laboratory has remained unchanged. By continuing to follow its original configuration based on individual work, the language laboratory…

  2. Capitalizing on App Development Tools and Technologies

    ERIC Educational Resources Information Center

    Luterbach, Kenneth J.; Hubbell, Kenneth R.

    2015-01-01

    Instructional developers and others creating apps must choose from a wide variety of app development tools and technologies. Some app development tools have incorporated visual programming features, which enable some drag and drop coding and contextual programming. While those features help novices begin programming with greater ease, questions…

  3. Assessing the quality of infertility resources on the World Wide Web: tools to guide clients through the maze of fact and fiction.

    PubMed

    Okamura, Kyoko; Bernstein, Judith; Fidler, Anne T

    2002-01-01

    The Internet has become a major source of health information for women, but information placed on the World Wide Web does not routinely undergo a peer review process before dissemination. In this study, we present an analysis of 197 infertility-related Web sites for quality and accountability, using JAMA's minimal core standards for responsible print. Only 2% of the web sites analyzed met all four recommended standards, and 50.8% failed to report any of the four. Commercial web sites were more likely to fail to meet minimum standards (71.2%) than those with educational (46.8%) or supportive (29.8%) elements. Web sites with educational and informational components were most common (70.6%), followed by commercial sites (52.8%) and sites that offered a forum for infertility support and activism (28.9%). Internet resources available to infertile patients are at best variable. The current state of infertility-related materials on the World Wide Web offers unprecedented opportunities to improve services to a growing number of e-health users. Because of variations in quality of site content, women's health clinicians must assume responsibility for a new role as information monitor. This study provides assessment tools clinicians can apply and share with clients.

  4. Clinical laboratory analytics: Challenges and promise for an emerging discipline.

    PubMed

    Shirts, Brian H; Jackson, Brian R; Baird, Geoffrey S; Baron, Jason M; Clements, Bryan; Grisson, Ricky; Hauser, Ronald George; Taylor, Julie R; Terrazas, Enrique; Brimhall, Brad

    2015-01-01

    The clinical laboratory is a major source of health care data. Increasingly these data are being integrated with other data to inform health system-wide actions meant to improve diagnostic test utilization, service efficiency, and "meaningful use." The Academy of Clinical Laboratory Physicians and Scientists hosted a satellite meeting on clinical laboratory analytics in conjunction with their annual meeting on May 29, 2014 in San Francisco. There were 80 registrants for the clinical laboratory analytics meeting. The meeting featured short presentations on current trends in clinical laboratory analytics and several panel discussions on data science in laboratory medicine, laboratory data and its role in the larger healthcare system, integrating laboratory analytics, and data sharing for collaborative analytics. One main goal of meeting was to have an open forum of leaders that work with the "big data" clinical laboratories produce. This article summarizes the proceedings of the meeting and content discussed.

  5. Genome-wide specificity of DNA binding, gene regulation, and chromatin remodeling by TALE- and CRISPR/Cas9-based transcriptional activators.

    PubMed

    Polstein, Lauren R; Perez-Pinera, Pablo; Kocak, D Dewran; Vockley, Christopher M; Bledsoe, Peggy; Song, Lingyun; Safi, Alexias; Crawford, Gregory E; Reddy, Timothy E; Gersbach, Charles A

    2015-08-01

    Genome engineering technologies based on the CRISPR/Cas9 and TALE systems are enabling new approaches in science and biotechnology. However, the specificity of these tools in complex genomes and the role of chromatin structure in determining DNA binding are not well understood. We analyzed the genome-wide effects of TALE- and CRISPR-based transcriptional activators in human cells using ChIP-seq to assess DNA-binding specificity and RNA-seq to measure the specificity of perturbing the transcriptome. Additionally, DNase-seq was used to assess genome-wide chromatin remodeling that occurs as a result of their action. Our results show that these transcription factors are highly specific in both DNA binding and gene regulation and are able to open targeted regions of closed chromatin independent of gene activation. Collectively, these results underscore the potential for these technologies to make precise changes to gene expression for gene and cell therapies or fundamental studies of gene function. © 2015 Polstein et al.; Published by Cold Spring Harbor Laboratory Press.

  6. MOD Tool (Microwave Optics Design Tool)

    NASA Technical Reports Server (NTRS)

    Katz, Daniel S.; Borgioli, Andrea; Cwik, Tom; Fu, Chuigang; Imbriale, William A.; Jamnejad, Vahraz; Springer, Paul L.

    1999-01-01

    The Jet Propulsion Laboratory (JPL) is currently designing and building a number of instruments that operate in the microwave and millimeter-wave bands. These include MIRO (Microwave Instrument for the Rosetta Orbiter), MLS (Microwave Limb Sounder), and IMAS (Integrated Multispectral Atmospheric Sounder). These instruments must be designed and built to meet key design criteria (e.g., beamwidth, gain, pointing) obtained from the scientific goals for the instrument. These criteria are frequently functions of the operating environment (both thermal and mechanical). To design and build instruments which meet these criteria, it is essential to be able to model the instrument in its environments. Currently, a number of modeling tools exist. Commonly used tools at JPL include: FEMAP (meshing), NASTRAN (structural modeling), TRASYS and SINDA (thermal modeling), MACOS/IMOS (optical modeling), and POPO (physical optics modeling). Each of these tools is used by an analyst, who models the instrument in one discipline. The analyst then provides the results of this modeling to another analyst, who continues the overall modeling in another discipline. There is a large reengineering task in place at JPL to automate and speed-up the structural and thermal modeling disciplines, which does not include MOD Tool. The focus of MOD Tool (and of this paper) is in the fields unique to microwave and millimeter-wave instrument design. These include initial design and analysis of the instrument without thermal or structural loads, the automation of the transfer of this design to a high-end CAD tool, and the analysis of the structurally deformed instrument (due to structural and/or thermal loads). MOD Tool is a distributed tool, with a database of design information residing on a server, physical optics analysis being performed on a variety of supercomputer platforms, and a graphical user interface (GUI) residing on the user's desktop computer. The MOD Tool client is being developed using Tcl

  7. What is WorldWide Telescope, and Why Should Researchers Care?

    NASA Astrophysics Data System (ADS)

    Goodman, Alyssa A.

    2016-01-01

    As of 2015, about 20 million people have downloaded the computer program called "WorldWide Telescope," and even more have accessed it via the web, at http://worldwidetelescope.org. But, the vast majority of these millions are not professional astronomers. This talk will explain why WorldWide Telescope (WWT) is also a powerful tool for research astronomers. I will focus on how WWT can be, and is, being built-in to Journals, and into day-to-day research environments. By way of example, I will show how WWT already: allows users to display images, including those in Journals, in the context of multi-wavelength full-sky imagery; allows for the display of which parts of the Sky have been studied, when, how, and for what reason (see http://adsass.org); allows, via right-click, immediate access to ADS, SIMBAD, and other professional research tools. I will also highlight new work, currently in development, that is using WWT as a tool for observation planning, and as a display mode for advanced high-dimensional data visualization tools, like glue (see http://glueviz.org). WWT is now well-known in the education community (see http://wwtambassadors.org), so the explicit goal of this talk will be to make researchers more aware of its full power. I will explain how WWT transitioned, over 8 years, from a Microsoft Research project to its current open-source state (see https://github.com/WorldWideTelescope), and I will conclude with comments on the future of WWT, and its relationship to how research should be carried out in the future (see http://tinyurl.com/aas-potf).

  8. The use of reference change values in clinical laboratories.

    PubMed

    Bugdayci, Guler; Oguzman, Hamdi; Arattan, Havva Yasemin; Sasmaz, Guler

    2015-01-01

    The use of Reference Change Values (RCV) has been advocated as very useful for monitoring individuals. Most of these are performed for monitoring individuals in acute situations and for following up the improvement or deterioration of chronic diseases. In our study, we aimed at evaluating the RCV calculation for 24 clinical chemistry analytes widely used in clinical laboratories and the utilization of this data. Twenty-four serum samples were analyzed with Abbott kits (Abbott Laboratories, Abbott Park, IL, USA), manufactured for use with the Architect c8000 (Abbott Laboratories, Abbott Park, IL, USA) auto-analyzer. We calculated RCV using the following formula: RCV = Z x 2 1/2x (CVA2 + CVw2)1/2. Four reference change values (RCV) were calculated for each analyte using four statistical probabilities (0.95, and 0.99, unidirectional and bidirectional). Moreover, by providing an interval after identifying upper and lower limits with the Reference Change Factor (RCF), serially measured tests were calculated by using two formulas: exp (Z x 2 1/2 x (CV(A)2 + CVw2)½/100) for RCF(UP) and (1/RCF(UP)) for RCF(DOWN). RCVs of these analytes were calculated as 14.63% for glucose, 29.88% for urea, 17.75% for ALP, 53.39% for CK, 46.98% for CK-MB, 21.00% amylase, 8.00% for total protein, 8.70% for albumin, 51.08% for total bilirubin, 86.34% for direct bilirubin, 6.40% for calcium, 15.03% for creatinine, 21.47% for urate, 14.19% for total cholesterol, 46.62% for triglyceride, 20.51% for HDL-cholesterol, 29.59% for AST, 46.31% for ALT, 31.54% for GGT, 20.92% for LDH, 19.75% for inorganic phosphate, 3.05% for sodium, 11.75% for potassium, 4.44% for chloride (RCV, p < 0.05, unidirectionally). We suggest using RCV as well as using population-based reference intervals in clinical laboratories. RCV could be available as a tool for making clinical decision, especially when monitoring individuals.

  9. Crew Exploration Vehicle (CEV) Avionics Integration Laboratory (CAIL) Independent Analysis

    NASA Technical Reports Server (NTRS)

    Davis, Mitchell L.; Aguilar, Michael L.; Mora, Victor D.; Regenie, Victoria A.; Ritz, William F.

    2009-01-01

    Two approaches were compared to the Crew Exploration Vehicle (CEV) Avionics Integration Laboratory (CAIL) approach: the Flat-Sat and Shuttle Avionics Integration Laboratory (SAIL). The Flat-Sat and CAIL/SAIL approaches are two different tools designed to mitigate different risks. Flat-Sat approach is designed to develop a mission concept into a flight avionics system and associated ground controller. The SAIL approach is designed to aid in the flight readiness verification of the flight avionics system. The approaches are complimentary in addressing both the system development risks and mission verification risks. The following NESC team findings were identified: The CAIL assumption is that the flight subsystems will be matured for the system level verification; The Flat-Sat and SAIL approaches are two different tools designed to mitigate different risks. The following NESC team recommendation was provided: Define, document, and manage a detailed interface between the design and development (EDL and other integration labs) to the verification laboratory (CAIL).

  10. Open Simulation Laboratories [Guest editors' introduction

    DOE PAGES

    Alexander, Francis J.; Meneveau, Charles

    2015-09-01

    The introduction for the special issue on open simulation laboratories, the guest editors describe how OSLs will become more common as their potential is better understood and they begin providing access to valuable datasets to much larger segments of the scientific community. Moreover, new analysis tools and ways to do science will inevitably develop as a result.

  11. A Novel Passive Robotic Tool Interface

    NASA Astrophysics Data System (ADS)

    Roberts, Paul

    2013-09-01

    The increased capability of space robotics has seen their uses increase from simple sample gathering and mechanical adjuncts to humans, to sophisticated multi- purpose investigative and maintenance tools that substitute for humans for many external space tasks. As with all space missions, reducing mass and system complexity is critical. A key component of robotic systems mass and complexity is the number of motors and actuators needed. MDA has developed a passive tool interface that, like a household power drill, permits a single tool actuator to be interfaced with many Tool Tips without requiring additional actuators to manage the changing and storage of these tools. MDA's Multifunction Tool interface permits a wide range of Tool Tips to be designed to a single interface that can be pre-qualified to torque and strength limits such that additional Tool Tips can be added to a mission's "tool kit" simply and quickly.

  12. Research on the tool holder mode in high speed machining

    NASA Astrophysics Data System (ADS)

    Zhenyu, Zhao; Yongquan, Zhou; Houming, Zhou; Xiaomei, Xu; Haibin, Xiao

    2018-03-01

    High speed machining technology can improve the processing efficiency and precision, but also reduce the processing cost. Therefore, the technology is widely regarded in the industry. With the extensive application of high-speed machining technology, high-speed tool system has higher and higher requirements on the tool chuck. At present, in high speed precision machining, several new kinds of clip heads are as long as there are heat shrinkage tool-holder, high-precision spring chuck, hydraulic tool-holder, and the three-rib deformation chuck. Among them, the heat shrinkage tool-holder has the advantages of high precision, high clamping force, high bending rigidity and dynamic balance, etc., which are widely used. Therefore, it is of great significance to research the new requirements of the machining tool system. In order to adapt to the requirement of high speed machining precision machining technology, this paper expounds the common tool holder technology of high precision machining, and proposes how to select correctly tool clamping system in practice. The characteristics and existing problems are analyzed in the tool clamping system.

  13. Virtual laboratories: new opportunities for collaborative water science

    NASA Astrophysics Data System (ADS)

    Ceola, Serena; Arheimer, Berit; Bloeschl, Guenter; Baratti, Emanuele; Capell, Rene; Castellarin, Attilio; Freer, Jim; Han, Dawei; Hrachowitz, Markus; Hundecha, Yeshewatesfa; Hutton, Christopher; Lindström, Goran; Montanari, Alberto; Nijzink, Remko; Parajka, Juraj; Toth, Elena; Viglione, Alberto; Wagener, Thorsten

    2015-04-01

    Reproducibility and repeatability of experiments are the fundamental prerequisites that allow researchers to validate results and share hydrological knowledge, experience and expertise in the light of global water management problems. Virtual laboratories offer new opportunities to enable these prerequisites since they allow experimenters to share data, tools and pre-defined experimental procedures (i.e. protocols). Here we present the outcomes of a first collaborative numerical experiment undertaken by five different international research groups in a virtual laboratory to address the key issues of reproducibility and repeatability. Moving from the definition of accurate and detailed experimental protocols, a rainfall-runoff model was independently applied to 15 European catchments by the research groups and model results were collectively examined through a web-based discussion. We found that a detailed modelling protocol was crucial to ensure the comparability and reproducibility of the proposed experiment across groups. Our results suggest that sharing comprehensive and precise protocols and running the experiments within a controlled environment (e.g. virtual laboratory) is as fundamental as sharing data and tools for ensuring experiment repeatability and reproducibility across the broad scientific community and thus advancing hydrology in a more coherent way.

  14. Doing laboratory ethnography: reflections on method in scientific workplaces

    PubMed Central

    Stephens, Neil; Lewis, Jamie

    2017-01-01

    Laboratory ethnography extended the social scientist’s gaze into the day-to-day accomplishment of scientific practice. Here we reflect upon our own ethnographies of biomedical scientific workspaces to provoke methodological discussion on the doing of laboratory ethnography. What we provide is less a ‘how to’ guide and more a commentary on what to look for and what to look at. We draw upon our empirical research with stem cell laboratories and animal houses, teams producing robotic surgical tools, musicians sonifying data science, a psychiatric genetics laboratory, and scientists developing laboratory grown meat. We use these cases to example a set of potential ethnographic themes worthy of pursuit: science epistemics and the extended laboratory, the interaction order of scientific work, sensory realms and the rending of science as sensible, conferences as performative sites, and the spaces, places and temporalities of scientific work. PMID:28546784

  15. Developing a customised approach for strengthening tuberculosis laboratory quality management systems toward accreditation

    PubMed Central

    Trollip, Andre; Erni, Donatelle; Kao, Kekeletso

    2017-01-01

    Background Quality-assured tuberculosis laboratory services are critical to achieve global and national goals for tuberculosis prevention and care. Implementation of a quality management system (QMS) in laboratories leads to improved quality of diagnostic tests and better patient care. The Strengthening Laboratory Management Toward Accreditation (SLMTA) programme has led to measurable improvements in the QMS of clinical laboratories. However, progress in tuberculosis laboratories has been slower, which may be attributed to the need for a structured tuberculosis-specific approach to implementing QMS. We describe the development and early implementation of the Strengthening Tuberculosis Laboratory Management Toward Accreditation (TB SLMTA) programme. Development The TB SLMTA curriculum was developed by customizing the SLMTA curriculum to include specific tools, job aids and supplementary materials specific to the tuberculosis laboratory. The TB SLMTA Harmonized Checklist was developed from the World Health Organisation Regional Office for Africa Stepwise Laboratory Quality Improvement Process Towards Accreditation checklist, and incorporated tuberculosis-specific requirements from the Global Laboratory Initiative Stepwise Process Towards Tuberculosis Laboratory Accreditation online tool. Implementation Four regional training-of-trainers workshops have been conducted since 2013. The TB SLMTA programme has been rolled out in 37 tuberculosis laboratories in 10 countries using the Workshop approach in 32 laboratories in five countries and the Facility-based approach in five tuberculosis laboratories in five countries. Conclusion Lessons learnt from early implementation of TB SLMTA suggest that a structured training and mentoring programme can build a foundation towards further quality improvement in tuberculosis laboratories. Structured mentoring, and institutionalisation of QMS into country programmes, is needed to support tuberculosis laboratories to achieve

  16. Increasing efficiency of information dissemination and collection through the World Wide Web

    Treesearch

    Daniel P. Huebner; Malchus B. Baker; Peter F. Ffolliott

    2000-01-01

    Researchers, managers, and educators have access to revolutionary technology for information transfer through the World Wide Web (Web). Using the Web to effectively gather and distribute information is addressed in this paper. Tools, tips, and strategies are discussed. Companion Web sites are provided to guide users in selecting the most appropriate tool for searching...

  17. Translating a National Laboratory Strategic Plan into action through SLMTA in a district hospital laboratory in Botswana.

    PubMed

    Ntshambiwa, Keoratile; Ntabe-Jagwer, Winnie; Kefilwe, Chandapiwa; Samuel, Fredrick; Moyo, Sikhulile

    2014-01-01

    The Ministry of Health (MOH) of Botswana adopted Strengthening Laboratory Management Toward Accreditation (SLMTA), a structured quality improvement programme, as a key tool for the implementation of quality management systems in its public health laboratories. Coupled with focused mentorship, this programme aimed to help MOH achieve the goals of the National Laboratory Strategic Plan to provide quality and timely clinical diagnoses. This article describes the impact of implementing SLMTA in Sekgoma Memorial Hospital Laboratory (SMHL) in Serowe, Botswana. SLMTA implementation in SMHL included trainings, improvement projects, site visits and focused mentorship. To measure progress, audits using the Stepwise Laboratory Quality Improvement Process Towards Accreditation (SLIPTA) checklist were conducted at baseline and exit of the programme, with scores corresponding to a zero- to five-star scale. Turnaround times, customer satisfaction, and several other health service indicators were tracked. The laboratory scored 53% (zero stars) at the baseline audit and 80% (three stars) at exit. Nearly three years later, the laboratory scored 85% (four stars) in an official audit conducted by the African Society for Laboratory Medicine. Turnaround times became shorter after SLMTA implementation, with reductions ranging 19% to 52%; overall patient satisfaction increased from 56% to 73%; and clinician satisfaction increased from 41% to 72%. Improvements in inventory management led to decreases in discarded reagents, reducing losses from US $18 000 in 2011 to $40 in 2013. The SLMTA programme contributed to enhanced performance of the laboratory, which in turn yielded potential positive impacts for patient care at the hospital.

  18. Integration of digital gross pathology images for enterprise-wide access.

    PubMed

    Amin, Milon; Sharma, Gaurav; Parwani, Anil V; Anderson, Ralph; Kolowitz, Brian J; Piccoli, Anthony; Shrestha, Rasu B; Lauro, Gonzalo Romero; Pantanowitz, Liron

    2012-01-01

    Sharing digital pathology images for enterprise- wide use into a picture archiving and communication system (PACS) is not yet widely adopted. We share our solution and 3-year experience of transmitting such images to an enterprise image server (EIS). Gross pathology images acquired by prosectors were integrated with clinical cases into the laboratory information system's image management module, and stored in JPEG2000 format on a networked image server. Automated daily searches for cases with gross images were used to compile an ASCII text file that was forwarded to a separate institutional Enterprise Digital Imaging and Communications in Medicine (DICOM) Wrapper (EDW) server. Concurrently, an HL7-based image order for these cases was generated, containing the locations of images and patient data, and forwarded to the EDW, which combined data in these locations to generate images with patient data, as required by DICOM standards. The image and data were then "wrapped" according to DICOM standards, transferred to the PACS servers, and made accessible on an institution-wide basis. In total, 26,966 gross images from 9,733 cases were transmitted over the 3-year period from the laboratory information system to the EIS. The average process time for cases with successful automatic uploads (n=9,688) to the EIS was 98 seconds. Only 45 cases (0.5%) failed requiring manual intervention. Uploaded images were immediately available to institution- wide PACS users. Since inception, user feedback has been positive. Enterprise- wide PACS- based sharing of pathology images is feasible, provides useful services to clinical staff, and utilizes existing information system and telecommunications infrastructure. PACS-shared pathology images, however, require a "DICOM wrapper" for multisystem compatibility.

  19. Solar Data and Tools: Resources for Researchers, Industry, and Developers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-04-01

    In partnership with the U.S. Department of Energy SunShot Initiative, the National Renewable Energy Laboratory (NREL) has created a suite of analytical tools and data that can inform decisions about implementing solar and that are increasingly forming the basis of private-sector tools and services to solar consumers. The following solar energy data sets and analytical tools are available free to the public.

  20. A Survey of Security Tools for the Industrial Control System Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hurd, Carl M.; McCarty, Michael V.

    This report details the results of a survey conducted by Idaho National Laboratory (INL) to identify existing tools which could be used to prevent, detect, mitigate, or investigate a cyber-attack in an industrial control system (ICS) environment. This report compiles a list of potentially applicable tools and shows the coverage of the tools in an ICS architecture.

  1. A Data-Driven Framework for Incorporating New Tools for ...

    EPA Pesticide Factsheets

    This talk was given during the “Exposure-Based Toxicity Testing” session at the annual meeting of the International Society for Exposure Science. It provided an update on the state of the science and tools that may be employed in risk-based prioritization efforts. It outlined knowledge gained from the data provided using these high-throughput tools to assess chemical bioactivity and to predict chemical exposures and also identified future needs. It provided an opportunity to showcase ongoing research efforts within the National Exposure Research Laboratory and the National Center for Computational Toxicology within the Office of Research and Development to an international audience. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  2. The upper bound of Pier Scour defined by selected laboratory and field data

    USGS Publications Warehouse

    Benedict, Stephen; Caldwell, Andral W.

    2015-01-01

    The U.S. Geological Survey, in cooperation with the South Carolina Department of Transportation, conducted several field investigations of pier scour in South Carolina (Benedict and Caldwell, 2006; Benedict and Caldwell, 2009) and used that data to develop envelope curves defining the upper bound of pier scour. To expand upon this previous work, an additional cooperative investigation was initiated to combine the South Carolina data with pier-scour data from other sources and evaluate the upper bound of pier scour with this larger data set. To facilitate this analysis, a literature review was made to identify potential sources of published pier-scour data, and selected data were compiled into a digital spreadsheet consisting of approximately 570 laboratory and 1,880 field measurements. These data encompass a wide range of laboratory and field conditions and represent field data from 24 states within the United States and six other countries. This extensive database was used to define the upper bound of pier-scour depth with respect to pier width encompassing the laboratory and field data. Pier width is a primary variable that influences pier-scour depth (Laursen and Toch, 1956; Melville and Coleman, 2000; Mueller and Wagner, 2005, Ettema et al. 2011, Arneson et al. 2012) and therefore, was used as the primary explanatory variable in developing the upper-bound envelope curve. The envelope curve provides a simple but useful tool for assessing the potential maximum pier-scour depth for pier widths of about 30 feet or less.

  3. A tracking system for laboratory mice to support medical researchers in behavioral analysis.

    PubMed

    Macrì, S; Mainetti, L; Patrono, L; Pieretti, S; Secco, A; Sergi, I

    2015-08-01

    The behavioral analysis of laboratory mice plays a key role in several medical and scientific research areas, such as biology, toxicology, pharmacology, and so on. Important information on mice behavior and their reaction to a particular stimulus is deduced from a careful analysis of their movements. Moreover, behavioral analysis of genetically modified mice allows obtaining important information about particular genes, phenotypes or drug effects. The techniques commonly adopted to support such analysis have many limitations, which make the related systems particularly ineffective. Currently, the engineering community is working to explore innovative identification and sensing technologies to develop new tracking systems able to guarantee benefits to animals' behavior analysis. This work presents a tracking solution based on passive Radio Frequency Identification Technology (RFID) in Ultra High Frequency (UHF) band. Much emphasis is given to the software component of the system, based on a Web-oriented solution, able to process the raw tracking data coming from a hardware system, and offer 2D and 3D tracking information as well as reports and dashboards about mice behavior. The system has been widely tested using laboratory mice and compared with an automated video-tracking software (i.e., EthoVision). The obtained results have demonstrated the effectiveness and reliability of the proposed solution, which is able to correctly detect the events occurring in the animals' cage, and to offer a complete and user-friendly tool to support researchers in behavioral analysis of laboratory mice.

  4. The AALAS Learning Library and its effectiveness as a tool for technician training.

    PubMed

    Parker-Thornburg, Jan; Grabeel, Pam; Butler, Mark

    2009-06-01

    Computer-based training is potentially a useful means of gaining proficiency in various aspects of laboratory animal science. The authors present an overview of the AALAS Learning Library (ALL), an internet-based training system that was established in 2003 and is widely used for technician certification training and for IACUC-mandated training. To evaluate the effectiveness of the ALL as a tool for general training and for achieving certification, the AALAS Online Learning Committee initiated a review of the online courses. The authors analyzed the numbers of users who accessed different types of courses and completed exams in those courses. They also correlated ALL usage with pass rates in technician certification exams. Results suggest that the ALL is a highly effective method of training, particularly for technician certification.

  5. Interactive target tracking for persistent wide-area surveillance

    NASA Astrophysics Data System (ADS)

    Ersoy, Ilker; Palaniappan, Kannappan; Seetharaman, Guna S.; Rao, Raghuveer M.

    2012-06-01

    Persistent aerial surveillance is an emerging technology that can provide continuous, wide-area coverage from an aircraft-based multiple-camera system. Tracking targets in these data sets is challenging for vision algorithms due to large data (several terabytes), very low frame rate, changing viewpoint, strong parallax and other imperfections due to registration and projection. Providing an interactive system for automated target tracking also has additional challenges that require online algorithms that are seamlessly integrated with interactive visualization tools to assist the user. We developed an algorithm that overcomes these challenges and demonstrated it on data obtained from a wide-area imaging platform.

  6. Simulation Tool for Dielectric Barrier Discharge Plasma Actuators

    NASA Technical Reports Server (NTRS)

    Likhanskii, Alexander

    2014-01-01

    Traditional approaches for active flow separation control using dielectric barrier discharge (DBD) plasma actuators are limited to relatively low speed flows and atmospheric conditions. This results in low feasibility of the DBDs for aerospace applications. For active flow control at turbine blades, fixed wings, and rotary wings and on hypersonic vehicles, DBD plasma actuators must perform at a wide range of conditions, including rarified flows and combustion mixtures. An efficient, comprehensive, physically based DBD simulation tool can optimize DBD plasma actuators for different operation conditions. Researchers are developing a DBD plasma actuator simulation tool for a wide range of ambient gas pressures. The tool will treat DBD using either kinetic, fluid, or hybrid models, depending on the DBD operational condition.

  7. Critical Value Reporting at Egyptian Laboratories.

    PubMed

    Mosallam, Rasha; Ibrahim, Samaa Zenhom

    2015-06-12

    To examine critical value reporting policies and practices and to identify critical value ranges for selected common laboratory assays at inpatient division of laboratories of Alexandria hospitals. A cross-sectional descriptive study design was used. Subjects were from inpatient division of all laboratories of Alexandria hospitals (40 laboratories). Data were collected using a questionnaire composed of 4 sections. The first section explored hospital and laboratory characteristics. The second section assessed policies and procedures of critical value reporting. The third section explored the reporting process. The fourth section explored critical value ranges for selected common laboratory assays. Written procedure for reporting of critical values was present in 77.5% of laboratories and a comprehensive list of critical values in 72.55%. For laboratories having a critical value list, the number of tests in the list ranged from 7 to 40. Three-fifths of laboratories had a policy for assessing the timeliness of reporting and 3 quarters stated that the laboratory policy requires feedback (60.0% and 75.0%, respectively). The hospital laboratory physician was responsible for critical value reporting followed by the laboratory technician (75.0% and 50.0%, respectively). The call is received mainly by nurses and physicians ordering the test (67.5% and 55.0%, respectively) and the channel of reporting is mainly the telephone or through sending test report to the ward (67.5% and 50.0%, respectively). Wireless technologies are used in reporting in only 10.0% of hospitals. The cutoff limits for reporting different assays showed considerable interlaboratory variation. Critical value policies and practices showed interinstitutional variation with deficiencies in some reporting practices. Selection of critical assays for notification and setting the limits of notification exhibited wide variation as well.

  8. Contingency diagrams as teaching tools.

    PubMed

    Mattaini, M A

    1995-01-01

    Contingency diagrams are particularly effective teaching tools, because they provide a means for students to view the complexities of contingency networks present in natural and laboratory settings while displaying the elementary processes that constitute those networks. This paper sketches recent developments in this visualization technology and illustrates approaches for using contingency diagrams in teaching.

  9. Clinical laboratory analytics: Challenges and promise for an emerging discipline

    PubMed Central

    Shirts, Brian H.; Jackson, Brian R.; Baird, Geoffrey S.; Baron, Jason M.; Clements, Bryan; Grisson, Ricky; Hauser, Ronald George; Taylor, Julie R.; Terrazas, Enrique; Brimhall, Brad

    2015-01-01

    The clinical laboratory is a major source of health care data. Increasingly these data are being integrated with other data to inform health system-wide actions meant to improve diagnostic test utilization, service efficiency, and “meaningful use.” The Academy of Clinical Laboratory Physicians and Scientists hosted a satellite meeting on clinical laboratory analytics in conjunction with their annual meeting on May 29, 2014 in San Francisco. There were 80 registrants for the clinical laboratory analytics meeting. The meeting featured short presentations on current trends in clinical laboratory analytics and several panel discussions on data science in laboratory medicine, laboratory data and its role in the larger healthcare system, integrating laboratory analytics, and data sharing for collaborative analytics. One main goal of meeting was to have an open forum of leaders that work with the “big data” clinical laboratories produce. This article summarizes the proceedings of the meeting and content discussed. PMID:25774320

  10. Clinical laboratory: bigger is not always better.

    PubMed

    Plebani, Mario

    2018-06-27

    Laboratory services around the world are undergoing substantial consolidation and changes through mechanisms ranging from mergers, acquisitions and outsourcing, primarily based on expectations to improve efficiency, increasing volumes and reducing the cost per test. However, the relationship between volume and costs is not linear and numerous variables influence the end cost per test. In particular, the relationship between volumes and costs does not span the entire platter of clinical laboratories: high costs are associated with low volumes up to a threshold of 1 million test per year. Over this threshold, there is no linear association between volumes and costs, as laboratory organization rather than test volume more significantly affects the final costs. Currently, data on laboratory errors and associated diagnostic errors and risk for patient harm emphasize the need for a paradigmatic shift: from a focus on volumes and efficiency to a patient-centered vision restoring the nature of laboratory services as an integral part of the diagnostic and therapy process. Process and outcome quality indicators are effective tools to measure and improve laboratory services, by stimulating a competition based on intra- and extra-analytical performance specifications, intermediate outcomes and customer satisfaction. Rather than competing with economic value, clinical laboratories should adopt a strategy based on a set of harmonized quality indicators and performance specifications, active laboratory stewardship, and improved patient safety.

  11. US Naval Research Laboratory's Current Space Photovoltaic Experiemtns

    NASA Astrophysics Data System (ADS)

    Jenkins, Phillip; Walters, Robert; Messenger, Scott; Krasowski, Michael

    2008-09-01

    The US Naval Research Laboratory (NRL) has a rich history conducting space photovoltaic (PV) experiments starting with Vanguard I, the first solar powered satellite in 1958. Today, NRL in collaboration with the NASA Glenn Research Center, is engaged in three flight experiments demonstrating a wide range of PV technologies in both LEO and HEO orbits. The Forward Technology Solar Cell Experiment (FTSCE)[1], part of the 5th Materials on the International Space Station Experiment (MISSE-5), flew for 13 months on the International Space Station in 2005-2006. The FTSCE provided in-situ I-V monitoring of advanced III-V multi-junction cells and laboratory prototypes of thin film and other next generation technologies. Two experiments under development will provide more opportunities to demonstrate advanced solar cells and characterization electronics that are easily integrated on a wide variety of spacecraft bus architectures.

  12. Simulation studies of a wide area health care network.

    PubMed Central

    McDaniel, J. G.

    1994-01-01

    There is an increasing number of efforts to install wide area health care networks. Some of these networks are being built to support several applications over a wide user base consisting primarily of medical practices, hospitals, pharmacies, medical laboratories, payors, and suppliers. Although on-line, multi-media telecommunication is desirable for some purposes such as cardiac monitoring, store-and-forward messaging is adequate for many common, high-volume applications. Laboratory test results and payment claims, for example, can be distributed using electronic messaging networks. Several network prototypes have been constructed to determine the technical problems and to assess the effectiveness of electronic messaging in wide area health care networks. Our project, Health Link, developed prototype software that was able to use the public switched telephone network to exchange messages automatically, reliably and securely. The network could be configured to accommodate the many different traffic patterns and cost constraints of its users. Discrete event simulations were performed on several network models. Canonical star and mesh networks, that were composed of nodes operating at steady state under equal loads, were modeled. Both topologies were found to support the throughput of a generic wide area health care network. The mean message delivery time of the mesh network was found to be less than that of the star network. Further simulations were conducted for a realistic large-scale health care network consisting of 1,553 doctors, 26 hospitals, four medical labs, one provincial lab and one insurer. Two network topologies were investigated: one using predominantly peer-to-peer communication, the other using client-server communication.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:7949966

  13. Teacher Leadership: Teacher Self-Assessment Tool

    ERIC Educational Resources Information Center

    American Institutes for Research, 2017

    2017-01-01

    As interest in teacher leadership has grown, many leading organizations have developed tools and guidance to support schools, districts, and teacher leaders themselves. In collaboration and consultation with the Regional Educational Laboratory (REL) Midwest Educator Effectiveness Research Alliance, REL Midwest and the Center on Great Teachers and…

  14. Development and implications of technology in reform-based physics laboratories

    NASA Astrophysics Data System (ADS)

    Chen, Sufen; Lo, Hao-Chang; Lin, Jing-Wen; Liang, Jyh-Chong; Chang, Hsin-Yi; Hwang, Fu-Kwun; Chiou, Guo-Li; Wu, Ying-Tien; Lee, Silvia Wen-Yu; Wu, Hsin-Kai; Wang, Chia-Yu; Tsai, Chin-Chung

    2012-12-01

    Technology has been widely involved in science research. Researchers are now applying it to science education in an attempt to bring students’ science activities closer to authentic science activities. The present study synthesizes the research to discuss the development of technology-enhanced laboratories and how technology may contribute to fulfilling the instructional objectives of laboratories in physics. To be more specific, this paper discusses the engagement of technology to innovate physics laboratories and the potential of technology to promote inquiry, instructor and peer interaction, and learning outcomes. We then construct a framework for teachers, scientists, and programmers to guide and evaluate technology-integrated laboratories. The framework includes inquiry learning and openness supported by technology, ways of conducting laboratories, and the diverse learning objectives on which a technology-integrated laboratory may be focused.

  15. Annotated bibliography of Software Engineering Laboratory literature

    NASA Technical Reports Server (NTRS)

    1985-01-01

    An annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory is presented. More than 100 publications are summarized. These publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials are grouped into five general subject areas for easy reference: (1) the software engineering laboratory; (2) software tools; (3) models and measures; (4) technology evaluations; and (5) data collection. An index further classifies these documents by specific topic.

  16. Parallel and serial computing tools for testing single-locus and epistatic SNP effects of quantitative traits in genome-wide association studies

    PubMed Central

    Ma, Li; Runesha, H Birali; Dvorkin, Daniel; Garbe, John R; Da, Yang

    2008-01-01

    Background Genome-wide association studies (GWAS) using single nucleotide polymorphism (SNP) markers provide opportunities to detect epistatic SNPs associated with quantitative traits and to detect the exact mode of an epistasis effect. Computational difficulty is the main bottleneck for epistasis testing in large scale GWAS. Results The EPISNPmpi and EPISNP computer programs were developed for testing single-locus and epistatic SNP effects on quantitative traits in GWAS, including tests of three single-locus effects for each SNP (SNP genotypic effect, additive and dominance effects) and five epistasis effects for each pair of SNPs (two-locus interaction, additive × additive, additive × dominance, dominance × additive, and dominance × dominance) based on the extended Kempthorne model. EPISNPmpi is the parallel computing program for epistasis testing in large scale GWAS and achieved excellent scalability for large scale analysis and portability for various parallel computing platforms. EPISNP is the serial computing program based on the EPISNPmpi code for epistasis testing in small scale GWAS using commonly available operating systems and computer hardware. Three serial computing utility programs were developed for graphical viewing of test results and epistasis networks, and for estimating CPU time and disk space requirements. Conclusion The EPISNPmpi parallel computing program provides an effective computing tool for epistasis testing in large scale GWAS, and the epiSNP serial computing programs are convenient tools for epistasis analysis in small scale GWAS using commonly available computer hardware. PMID:18644146

  17. Scientific Computing Strategic Plan for the Idaho National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whiting, Eric Todd

    Scientific computing is a critical foundation of modern science. Without innovations in the field of computational science, the essential missions of the Department of Energy (DOE) would go unrealized. Taking a leadership role in such innovations is Idaho National Laboratory’s (INL’s) challenge and charge, and is central to INL’s ongoing success. Computing is an essential part of INL’s future. DOE science and technology missions rely firmly on computing capabilities in various forms. Modeling and simulation, fueled by innovations in computational science and validated through experiment, are a critical foundation of science and engineering. Big data analytics from an increasing numbermore » of widely varied sources is opening new windows of insight and discovery. Computing is a critical tool in education, science, engineering, and experiments. Advanced computing capabilities in the form of people, tools, computers, and facilities, will position INL competitively to deliver results and solutions on important national science and engineering challenges. A computing strategy must include much more than simply computers. The foundational enabling component of computing at many DOE national laboratories is the combination of a showcase like data center facility coupled with a very capable supercomputer. In addition, network connectivity, disk storage systems, and visualization hardware are critical and generally tightly coupled to the computer system and co located in the same facility. The existence of these resources in a single data center facility opens the doors to many opportunities that would not otherwise be possible.« less

  18. Customizing Laboratory Information Systems: Closing the Functionality Gap.

    PubMed

    Gershkovich, Peter; Sinard, John H

    2015-09-01

    Highly customizable laboratory information systems help to address great variations in laboratory workflows, typical in Pathology. Often, however, built-in customization tools are not sufficient to add all of the desired functionality and improve systems interoperability. Emerging technologies and advances in medicine often create a void in functionality that we call a functionality gap. These gaps have distinct characteristics—a persuasive need to change the way a pathology group operates, the general availability of technology to address the missing functionality, the absence of this technology from your laboratory information system, and inability of built-in customization tools to address it. We emphasize the pervasive nature of these gaps, the role of pathology informatics in closing them, and suggest methods on how to achieve that. We found that a large number of the papers in the Journal of Pathology Informatics are concerned with these functionality gaps, and an even larger proportion of electronic posters and abstracts presented at the Pathology Informatics Summit conference each year deal directly with these unmet needs in pathology practice. A rapid, continuous, and sustainable approach to closing these gaps is critical for Pathology to provide the highest quality of care, adopt new technologies, and meet regulatory and financial challenges. The key element of successfully addressing functionality gaps is gap ownership—the ability to control the entire pathology information infrastructure with access to complementary systems and components. In addition, software developers with detailed domain expertise, equipped with right tools and methodology can effectively address these needs as they emerge.

  19. Biosafety Practices and Emergency Response at the Idaho National Laboratory and Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frank F. Roberto; Dina M. Matz

    2008-03-01

    Strict federal regulations govern the possession, use, and transfer of pathogens and toxins with potential to cause harm to the public, either through accidental or deliberate means. Laboratories registered through either the Centers for Disease Control and Prevention (CDC), the U.S. Dept. of Agriculture (USDA), or both, must prepare biosafety, security, and incident response plans, conduct drills or exercises on an annual basis, and update plans accordingly. At the Idaho National Laboratory (INL), biosafety, laboratory, and emergency management staff have been working together for 2 years to satisfy federal and DOE/NNSA requirements. This has been done through the establishment ofmore » plans, training, tabletop and walk-through exercises and drills, and coordination with local and regional emergency response personnel. Responding to the release of infectious agents or toxins is challenging, but through familiarization with the nature of the hazardous biological substances or organisms, and integration with laboratory-wide emergency response procedures, credible scenarios are being used to evaluate our ability to protect workers, the public, and the environment from agents we must work with to provide for national biodefense.« less

  20. Implementation of a School-wide Clinical Intervention Documentation System

    PubMed Central

    Stevenson, T. Lynn; Fox, Brent I.; Andrus, Miranda; Carroll, Dana

    2011-01-01

    Objective. To evaluate the effectiveness and impact of a customized Web-based software program implemented in 2006 for school-wide documentation of clinical interventions by pharmacy practice faculty members, pharmacy residents, and student pharmacists. Methods. The implementation process, directed by a committee of faculty members and school administrators, included preparation and refinement of the software, user training, development of forms and reports, and integration of the documentation process within the curriculum. Results. Use of the documentation tool consistently increased from May 2007 to December 2010. Over 187,000 interventions were documented with over $6.2 million in associated cost avoidance. Conclusions. Successful implementation of a school-wide documentation tool required considerable time from the oversight committee and a comprehensive training program for all users, with ongoing monitoring of data collection practices. Data collected proved to be useful to show the impact of faculty members, residents, and student pharmacists at affiliated training sites. PMID:21829264

  1. The "hospital central laboratory": automation, integration and clinical usefulness.

    PubMed

    Zaninotto, Martina; Plebani, Mario

    2010-07-01

    Recent technological developments in laboratory medicine have led to a major challenge, maintaining a close connection between the search of efficiency through automation and consolidation and the assurance of effectiveness. The adoption of systems that automate most of the manual tasks characterizing routine activities has significantly improved the quality of laboratory performance; total laboratory automation being the paradigm of the idea that "human-less" robotic laboratories may allow for better operation and insuring less human errors. Furthermore, even if ongoing technological developments have considerably improved the productivity of clinical laboratories as well as reducing the turnaround time of the entire process, the value of qualified personnel remains a significant issue. Recent evidence confirms that automation allows clinical laboratories to improve analytical performances only if trained staff operate in accordance with well-defined standard operative procedures, thus assuring continuous monitoring of the analytical quality. In addition, laboratory automation may improve the appropriateness of test requests through the use of algorithms and reflex testing. This should allow the adoption of clinical and biochemical guidelines. In conclusion, in laboratory medicine, technology represents a tool for improving clinical effectiveness and patient outcomes, but it has to be managed by qualified laboratory professionals.

  2. Introduction to Nucleonics: A Laboratory Course.

    ERIC Educational Resources Information Center

    Phelps, William; And Others

    This student text and laboratory manual is designed primarily for the non-college bound high school student. It can be adapted, however, to a wide range of abilities. It begins with an examination of the properties of nuclear radiation, develops an understanding of the fundamentals of nucleonics, and ends with an investigation of careers in areas…

  3. Towards an evaluation framework for Laboratory Information Systems.

    PubMed

    Yusof, Maryati M; Arifin, Azila

    Laboratory testing and reporting are error-prone and redundant due to repeated, unnecessary requests and delayed or missed reactions to laboratory reports. Occurring errors may negatively affect the patient treatment process and clinical decision making. Evaluation on laboratory testing and Laboratory Information System (LIS) may explain the root cause to improve the testing process and enhance LIS in supporting the process. This paper discusses a new evaluation framework for LIS that encompasses the laboratory testing cycle and the socio-technical part of LIS. Literature review on discourses, dimensions and evaluation methods of laboratory testing and LIS. A critical appraisal of the Total Testing Process (TTP) and the human, organization, technology-fit factors (HOT-fit) evaluation frameworks was undertaken in order to identify error incident, its contributing factors and preventive action pertinent to laboratory testing process and LIS. A new evaluation framework for LIS using a comprehensive and socio-technical approach is outlined. Positive relationship between laboratory and clinical staff resulted in a smooth laboratory testing process, reduced errors and increased process efficiency whilst effective use of LIS streamlined the testing processes. The TTP-LIS framework could serve as an assessment as well as a problem-solving tool for the laboratory testing process and system. Copyright © 2016 King Saud Bin Abdulaziz University for Health Sciences. Published by Elsevier Ltd. All rights reserved.

  4. Introductory Physics Laboratories for Life Scientists - Hands on Physics of Complex Systems

    NASA Astrophysics Data System (ADS)

    Losert, Wolfgang; Moore, Kim

    2015-03-01

    We have developed a set of laboratories and hands on activities to accompany a new two-semester interdisciplinary physics course that has been successfully implemented as the required physics course for premeds at the University of Maryland. The laboratories include significant content on physics relevant to cellular scales, from chemical interactions to random motion and charge screening in fluids. We also introduce the students to research-grade equipment and modern physics analysis tools in contexts relevant to biology, while maintaining the pedagogically valuable open-ended laboratory structure of reformed laboratories.

  5. Electrophysiology Tool Construction

    PubMed Central

    Ide, David

    2016-01-01

    This protocol documents the construction of a custom microscope stage system currently in widespread use by a wide variety of investigators. The current design and construction of this stage is the result of multiple iterations, integrating input from a number of electrophysiologists working with a variety of preparations. Thus, this tool is a generally applicable solution, suitable for a wide array of end-user requirements; its flexible design facilitates rapid and easy configuration, making it useful for multi-user microscopes, as individual researchers can reconfigure the stage system or have their own readily replaceable stage plates. Furthermore, the stage can be manufactured using equipment typically found in small research machine shops, and by keeping the various parts on hand, machinists can quickly satisfy new requests and/or modifications for a wide variety of applications. PMID:23315946

  6. Open access tools for quality-assured and efficient data entry in a large, state-wide tobacco survey in India

    PubMed Central

    Shewade, Hemant Deepak; Vidhubala, E; Subramani, Divyaraj Prabhakar; Lal, Pranay; Bhatt, Neelam; Sundaramoorthi, C.; Singh, Rana J.; Kumar, Ajay M. V.

    2017-01-01

    ABSTRACT Background: A large state-wide tobacco survey was conducted using modified version of pretested, globally validated Global Adult Tobacco Survey (GATS) questionnaire in 2015–22016 in Tamil Nadu, India. Due to resource constrains, data collection was carrid out using paper-based questionnaires (unlike the GATS-India, 2009–2010, which used hand-held computer devices) while data entry was done using open access tools. The objective of this paper is to describe the process of data entry and assess its quality assurance and efficiency. Methods: In EpiData language, a variable is referred to as ‘field’ and a questionnaire (set of fields) as ‘record’. EpiData software was used for double data entry with adequate checks followed by validation. Teamviewer was used for remote training and trouble shooting. The EpiData databases (one each for each district and each zone in Chennai city) were housed in shared Dropbox folders, which enabled secure sharing of files and automatic back-up. Each database for a district/zone had separate file for data entry of household level and individual level questionnaire. Results: Of 32,945 households, there were 111,363 individuals aged ≥15 years. The average proportion of records with data entry errors for a district/zone in household level and individual level file was 4% and 24%, respectively. These are the errors that would have gone unnoticed if single entry was used. The median (inter-quartile range) time taken for double data entry for a single household level and individual level questionnaire was 30 (24, 40) s and 86 (64, 126) s, respectively. Conclusion: Efficient and quality-assured near-real-time data entry in a large sub-national tobacco survey was performed using innovative, resource-efficient use of open access tools. PMID:29092673

  7. Open access tools for quality-assured and efficient data entry in a large, state-wide tobacco survey in India.

    PubMed

    Shewade, Hemant Deepak; Vidhubala, E; Subramani, Divyaraj Prabhakar; Lal, Pranay; Bhatt, Neelam; Sundaramoorthi, C; Singh, Rana J; Kumar, Ajay M V

    2017-01-01

    A large state-wide tobacco survey was conducted using modified version of pretested, globally validated Global Adult Tobacco Survey (GATS) questionnaire in 2015-22016 in Tamil Nadu, India. Due to resource constrains, data collection was carrid out using paper-based questionnaires (unlike the GATS-India, 2009-2010, which used hand-held computer devices) while data entry was done using open access tools. The objective of this paper is to describe the process of data entry and assess its quality assurance and efficiency. In EpiData language, a variable is referred to as 'field' and a questionnaire (set of fields) as 'record'. EpiData software was used for double data entry with adequate checks followed by validation. Teamviewer was used for remote training and trouble shooting. The EpiData databases (one each for each district and each zone in Chennai city) were housed in shared Dropbox folders, which enabled secure sharing of files and automatic back-up. Each database for a district/zone had separate file for data entry of household level and individual level questionnaire. Of 32,945 households, there were 111,363 individuals aged ≥15 years. The average proportion of records with data entry errors for a district/zone in household level and individual level file was 4% and 24%, respectively. These are the errors that would have gone unnoticed if single entry was used. The median (inter-quartile range) time taken for double data entry for a single household level and individual level questionnaire was 30 (24, 40) s and 86 (64, 126) s, respectively. Efficient and quality-assured near-real-time data entry in a large sub-national tobacco survey was performed using innovative, resource-efficient use of open access tools.

  8. Systems engineering and integration: Advanced avionics laboratories

    NASA Technical Reports Server (NTRS)

    1990-01-01

    In order to develop the new generation of avionics which will be necessary for upcoming programs such as the Lunar/Mars Initiative, Advanced Launch System, and the National Aerospace Plane, new Advanced Avionics Laboratories are required. To minimize costs and maximize benefits, these laboratories should be capable of supporting multiple avionics development efforts at a single location, and should be of a common design to support and encourage data sharing. Recent technological advances provide the capability of letting the designer or analyst perform simulations and testing in an environment similar to his engineering environment and these features should be incorporated into the new laboratories. Existing and emerging hardware and software standards must be incorporated wherever possible to provide additional cost savings and compatibility. Special care must be taken to design the laboratories such that real-time hardware-in-the-loop performance is not sacrificed in the pursuit of these goals. A special program-independent funding source should be identified for the development of Advanced Avionics Laboratories as resources supporting a wide range of upcoming NASA programs.

  9. The Litho-Density tool calibration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ellis, D.; Flaum, C.; Marienbach, E.

    1983-10-01

    The Litho-Density tool (LDT) uses a gamma ray source and two NaI scintillator detectors for borehole measurement of electron density, p/SUB e/, and a quantity, P/SUB e/, which is related to the photoelectric cross section at 60 keV and therefore to the lithology of the formation. An active stabilization system controls the gains of the two detectors which permits selective gamma-ray detection. Spectral analysis is performed in the near detector (2 energy windows) and in the detector farther away from the source (3 energy windows). This paper describes the results of laboratory measurements undertaken to define the basic tool response.more » The tool is shown to provide reliable measurements of formation density and lithology under a variety of environmental conditions.« less

  10. Harmonization of European laboratory response networks by implementing CWA 15793: use of a gap analysis and an "insider" exercise as tools.

    PubMed

    Sundqvist, Bo; Bengtsson, Ulrika Allard; Wisselink, Henk J; Peeters, Ben P H; van Rotterdam, Bart; Kampert, Evelien; Bereczky, Sándor; Johan Olsson, N G; Szekely Björndal, Asa; Zini, Sylvie; Allix, Sébastien; Knutsson, Rickard

    2013-09-01

    Laboratory response networks (LRNs) have been established for security reasons in several countries including the Netherlands, France, and Sweden. LRNs function in these countries as a preparedness measure for a coordinated diagnostic response capability in case of a bioterrorism incident or other biocrimes. Generally, these LRNs are organized on a national level. The EU project AniBioThreat has identified the need for an integrated European LRN to strengthen preparedness against animal bioterrorism. One task of the AniBioThreat project is to suggest a plan to implement laboratory biorisk management CWA 15793:2011 (CWA 15793), a management system built on the principle of continual improvement through the Plan-Do-Check-Act (PDCA) cycle. The implementation of CWA 15793 can facilitate trust and credibility in a future European LRN and is an assurance that the work done at the laboratories is performed in a structured way with continuous improvements. As a first step, a gap analysis was performed to establish the current compliance status of biosafety and laboratory biosecurity management with CWA 15793 in 5 AniBioThreat partner institutes in France (ANSES), the Netherlands (CVI and RIVM), and Sweden (SMI and SVA). All 5 partners are national and/or international laboratory reference institutes in the field of public or animal health and possess high-containment laboratories and animal facilities. The gap analysis showed that the participating institutes already have robust biorisk management programs in place, but several gaps were identified that need to be addressed. Despite differences between the participating institutes in their compliance status, these variations are not significant. Biorisk management exercises also have been identified as a useful tool to control compliance status and thereby implementation of CWA 15793. An exercise concerning an insider threat and loss of a biological agent was performed at SVA in the AniBioThreat project to evaluate

  11. Mouse Genome Informatics (MGI) Resource: Genetic, Genomic, and Biological Knowledgebase for the Laboratory Mouse.

    PubMed

    Eppig, Janan T

    2017-07-01

    The Mouse Genome Informatics (MGI) Resource supports basic, translational, and computational research by providing high-quality, integrated data on the genetics, genomics, and biology of the laboratory mouse. MGI serves a strategic role for the scientific community in facilitating biomedical, experimental, and computational studies investigating the genetics and processes of diseases and enabling the development and testing of new disease models and therapeutic interventions. This review describes the nexus of the body of growing genetic and biological data and the advances in computer technology in the late 1980s, including the World Wide Web, that together launched the beginnings of MGI. MGI develops and maintains a gold-standard resource that reflects the current state of knowledge, provides semantic and contextual data integration that fosters hypothesis testing, continually develops new and improved tools for searching and analysis, and partners with the scientific community to assure research data needs are met. Here we describe one slice of MGI relating to the development of community-wide large-scale mutagenesis and phenotyping projects and introduce ways to access and use these MGI data. References and links to additional MGI aspects are provided. © The Author 2017. Published by Oxford University Press.

  12. Mouse Genome Informatics (MGI) Resource: Genetic, Genomic, and Biological Knowledgebase for the Laboratory Mouse

    PubMed Central

    Eppig, Janan T.

    2017-01-01

    Abstract The Mouse Genome Informatics (MGI) Resource supports basic, translational, and computational research by providing high-quality, integrated data on the genetics, genomics, and biology of the laboratory mouse. MGI serves a strategic role for the scientific community in facilitating biomedical, experimental, and computational studies investigating the genetics and processes of diseases and enabling the development and testing of new disease models and therapeutic interventions. This review describes the nexus of the body of growing genetic and biological data and the advances in computer technology in the late 1980s, including the World Wide Web, that together launched the beginnings of MGI. MGI develops and maintains a gold-standard resource that reflects the current state of knowledge, provides semantic and contextual data integration that fosters hypothesis testing, continually develops new and improved tools for searching and analysis, and partners with the scientific community to assure research data needs are met. Here we describe one slice of MGI relating to the development of community-wide large-scale mutagenesis and phenotyping projects and introduce ways to access and use these MGI data. References and links to additional MGI aspects are provided. PMID:28838066

  13. Tool for Torquing Circular Electrical-Connector Collars

    NASA Technical Reports Server (NTRS)

    Gaulke, Kathryn; Werneth, Russell; Grunsfeld, John; O'Neill, Patrick; Snyder, Russ

    2006-01-01

    An improved tool has been devised for applying torque to lock and unlock knurled collars on circular electrical connectors. The tool was originally designed for, and used by, astronauts working in outer space on the Hubble Space Telescope (HST). The tool is readily adaptable to terrestrial use in installing and removing the same or similar circular electrical connectors as well as a wide variety of other cylindrical objects, the tightening and loosening of which entail considerable amounts of torque.

  14. Green Fluorescent Protein-Focused Bioinformatics Laboratory Experiment Suitable for Undergraduates in Biochemistry Courses

    ERIC Educational Resources Information Center

    Rowe, Laura

    2017-01-01

    An introductory bioinformatics laboratory experiment focused on protein analysis has been developed that is suitable for undergraduate students in introductory biochemistry courses. The laboratory experiment is designed to be potentially used as a "stand-alone" activity in which students are introduced to basic bioinformatics tools and…

  15. ARTS: a web-based tool for the set-up of high-throughput genome-wide mapping panels for the SNP genotyping of mouse mutants.

    PubMed

    Klaften, Matthias; Hrabé de Angelis, Martin

    2005-07-01

    Genome-wide mapping in the identification of novel candidate genes has always been the standard method in genetics and genomics to correlate a clinically interesting phenotypic trait with a genotype. However, the performance of a mapping experiment using classical microsatellite approaches can be very time consuming. The high-throughput analysis of single-nucleotide polymorphisms (SNPs) has the potential of being the successor of microsatellite analysis routinely used for these mapping approaches, where one of the major obstacles is the design of the appropriate SNP marker set itself. Here we report on ARTS, an advanced retrieval tool for SNPs, which allows researchers to comb freely the public mouse dbSNP database for multiple reference and test strains. Several filters can be applied in order to improve the sensitivity and the specificity of the search results. By employing the panel generator function of this program, it is possible to abbreviate the extraction of reliable sequence data for a large marker panel including several different mouse strains from days to minutes. The concept of ARTS is easily adaptable to other species for which SNP databases are available, making it a versatile tool for the use of SNPs as markers for genotyping. The web interface is accessible at http://andromeda.gsf.de/arts.

  16. Building Cross-Country Networks for Laboratory Capacity and Improvement.

    PubMed

    Schneidman, Miriam; Matu, Martin; Nkengasong, John; Githui, Willie; Kalyesubula-Kibuuka, Simeon; Silva, Kelly Araujo

    2018-03-01

    Laboratory networks are vital to well-functioning public health systems and disease control efforts. Cross-country laboratory networks play a critical role in supporting epidemiologic surveillance, accelerating disease outbreak response, and tracking drug resistance. The East Africa Public Health Laboratory Network was established to bolster diagnostic and disease surveillance capacity. The network supports the introduction of regional quality standards; facilitates the rollout and evaluation of new diagnostic tools; and serves as a platform for training, research, and knowledge sharing. Participating facilities benefitted from state-of-the art investments, capacity building, and mentorship; conducted multicountry research studies; and contributed to disease outbreak response. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Laboratory directed research and development program, FY 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1997-02-01

    The Ernest Orlando Lawrence Berkeley National Laboratory (Berkeley Lab) Laboratory Directed Research and Development Program FY 1996 report is compiled from annual reports submitted by principal investigators following the close of the fiscal year. This report describes the projects supported and summarizes their accomplishments. It constitutes a part of the Laboratory Directed Research and Development (LDRD) program planning and documentation process that includes an annual planning cycle, projection selection, implementation, and review. The Berkeley Lab LDRD program is a critical tool for directing the Laboratory`s forefront scientific research capabilities toward vital, excellent, and emerging scientific challenges. The program provides themore » resources for Berkeley Lab scientists to make rapid and significant contributions to critical national science and technology problems. The LDRD program also advances the Laboratory`s core competencies, foundations, and scientific capability, and permits exploration of exciting new opportunities. Areas eligible for support include: (1) Work in forefront areas of science and technology that enrich Laboratory research and development capability; (2) Advanced study of new hypotheses, new experiments, and innovative approaches to develop new concepts or knowledge; (3) Experiments directed toward proof of principle for initial hypothesis testing or verification; and (4) Conception and preliminary technical analysis to explore possible instrumentation, experimental facilities, or new devices.« less

  18. LabRS: A Rosetta stone for retrospective standardization of clinical laboratory test results.

    PubMed

    Hauser, Ronald George; Quine, Douglas B; Ryder, Alex

    2018-02-01

    Clinical laboratories in the United States do not have an explicit result standard to report the 7 billion laboratory tests results they produce each year. The absence of standardized test results creates inefficiencies and ambiguities for secondary data users. We developed and tested a tool to standardize the results of laboratory tests in a large, multicenter clinical data warehouse. Laboratory records, each of which consisted of a laboratory result and a test identifier, from 27 diverse facilities were captured from 2000 through 2015. Each record underwent a standardization process to convert the original result into a format amenable to secondary data analysis. The standardization process included the correction of typos, normalization of categorical results, separation of inequalities from numbers, and conversion of numbers represented by words (eg, "million") to numerals. Quality control included expert review. We obtained 1.266 × 109 laboratory records and standardized 1.252 × 109 records (98.9%). Of the unique unstandardized records (78.887 × 103), most appeared <5 times (96%, eg, typos), did not have a test identifier (47%), or belonged to an esoteric test with <100 results (2%). Overall, these 3 reasons accounted for nearly all unstandardized results (98%). Current results suggest that the tool is both scalable and generalizable among diverse clinical laboratories. Based on observed trends, the tool will require ongoing maintenance to stay current with new tests and result formats. Future work to develop and implement an explicit standard for test results would reduce the need to retrospectively standardize test results. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  19. OPTIMIZING USABILITY OF AN ECONOMIC DECISION SUPPORT TOOL: PROTOTYPE OF THE EQUIPT TOOL.

    PubMed

    Cheung, Kei Long; Hiligsmann, Mickaël; Präger, Maximilian; Jones, Teresa; Józwiak-Hagymásy, Judit; Muñoz, Celia; Lester-George, Adam; Pokhrel, Subhash; López-Nicolás, Ángel; Trapero-Bertran, Marta; Evers, Silvia M A A; de Vries, Hein

    2018-01-01

    Economic decision-support tools can provide valuable information for tobacco control stakeholders, but their usability may impact the adoption of such tools. This study aims to illustrate a mixed-method usability evaluation of an economic decision-support tool for tobacco control, using the EQUIPT ROI tool prototype as a case study. A cross-sectional mixed methods design was used, including a heuristic evaluation, a thinking aloud approach, and a questionnaire testing and exploring the usability of the Return of Investment tool. A total of sixty-six users evaluated the tool (thinking aloud) and completed the questionnaire. For the heuristic evaluation, four experts evaluated the interface. In total twenty-one percent of the respondents perceived good usability. A total of 118 usability problems were identified, from which twenty-six problems were categorized as most severe, indicating high priority to fix them before implementation. Combining user-based and expert-based evaluation methods is recommended as these were shown to identify unique usability problems. The evaluation provides input to optimize usability of a decision-support tool, and may serve as a vantage point for other developers to conduct usability evaluations to refine similar tools before wide-scale implementation. Such studies could reduce implementation gaps by optimizing usability, enhancing in turn the research impact of such interventions.

  20. Improvements in diagnostic tools for early detection of psoriatic arthritis.

    PubMed

    D'Angelo, Salvatore; Palazzi, Carlo; Gilio, Michele; Leccese, Pietro; Padula, Angela; Olivieri, Ignazio

    2016-11-01

    Psoriatic arthritis (PsA) is a heterogeneous chronic inflammatory disease characterized by a wide clinical spectrum. The early diagnosis of PsA is currently a challenging topic. Areas covered: The literature was extensively reviewed for studies addressing the topic area "diagnosis of psoriatic arthritis". This review will summarize improvements in diagnostic tools, especially referral to the rheumatologist, the role of patient history and clinical examination, laboratory tests, and imaging techniques in getting an early and correct diagnosis of PsA. Expert commentary: Due to the heterogeneity of its expression, PsA may be easily either overdiagnosed or underdiagnosed. A diagnosis of PsA should be taken into account every time a patient with psoriasis or a family history of psoriasis shows peripheral arthritis, especially if oligoarticular or involving the distal interphalangeal joints, enthesitis or dactylitis. Magnetic resonance imaging and ultrasonography are useful for diagnosing PsA early, particularly when isolated enthesitis or inflammatory spinal pain occur.

  1. [Development of novel laboratory technology--Chairmen's introductory remarks].

    PubMed

    Maekawa, Masato; Ando, Yukio

    2012-07-01

    The theme of the 58th annual meeting is, "Mission and Challenge of Laboratory Medicine". This symposium is named, "Development of Novel Laboratory Technology" and is held under the joint sponsorship of the Japanese Society of Clinical Chemistry and the Japanese Electrophoresis Society. Both societies have superior skills at developing methodology and technology. The tools used in the lectures are a carbon nanotube sensor, immunochromatography, direct measurement using polyanions and detergents, epigenomic analysis and fluorescent two-dimensional electrophoresis. All of the lectures will be very helpful and interesting.

  2. Laboratory medicine and sports: between Scylla and Charybdis.

    PubMed

    Lippi, Giuseppe; Banfi, Giuseppe; Botrè, Francesco; de la Torre, Xavier; De Vita, Francesco; Gomez-Cabrera, Mari Carmen; Maffulli, Nicola; Marchioro, Lucio; Pacifici, Roberta; Sanchis-Gomar, Fabian; Schena, Federico; Plebani, Mario

    2012-02-28

    Laboratory medicine is complex and contributes to the diagnosis, therapeutic monitoring and follow-up of acquired and inherited human disorders. The regular practice of physical exercise provides important benefits in heath and disease and sports medicine is thereby receiving growing focus from almost each and every clinical discipline, including laboratory medicine. Sport-laboratory medicine is a relatively innovative branch of laboratory science, which can provide valuable contributions to the diagnosis and follow-up of athletic injuries, and which is acquiring a growing clinical significance to support biomechanics and identify novel genomics and "exercisenomics" patterns that can help identify specific athlete's tendency towards certain types of sport traumas and injuries. Laboratory medicine can also provide sport physicians and coaches with valuable clues about personal inclination towards a certain sport, health status, fitness and nutritional deficiencies of professional, elite and recreational athletes in order to enable a better and earlier prediction of sport injuries, overreaching and overtraining. Finally, the wide armamentarium of laboratory tests represents the milestone for identifying cheating athletes in the strenuous fight against doping in sports.

  3. A study of social interaction and teamwork in reformed physics laboratories

    NASA Astrophysics Data System (ADS)

    Gresser, Paul W.

    It is widely accepted that, for many students, learning can be accomplished most effectively through social interaction with peers, and there have been many successes in using the group environment to improve learning in a variety of classroom settings. What is not well understood, however, are the dynamics of student groups, specifically how the students collectively apprehend the subject matter and share the mental workload. This research examines recent developments of theoretical tools for describing the cognitive states of individual students: associational patterns such as epistemic games and cultural structures such as epistemological framing. Observing small group interaction in authentic classroom situations (labs, tutorials, problem solving) suggests that these tools could be effective in describing these interactions. Though conventional wisdom tells us that groups may succeed where individuals fail, there are many reasons why group work may also run into difficulties, such as a lack or imbalance of knowledge, an inappropriate mix of learning styles, or a destructive power arrangement. This research explores whether or not inconsistent epistemological framing among group members can also be a cause of group failure. Case studies of group interaction in the laboratory reveal evidence of successful groups employing common framing, and unsuccessful groups failing from lack of a shared frame. This study was conducted in a large introductory algebra-based physics course at the University of Maryland, College Park, in a laboratory designed specifically to foster increased student interaction and cooperation. Videotape studies of this environment reveal that productive lab groups coordinate their efforts through a number of locally coherent knowledge-building activities, which are described through the framework of epistemic games. The existence of these epistemic games makes it possible for many students to participate in cognitive activities without a

  4. Radiation and Health Technology Laboratory Capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bihl, Donald E.; Lynch, Timothy P.; Murphy, Mark K.

    2005-07-09

    The Radiological Standards and Calibrations Laboratory, a part of Pacific Northwest National Laboratory (PNNL)(a) performs calibrations and upholds reference standards necessary to maintain traceability to national standards. The facility supports U.S. Department of Energy (DOE) programs at the Hanford Site, programs sponsored by DOE Headquarters and other federal agencies, radiological protection programs at other DOE and commercial nuclear sites and research and characterization programs sponsored through the commercial sector. The laboratory is located in the 318 Building of the Hanford Site's 300 Area. The facility contains five major exposure rooms and several laboratories used for exposure work preparation, low-activity instrumentmore » calibrations, instrument performance evaluations, instrument maintenance, instrument design and fabrication work, thermoluminescent and radiochromic Dosimetry, and calibration of measurement and test equipment (M&TE). The major exposure facilities are a low-scatter room used for neutron and photon exposures, a source well room used for high-volume instrument calibration work, an x-ray facility used for energy response studies, a high-exposure facility used for high-rate photon calibration work, a beta standards laboratory used for beta energy response studies and beta reference calibrations and M&TE laboratories. Calibrations are routinely performed for personnel dosimeters, health physics instrumentation, photon and neutron transfer standards alpha, beta, and gamma field sources used throughout the Hanford Site, and a wide variety of M&TE. This report describes the standards and calibrations laboratory.« less

  5. Integration of digital gross pathology images for enterprise-wide access

    PubMed Central

    Amin, Milon; Sharma, Gaurav; Parwani, Anil V.; Anderson, Ralph; Kolowitz, Brian J; Piccoli, Anthony; Shrestha, Rasu B.; Lauro, Gonzalo Romero; Pantanowitz, Liron

    2012-01-01

    Background: Sharing digital pathology images for enterprise- wide use into a picture archiving and communication system (PACS) is not yet widely adopted. We share our solution and 3-year experience of transmitting such images to an enterprise image server (EIS). Methods: Gross pathology images acquired by prosectors were integrated with clinical cases into the laboratory information system's image management module, and stored in JPEG2000 format on a networked image server. Automated daily searches for cases with gross images were used to compile an ASCII text file that was forwarded to a separate institutional Enterprise Digital Imaging and Communications in Medicine (DICOM) Wrapper (EDW) server. Concurrently, an HL7-based image order for these cases was generated, containing the locations of images and patient data, and forwarded to the EDW, which combined data in these locations to generate images with patient data, as required by DICOM standards. The image and data were then “wrapped” according to DICOM standards, transferred to the PACS servers, and made accessible on an institution-wide basis. Results: In total, 26,966 gross images from 9,733 cases were transmitted over the 3-year period from the laboratory information system to the EIS. The average process time for cases with successful automatic uploads (n=9,688) to the EIS was 98 seconds. Only 45 cases (0.5%) failed requiring manual intervention. Uploaded images were immediately available to institution- wide PACS users. Since inception, user feedback has been positive. Conclusions: Enterprise- wide PACS- based sharing of pathology images is feasible, provides useful services to clinical staff, and utilizes existing information system and telecommunications infrastructure. PACS-shared pathology images, however, require a “DICOM wrapper” for multisystem compatibility. PMID:22530178

  6. Laboratory systems integration: robotics and automation.

    PubMed

    Felder, R A

    1991-01-01

    Robotic technology is going to have a profound impact on the clinical laboratory of the future. Faced with increased pressure to reduce health care spending yet increase services to patients, many laboratories are looking for alternatives to the inflexible or "fixed" automation found in many clinical analyzers. Robots are being examined by many clinical pathologists as an attractive technology which can adapt to the constant changes in laboratory testing. Already, laboratory designs are being altered to accommodate robotics and automated specimen processors. However, the use of robotics and computer intelligence in the clinical laboratory is still in its infancy. Successful examples of robotic automation exist in several laboratories. Investigators have used robots to automate endocrine testing, high performance liquid chromatography, and specimen transportation. Large commercial laboratories are investigating the use of specimen processors which combine the use of fixed automation and robotics. Robotics have also reduced the exposure of medical technologists to specimens infected with viral pathogens. The successful examples of clinical robotics applications were a result of the cooperation of clinical chemists, engineers, and medical technologists. At the University of Virginia we have designed and implemented a robotic critical care laboratory. Initial clinical experience suggests that robotic performance is reliable, however, staff acceptance and utilization requires continuing education. We are also developing a robotic cyclosporine which promises to greatly reduce the labor costs of this analysis. The future will bring lab wide automation that will fully integrate computer artificial intelligence and robotics. Specimens will be transported by mobile robots. Specimen processing, aliquotting, and scheduling will be automated.(ABSTRACT TRUNCATED AT 250 WORDS)

  7. Ranking protective coatings: Laboratory vs. field experience

    NASA Astrophysics Data System (ADS)

    Conner, Jeffrey A.; Connor, William B.

    1994-12-01

    Environmentally protective coatings are used on a wide range of gas turbine components for survival in the harsh operating conditions of engines. A host of coatings are commercially available to protect hot-section components, ranging from simple aluminides to designer metallic overlays and ceramic thermal barrier coatings. A variety of coating-application processes are available, and they range from simple pack cementation processing to complex physical vapor deposition, which requires multimillion dollar facilities. Detailed databases are available for most coatings and coating/process combinations for a range of laboratory tests. Still, the analysis of components actually used in engines often yields surprises when compared against predicted coating behavior from laboratory testing. This paper highlights recent work to develop new laboratory tests that better simulate engine environments. Comparison of in-flight coating performance as well as industrial and factory engine testing on a range of hardware is presented along with laboratory predictions from standard testing and from recently developed cyclic burner-rig testing.

  8. Use of proficiency test performance to determine clinical laboratory director qualifications.

    PubMed

    Howanitz, P J

    1988-04-01

    Many activities and policies influence laboratory test quality. Proficiency test results are one measure of laboratory quality, and during the past 25 years, five studies have examined the relationship of laboratory director educational requirements to proficiency test results. Data from three studies support the association between director qualifications and quality as measured by proficiency test performance, whereas no relationship was found in the other two studies. Possible reasons for conflicting results include differences in database size and demographics; in addition, proficiency test results may be inappropriate, although widely used, as the sole measure of laboratory director performance.

  9. Challenges to laboratory hematology practice: Egypt perspective.

    PubMed

    Rizk, S H

    2018-05-01

    Laboratory hematology is an integral part of all clinical laboratories along the extensive healthcare facilities in Egypt. The aim of this review is to portrait the laboratory hematology practice in Egypt including its unique socioeconomic background, blood disease pattern, education and training, regulatory oversight, and the related challenges. Current practice varies widely between different parts of the healthcare system in terms of the range of tests, applied techniques, workforce experience, and quality of service. The national transfusion service (NBTS) in Egypt has been recently upgraded and standardized according to the World Health Organization (WHO) guidelines. Formal postgraduate education roughly follows the British system. Laboratory hematology specialization is achieved through 2-3 years masters' degree followed by 2-4 years doctorate degree in clinical pathology with training and research in hematology. Improvement of laboratory hematology education is recently undergoing a reform as a part of the modernization of higher education policy and following the standards developed by the National Quality Assurance and Accreditation Agency (NQAAA). Accreditation of medical laboratories is recently progressing with the development of the "Egyptian Accreditation Council" (EGAC) as the sole accreditation body system and training of assessors. Current laboratory system has many challenges, some are related to the inadequate system performance, and others are unique to laboratory hematology issues. The rapid technological advances and therapeutic innovations in hematology practice call for an adapting laboratory system with continuous upgrading. © 2018 John Wiley & Sons Ltd.

  10. Contributions of academic laboratories to the discovery and development of chemical biology tools.

    PubMed

    Huryn, Donna M; Resnick, Lynn O; Wipf, Peter

    2013-09-26

    The academic setting provides an environment that may foster success in the discovery of certain types of small molecule tools while proving less suitable in others. For example, small molecule probes for poorly understood systems, those that exploit a specific resident expertise, and those whose commercial return is not apparent are ideally suited to be pursued in a university setting. In this review, we highlight five projects that emanated from academic research groups and generated valuable tool compounds that have been used to interrogate biological phenomena: reactive oxygen species (ROS) sensors, GPR30 agonists and antagonists, selective CB2 agonists, Hsp70 modulators, and β-amyloid PET imaging agents. By taking advantage of the unique expertise resident in university settings and the ability to pursue novel projects that may have great scientific value but with limited or no immediate commercial value, probes from academic research groups continue to provide useful tools and generate a long-term resource for biomedical researchers.

  11. The recording of student performance in the microbiology laboratory as a training, tutorial, and motivational tool.

    PubMed

    Lipson, Steven M; Gair, Marina

    2011-01-01

    The laboratory component of a microbiology course consists of exercises which mandate a level of proficiency and manual dexterity equal to and often beyond that recognized among other biology courses. Bacterial growth, maintenance, identification (e.g., Gram stain, biochemical tests, genomics), as well as the continuous need to maintain laboratory safety and sterile technique, are only a few skills/responsibilities critical to the discipline of microbiology. Performance of the Gram stain remains one of the most basic and pivotal skills that must be mastered in the microbiology laboratory. However, a number of students continually have difficulty executing the Gram stain and preparative procedures associated with the test. In order to address this issue, we incorporated real-time digital recording as a supplemental teaching aid in the microbiology laboratory. Our use of the digital movie camera in the teaching setting served to enhance interest, motivate students, and in general, improve student performance.

  12. Laboratory Plasma Source as an MHD Model for Astrophysical Jets

    NASA Technical Reports Server (NTRS)

    Mayo, Robert M.

    1997-01-01

    The significance of the work described herein lies in the demonstration of Magnetized Coaxial Plasma Gun (MCG) devices like CPS-1 to produce energetic laboratory magneto-flows with embedded magnetic fields that can be used as a simulation tool to study flow interaction dynamic of jet flows, to demonstrate the magnetic acceleration and collimation of flows with primarily toroidal fields, and study cross field transport in turbulent accreting flows. Since plasma produced in MCG devices have magnetic topology and MHD flow regime similarity to stellar and extragalactic jets, we expect that careful investigation of these flows in the laboratory will reveal fundamental physical mechanisms influencing astrophysical flows. Discussion in the next section (sec.2) focuses on recent results describing collimation, leading flow surface interaction layers, and turbulent accretion. The primary objectives for a new three year effort would involve the development and deployment of novel electrostatic, magnetic, and visible plasma diagnostic techniques to measure plasma and flow parameters of the CPS-1 device in the flow chamber downstream of the plasma source to study, (1) mass ejection, morphology, and collimation and stability of energetic outflows, (2) the effects of external magnetization on collimation and stability, (3) the interaction of such flows with background neutral gas, the generation of visible emission in such interaction, and effect of neutral clouds on jet flow dynamics, and (4) the cross magnetic field transport of turbulent accreting flows. The applicability of existing laboratory plasma facilities to the study of stellar and extragalactic plasma should be exploited to elucidate underlying physical mechanisms that cannot be ascertained though astrophysical observation, and provide baseline to a wide variety of proposed models, MHD and otherwise. The work proposed herin represents a continued effort on a novel approach in relating laboratory experiments to

  13. The pediatric hematology/oncology educational laboratory in-training examination (PHOELIX): A formative evaluation of laboratory skills for Canadian pediatric hematology/oncology trainees.

    PubMed

    Leung, Elaine; Dix, David; Ford, Jason; Barnard, Dorothy; McBride, Eileen

    2015-11-01

    Pediatric hematologists/oncologists need to be skilled clinicians, and must also be adept and knowledgeable in relevant areas of laboratory medicine. Canadian training programs in this subspecialty have a minimum requirement for 6 months of training in acquiring "relevant laboratory diagnostic skills." The Canadian pediatric hematology/oncology (PHO) national specialty society, C17, recognized the need for an assessment method in laboratory skills for fellows graduating from PHO training programs. Canadian pediatric hematologists/oncologists were surveyed regarding what were felt to be the essential laboratory-related knowledge and skills deemed necessary for graduating pediatric hematology/oncology trainees. The PHOELIX (Pediatric hematology/oncology educational laboratory in-training examination) was then developed to provide an annual formative evaluation of laboratory skills in Canadian PHO trainees. The majority of PHO respondents (89%) felt that laboratory skills are important in clinical practice. An annual formative examination including review of glass slides was implemented starting in 2010; this provides feedback regarding knowledge of laboratory medicine to both trainees and program directors (PDs). We have successfully created a formative examination that can be used to evaluate and educate trainees, as well as provide PDs with a tool to gauge the effectiveness of their laboratory training curriculum. Feedback has been positive from both trainees and PDs. © 2015 Wiley Periodicals, Inc.

  14. Develop virtual joint laboratory for education like distance engineering system for robotic applications

    NASA Astrophysics Data System (ADS)

    Latinovic, T. S.; Deaconu, S. I.; Latinović, M. T.; Malešević, N.; Barz, C.

    2015-06-01

    This paper work with a new system that provides distance learning and online training engineers. The purpose of this paper is to develop and provide web-based system for the handling and control of remote devices via the Internet. Remote devices are currently the industry or mobile robots [13]. For future product development machine in the factory will be included in the system. This article also discusses the current use of virtual reality tools in the fields of science and engineering education. One programming tool in particular, virtual reality modeling language (VRML) is presented in the light of its applications and capabilities in the development of computer visualization tool for education. One contribution of this paper is to present the software tools and examples that can encourage educators to develop a virtual reality model to improve teaching in their discipline. [12] This paper aims to introduce a software platform, called VALIP where users can build, share, and manipulate 3D content in cooperation with the interaction processes in a 3D context, while participating hardware and software devices can be physical and / or logical distributed and connected together via the Internet. VALIP the integration of virtual laboratories to appropriate partners; therefore, allowing access to all laboratories in any of the partners in the project. VALIP provides advanced laboratory for training and research within robotics and production engineering, and thus, provides a great laboratory facilities with only having to invest a limited amount of resources at the local level to the partner site.

  15. Design and development of a solar powered mobile laboratory

    NASA Astrophysics Data System (ADS)

    Jiao, L.; Simon, A.; Barrera, H.; Acharya, V.; Repke, W.

    2016-08-01

    This paper describes the design and development of a solar powered mobile laboratory (SPML) system. The SPML provides a mobile platform that schools, universities, and communities can use to give students and staff access to laboratory environments where dedicated laboratories are not available. The lab includes equipment like 3D printers, computers, and soldering stations. The primary power source of the system is solar PV which allows the laboratory to be operated in places where the grid power is not readily available or not sufficient to power all the equipment. The main system components include PV panels, junction box, battery, charge controller, and inverter. Not only is it used to teach students and staff how to use the lab equipment, but it is also a great tool to educate the public about solar PV technologies.

  16. Strengths of the Northwell Health Laboratory Service Line

    PubMed Central

    Balfour, Erika; Stallone, Robert; Castagnaro, Joseph; Poczter, Hannah; Schron, Deborah; Martone, James; Breining, Dwayne; Simpkins, Henry; Neglia, Tom; Kalish, Paul

    2016-01-01

    From 2009 to 2015, the laboratories of the 19-hospital North Shore-LIJ Health System experienced 5 threatened interruptions in service and supported 2 regional health-care providers with threatened interruptions in their laboratory service. We report our strategies to maintain laboratory performance during these events, drawing upon the strengths of our integrated laboratory service line. Established in 2009, the laboratory service line has unified medical and administrative leadership and system-wide divisional structure, quality management, and standardization of operations and procedures. Among many benefits, this governance structure enabled the laboratories to respond to a series of unexpected events. Specifically, at our various service sites, the laboratories dealt with pandemic (2009), 2 floods (2010, 2012), 2 fires (2010, 2015), and laboratory floor subsidence (2013). We were also asked to provide support for a regional physician network facing abrupt loss of testing services from closure of another regional clinical laboratory (2010) and to intervene for a non-health system hospital threatened with closure owing to noncompliance of laboratory operations (2012). In all but a single instance, patient care was served without interruption in service. In the last instance, fire interrupted laboratory services for 30 minutes. We conclude that in a large integrated health system, threats to continuous laboratory operations are not infrequent when measured on an annual basis. While most threats are from external physical circumstances, some emanate from unexpected administrative events. A strong laboratory governance mechanism that includes unified medical and administrative leadership across the entirety of the laboratory service line enables successful responses to these threats. PMID:28725768

  17. Idaho National Laboratory Cultural Resource Management Annual Report FY 2006

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clayton F. Marler; Julie Braun; Hollie Gilbert

    2007-04-01

    The Idaho National Laboratory Site is home to vast numbers and a wide variety of important cultural resources representing at least a 13,500-year span of human occupation in the region. As a federal agency, the Department of Energy Idaho Operations Office has legal responsibility for the management and protection of those resources and has delegated these responsibilities to its primary contractor, Battelle Energy Alliance (BEA). The INL Cultural Resource Management Office, staffed by BEA professionals, is committed to maintaining a cultural resource management program that accepts these challenges in a manner reflecting the resources’ importance in local, regional, and nationalmore » history. This annual report summarizes activities performed by the INL Cultural Resource Management Office staff during Fiscal Year 2006. This work is diverse, far-reaching and though generally confined to INL cultural resource compliance, also includes a myriad of professional and voluntary community activities. This document is intended to be both informative to internal and external stakeholders, and to serve as a planning tool for future cultural resource management work to be conducted on the INL.« less

  18. Idaho National Laboratory Cultural Resource Management Annual Report FY 2007

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Julie Braun; Hollie Gilbert; Dino Lowrey

    2008-03-01

    The Idaho National Laboratory (INL) Site is home to vast numbers and a wide variety of important cultural resources representing at least a 13,500-year span of human land use in the region. As a federal agency, the Department of Energy Idaho Operations Office has legal responsibility for the management and protection of those resources and has delegated these responsibilities to its primary contractor, Battelle Energy Alliance (BEA). The BEA professional staff is committed to maintaining a cultural resource management program that accepts these challenges in a manner reflecting the resources’ importance in local, regional, and national history. This annual reportmore » summarizes activities performed by the INL Cultural Resource Management Office (CRMO) staff during fiscal year 2007. This work is diverse, far-reaching and though generally confined to INL cultural resource compliance, also includes a myriad of professional and voluntary community activities. This document is intended to be both informative to internal and external stakeholders, and to serve as a planning tool for future cultural resource management work to be conducted on the INL.« less

  19. A laboratory system for element specific hyperspectral X-ray imaging.

    PubMed

    Jacques, Simon D M; Egan, Christopher K; Wilson, Matthew D; Veale, Matthew C; Seller, Paul; Cernik, Robert J

    2013-02-21

    X-ray tomography is a ubiquitous tool used, for example, in medical diagnosis, explosives detection or to check structural integrity of complex engineered components. Conventional tomographic images are formed by measuring many transmitted X-rays and later mathematically reconstructing the object, however the structural and chemical information carried by scattered X-rays of different wavelengths is not utilised in any way. We show how a very simple; laboratory-based; high energy X-ray system can capture these scattered X-rays to deliver 3D images with structural or chemical information in each voxel. This type of imaging can be used to separate and identify chemical species in bulk objects with no special sample preparation. We demonstrate the capability of hyperspectral imaging by examining an electronic device where we can clearly distinguish the atomic composition of the circuit board components in both fluorescence and transmission geometries. We are not only able to obtain attenuation contrast but also to image chemical variations in the object, potentially opening up a very wide range of applications from security to medical diagnostics.

  20. Range-wide success of red-cockaded woodpecker translocations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, John W; Costa, Ralph

    2004-12-31

    Edwards, John W.; Costa, Ralph. 2004. Range-wide success of red-cockaded woodpecker translocations. In: Red-cockaded woodpecker; Road to Recovery. Proceedings of the 4th Red-cockaded woodpecker Symposium. Ralph Costa and Susan J. Daniels, eds. Savannah, Georgia. January, 2003. Chapter 6. Translocation. Pp 307-311. Abstract: Red-cockaded woodpeckers (Picoides borealis) have declined range-wide during the past century, suffering from habitat loss and the effects of fire exclusion in older southern pine forests. Red-cockaded woodpecker translocations are a potentially important tool in conservation efforts to reestablish red-cockaded woodpeckers in areas from which they have been extirpated. Currently, translocations are critical in ongoing efforts to savemore » and restore the many existing small populations. We examined the effects of demographic and environmental factors on the range-wide success of translocations between 1989 and 1995.« less

  1. Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server

    DTIC Science & Technology

    2016-09-01

    ARL-TR-7798 ● SEP 2016 US Army Research Laboratory Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server...for the Applied Anomaly Detection Tool (AADT) Web Server by Christian D Schlesiger Computational and Information Sciences Directorate, ARL...SUBTITLE Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT

  2. Creating a Classroom Kaleidoscope with the World Wide Web.

    ERIC Educational Resources Information Center

    Quinlan, Laurie A.

    1997-01-01

    Discusses the elements of classroom Web presentations: planning; construction, including design tips; classroom use; and assessment. Lists 14 World Wide Web resources for K-12 teachers; Internet search tools (directories, search engines and meta-search engines); a Web glossary; and an example of HTML for a simple Web page. (PEN)

  3. The white matter structural network underlying human tool use and tool understanding.

    PubMed

    Bi, Yanchao; Han, Zaizhu; Zhong, Suyu; Ma, Yujun; Gong, Gaolang; Huang, Ruiwang; Song, Luping; Fang, Yuxing; He, Yong; Caramazza, Alfonso

    2015-04-29

    The ability to recognize, create, and use complex tools is a milestone in human evolution. Widely distributed brain regions in parietal, frontal, and temporal cortices have been implicated in using and understanding tools, but the roles of their anatomical connections in supporting tool use and tool conceptual behaviors are unclear. Using deterministic fiber tracking in healthy participants, we first examined how 14 cortical regions that are consistently activated by tool processing are connected by white matter (WM) tracts. The relationship between the integrity of each of the 33 obtained tracts and tool processing deficits across 86 brain-damaged patients was investigated. WM tract integrity was measured with both lesion percentage (structural imaging) and mean fractional anisotropy (FA) values (diffusion imaging). Behavioral abilities were assessed by a tool use task, a range of conceptual tasks, and control tasks. We found that three left hemisphere tracts connecting frontoparietal and intrafrontal areas overlapping with left superior longitudinal fasciculus are crucial for tool use such that larger lesion and lower mean FA values on these tracts were associated with more severe tool use deficits. These tracts and five additional left hemisphere tracts connecting frontal and temporal/parietal regions, mainly overlapping with left superior longitudinal fasciculus, inferior frontooccipital fasciculus, uncinate fasciculus, and anterior thalamic radiation, are crucial for tool concept processing. Largely consistent results were also obtained using voxel-based symptom mapping analyses. Our results revealed the WM structural networks that support the use and conceptual understanding of tools, providing evidence for the anatomical skeleton of the tool knowledge network. Copyright © 2015 the authors 0270-6474/15/356822-14$15.00/0.

  4. Supporting Positive Behaviour in Alberta Schools: A School-Wide Approach

    ERIC Educational Resources Information Center

    Mackenzie, Nancy

    2008-01-01

    Drawing on current research and best practices, this three-part resource, "Supporting Positive Behaviour in Alberta Schools," provides information, strategies, stories from schools and sample tools for systematically teaching, supporting and reinforcing positive behaviour. This integrated system of school-wide, classroom management, and…

  5. Adoption of Lean Principles in a High-Volume Molecular Diagnostic Microbiology Laboratory

    PubMed Central

    Mitchell, P. Shawn; Mandrekar, Jayawant N.

    2014-01-01

    Clinical laboratories are constantly facing challenges to do more with less, enhance quality, improve test turnaround time, and reduce operational expenses. Experience with adopting and applying lean concepts and tools used extensively in the manufacturing industry is described for a high-volume clinical molecular microbiology laboratory, illustrating how operational success and benefits can be achieved. PMID:24829247

  6. Contingency diagrams as teaching tools

    PubMed Central

    Mattaini, Mark A.

    1995-01-01

    Contingency diagrams are particularly effective teaching tools, because they provide a means for students to view the complexities of contingency networks present in natural and laboratory settings while displaying the elementary processes that constitute those networks. This paper sketches recent developments in this visualization technology and illustrates approaches for using contingency diagrams in teaching. ImagesFigure 2Figure 3Figure 4 PMID:22478208

  7. Time Series Data Visualization in World Wide Telescope

    NASA Astrophysics Data System (ADS)

    Fay, J.

    WorldWide Telescope provides a rich set of timer series visualization for both archival and real time data. WWT consists of both interactive desktop tools for interactive immersive visualization and HTML5 web based controls that can be utilized in customized web pages. WWT supports a range of display options including full dome, power walls, stereo and virtual reality headsets.

  8. [Human resource capacity building on TB laboratory work for TB control program--through the experience of international TB laboratory training course for TB control at the Research Institute of Tuberculosis, JATA, Japan].

    PubMed

    Fujiki, Akiko; Kato, Seiya

    2008-06-01

    The international training course on TB laboratory work for national tuberculosis program (NTP) has been conducted at the Research Institute of Tuberculosis since 1975 funded by Japan International Cooperation Agency in collaboration with WHO Western Pacific Regional Office. The aim of the course is to train key personnel in TB laboratory field for NTP in resource-limited countries. The course has trained 265 national key personnel in TB laboratory service from 57 resource-limited countries in the last 33 years. The number of participants trained may sound too small in the fight against the large TB problem in resource-limited countries. However, every participant is playing an important role as a core and catalyst for the TB control program in his/her own country when they were back home. The curriculum is composed of technical aspects on TB examination, mainly sputum microscopy in addition since microscopy service is provided at many centers that are deployed in a widely spread area, the managerial aspect of maintaining quality TB laboratory work at the field laboratory is another component of the curriculum. Effective teaching methods using materials such as artificial sputum, which is useful for panel slide preparation, and technical manuals with illustrations and pictures of training procedure have been developed through the experience of the course. These manuals are highly appreciated and widely used by the front line TB workers. The course has also contributed to the expansion of EQA (External Quality Assessment) system on AFB microscopy for the improvement of the quality of TB laboratory service of NTP. The course is well-known for not only having a long history, but also for its unique learning method emphasizing "Participatory Training", particularly for practicum sessions to master the skills on AFB microscopy. The method in learning AFB microscopy, which was developed by the course, was published as a training manual by IUATLD, RIT and USAID. As it is

  9. Going Google: Powerful Tools for 21st Century Learning

    ERIC Educational Resources Information Center

    Covili, Jared

    2012-01-01

    Google is more than a search engine--it offers many tools that give people the opportunity to work virtually from anywhere, with anyone, at any time they choose. And these tools are available to teachers and students for free. This book for K-12 educators explores the wide array of Google tools and shows how to use them in the classroom to foster…

  10. In-house access to PACS images and related data through World Wide Web

    NASA Astrophysics Data System (ADS)

    Mascarini, Christian; Ratib, Osman M.; Trayser, Gerhard; Ligier, Yves; Appel, R. D.

    1996-05-01

    The development of a hospital wide PACS is in progress at the University Hospital of Geneva and several archive modules are operational since 1992. This PACS is intended for wide distribution of images to clinical wards. As the PACS project and the number of archived images grow rapidly in the hospital, it was necessary to provide an easy, more widely accessible and convenient access to the PACS database for the clinicians in the different wards and clinical units of the hospital. An innovative solution has been developed using tools such as Netscape navigator and NCSA World Wide Web server as an alternative to conventional database query and retrieval software. These tools present the advantages of providing an user interface which is the same independently of the platform being used (Mac, Windows, UNIX, ...), and an easy integration of different types of documents (text, images, ...). A strict access control has been added to this interface. It allows user identification and access rights checking, as defined by the in-house hospital information system, before allowing the navigation through patient data records.

  11. Microarrays (DNA Chips) for the Classroom Laboratory

    ERIC Educational Resources Information Center

    Barnard, Betsy; Sussman, Michael; BonDurant, Sandra Splinter; Nienhuis, James; Krysan, Patrick

    2006-01-01

    We have developed and optimized the necessary laboratory materials to make DNA microarray technology accessible to all high school students at a fraction of both cost and data size. The primary component is a DNA chip/array that students "print" by hand and then analyze using research tools that have been adapted for classroom use. The…

  12. Changing Educational Traditions with the Change Laboratory

    ERIC Educational Resources Information Center

    Botha, Louis Royce

    2017-01-01

    This paper outlines the use of a form of research intervention known as the Change Laboratory to illustrate how the processes of organisational change initiated at a secondary school can be applied to develop tools and practices to analyse and potentially re-make educational traditions in a bottom-up manner. In this regard it is shown how a…

  13. Cost of gentamicin assays carried out by microbiology laboratories.

    PubMed Central

    Vacani, P F; Malek, M M; Davey, P G

    1993-01-01

    AIMS--To assess the current range of prices charged for gentamicin assays in United Kingdom laboratories; and to examine the laboratories' likely response to increases or decreases in the demand for the service. METHODS--A postal survey of the 420 members of the Association of Medical Microbiologists was used to establish the range of prices charged for aminoglycoside assays. Additionally, eight private institutions were contacted to determine what the private sector was charging for aminoglycoside assays. Reagent costs in the NHS laboratories were calculated by dividing the total cost of all aminoglycoside assay kits by the number of samples analysed. RESULTS--The NHS and the private institutions both showed a wide price variation. Prices charged to an in-hospital requester for a peak and trough assay ranged from 5.00 pounds to 68.20 pounds (n = 44), and to an external private hospital, under a bulk service contract, from 5.00 pounds to 96.00 pounds (n = 47). Prices charged by private laboratories ranged from 49.00 pounds to 84.00 pounds (n = 8). There was a log linear correlation in the NHS laboratories between the reagent costs per assay and the number of assays performed per year, and most laboratories thought that their price per assay would be sensitive to increases or decreases in demand. Laboratories which had purchased their assay machines had lower reagent costs per assay but higher repair and maintenance costs. Overall, number of assays performed and method of payment for assay machinery only accounted for 44.8% of the observed variation in assay kit costs. CONCLUSIONS--The price range for gentamicin assays in the United Kingdom is wide and is only partially explained by the number of assays performed. Most laboratories believe that they would experience a reduction in unit cost as output increases. The currently offered range of prices is, in part, due to variation in the laboratories' approach to costing the service provided and some laboratories

  14. [Reducing the use of laboratory animals].

    PubMed

    Claude, Nancy

    2009-11-01

    Since 1959, when Russel and Burch formulated the 3Rs principle (Reduce, Replace, Refine), the scientific community has been attempting to reduce the use of laboratory animals for research purposes. Current regulatory guidelines take this principle into account. Thanks to scientific and technical progress, and advances in bioinformatics, new tools are now available that reduce the need for laboratory animals, albeit without totally replacing them. Implementation of the International Conference on Harmonization recommendations in 1990 represented a major step forward, notably by helping to avoid duplication of studies using laboratory animals. The use of animals for cosmetics testing is now forbidden in the European Union. Although new in vitro and in silico models remain to be validated, they are proving particularly useful during the early stages of product development, by avoiding experimental studies of chemicals that are ineffective or excessively toxic. The success of these measures is reflected in the results of a European study showing a fall, between 1996 and 2005, in the number of laboratory animals used for research and development, despite a large increase in overall research activities. The challenge for the next decade is to amplify this trend.

  15. Laboratory for Atmospheres 2008 Technical Highlights

    NASA Technical Reports Server (NTRS)

    Cote, Charles E.

    2009-01-01

    out in collaboration with other laboratories and research groups within the Earth Sciences Division, across the Sciences and Exploration Directorate, and with partners in universities and other Government agencies. The Laboratory for Atmospheres is a vital participant in NASA s research agenda. Our Laboratory often has relatively large programs, sizable satellite missions, and observational campaigns that require the cooperative and collaborative efforts of many scientists. We ensure an appropriate balance between our scientists responsibility for these large collaborative projects and their need for an active individual research agenda. This balance allows members of the Laboratory to continuously improve their scientific credentials. Members of the Laboratory interact with the general public to support a wide range of interests in the atmospheric sciences. Among other activities, the Laboratory raises the public s awareness of atmospheric science by presenting public lectures and demonstrations, by making scientific data available to wide audiences, by teaching, and by mentoring students and teachers. The Laboratory makes substantial efforts to attract new scientists to the various areas of atmospheric research. We strongly encourage the establishment of partnerships with Federal and state agencies that have operational responsibilities to promote the societal application of our science products. This report describes our role in NASA s mission, gives a broad description of our research, and summarizes our scientists major accomplishments during calendar year 2008. The report also contains useful information on human resources, scientific interactions, and outreach activities.

  16. Adding tools to the open source toolbox: The Internet

    NASA Technical Reports Server (NTRS)

    Porth, Tricia

    1994-01-01

    The Internet offers researchers additional sources of information not easily available from traditional sources such as print volumes or commercial data bases. Internet tools such as e-mail and file transfer protocol (ftp) speed up the way researchers communicate and transmit data. Mosaic, one of the newest additions to the Internet toolbox, allows users to combine tools such as ftp, gopher, wide area information server, and the world wide web with multimedia capabilities. Mosaic has quickly become a popular means of making information available on the Internet because it is versatile and easily customizable.

  17. UV-Vis Spectrophotometric Analysis and Quantification of Glyphosate for an Interdisciplinary Undergraduate Laboratory

    ERIC Educational Resources Information Center

    Felton, Daniel E.; Ederer, Martina; Steffens, Timothy; Hartzell, Patricia L.; Waynant, Kristopher V.

    2018-01-01

    Glyphosate (N-(phosphonomethyl)glycine) is the most widely used herbicide on earth. A simple assay to quantify glyphosate concentrations in environmental samples was developed as part of an interdisciplinary effort linking introductory laboratory courses in chemistry, biology, and microbiology. In this 3 h laboratory experiment, students used…

  18. Supplement Analysis for the Site-Wide Environmental Impact Statement for Continued Operation of Los Alamos National Laboratory -- Recovery and Storage of Strontium-90 Fueled Radioisotope Thermal Electric Generators at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    N /A

    2004-01-22

    This Supplement Analysis (SA) has been prepared to determine if the Site-Wide Environmental Impact Statement for Continued Operations of Los Alamos National Laboratory (SWEIS) (DOE/EIS-0238) adequately addresses the environmental effects of recovery and storage for disposal of six strontium-90 (Sr-90) fueled radioisotope thermal electric generators (RTGs) at the Los Alamos National Laboratory (LANL) Technical Area (TA)-54, Area G, or if the SWEIS needs to be supplemented. DOE's National Nuclear Security Administration (NNSA) proposed to recover and store six Sr-90 RTGs from the commercial sector as part of its Offsite-Source Recovery Project (OSRP). The OSRP focuses on the proactive recovery andmore » storage of unwanted radioactive sealed sources exceeding the US Nuclear Regulatory Commission (NRC) limits for Class C low-level waste (also known as Greater than Class C waste, or GTCC). In response to the events of September 11, 2001, NRC conducted a risk-based evaluation of potential vulnerabilities to terrorist threats involving NRC-licensed nuclear facilities and materials. NRC's evaluation concluded that possession of unwanted radioactive sealed sources with no disposal outlet presents a potential vulnerability (NRC 2002). In a November 25, 2003 letter to the manager of the NNSA's Los Alamos Site Office, the NRC Office of Nuclear Security and Incident Response identified recovery of several Sr-90 RTGs as the highest priority and requested that DOE take whatever actions necessary to recovery these sources as soon as possible. This SA specifically compares key impact assessment parameters of this proposal to the offsite source recovery program evaluated in the SWEIS and a subsequent SA that evaluated a change to the approach of a portion of the recovery program. It also provides an explanation of any differences between the Proposed Action and activities described in the previous SWEIS and SA analyses.« less

  19. Public health laboratory workforce outreach in Hawai'i: CLIA-focused student internship pilot program at the state laboratories.

    PubMed

    Whelen, A Christian; Kitagawa, Kent

    2013-01-01

    Chronically understaffed public health laboratories depend on a decreasing number of employees who must assume broader responsibilities in order to sustain essential functions for the many clients the laboratories support. Prospective scientists considering a career in public health are often not aware of the requirements associated with working in a laboratory regulated by the Clinical Laboratory Improvement Amendments (CLIA). The purpose of this pilot internship was two-fold; introduce students to operations in a regulated laboratory early enough in their academics so that they could make good career decisions, and evaluate internship methodology as one possible solution to workforce shortages. Four interns were recruited from three different local universities, and were paired with an experienced State Laboratories Division (SLD) staff mentor. Students performed tasks that demonstrated the importance of CLIA regulations for 10-15 hours per week over a 14 week period. Students also attended several directed group sessions on regulatory lab practice and quality systems. Both interns and mentors were surveyed periodically during the semester. Surveys of mentors and interns indicated overall positive experiences. One-on-one pairing of experienced public health professionals and students seems to be a mutually beneficial arrangement. Interns reported that they would participate if the internship was lower paid, unpaid, or for credit only. The internship appeared to be an effective tool to expose students to employment in CLIA-regulated laboratories, and potentially help address public health laboratory staffing shortfalls. Longer term follow up with multiple classes of interns may provide a more informed assessment.

  20. genipe: an automated genome-wide imputation pipeline with automatic reporting and statistical tools.

    PubMed

    Lemieux Perreault, Louis-Philippe; Legault, Marc-André; Asselin, Géraldine; Dubé, Marie-Pierre

    2016-12-01

    Genotype imputation is now commonly performed following genome-wide genotyping experiments. Imputation increases the density of analyzed genotypes in the dataset, enabling fine-mapping across the genome. However, the process of imputation using the most recent publicly available reference datasets can require considerable computation power and the management of hundreds of large intermediate files. We have developed genipe, a complete genome-wide imputation pipeline which includes automatic reporting, imputed data indexing and management, and a suite of statistical tests for imputed data commonly used in genetic epidemiology (Sequence Kernel Association Test, Cox proportional hazards for survival analysis, and linear mixed models for repeated measurements in longitudinal studies). The genipe package is an open source Python software and is freely available for non-commercial use (CC BY-NC 4.0) at https://github.com/pgxcentre/genipe Documentation and tutorials are available at http://pgxcentre.github.io/genipe CONTACT: louis-philippe.lemieux.perreault@statgen.org or marie-pierre.dube@statgen.orgSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  1. Implementation of 5S Method for Ergonomic Laboratory

    NASA Astrophysics Data System (ADS)

    Dila Sari, Amarria; Ilma Rahmillah, Fety; Prabowo Aji, Bagus

    2017-06-01

    This article discusses 5S implementation in Work System Design and Ergonomic Laboratory, Department of Industrial Engineering, Islamic University of Indonesia. There are some problems related to equipment settings for activity involving students such as files which is accumulated over the previous year practicum, as well as the movement of waste in the form of time due to the placement of goods that do not fit. Therefore, this study aims to apply the 5S method in DSK & E laboratory to facilitate the work processes and reduce waste. The project is performed by laboratory management using 5S methods in response to continuous improvement (Kaizen). Moreover, some strategy and suggestions are promoted to impose 5S system within the laboratory. As a result, the tidiness and cleanliness can be achieved that lead to the great performance of laboratory users. Score assessment before implementing 5S DSKE laboratory is at 64 (2.56) while the score after implementation is 32 (1.28) and shows an improvement of 50%. This has implications for better use in the laboratory area, save time when looking for tools and materials due to its location and good visual control, as well as improving the culture and spirit of ‘5S’ on staff regarding better working environment

  2. A senior manufacturing laboratory for determining injection molding process capability

    NASA Technical Reports Server (NTRS)

    Wickman, Jerry L.; Plocinski, David

    1992-01-01

    The following is a laboratory experiment designed to further understanding of materials science. This subject material is directed at an upper level undergraduate/graduate student in an Engineering or Engineering Technology program. It is assumed that the student has a thorough understanding of the process and quality control. The format of this laboratory does not follow that which is normally recommended because of the nature of process capability and that of the injection molding equipment and tooling. This laboratory is instead developed to be used as a point of departure for determining process capability for any process in either a quality control laboratory or a manufacturing environment where control charts, process capability, and experimental or product design are considered important topics.

  3. Genetics and molecular biology in laboratory medicine, 1963-2013.

    PubMed

    Whitfield, John B

    2013-01-01

    The past 50 years have seen many changes in laboratory medicine, either as causes or consequences of increases in productivity and expansion of the range of information which can be provided. The drivers and facilitators of change in relation to clinical applications of molecular biology included the need for diagnostic tools for genetic diseases and technical advances such as PCR and sequencing. However, molecular biology techniques have proved to have far wider applications, from detection of infectious agents to molecular characterization of tumors. Journals such as Clinical Chemistry and Laboratory Medicine play an important role in communication of these advances to the laboratory medicine community and in publishing evaluations of their practical value.

  4. Explosively driven two-shockwave tools with applications

    NASA Astrophysics Data System (ADS)

    Buttler, W. T.; Oró, D. M.; Mariam, F. G.; Saunders, A.; Andrews, M. J.; Cherne, F. J.; Hammerberg, J. E.; Hixson, R. S.; Monfared, S. K.; Morris, C.; Olson, R. T.; Preston, D. L.; Stone, J. B.; Terrones, G.; Tupa, D.; Vogan-McNeil, W.

    2014-05-01

    We present the development of an explosively driven physics tool to generate two mostly uniaxial shockwaves. The tool is being used to extend single shockwave ejecta models to account for a second shockwave a few microseconds later. We explore techniques to vary the amplitude of both the first and second shockwaves, and we apply the tool experimentally at the Los Alamos National Laboratory Proton Radiography (pRad)facility. The tools have been applied to Sn with perturbations of wavelength λ = 550 μm, and various amplitudes that give wavenumber amplitude products of kh in {3/4,1/2,1/4,1/8}, where h is the perturbation amplitude, and k = 2π/λ is the wavenumber. The pRad data suggest the development of a second shock ejecta model based on unstable Richtmyer-Meshkov physics.

  5. Effect of Virtual Analytical Chemistry Laboratory on Enhancing Student Research Skills and Practices

    ERIC Educational Resources Information Center

    Bortnik, Boris; Stozhko, Natalia; Pervukhina, Irina; Tchernysheva, Albina; Belysheva, Galina

    2017-01-01

    This article aims to determine the effect of a virtual chemistry laboratory on university student achievement. The article describes a model of a laboratory course that includes a virtual component. This virtual component is viewed as a tool of student pre-lab autonomous learning. It presents electronic resources designed for a virtual laboratory…

  6. ASFinder: a tool for genome-wide identification of alternatively splicing transcripts from EST-derived sequences.

    PubMed

    Min, Xiang Jia

    2013-01-01

    Expressed Sequence Tags (ESTs) are a rich resource for identifying Alternatively Splicing (AS) genes. The ASFinder webserver is designed to identify AS isoforms from EST-derived sequences. Two approaches are implemented in ASFinder. If no genomic sequences are provided, the server performs a local BLASTN to identify AS isoforms from ESTs having both ends aligned but an internal segment unaligned. Otherwise, ASFinder uses SIM4 to map ESTs to the genome, then the overlapping ESTs that are mapped to the same genomic locus and have internal variable exon/intron boundaries are identified as AS isoforms. The tool is available at http://proteomics.ysu.edu/tools/ASFinder.html.

  7. The Alcohol Dehydrogenase Kinetics Laboratory: Enhanced Data Analysis and Student-Designed Mini-Projects

    ERIC Educational Resources Information Center

    Silverstein, Todd P.

    2016-01-01

    A highly instructive, wide-ranging laboratory project in which students study the effects of various parameters on the enzymatic activity of alcohol dehydrogenase has been adapted for the upper-division biochemistry and physical biochemistry laboratory. Our two main goals were to provide enhanced data analysis, featuring nonlinear regression, and…

  8. Computational Tools for Metabolic Engineering

    PubMed Central

    Copeland, Wilbert B.; Bartley, Bryan A.; Chandran, Deepak; Galdzicki, Michal; Kim, Kyung H.; Sleight, Sean C.; Maranas, Costas D.; Sauro, Herbert M.

    2012-01-01

    A great variety of software applications are now employed in the metabolic engineering field. These applications have been created to support a wide range of experimental and analysis techniques. Computational tools are utilized throughout the metabolic engineering workflow to extract and interpret relevant information from large data sets, to present complex models in a more manageable form, and to propose efficient network design strategies. In this review, we present a number of tools that can assist in modifying and understanding cellular metabolic networks. The review covers seven areas of relevance to metabolic engineers. These include metabolic reconstruction efforts, network visualization, nucleic acid and protein engineering, metabolic flux analysis, pathway prospecting, post-structural network analysis and culture optimization. The list of available tools is extensive and we can only highlight a small, representative portion of the tools from each area. PMID:22629572

  9. The Evolving Role of Field and Laboratory Seismic Measurements in Geotechnical Engineering

    NASA Astrophysics Data System (ADS)

    Stokoe, K. H.

    2017-12-01

    The geotechnical engineering has been faced with the problem of characterizing geological materials for site-specific design in the built environment since the profession began. When one of the design requirements included determining the dynamic response of important and critical facilities to earthquake shaking or other types of dynamic loads, seismically-based measurements in the field and laboratory became important tools for direct characterization of the stiffnesses and energy dissipation (material damping) of these materials. In the 1960s, field seismic measurements using small-strain body waves were adapted from exploration geophysics. At the same time, laboratory measurements began using dynamic, torsional, resonant-column devices to measure shear stiffness and material damping in shear. The laboratory measurements also allowed parameters such as material type, confinement state, and nonlinear straining to be evaluated. Today, seismic measurements are widely used and evolving because: (1) the measurements have a strong theoretical basis, (2) they can be performed in the field and laboratory, thus forming an important link between these measurements, and (3) in recent developments in field testing involving surface waves, they are noninvasive which makes them cost effective in comparison to other methods. Active field seismic measurements are used today over depths ranging from about 5 to 1000 m. Examples of shear-wave velocity (VS) profiles evaluated using boreholes, penetrometers, suspension logging, and Rayleigh-type surface waves are presented. The VS measurements were performed in materials ranging from uncemented soil to unweathered rock. The coefficients of variation (COVs) in the VS profiles are generally less than 0.15 over sites with surface areas of 50 km2 or more as long as material types are not laterally mixed. Interestingly, the largest COVs often occur around layer boundaries which vary vertically. It is also interesting to observe how the

  10. Overview of the Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melaina, Marc; Bush, Brian; Penev, Michael

    This presentation provides an introduction to the Hydrogen Financial Analysis Scenario Tool (H2FAST) and includes an overview of each of the three versions of H2FAST: the Web tool, the Excel spreadsheet version, and the beta version of the H2FAST Business Case Scenario tool.

  11. The World Wide Web--a new tool for biomedical engineering education.

    PubMed

    Blanchard, S M

    1997-01-01

    An ever-increasing variety of materials (text, images, videos, and sound) are available through the World Wide Web (WWW). While textbooks, which are often outdated by the time they are published, are usually limited to black and white text and images, many supplemental materials can be found on the WWW. The WWW also provides many resources for student projects. In BAE 465: Biomedical Engineering Applications, student teams developed WWW-based term projects on biomedical topics, e.g. biomaterials, MRI, and medical ultrasound. After the projects were completed and edited by the instructor, they were placed on-line for world-wide access if permission for this had been granted by the student authors. Projects from three classes have been used to form the basis for an electronic textbook which is available at http:@www.eos.ncsu.edu/bae/research/blanchard /www/465/textbook/. This electronic textbook also includes instructional objectives and sample tests for specific topic areas. Student projects have been linked to the appropriate topic areas within the electronic textbook. Links to relevant sites have been included within the electronic textbook as well as within the individual projects. Students were required to link to images and other materials they wanted to include in their project in order to avoid copyright issues. The drawback to this approach to copyright protection is that addresses can change making links unavailable. In BAE 465 and in BAE 235: Engineering Biology, the WWW has also been used to distribute instructional objectives, the syllabi and class policies, homework problems, and abbreviated lecture notes. This has made maintaining course-related material easier and has reduced the amount of paper used by both the students and the instructor. Goals for the electronic textbook include the addition of instructional simulation programs that can be run from remote sites. In the future, biomedical engineering may be taught in a virtual classroom with

  12. [Patient satisfaction in a laboratory test collection unit].

    PubMed

    de Moura, Gisela Maria Schebella Souto; Hilleshein, Eunice Fabiani; Schardosim, Juliana Machado; Delgado, Kátia Simone

    2008-06-01

    This exploratory descriptive study aimed at identifying customer satisfaction attributes in the field of laboratory tests. Data were collected in 2006, using 104 interviews in a laboratorial unit inside a teaching hospital, using the critical incident technique, and submitted to content analysis. Three attribute categories were identified: time spent in waiting for care, interpersonal contact, and technical skills. These results subsidize the assessment of the current satisfaction survey tool, and point to its reformulation. They also allow the identification of improvement needs in customer attention, and provide elements to be taken into account in personnel selection, training programs, personnel performance assessment.

  13. Dissemination of watershed management information through the World Wide Web

    Treesearch

    Malchus B. Baker; Deborah J. Young

    2000-01-01

    Information and related literature on watershed management practices is sometimes not widely known nor readily accessible. New electronic technologies provide unique tools for disseminating research findings to scientists, educators, land management professionals, and the public. This paper illustrates how the usefulness and accessibility of research information from...

  14. Radar Images of the Earth and the World Wide Web

    NASA Technical Reports Server (NTRS)

    Chapman, B.; Freeman, A.

    1995-01-01

    A perspective of NASA's Jet Propulsion Laboratory as a center of planetary exploration, and its involvement in studying the earth from space is given. Remote sensing, radar maps, land topography, snow cover properties, vegetation type, biomass content, moisture levels, and ocean data are items discussed related to earth orbiting satellite imaging radar. World Wide Web viewing of this content is discussed.

  15. Biomedical engineering at Sandia National Laboratories

    NASA Astrophysics Data System (ADS)

    Zanner, Mary Ann

    1994-12-01

    The potential exists to reduce or control some aspects of the U.S. health care expenditure without compromising health care delivery by developing carefully selected technologies which impact favorably on the health care system. A focused effort to develop such technologies is underway at Sandia National Laboratories. As a DOE National Laboratory, Sandia possesses a wealth of engineering and scientific expertise that can be readily applied to this critical national need. Appropriate mechanisms currently exist to allow transfer of technology from the laboratory to the private sector. Sandia's Biomedical Engineering Initiative addresses the development of properly evaluated, cost-effective medical technologies through team collaborations with the medical community. Technology development is subjected to certain criteria including wide applicability, earlier diagnoses, increased efficiency, cost-effectiveness and dual-use. Examples of Sandia's medical technologies include a noninvasive blood glucose sensor, computer aided mammographic screening, noninvasive fetal oximetry and blood gas measurement, burn diagnostics and laser debridement, telerobotics and ultrasonic scanning for prosthetic devices. Sandia National Laboratories has the potential to aid in directing medical technology development efforts which emphasize health care needs, earlier diagnosis, cost containment and improvement of the quality of life.

  16. Near-net-shape manufacturing: Spray-formed metal matrix composites and tooling

    NASA Technical Reports Server (NTRS)

    Mchugh, Kevin M.

    1994-01-01

    Spray forming is a materials processing technology in which a bulk liquid metal is converted to a spray of fine droplets and deposited onto a substrate or pattern to form a near-net-shape solid. The technology offers unique opportunities for simplifying materials processing without sacrificing, and oftentimes substantially improving, product quality. Spray forming can be performed with a wide range of metals and nonmetals, and offers property improvements resulting from rapid solidification (e.g. refined microstructures, extended solid solubilities and reduced segregation). Economic benefits result from process simplification and the elimination of unit operations. The Idaho National Engineering Laboratory is developing a unique spray-forming method, the Controlled Aspiration Process (CAP), to produce near-net-shape solids and coatings of metals, polymers, and composite materials. Results from two spray-accompanying technical and economic benefits. These programs involved spray forming aluminum strip reinforced with SiC particulate, and the production of tooling, such as injection molds and dies, using low-melting-point metals.

  17. Cost analysis in the toxicology laboratory.

    PubMed

    Travers, E M

    1990-09-01

    The process of determining laboratory sectional and departmental costs and test costs for instrument-generated and manually generated reportable results for toxicology laboratories has been outlined in this article. It is hoped that the basic principles outlined in the preceding text will clarify and elucidate one of the most important areas needed for laboratory fiscal integrity and its survival in these difficult times for health care providers. The following general principles derived from this article are helpful aids for managers of toxicology laboratories. 1. To manage a cost-effective, efficient toxicology laboratory, several factors must be considered: the laboratory's instrument configuration, test turnaround time needs, the test menu offered, the analytic methods used, the cost of labor based on time expended and the experience and educational level of the staff, and logistics that determine specimen delivery time and costs. 2. There is a wide variation in costs for toxicologic methods, which requires that an analysis of capital (equipment) purchase and operational (test performance) costs be performed to avoid waste, purchase wisely, and determine which tests consume the majority of the laboratory's resources. 3. Toxicologic analysis is composed of many complex steps. Each step must be individually cost-accounted. Screening test results must be confirmed, and the cost for both steps must be included in the cost per reportable result. 4. Total costs will vary in the same laboratory and between laboratories based on differences in salaries paid to technical staff, differences in reagent/supply costs, the number of technical staff needed to operate the analyzer or perform the method, and the inefficient use of highly paid staff to operate the analyzer or perform the method. 5. Since direct test costs vary directly with the type and number of analyzers or methods and are dependent on the operational mode designed by the manufacturer, laboratory managers

  18. Capability of the Gas Analysis and Testing Laboratory at the NASA Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Broerman, Craig; Jimenez, Javier; Sweterlitsch, Jeff

    2012-01-01

    The Gas Analysis and Testing Laboratory is an integral part of the testing performed at the NASA Johnson Space Center. The Gas Analysis and Testing Laboratory is a high performance laboratory providing real time analytical instruments to support manned and unmanned testing. The lab utilizes precision gas chromatographs, gas analyzers and spectrophotometers to support the technology development programs within the NASA community. The Gas Analysis and Testing Laboratory works with a wide variety of customers and provides engineering support for user-specified applications in compressed gas, chemical analysis, general and research laboratory.

  19. Capability of the Gas Analysis and Testing Laboratory at the NASA Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Broerman, Craig; Jimenez, Javier; Sweterlitsch, Jeff

    2011-01-01

    The Gas Analysis and Testing Laboratory is an integral part of the testing performed at the NASA Johnson Space Center. The Gas Analysis and Testing Laboratory is a high performance laboratory providing real time analytical instruments to support manned and unmanned testing. The lab utilizes precision gas chromatographs, gas analyzers and spectrophotometers to support the technology development programs within the NASA community. The Gas Analysis and Testing Laboratory works with a wide variety of customers and provides engineering support for user-specified applications in compressed gas, chemical analysis, general and research laboratory

  20. The laboratory astrophysics facility at University College

    NASA Astrophysics Data System (ADS)

    Hyland, A. R.; Smith, R. G.; Robinson, G.

    A laboratory astrophysics facility for the study of the terrestrial analogues of interstellar dust grains is being developed in the Physics Department, University College, Australian Defence Force Academy. The facility consists of a gas handling system for the preparation of samples, a closed-cycle cooler and specimen chamber, and a Fourier Transform Infrared (FTIR) Spectrometer capable of high resolution (0.3/cm) and high sensitivity measurements, currently from 1-25 microns. The layout and construction of the laboratory are described, and the proposed initial experimental program aimed at determining the optical constants of ices, over a wide wavelength range for comparison with astronomical observations, is discussed.

  1. Improved reliability of serological tools for the diagnosis of West Nile fever in horses within Europe

    PubMed Central

    Lowenski, Steeve; Durand, Benoit; Bahuon, Céline; Zientara, Stéphan; Lecollinet, Sylvie

    2017-01-01

    West Nile Fever is a zoonotic disease caused by a mosquito-borne flavivirus, WNV. By its clinical sensitivity to the disease, the horse is a useful sentinel of infection. Because of the virus’ low-level, short-term viraemia in horses, the primary tools used to diagnose WNV are serological tests. Inter-laboratory proficiency tests (ILPTs) were held in 2010 and 2013 to evaluate WNV serological diagnostic tools suited for the European network of National Reference Laboratories (NRLs) for equine diseases. These ILPTs were designed to evaluate the laboratories’ and methods’ performances in detecting WNV infection in horses through serology. The detection of WNV immunoglobulin G (IgG) antibodies by ELISA is widely used in Europe, with 17 NRLs in 2010 and 20 NRLs in 2013 using IgG WNV assays. Thanks to the development of new commercial IgM capture kits, WNV IgM capture ELISAs were rapidly implemented in NRLs between 2010 (4 NRLs) and 2013 (13 NRLs). The use of kits allowed the quick standardisation of WNV IgG and IgM detection assays in NRLs with more than 95% (20/21) and 100% (13/13) of satisfactory results respectively in 2013. Conversely, virus neutralisation tests (VNTs) were implemented in 33% (7/21) of NRLs in 2013 and their low sensitivity was evidenced in 29% (2/7) of NRLs during this ILPT. A comparison of serological diagnostic methods highlighted the higher sensitivity of IgG ELISAs compared to WNV VNTs. They also revealed that the low specificity of IgG ELISA kits meant that it could detect animals infected with other flaviviruses. In contrast VNT and IgM ELISA assays were highly specific and did not detect antibodies against related flaviviruses. These results argue in favour of the need for and development of new, specific serological diagnostic assays that could be easily transferred to partner laboratories. PMID:28915240

  2. Tool use disorders after left brain damage.

    PubMed

    Baumard, Josselin; Osiurak, François; Lesourd, Mathieu; Le Gall, Didier

    2014-01-01

    In this paper we review studies that investigated tool use disorders in left-brain damaged (LBD) patients over the last 30 years. Four tasks are classically used in the field of apraxia: Pantomime of tool use, single tool use, real tool use and mechanical problem solving. Our aim was to address two issues, namely, (1) the role of mechanical knowledge in real tool use and (2) the cognitive mechanisms underlying pantomime of tool use, a task widely employed by clinicians and researchers. To do so, we extracted data from 36 papers and computed the difference between healthy subjects and LBD patients. On the whole, pantomime of tool use is the most difficult task and real tool use is the easiest one. Moreover, associations seem to appear between pantomime of tool use, real tool use and mechanical problem solving. These results suggest that the loss of mechanical knowledge is critical in LBD patients, even if all of those tasks (and particularly pantomime of tool use) might put differential demands on semantic memory and working memory.

  3. Tool use disorders after left brain damage

    PubMed Central

    Baumard, Josselin; Osiurak, François; Lesourd, Mathieu; Le Gall, Didier

    2014-01-01

    In this paper we review studies that investigated tool use disorders in left-brain damaged (LBD) patients over the last 30 years. Four tasks are classically used in the field of apraxia: Pantomime of tool use, single tool use, real tool use and mechanical problem solving. Our aim was to address two issues, namely, (1) the role of mechanical knowledge in real tool use and (2) the cognitive mechanisms underlying pantomime of tool use, a task widely employed by clinicians and researchers. To do so, we extracted data from 36 papers and computed the difference between healthy subjects and LBD patients. On the whole, pantomime of tool use is the most difficult task and real tool use is the easiest one. Moreover, associations seem to appear between pantomime of tool use, real tool use and mechanical problem solving. These results suggest that the loss of mechanical knowledge is critical in LBD patients, even if all of those tasks (and particularly pantomime of tool use) might put differential demands on semantic memory and working memory. PMID:24904487

  4. Managing Science: Management for R&D Laboratories

    NASA Astrophysics Data System (ADS)

    Gelès, Claude; Lindecker, Gilles; Month, Mel; Roche, Christian

    1999-10-01

    A unique "how-to" manual for the management of scientific laboratories This book presents a complete set of tools for the management of research and development laboratories and projects. With an emphasis on knowledge rather than profit as a measure of output and performance, the authors apply standard management principles and techniques to the needs of high-flux, open-ended, separately funded science and technology enterprises. They also propose the novel idea that failure, and incipient failure, is an important measure of an organization's potential. From the management of complex, round-the-clock, high-tech operations to strategies for long-term planning, Managing Science: Management for R&D Laboratories discusses how to build projects with the proper research and development, obtain and account for funding, and deal with rapidly changing technologies, facilities, and trends. The entire second part of the book is devoted to personnel issues and the impact of workplace behavior on the various functions of a knowledge-based organization. Drawing on four decades of involvement with the management of scientific laboratories, the authors thoroughly illustrate their philosophy with real-world examples from the physics field and provide tables and charts. Managers of scientific laboratories as well as scientists and engineers expecting to move into management will find Managing Science: Management for R&D Laboratories an invaluable practical guide.

  5. A tool for NDVI time series extraction from wide-swath remotely sensed images

    NASA Astrophysics Data System (ADS)

    Li, Zhishan; Shi, Runhe; Zhou, Cong

    2015-09-01

    Normalized Difference Vegetation Index (NDVI) is one of the most widely used indicators for monitoring the vegetation coverage in land surface. The time series features of NDVI are capable of reflecting dynamic changes of various ecosystems. Calculating NDVI via Moderate Resolution Imaging Spectrometer (MODIS) and other wide-swath remotely sensed images provides an important way to monitor the spatial and temporal characteristics of large-scale NDVI. However, difficulties are still existed for ecologists to extract such information correctly and efficiently because of the problems in several professional processes on the original remote sensing images including radiometric calibration, geometric correction, multiple data composition and curve smoothing. In this study, we developed an efficient and convenient online toolbox for non-remote sensing professionals who want to extract NDVI time series with a friendly graphic user interface. It is based on Java Web and Web GIS technically. Moreover, Struts, Spring and Hibernate frameworks (SSH) are integrated in the system for the purpose of easy maintenance and expansion. Latitude, longitude and time period are the key inputs that users need to provide, and the NDVI time series are calculated automatically.

  6. The Impact on Education of the World Wide Web.

    ERIC Educational Resources Information Center

    Hobbs, D. J.; Taylor, R. J.

    This paper describes a project which created a set of World Wide Web (WWW) pages documenting the state of the art in educational multimedia design; a prototype WWW-based multimedia teaching tool--a podiatry test using HTML forms, 24-bit color images and MPEG video--was also designed, developed, and evaluated. The project was conducted between…

  7. Prevention of infections in an ART laboratory: a reflection on simplistic methods.

    PubMed

    Huyser, C

    2014-01-01

    Preventative measures combined with reactive remedial actions are generic management tools to optimize and protect an entity's core businesses. Differences between assisted reproduction technology (ART) laboratories in developing versus developed countries include restricted access to, or availability of resources, and the prevalence of pathological conditions that are endemic or common in non-industrialized regions. The aim of this paper is to discuss the prevention of infections in an ART laboratory in a low to middle-income country, with reference to simplistic risk reduction applications to avoid the introduction and transmission of pathogens. Diagnostic and procedural phases will be examined, i.e. (i) screening for microbes during patient evaluation, and (ii-iii) prevention of environmental and procedural contamination. Preventative action is enabled by knowledge of threats and the degree of risk involved. Awareness and understanding of the vulnerabilities in an ART system, wherein laboratory personnel operate, are invaluable assets when unforeseen equipment failure occurs or instant decisions have to be made to safeguard procedures. An inter-connective team approach to patient treatment, biosafety training and utilization of practical procedures such as semen decontamination, are fundamental tools in a laboratory's risk-reduction armoury to prevent and eliminate infectious elements.

  8. CAD tools for detector design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Womersley, J.; DiGiacomo, N.; Killian, K.

    1990-04-01

    Detailed detector design has traditionally been divided between engineering optimization for structural integrity and subsequent physicist evaluation. The availability of CAD systems for engineering design enables the tasks to be integrated by providing tools for particle simulation within the CAD system. We believe this will speed up detector design and avoid problems due to the late discovery of shortcomings in the detector. This could occur because of the slowness of traditional verification techniques (such as detailed simulation with GEANT). One such new particle simulation tool is described. It is being used with the I-DEAS CAD package for SSC detector designmore » at Martin-Marietta Astronautics and is to be released through the SSC Laboratory.« less

  9. Cooperative problem solving with personal mobile information tools in hospitals.

    PubMed

    Buchauer, A; Werner, R; Haux, R

    1998-01-01

    Health-care professionals have a broad range of needs for information and cooperation while working at different points of care (e.g., outpatient departments, wards, and functional units such as operating theaters). Patient-related data and medical knowledge have to be widely available to support high-quality patient care. Furthermore, due to the increased specialization of health-care professionals, efficient collaboration is required. Personal mobile information tools have a considerable potential to realize almost ubiquitous information and collaborative support. They enable to unite the functionality of conventional tools such as paper forms, dictating machines, and pagers into one tool. Moreover, they can extend the support already provided by clinical workstations. An approach is described for the integration of mobile information tools with heterogeneous hospital information systems. This approach includes identification of functions which should be provided on mobile tools. Major functions are the presentation of medical records and reports, electronic mailing to support interpersonal communication, and the provision of editors for structured clinical documentation. To realize those functions on mobile tools, we propose a document-based client-server architecture that enables mobile information tools to interoperate with existing computer-based application systems. Open application systems and powerful, partially wireless, hospital-wide networks are the prerequisites for the introduction of mobile information tools.

  10. Optimized molecular resolution of cross-contamination alerts in clinical mycobacteriology laboratories.

    PubMed

    Martín, Ana; Herranz, Marta; Lirola, Miguel Martínez; Fernández, Rosa Fernández; Bouza, Emilio; García de Viedma, Darío

    2008-02-14

    The phenomenon of misdiagnosing tuberculosis (TB) by laboratory cross-contamination when culturing Mycobacterium tuberculosis (MTB) has been widely reported and it has an obvious clinical, therapeutic and social impact. The final confirmation of a cross-contamination event requires the molecular identification of the same MTB strain cultured from both the potential source of the contamination and from the false-positive candidate. The molecular tool usually applied in this context is IS6110-RFLP which takes a long time to provide an answer, usually longer than is acceptable for microbiologists and clinicians to make decisions. Our purpose in this study is to evaluate a novel PCR-based method, MIRU-VNTR as an alternative to assure a rapid and optimized analysis of cross-contamination alerts. MIRU-VNTR was prospectively compared with IS6110-RFLP for clarifying 19 alerts of false positivity from other laboratories. MIRU-VNTR highly correlated with IS6110-RFLP, reduced the response time by 27 days and clarified six alerts unresolved by RFLP. Additionally, MIRU-VNTR revealed complex situations such as contamination events involving polyclonal isolates and a false-positive case due to the simultaneous cross-contamination from two independent sources. Unlike standard RFLP-based genotyping, MIRU-VNTR i) could help reduce the impact of a false positive diagnosis of TB, ii) increased the number of events that could be solved and iii) revealed the complexity of some cross-contamination events that could not be dissected by IS6110-RFLP.

  11. LABORATORY EVALUATION OF A MICROFLUIDIC ELECTROCHEMICAL SENSOR FOR AEROSOL OXIDATIVE LOAD.

    PubMed

    Koehler, Kirsten; Shapiro, Jeffrey; Sameenoi, Yupaporn; Henry, Charles; Volckens, John

    2014-05-01

    Human exposure to particulate matter (PM) air pollution is associated with human morbidity and mortality. The mechanisms by which PM impacts human health are unresolved, but evidence suggests that PM intake leads to cellular oxidative stress through the generation of reactive oxygen species (ROS). Therefore, reliable tools are needed for estimating the oxidant generating capacity, or oxidative load, of PM at high temporal resolution (minutes to hours). One of the most widely reported methods for assessing PM oxidative load is the dithiothreitol (DTT) assay. The traditional DTT assay utilizes filter-based PM collection in conjunction with chemical analysis to determine the oxidation rate of reduced DTT in solution with PM. However, the traditional DTT assay suffers from poor time resolution, loss of reactive species during sampling, and high limit of detection. Recently, a new DTT assay was developed that couples a Particle-Into-Liquid-Sampler with microfluidic-electrochemical detection. This 'on-line' system allows high temporal resolution monitoring of PM reactivity with improved detection limits. This study reports on a laboratory comparison of the traditional and on-line DTT approaches. An urban dust sample was aerosolized in a laboratory test chamber at three atmospherically-relevant concentrations. The on-line system gave a stronger correlation between DTT consumption rate and PM mass (R 2 = 0.69) than the traditional method (R 2 = 0.40) and increased precision at high temporal resolution, compared to the traditional method.

  12. MGAS: a powerful tool for multivariate gene-based genome-wide association analysis.

    PubMed

    Van der Sluis, Sophie; Dolan, Conor V; Li, Jiang; Song, Youqiang; Sham, Pak; Posthuma, Danielle; Li, Miao-Xin

    2015-04-01

    Standard genome-wide association studies, testing the association between one phenotype and a large number of single nucleotide polymorphisms (SNPs), are limited in two ways: (i) traits are often multivariate, and analysis of composite scores entails loss in statistical power and (ii) gene-based analyses may be preferred, e.g. to decrease the multiple testing problem. Here we present a new method, multivariate gene-based association test by extended Simes procedure (MGAS), that allows gene-based testing of multivariate phenotypes in unrelated individuals. Through extensive simulation, we show that under most trait-generating genotype-phenotype models MGAS has superior statistical power to detect associated genes compared with gene-based analyses of univariate phenotypic composite scores (i.e. GATES, multiple regression), and multivariate analysis of variance (MANOVA). Re-analysis of metabolic data revealed 32 False Discovery Rate controlled genome-wide significant genes, and 12 regions harboring multiple genes; of these 44 regions, 30 were not reported in the original analysis. MGAS allows researchers to conduct their multivariate gene-based analyses efficiently, and without the loss of power that is often associated with an incorrectly specified genotype-phenotype models. MGAS is freely available in KGG v3.0 (http://statgenpro.psychiatry.hku.hk/limx/kgg/download.php). Access to the metabolic dataset can be requested at dbGaP (https://dbgap.ncbi.nlm.nih.gov/). The R-simulation code is available from http://ctglab.nl/people/sophie_van_der_sluis. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  13. RSAT: regulatory sequence analysis tools.

    PubMed

    Thomas-Chollier, Morgane; Sand, Olivier; Turatsinze, Jean-Valéry; Janky, Rekin's; Defrance, Matthieu; Vervisch, Eric; Brohée, Sylvain; van Helden, Jacques

    2008-07-01

    The regulatory sequence analysis tools (RSAT, http://rsat.ulb.ac.be/rsat/) is a software suite that integrates a wide collection of modular tools for the detection of cis-regulatory elements in genome sequences. The suite includes programs for sequence retrieval, pattern discovery, phylogenetic footprint detection, pattern matching, genome scanning and feature map drawing. Random controls can be performed with random gene selections or by generating random sequences according to a variety of background models (Bernoulli, Markov). Beyond the original word-based pattern-discovery tools (oligo-analysis and dyad-analysis), we recently added a battery of tools for matrix-based detection of cis-acting elements, with some original features (adaptive background models, Markov-chain estimation of P-values) that do not exist in other matrix-based scanning tools. The web server offers an intuitive interface, where each program can be accessed either separately or connected to the other tools. In addition, the tools are now available as web services, enabling their integration in programmatic workflows. Genomes are regularly updated from various genome repositories (NCBI and EnsEMBL) and 682 organisms are currently supported. Since 1998, the tools have been used by several hundreds of researchers from all over the world. Several predictions made with RSAT were validated experimentally and published.

  14. Idaho National Laboratory Cultural Resource Management Office FY 2010 Activity Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hollie K. Gilbert; Clayton F. Marler; Christina L. Olson

    2011-09-01

    The Idaho National Laboratory (INL) Site is home to vast numbers and a wide variety of important cultural resources representing at least a 13,500 year span of human land use in the region. As a federal agency, the Department of Energy, Idaho Operations Office (DOE-ID) has legal responsibility for the management and protection of the resources and has contracted these responsibilities to Battelle Energy Alliance (BEA). The BEA professional staff is committed to maintaining a cultural resource management program that accepts the challenge of preserving INL cultural resources in a manner reflecting their importance in local, regional, and national history.more » This report summarizes activities performed by the INL Cultural Resource Management Office (CRMO) staff during fiscal year 2010. This work is diverse, far-reaching and though generally confined to INL cultural resource compliance, also includes a myriad of professional and voluntary community activities. This document is intended to be informative to both internal and external stakeholders and to serve as a planning tool for future INL cultural resource management work.« less

  15. MRM as a discovery tool?

    PubMed

    Rudnick, Paul A

    2015-04-01

    Multiple-reaction monitoring (MRM) of peptides has been recognized as a promising technology because it is sensitive and robust. Borrowed from stable-isotope dilution (SID) methodologies in the field of small molecules, MRM is now routinely used in proteomics laboratories. While its usefulness validating candidate targets is widely accepted, it has not been established as a discovery tool. Traditional thinking has been that MRM workflows cannot be multiplexed high enough to efficiently profile. This is due to slower instrument scan rates and the complexities of developing increasingly large scheduling methods. In this issue, Colangelo et al. (Proteomics 2015, 15, 1202-1214) describe a pipeline (xMRM) for discovery-style MRM using label-free methods (i.e. relative quantitation). Label-free comes with cost benefits as does MRM, where data are easier to analyze than full-scan. Their paper offers numerous improvements in method design and data analysis. The robustness of their pipeline was tested on rodent postsynaptic density fractions. There, they were able to accurately quantify 112 proteins at a CV% of 11.4, with only 2.5% of the 1697 transitions requiring user intervention. Colangelo et al. aim to extend the reach of MRM deeper into the realm of discovery proteomics, an area that is currently dominated by data-dependent and data-independent workflows. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Clinical laboratory technician to clinical laboratory scientist articulation and distance learning.

    PubMed

    Crowley, J R; Laurich, G A; Mobley, R C; Arnette, A H; Shaikh, A H; Martin, S M

    1999-01-01

    Laboratory workers and educators alike are challenged to support access to education that is current and provides opportunities for career advancement in the work place. The clinical laboratory science (CLS) program at the Medical College of Georgia in Augusta developed a clinical laboratory technician (CLT) to CLS articulation option, expanded it through distance learning, and integrated computer based learning technology into the educational process over a four year period to address technician needs for access to education. Both positive and negative outcomes were realized through these efforts. Twenty-seven students entered the pilot articulation program, graduated, and took a CLS certification examination. Measured in terms of CLS certification, promotions, pay raises, and career advancement, the program described was a success. However, major problems were encountered related to the use of unfamiliar communication technology; administration of the program at distance sites; communication between educational institutions, students, and employers; and competition with CLT programs for internship sites. These problems must be addressed in future efforts to provide a successful distance learning program. Effective methods for meeting educational needs and career ladder expectations of CLTs and their employers are important to the overall quality and appeal of the profession. Educational technology that includes computer-aided instruction, multimedia, and telecommunications can provide powerful tools for education in general and CLT articulation in particular. Careful preparation and vigilant attention to reliable delivery methods as well as students' progress and outcomes is critical for an efficient, economically feasible, and educationally sound program.

  17. Web Environment for Programming and Control of a Mobile Robot in a Remote Laboratory

    ERIC Educational Resources Information Center

    dos Santos Lopes, Maísa Soares; Gomes, Iago Pacheco; Trindade, Roque M. P.; da Silva, Alzira F.; de C. Lima, Antonio C.

    2017-01-01

    Remote robotics laboratories have been successfully used for engineering education. However, few of them use mobile robots to to teach computer science. This article describes a mobile robot Control and Programming Environment (CPE) and its pedagogical applications. The system comprises a remote laboratory for robotics, an online programming tool,…

  18. MilQuant: a free, generic software tool for isobaric tagging-based quantitation.

    PubMed

    Zou, Xiao; Zhao, Minzhi; Shen, Hongyan; Zhao, Xuyang; Tong, Yuanpeng; Wang, Qingsong; Wei, Shicheng; Ji, Jianguo

    2012-09-18

    Isobaric tagging techniques such as iTRAQ and TMT are widely used in quantitative proteomics and especially useful for samples that demand in vitro labeling. Due to diversity in choices of MS acquisition approaches, identification algorithms, and relative abundance deduction strategies, researchers are faced with a plethora of possibilities when it comes to data analysis. However, the lack of generic and flexible software tool often makes it cumbersome for researchers to perform the analysis entirely as desired. In this paper, we present MilQuant, mzXML-based isobaric labeling quantitator, a pipeline of freely available programs that supports native acquisition files produced by all mass spectrometer types and collection approaches currently used in isobaric tagging based MS data collection. Moreover, aside from effective normalization and abundance ratio deduction algorithms, MilQuant exports various intermediate results along each step of the pipeline, making it easy for researchers to customize the analysis. The functionality of MilQuant was demonstrated by four distinct datasets from different laboratories. The compatibility and extendibility of MilQuant makes it a generic and flexible tool that can serve as a full solution to data analysis of isobaric tagging-based quantitation. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. Beverage-Agarose Gel Electrophoresis: An Inquiry-Based Laboratory Exercise with Virtual Adaptation

    ERIC Educational Resources Information Center

    Cunningham, Steven C.; McNear, Brad; Pearlman, Rebecca S.; Kern, Scott E.

    2006-01-01

    A wide range of literature and experience has shown that teaching methods that promote active learning, such as inquiry-based approaches, are more effective than those that rely on passive learning. Gel electrophoresis, one of the most common laboratory techniques in molecular biology, has a wide range of applications in the life sciences. As…

  20. Myoglobin structure and function: A multiweek biochemistry laboratory project.

    PubMed

    Silverstein, Todd P; Kirk, Sarah R; Meyer, Scott C; Holman, Karen L McFarlane

    2015-01-01

    We have developed a multiweek laboratory project in which students isolate myoglobin and characterize its structure, function, and redox state. The important laboratory techniques covered in this project include size-exclusion chromatography, electrophoresis, spectrophotometric titration, and FTIR spectroscopy. Regarding protein structure, students work with computer modeling and visualization of myoglobin and its homologues, after which they spectroscopically characterize its thermal denaturation. Students also study protein function (ligand binding equilibrium) and are instructed on topics in data analysis (calibration curves, nonlinear vs. linear regression). This upper division biochemistry laboratory project is a challenging and rewarding one that not only exposes students to a wide variety of important biochemical laboratory techniques but also ties those techniques together to work with a single readily available and easily characterized protein, myoglobin. © 2015 International Union of Biochemistry and Molecular Biology.

  1. Recent Advances in Algal Genetic Tool Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R. Dahlin, Lukas; T. Guarnieri, Michael

    The goal of achieving cost-effective biofuels and bioproducts derived from algal biomass will require improvements along the entire value chain, including identification of robust, high-productivity strains and development of advanced genetic tools. Though there have been modest advances in development of genetic systems for the model alga Chlamydomonas reinhardtii, progress in development of algal genetic tools, especially as applied to non-model algae, has generally lagged behind that of more commonly utilized laboratory and industrial microbes. This is in part due to the complex organellar structure of algae, including robust cell walls and intricate compartmentalization of target loci, as well asmore » prevalent gene silencing mechanisms, which hinder facile utilization of conventional genetic engineering tools and methodologies. However, recent progress in global tool development has opened the door for implementation of strain-engineering strategies in industrially-relevant algal strains. Here, we review recent advances in algal genetic tool development and applications in eukaryotic microalgae.« less

  2. Recent Advances in Algal Genetic Tool Development

    DOE PAGES

    R. Dahlin, Lukas; T. Guarnieri, Michael

    2016-06-24

    The goal of achieving cost-effective biofuels and bioproducts derived from algal biomass will require improvements along the entire value chain, including identification of robust, high-productivity strains and development of advanced genetic tools. Though there have been modest advances in development of genetic systems for the model alga Chlamydomonas reinhardtii, progress in development of algal genetic tools, especially as applied to non-model algae, has generally lagged behind that of more commonly utilized laboratory and industrial microbes. This is in part due to the complex organellar structure of algae, including robust cell walls and intricate compartmentalization of target loci, as well asmore » prevalent gene silencing mechanisms, which hinder facile utilization of conventional genetic engineering tools and methodologies. However, recent progress in global tool development has opened the door for implementation of strain-engineering strategies in industrially-relevant algal strains. Here, we review recent advances in algal genetic tool development and applications in eukaryotic microalgae.« less

  3. Assessment of Proficiency and Competency in Laboratory Animal Biomethodologies

    PubMed Central

    Clifford, Paula; Melfi, Natasha; Bogdanske, John; Johnson, Elizabeth J; Kehler, James; Baran, Szczepan W

    2013-01-01

    Personnel working with laboratory animals are required by laws and guidelines to be trained and qualified to perform biomethodologic procedures. The assessment of competency and proficiency is a vital component of a laboratory animal training program, because this process confirms that the trainees have met the learning objectives for a particular procedure. The approach toward qualification assessment differs between organizations because laws and guidelines do not outline how the assessment should be performed or which methods and tools should be used. Assessment of clinical and surgical medicine has received considerable attention over the last few decades and has progressed from simple subjective methods to well-defined and objective methods of assessing competency. Although biomethodology competency and proficiency assessment is discussed in the literature, a standard and objective assessment method has not yet been developed. The development and implementation of an objective and standardized biomethodologic assessment program can serve as a tool to improve standards, ensure consistent training, and decrease research variables yet ensure animal welfare. Here we review the definition and goals of training and assessment, review assessment methods, and propose a method to develop a standard and objective assessment program for the laboratory animal science field, particularly training departments and IACUC. PMID:24351758

  4. Adoption of lean principles in a high-volume molecular diagnostic microbiology laboratory.

    PubMed

    Mitchell, P Shawn; Mandrekar, Jayawant N; Yao, Joseph D C

    2014-07-01

    Clinical laboratories are constantly facing challenges to do more with less, enhance quality, improve test turnaround time, and reduce operational expenses. Experience with adopting and applying lean concepts and tools used extensively in the manufacturing industry is described for a high-volume clinical molecular microbiology laboratory, illustrating how operational success and benefits can be achieved. Copyright © 2014, American Society for Microbiology. All Rights Reserved.

  5. Spreadsheet Assessment Tool v. 2.4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allen, David J.; Martinez, Ruben

    2016-03-03

    The Spreadsheet Assessment Tool (SAT) is an easy to use, blast assessment tool that is intended to estimate the potential risk due to an explosive attack on a blood irradiator. The estimation of risk is based on the methodology, assumptions, and results of a detailed blast effects assessment study that is summarized in Sandia National Laboratories Technical Report SAND2015-6166. Risk as defined in the report and as used in the SAT is: "The potential risk of creating an air blast-induced vent opening at a buildings envelope surface". Vent openings can be created at a buildings envelope through the failure ofmore » an exterior building component—like a wall, window, or door—due to an explosive sabotage of an irradiator within the building. To estimate risk, the tool requires that users obtain and input information pertaining to the building's characteristics and the irradiator location. The tool also suggests several prescriptive mitigation strategies that can be considered to reduce risk. Given the variability in civilian building construction practices, the input parameters used by this tool may not apply to all buildings being assessed. The tool should not be used as a substitute for engineering judgment. The tool is intended for assessment purposes only.« less

  6. Spray-formed tooling

    NASA Astrophysics Data System (ADS)

    McHugh, K. M.; Key, J. F.

    The United States Council for Automotive Research (USCAR) has formed a partnership with the Idaho National Engineering Laboratory (INEL) to develop a process for the rapid production of low-cost tooling based on spray forming technology developed at the INEL. Phase 1 of the program will involve bench-scale system development, materials characterization, and process optimization. In Phase 2, prototype systems will be designed, constructed, evaluated, and optimized. Process control and other issues that influence commercialization will be addressed during this phase of the project. Technology transfer to USCAR, or a tooling vendor selected by USCAR, will be accomplished during Phase 3. The approach INEL is using to produce tooling, such as plastic injection molds and stamping dies, combines rapid solidification processing and net-shape materials processing into a single step. A bulk liquid metal is pressure-fed into a de Laval spray nozzle transporting a high velocity, high temperature inert gas. The gas jet disintegrates the metal into fine droplets and deposits them onto a tool pattern made from materials such as plastic, wax, clay, ceramics, and metals. The approach is compatible with solid freeform fabrication techniques such as stereolithography, selective laser sintering, and laminated object manufacturing. Heat is extracted rapidly, in-flight, by convection as the spray jet entrains cool inert gas to produce undercooled and semi-solid droplets. At the pattern, the droplets weld together while replicating the shape and surface features of the pattern. Tool formation is rapid; deposition rates in excess of 1 ton/h have been demonstrated for bench-scale nozzles.

  7. International Scavenging for First Responder Guidance and Tools: IAEA Products

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stern, W.; Berthelot, L.; Bachner, K.

    In fiscal years (FY) 2016 and 2017, with support from the U.S. Department of Homeland Security (DHS), Brookhaven National Laboratory (BNL) examined the International Atomic Energy Agency (IAEA) radiological emergency response and preparedness products (guidance and tools) to determine which of these products could be useful to U.S. first responders. The IAEA Incident and Emergency Centre (IEC), which is responsible for emergency preparedness and response, offers a range of tools and guidance documents for responders in recognizing, responding to, and recovering from radiation emergencies and incidents. In order to implement this project, BNL obtained all potentially relevant tools and productsmore » produced by the IAEA IEC and analyzed these materials to determine their relevance to first responders in the U.S. Subsequently, BNL organized and hosted a workshop at DHS National Urban Security Technology Laboratory (NUSTL) for U.S. first responders to examine and evaluate IAEA products to consider their applicability to the United States. This report documents and describes the First Responder Product Evaluation Workshop, and provides recommendations on potential steps the U.S. federal government could take to make IAEA guidance and tools useful to U.S. responders.« less

  8. The Role of the Clinical Laboratory in the Future of Health Care: Lean Microbiology

    PubMed Central

    Samuel, Linoj

    2014-01-01

    This commentary will introduce lean concepts into the clinical microbiology laboratory. The practice of lean in the clinical microbiology laboratory can remove waste, increase efficiency, and reduce costs. Lean, Six Sigma, and other such management initiatives are useful tools and can provide dividends but must be accompanied by organizational leadership commitment to sustaining the lean culture in the laboratory setting and providing resources and time to work through the process. PMID:24574289

  9. Comparison of Performance Predictions for New Low-Thrust Trajectory Tools

    NASA Technical Reports Server (NTRS)

    Polsgrove, Tara; Kos, Larry; Hopkins, Randall; Crane, Tracie

    2006-01-01

    Several low thrust trajectory optimization tools have been developed over the last 3% years by the Low Thrust Trajectory Tools development team. This toolset includes both low-medium fidelity and high fidelity tools which allow the analyst to quickly research a wide mission trade space and perform advanced mission design. These tools were tested using a set of reference trajectories that exercised each tool s unique capabilities. This paper compares the performance predictions of the various tools against several of the reference trajectories. The intent is to verify agreement between the high fidelity tools and to quantify the performance prediction differences between tools of different fidelity levels.

  10. MIT Lincoln Laboratory Annual Report 2012

    DTIC Science & Technology

    2012-01-01

    5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS...ES) Massachusetts Institute of Technology,Lincoln Laboratory,244 Wood Street,Lexington,MA,02420-9108 8. PERFORMING ORGANIZATION REPORT NUMBER 9...spans a wide range of research areas, including high- performance detectors and focal planes, 3D integrated circuits, microelectromechanical devices

  11. Visual illusion of tool use recalibrates tactile perception

    PubMed Central

    Miller, Luke E.; Longo, Matthew R.; Saygin, Ayse P.

    2018-01-01

    Brief use of a tool recalibrates multisensory representations of the user’s body, a phenomenon called tool embodiment. Despite two decades of research, little is known about its boundary conditions. It has been widely argued that embodiment requires active tool use, suggesting a critical role for somatosensory and motor feedback. The present study used a visual illusion to cast doubt on this view. We used a mirror-based setup to induce a visual experience of tool use with an arm that was in fact stationary. Following illusory tool use, tactile perception was recalibrated on this stationary arm, and with equal magnitude as physical use. Recalibration was not found following illusory passive tool holding, and could not be accounted for by sensory conflict or general interhemispheric plasticity. These results suggest visual tool-use signals play a critical role in driving tool embodiment. PMID:28196765

  12. Novel assessment tools to evaluate clinical and laboratory responses in a subset of patients enrolled in the Rituximab in Myositis trial.

    PubMed

    Rider, Lisa G; Yip, Adrienne L; Horkayne-Szakaly, Iren; Volochayev, Rita; Shrader, Joseph A; Turner, Maria L; Kong, Heidi H; Jain, Minal S; Jansen, Anna V; Oddis, Chester V; Fleisher, Thomas A; Miller, Frederick W

    2014-01-01

    We aimed to assess changes in myositis core set measures and ancillary clinical and laboratory data from the National Institutes of Health's subset of patients enrolled in the Rituximab in Myositis trial. Eighteen patients (5 dermatomyositis, 8 polymyositis, 5 juvenile dermatomyositis) completed more in-depth testing of muscle strength and cutaneous assessments, patient-reported outcomes, and laboratory tests before and after administration of rituximab. Percentage change in individual measures and in the definitions of improvement (DOIs) and standardized response means were examined over 44 weeks. Core set activity measures improved by 18-70% from weeks 0-44 and were sensitive to change. Fifteen patients met the DOI at week 44, 9 patients met a DOI 50% response, and 4 met a DOI 70% response. Muscle strength and function measures were more sensitive to change than cutaneous assessments. Constitutional, gastrointestinal, and pulmonary systems improved 44-70%. Patient-reported outcomes improved up to 28%. CD20+ B cells were depleted in the periphery, but B cell depletion was not associated with clinical improvement at week 16. This subset of patients had high rates of clinical response to rituximab, similar to patients in the overall trial. Most measures were responsive, and muscle strength had a greater degree of change than cutaneous assessments. Several novel assessment tools, including measures of strength and function, extra-muscular organ activity, fatigue, and health-related quality of life, are promising for use in future myositis trials. Further study of B cell-depleting therapies in myositis, particularly in treatment-naïve patients, is warranted.

  13. The application of genome-wide 5-hydroxymethylcytosine studies in cancer research.

    PubMed

    Thomson, John P; Meehan, Richard R

    2017-01-01

    Early detection and characterization of molecular events associated with tumorgenesis remain high priorities. Genome-wide epigenetic assays are promising diagnostic tools, as aberrant epigenetic events are frequent and often cancer specific. The deposition and analysis of multiple patient-derived cancer epigenomic profiles contributes to our appreciation of the underlying biology; aiding the detection of novel identifiers for cancer subtypes. Modifying enzymes and co-factors regulating these epigenetic marks are frequently mutated in cancers, and as epigenetic modifications themselves are reversible, this makes their study very attractive with respect to pharmaceutical intervention. Here we focus on the novel modified base, 5-hydoxymethylcytosine, and discuss how genome-wide 5-hydoxymethylcytosine profiling expedites our molecular understanding of cancer, serves as a lineage tracer, classifies the mode of action of potentially carcinogenic agents and clarifies the roles of potential novel cancer drug targets; thus assisting the development of new diagnostic/prognostic tools.

  14. Implementation of Quality Management in Core Service Laboratories

    PubMed Central

    Creavalle, T.; Haque, K.; Raley, C.; Subleski, M.; Smith, M.W.; Hicks, B.

    2010-01-01

    CF-28 The Genetics and Genomics group of the Advanced Technology Program of SAIC-Frederick exists to bring innovative genomic expertise, tools and analysis to NCI and the scientific community. The Sequencing Facility (SF) provides next generation short read (Illumina) sequencing capacity to investigators using a streamlined production approach. The Laboratory of Molecular Technology (LMT) offers a wide range of genomics core services including microarray expression analysis, miRNA analysis, array comparative genome hybridization, long read (Roche) next generation sequencing, quantitative real time PCR, transgenic genotyping, Sanger sequencing, and clinical mutation detection services to investigators from across the NIH. As the technology supporting this genomic research becomes more complex, the need for basic quality processes within all aspects of the core service groups becomes critical. The Quality Management group works alongside members of these labs to establish or improve processes supporting operations control (equipment, reagent and materials management), process improvement (reengineering/optimization, automation, acceptance criteria for new technologies and tech transfer), and quality assurance and customer support (controlled documentation/SOPs, training, service deficiencies and continual improvement efforts). Implementation and expansion of quality programs within unregulated environments demonstrates SAIC-Frederick's dedication to providing the highest quality products and services to the NIH community.

  15. THE NATIONAL EXPOSURE RESEARCH LABORATORY'S COMPREHENSIVE HUMAN ACTIVITY DATABASE

    EPA Science Inventory

    EPA's National Exposure Research Laboratory (NERL) has combined data from nine U.S. studies related to human activities into one comprehensive data system that can be accessed via the world-wide web. The data system is called CHAD-Consolidated Human Activity Database-and it is ...

  16. The latest progress in sugarcane molecular genetics research at the USDA-ARS, Sugarcane Research Laboratory

    USDA-ARS?s Scientific Manuscript database

    In 2005, two sugar molecular genetics tools were developed in the USDA-ARS, Southeast Area, Sugarcane Research Laboratory at Houma, LA. One is the high throughput fluorescence- and capillary electrophoregrams (CE)-based SSR genotyping tool and the other is single pollen collection and SSR genotyping...

  17. The laboratory report: A pedagogical tool in college science courses

    NASA Astrophysics Data System (ADS)

    Ferzli, Miriam

    When viewed as a product rather than a process that aids in student learning, the lab report may become rote, busywork for both students and instructors. Students fail to see the purpose of the lab report, and instructors see them as a heavy grading load. If lab reports are taught as part of a process rather than a product that aims to "get the right answer," they may serve as pedagogical tools in college science courses. In response to these issues, an in-depth, web-based tutorial named LabWrite (www.ncsu.edu/labwrite) was developed to help students and instructors (www.ncsu.edu/labwrite/instructors) understand the purpose of the lab report as grounded in the written discourse and processes of science. The objective of this post-test only quasi-experimental study was to examine the role that in-depth instruction such as LabWrite plays in helping students to develop skills characteristic of scientifically literate individuals. Student lab reports from an introductory-level biology course at NC State University were scored for overall understanding of scientific concepts and scientific ways of thinking. The study also looked at students' attitudes toward science and lab report writing, as well as students' perceptions of lab reports in general. Significant statistical findings from this study show that students using LabWrite were able to write lab reports that showed a greater understanding of scientific investigations (p < .003) and scientific ways of thinking (p < .0001) than students receiving traditional lab report writing instruction. LabWrite also helped students develop positive attitudes toward lab reports as compared to non-LabWrite users (p < .01). Students using LabWrite seemed to perceive the lab report as a valuable tool for determining learning objectives, understanding science concepts, revisiting the lab experience, and documenting their learning.

  18. Using Self-Reflection to Increase Science Process Skills in the General Chemistry Laboratory

    ERIC Educational Resources Information Center

    Veal, William R.; Taylor, Dawne; Rogers, Amy L.

    2009-01-01

    Self-reflection is a tool of instruction that has been used in the science classroom. Research has shown great promise in using video as a learning tool in the classroom. However, the integration of self-reflective practice using video in the general chemistry laboratory to help students develop process skills has not been done. Immediate video…

  19. The EnzymeTracker: an open-source laboratory information management system for sample tracking.

    PubMed

    Triplet, Thomas; Butler, Gregory

    2012-01-26

    In many laboratories, researchers store experimental data on their own workstation using spreadsheets. However, this approach poses a number of problems, ranging from sharing issues to inefficient data-mining. Standard spreadsheets are also error-prone, as data do not undergo any validation process. To overcome spreadsheets inherent limitations, a number of proprietary systems have been developed, which laboratories need to pay expensive license fees for. Those costs are usually prohibitive for most laboratories and prevent scientists from benefiting from more sophisticated data management systems. In this paper, we propose the EnzymeTracker, a web-based laboratory information management system for sample tracking, as an open-source and flexible alternative that aims at facilitating entry, mining and sharing of experimental biological data. The EnzymeTracker features online spreadsheets and tools for monitoring numerous experiments conducted by several collaborators to identify and characterize samples. It also provides libraries of shared data such as protocols, and administration tools for data access control using OpenID and user/team management. Our system relies on a database management system for efficient data indexing and management and a user-friendly AJAX interface that can be accessed over the Internet. The EnzymeTracker facilitates data entry by dynamically suggesting entries and providing smart data-mining tools to effectively retrieve data. Our system features a number of tools to visualize and annotate experimental data, and export highly customizable reports. It also supports QR matrix barcoding to facilitate sample tracking. The EnzymeTracker was designed to be easy to use and offers many benefits over spreadsheets, thus presenting the characteristics required to facilitate acceptance by the scientific community. It has been successfully used for 20 months on a daily basis by over 50 scientists. The EnzymeTracker is freely available online at http

  20. The EnzymeTracker: an open-source laboratory information management system for sample tracking

    PubMed Central

    2012-01-01

    Background In many laboratories, researchers store experimental data on their own workstation using spreadsheets. However, this approach poses a number of problems, ranging from sharing issues to inefficient data-mining. Standard spreadsheets are also error-prone, as data do not undergo any validation process. To overcome spreadsheets inherent limitations, a number of proprietary systems have been developed, which laboratories need to pay expensive license fees for. Those costs are usually prohibitive for most laboratories and prevent scientists from benefiting from more sophisticated data management systems. Results In this paper, we propose the EnzymeTracker, a web-based laboratory information management system for sample tracking, as an open-source and flexible alternative that aims at facilitating entry, mining and sharing of experimental biological data. The EnzymeTracker features online spreadsheets and tools for monitoring numerous experiments conducted by several collaborators to identify and characterize samples. It also provides libraries of shared data such as protocols, and administration tools for data access control using OpenID and user/team management. Our system relies on a database management system for efficient data indexing and management and a user-friendly AJAX interface that can be accessed over the Internet. The EnzymeTracker facilitates data entry by dynamically suggesting entries and providing smart data-mining tools to effectively retrieve data. Our system features a number of tools to visualize and annotate experimental data, and export highly customizable reports. It also supports QR matrix barcoding to facilitate sample tracking. Conclusions The EnzymeTracker was designed to be easy to use and offers many benefits over spreadsheets, thus presenting the characteristics required to facilitate acceptance by the scientific community. It has been successfully used for 20 months on a daily basis by over 50 scientists. The EnzymeTracker is

  1. Automated MeSH indexing of the World-Wide Web.

    PubMed Central

    Fowler, J.; Kouramajian, V.; Maram, S.; Devadhar, V.

    1995-01-01

    To facilitate networked discovery and information retrieval in the biomedical domain, we have designed a system for automatic assignment of Medical Subject Headings to documents retrieved from the World-Wide Web. Our prototype implementations show significant promise. We describe our methods and discuss the further development of a completely automated indexing tool called the "Web-MeSH Medibot." PMID:8563421

  2. DNA-Based Methods in the Immunohematology Reference Laboratory

    PubMed Central

    Denomme, Gregory A

    2010-01-01

    Although hemagglutination serves the immunohematology reference laboratory well, when used alone, it has limited capability to resolve complex problems. This overview discusses how molecular approaches can be used in the immunohematology reference laboratory. In order to apply molecular approaches to immunohematology, knowledge of genes, DNA-based methods, and the molecular bases of blood groups are required. When applied correctly, DNA-based methods can predict blood groups to resolve ABO/Rh discrepancies, identify variant alleles, and screen donors for antigen-negative units. DNA-based testing in immunohematology is a valuable tool used to resolve blood group incompatibilities and to support patients in their transfusion needs. PMID:21257350

  3. An evaluation of the accuracy and speed of metagenome analysis tools

    PubMed Central

    Lindgreen, Stinus; Adair, Karen L.; Gardner, Paul P.

    2016-01-01

    Metagenome studies are becoming increasingly widespread, yielding important insights into microbial communities covering diverse environments from terrestrial and aquatic ecosystems to human skin and gut. With the advent of high-throughput sequencing platforms, the use of large scale shotgun sequencing approaches is now commonplace. However, a thorough independent benchmark comparing state-of-the-art metagenome analysis tools is lacking. Here, we present a benchmark where the most widely used tools are tested on complex, realistic data sets. Our results clearly show that the most widely used tools are not necessarily the most accurate, that the most accurate tool is not necessarily the most time consuming, and that there is a high degree of variability between available tools. These findings are important as the conclusions of any metagenomics study are affected by errors in the predicted community composition and functional capacity. Data sets and results are freely available from http://www.ucbioinformatics.org/metabenchmark.html PMID:26778510

  4. Energy efficiency in California laboratory-type facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mills, E.; Bell, G.; Sartor, D.

    The central aim of this project is to provide knowledge and tools for increasing the energy efficiency and performance of new and existing laboratory-type facilities in California. We approach the task along three avenues: (1) identification of current energy use and savings potential, (2) development of a {ital Design guide for energy- Efficient Research Laboratories}, and (3) development of a research agenda for focused technology development and improving out understanding of the market. Laboratory-type facilities use a considerable amount of energy resources. They are also important to the local and state economy, and energy costs are a factor in themore » overall competitiveness of industries utilizing laboratory-type facilities. Although the potential for energy savings is considerable, improving energy efficiency in laboratory-type facilities is no easy task, and there are many formidable barriers to improving energy efficiency in these specialized facilities. Insufficient motivation for individual stake holders to invest in improving energy efficiency using existing technologies as well as conducting related R&D is indicative of the ``public goods`` nature of the opportunity to achieve energy savings in this sector. Due to demanding environmental control requirements and specialized processes, laboratory-type facilities epitomize the important intersection between energy demands in the buildings sector and the industrial sector. Moreover, given the high importance and value of the activities conducted in laboratory-type facilities, they represent one of the most powerful contexts in which energy efficiency improvements stand to yield abundant non-energy benefits if properly applied.« less

  5. Inexpensive optical tweezers for undergraduate laboratories

    NASA Astrophysics Data System (ADS)

    Smith, Stephen P.; Bhalotra, Sameer R.; Brody, Anne L.; Brown, Benjamin L.; Boyda, Edward K.; Prentiss, Mara

    1999-01-01

    Single beam gradient force optical traps, or tweezers, are a powerful tool for a wide variety of experiments in physics, chemistry, and biology. We describe how to build an optical tweezer with a total cost of ≈6500 using only commercially available optics and mounts. We also suggest measurements that could be made using the apparatus.

  6. Development and testing of a European Union-wide farm-level carbon calculator

    PubMed Central

    Tuomisto, Hanna L; De Camillis, Camillo; Leip, Adrian; Nisini, Luigi; Pelletier, Nathan; Haastrup, Palle

    2015-01-01

    Direct greenhouse gas (GHG) emissions from agriculture accounted for approximately 10% of total European Union (EU) emissions in 2010. To reduce farming-related GHG emissions, appropriate policy measures and supporting tools for promoting low-C farming practices may be efficacious. This article presents the methodology and testing results of a new EU-wide, farm-level C footprint calculator. The Carbon Calculator quantifies GHG emissions based on international standards and technical specifications on Life Cycle Assessment (LCA) and C footprinting. The tool delivers its results both at the farm level and as allocated to up to 5 main products of the farm. In addition to the quantification of GHG emissions, the calculator proposes mitigation options and sequestration actions that may be suitable for individual farms. The results obtained during a survey made on 54 farms from 8 EU Member States are presented. These farms were selected in view of representing the diversity of farm types across different environmental zones in the EU. The results of the C footprint of products in the data set show wide range of variation between minimum and maximum values. The results of the mitigation actions showed that the tool can help identify practices that can lead to substantial emission reductions. To avoid burden-shifting from climate change to other environmental issues, the future improvements of the tool should include incorporation of other environmental impact categories in place of solely focusing on GHG emissions. Integr Environ Assess Manag 2015;11:404–416. © 2015 The Authors. Published by Wiley Periodicals, Inc. on behalf of SETAC. Key Points The methodology and testing results of a new European Union-wide, farm-level carbon calculator are presented. The Carbon Calculator reports life cycle assessment-based greenhouse gas emissions at farm and product levels and recommends farm- specific mitigation actions. Based on the results obtained from testing the tool in 54

  7. PACOM: A Versatile Tool for Integrating, Filtering, Visualizing, and Comparing Multiple Large Mass Spectrometry Proteomics Data Sets.

    PubMed

    Martínez-Bartolomé, Salvador; Medina-Aunon, J Alberto; López-García, Miguel Ángel; González-Tejedo, Carmen; Prieto, Gorka; Navajas, Rosana; Salazar-Donate, Emilio; Fernández-Costa, Carolina; Yates, John R; Albar, Juan Pablo

    2018-04-06

    Mass-spectrometry-based proteomics has evolved into a high-throughput technology in which numerous large-scale data sets are generated from diverse analytical platforms. Furthermore, several scientific journals and funding agencies have emphasized the storage of proteomics data in public repositories to facilitate its evaluation, inspection, and reanalysis. (1) As a consequence, public proteomics data repositories are growing rapidly. However, tools are needed to integrate multiple proteomics data sets to compare different experimental features or to perform quality control analysis. Here, we present a new Java stand-alone tool, Proteomics Assay COMparator (PACOM), that is able to import, combine, and simultaneously compare numerous proteomics experiments to check the integrity of the proteomic data as well as verify data quality. With PACOM, the user can detect source of errors that may have been introduced in any step of a proteomics workflow and that influence the final results. Data sets can be easily compared and integrated, and data quality and reproducibility can be visually assessed through a rich set of graphical representations of proteomics data features as well as a wide variety of data filters. Its flexibility and easy-to-use interface make PACOM a unique tool for daily use in a proteomics laboratory. PACOM is available at https://github.com/smdb21/pacom .

  8. Service Learning and Building Community with the World Wide Web

    ERIC Educational Resources Information Center

    Longan, Michael W.

    2007-01-01

    The geography education literature touts the World Wide Web (Web) as a revolutionary educational tool, yet most accounts ignore its uses for public communication and creative expression. This article argues that students can be producers of content that is of service to local audiences. Drawing inspiration from the community networking movement,…

  9. Overview of theory and simulations in the Heavy Ion Fusion Science Virtual National Laboratory

    NASA Astrophysics Data System (ADS)

    Friedman, Alex

    2007-07-01

    The Heavy Ion Fusion Science Virtual National Laboratory (HIFS-VNL) is a collaboration of Lawrence Berkeley National Laboratory, Lawrence Livermore National Laboratory, and Princeton Plasma Physics Laboratory. These laboratories, in cooperation with researchers at other institutions, are carrying out a coordinated effort to apply intense ion beams as drivers for studies of the physics of matter at extreme conditions, and ultimately for inertial fusion energy. Progress on this endeavor depends upon coordinated application of experiments, theory, and simulations. This paper describes the state of the art, with an emphasis on the coordination of modeling and experiment; developments in the simulation tools, and in the methods that underly them, are also treated.

  10. Automated cell analysis tool for a genome-wide RNAi screen with support vector machine based supervised learning

    NASA Astrophysics Data System (ADS)

    Remmele, Steffen; Ritzerfeld, Julia; Nickel, Walter; Hesser, Jürgen

    2011-03-01

    RNAi-based high-throughput microscopy screens have become an important tool in biological sciences in order to decrypt mostly unknown biological functions of human genes. However, manual analysis is impossible for such screens since the amount of image data sets can often be in the hundred thousands. Reliable automated tools are thus required to analyse the fluorescence microscopy image data sets usually containing two or more reaction channels. The herein presented image analysis tool is designed to analyse an RNAi screen investigating the intracellular trafficking and targeting of acylated Src kinases. In this specific screen, a data set consists of three reaction channels and the investigated cells can appear in different phenotypes. The main issue of the image processing task is an automatic cell segmentation which has to be robust and accurate for all different phenotypes and a successive phenotype classification. The cell segmentation is done in two steps by segmenting the cell nuclei first and then using a classifier-enhanced region growing on basis of the cell nuclei to segment the cells. The classification of the cells is realized by a support vector machine which has to be trained manually using supervised learning. Furthermore, the tool is brightness invariant allowing different staining quality and it provides a quality control that copes with typical defects during preparation and acquisition. A first version of the tool has already been successfully applied for an RNAi-screen containing three hundred thousand image data sets and the SVM extended version is designed for additional screens.

  11. Design and implementation of a hospital-based usability laboratory: insights from a Department of Veterans Affairs laboratory for health information technology.

    PubMed

    Russ, Alissa L; Weiner, Michael; Russell, Scott A; Baker, Darrell A; Fahner, W Jeffrey; Saleem, Jason J

    2012-12-01

    Although the potential benefits of more usable health information technologies (HIT) are substantial-reduced HIT support costs, increased work efficiency, and improved patient safety--human factors methods to improve usability are rarely employed. The US Department of Veterans Affairs (VA) has emerged as an early leader in establishing usability laboratories to inform the design of HIT, including its electronic health record. Experience with a usability laboratory at a VA Medical Center provides insights on how to design, implement, and leverage usability laboratories in the health care setting. The VA Health Services Research and Development Service Human-Computer Interaction & Simulation Laboratory emerged as one of the first VA usability laboratories and was intended to provide research-based findings about HIT designs. This laboratory supports rapid prototyping, formal usability testing, and analysis tools to assess existing technologies, alternative designs, and potential future technologies. RESULTS OF IMPLEMENTATION: Although the laboratory has maintained a research focus, it has become increasingly integrated with VA operations, both within the medical center and on a national VA level. With this resource, data-driven recommendations have been provided for the design of HIT applications before and after implementation. The demand for usability testing of HIT is increasing, and information on how to develop usability laboratories for the health care setting is often needed. This article may assist other health care organizations that want to invest in usability resources to improve HIT. The establishment and utilization of usability laboratories in the health care setting may improve HIT designs and promote safe, high-quality care for patients.

  12. Using Self-Reflection To Increase Science Process Skills in the General Chemistry Laboratory

    NASA Astrophysics Data System (ADS)

    Veal, William R.; Taylor, Dawne; Rogers, Amy L.

    2009-03-01

    Self-reflection is a tool of instruction that has been used in the science classroom. Research has shown great promise in using video as a learning tool in the classroom. However, the integration of self-reflective practice using video in the general chemistry laboratory to help students develop process skills has not been done. Immediate video feedback and direct instruction were employed in a general chemistry laboratory course to improve students' mastery and understanding of basic and advanced process skills. Qualitative results and statistical analysis of quantitative data proved that self-reflection significantly helped students develop basic and advanced process skills, yet did not seem to influence the general understanding of the science content.

  13. Teaching chemistry and other sciences to blind and low-vision students through hands-on learning experiences in high school science laboratories

    NASA Astrophysics Data System (ADS)

    Supalo, Cary Alan

    2010-11-01

    Students with blindness and low vision (BLV) have traditionally been underrepresented in the sciences as a result of technological and attitudinal barriers to equal access in science laboratory classrooms. The Independent Laboratory Access for the Blind (ILAB) project developed and evaluated a suite of talking and audible hardware/software tools to empower students with BLV to have multisensory, hands-on laboratory learning experiences. This dissertation focuses on the first year of ILAB tool testing in mainstream science laboratory classrooms, and comprises a detailed multi-case study of four students with BLV who were enrolled in high school science classes during 2007--08 alongside sighted students. Participants attended different schools; curricula included chemistry, AP chemistry, and AP physics. The ILAB tools were designed to provide multisensory means for students with BLV to make observations and collect data during standard laboratory lessons on an equivalent basis with their sighted peers. Various qualitative and quantitative data collection instruments were used to determine whether the hands-on experiences facilitated by the ILAB tools had led to increased involvement in laboratory-goal-directed actions, greater peer acceptance in the students' lab groups, improved attitudes toward science, and increased interest in science. Premier among the ILAB tools was the JAWS/Logger Pro software interface, which made audible all information gathered through standard Vernier laboratory probes and visually displayed through Logger Pro. ILAB tools also included a talking balance, a submersible audible light sensor, a scientific talking stopwatch, and a variety of other high-tech and low-tech devices and techniques. While results were mixed, all four participating BLV students seemed to have experienced at least some benefit, with the effect being stronger for some than for others. Not all of the data collection instruments were found to reveal improvements for all

  14. Recent advances in managing vascular occlusions in the cardiac catheterization laboratory

    PubMed Central

    Qureshi, Athar M.; Mullins, Charles E.; Latson, Larry A.

    2018-01-01

    Vascular occlusions continue to be a significant cause of morbidity and mortality. The management of vascular occlusions in patients is complex, requiring specialized expertise in the cardiac catheterization laboratory and from other disciplines. Knowledge of currently available tools at the operator’s disposal is important to optimize the success of these procedures. In this review, we discuss some of the recent advances in recanalization procedures of vascular occlusions and thrombotic lesions in the cardiac catheterization laboratory. PMID:29770200

  15. [The analytical reliability of clinical laboratory information and role of the standards in its support].

    PubMed

    Men'shikov, V V

    2012-12-01

    The article deals with the factors impacting the reliability of clinical laboratory information. The differences of qualities of laboratory analysis tools produced by various manufacturers are discussed. These characteristics are the causes of discrepancy of the results of laboratory analyses of the same analite. The role of the reference system in supporting the comparability of laboratory analysis results is demonstrated. The project of national standard is presented to regulate the requirements to standards and calibrators for analysis of qualitative and non-metrical characteristics of components of biomaterials.

  16. Laboratory security and emergency response guidance for laboratories working with select agents. Centers for Disease Control and Prevention.

    PubMed

    Richmond, Jonathan Y; Nesby-O'Dell, Shanna L

    2002-12-06

    In recent years, concern has increased regarding use of biologic materials as agents of terrorism, but these same agents are often necessary tools in clinical and research microbiology laboratories. Traditional biosafety guidelines for laboratories have emphasized use of optimal work practices, appropriate containment equipment, well-designed facilities, and administrative controls to minimize risk of worker injury and to ensure safeguards against laboratory contamination. The guidelines discussed in this report were first published in 1999 (U.S. Department of Health and Human Services/CDC and National Institutes of Health. Biosafety in microbiological and biomedical laboratories [BMBL]. Richmond JY, McKinney RW, eds. 4th ed. Washington, DC: US Department of Health and Human Services, 1999 [Appendix F]). In that report, physical security concerns were addressed, and efforts were focused on preventing unauthorized entry to laboratory areas and preventing unauthorized removal of dangerous biologic agents from the laboratory. Appendix F of BMBL is now being revised to include additional information regarding personnel risk assessments, and inventory controls. The guidelines contained in this report are intended for laboratories working with select agents under biosafety-level 2, 3, or 4 conditions as described in Sections II and III of BMBL. These recommendations include conducting facility risk assessments and developing comprehensive security plans to minimize the probability of misuse of select agents. Risk assessments should include systematic, site-specific reviews of 1) physical security; 2) security of data and electronic technology systems; 3) employee security; 4) access controls to laboratory and animal areas; 5) procedures for agent inventory and accountability; 6) shipping/transfer and receiving of select agents; 7) unintentional incident and injury policies; 8) emergency response plans; and 9) policies that address breaches in security. The security plan

  17. Online Analysis of Wind and Solar Part I: Ramping Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Etingov, Pavel V.; Ma, Jian; Makarov, Yuri V.

    2012-01-31

    To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. This tool predicts and displays additional capacity and ramping requirements caused by uncertainties in forecasts of loads and renewable generation. The tool is currently operational in the CAISO operations center. This is one of two final reports on the project.

  18. Dynamic principle for ensemble control tools.

    PubMed

    Samoletov, A; Vasiev, B

    2017-11-28

    Dynamical equations describing physical systems in contact with a thermal bath are commonly extended by mathematical tools called "thermostats." These tools are designed for sampling ensembles in statistical mechanics. Here we propose a dynamic principle underlying a range of thermostats which is derived using fundamental laws of statistical physics and ensures invariance of the canonical measure. The principle covers both stochastic and deterministic thermostat schemes. Our method has a clear advantage over a range of proposed and widely used thermostat schemes that are based on formal mathematical reasoning. Following the derivation of the proposed principle, we show its generality and illustrate its applications including design of temperature control tools that differ from the Nosé-Hoover-Langevin scheme.

  19. Comparison of a gross anatomy laboratory to online anatomy software for teaching anatomy.

    PubMed

    Mathiowetz, Virgil; Yu, Chih-Huang; Quake-Rapp, Cindee

    2016-01-01

    This study was designed to assess the grades, self-perceived learning, and satisfaction between occupational therapy students who used a gross anatomy laboratory versus online anatomy software (AnatomyTV) as tools to learn anatomy at a large public university and a satellite campus in the mid-western United States. The goal was to determine if equivalent learning outcomes could be achieved regardless of learning tool used. In addition, it was important to determine why students chose the gross anatomy laboratory over online AnatomyTV. A two group, post-test only design was used with data gathered at the end of the course. Primary outcomes were students' grades, self-perceived learning, and satisfaction. In addition, a survey was used to collect descriptive data. One cadaver prosection was available for every four students in the gross anatomy laboratory. AnatomyTV was available online through the university library. At the conclusion of the course, the gross anatomy laboratory group had significantly higher grade percentage, self-perceived learning, and satisfaction than the AnatomyTV group. However, the practical significance of the difference is debatable. The significantly greater time spent in gross anatomy laboratory during the laboratory portion of the course may have affected the study outcomes. In addition, some students may find the difference in (B+) versus (A-) grade as not practically significant. Further research needs to be conducted to identify what specific anatomy teaching resources are most effective beyond prosection for students without access to a gross anatomy laboratory. © 2015 American Association of Anatomists.

  20. Benchmarking a Visual-Basic based multi-component one-dimensional reactive transport modeling tool

    NASA Astrophysics Data System (ADS)

    Torlapati, Jagadish; Prabhakar Clement, T.

    2013-01-01

    We present the details of a comprehensive numerical modeling tool, RT1D, which can be used for simulating biochemical and geochemical reactive transport problems. The code can be run within the standard Microsoft EXCEL Visual Basic platform, and it does not require any additional software tools. The code can be easily adapted by others for simulating different types of laboratory-scale reactive transport experiments. We illustrate the capabilities of the tool by solving five benchmark problems with varying levels of reaction complexity. These literature-derived benchmarks are used to highlight the versatility of the code for solving a variety of practical reactive transport problems. The benchmarks are described in detail to provide a comprehensive database, which can be used by model developers to test other numerical codes. The VBA code presented in the study is a practical tool that can be used by laboratory researchers for analyzing both batch and column datasets within an EXCEL platform.

  1. [Tasks and duties of veterinary reference laboratories for food borne zoonoses].

    PubMed

    Ellerbroek, Lüppo; Alter, T; Johne, R; Nöckler, K; Beutin, L; Helmuth, R

    2009-02-01

    Reference laboratories are of central importance for consumer protection. Field expertise and high scientific competence are basic requirements for the nomination of a national reference laboratory. To ensure a common approach in the analysis of zoonotic hazards, standards have been developed by the reference laboratories together with national official laboratories on the basis of Art. 33 of Directive (EG) No. 882/2004. Reference laboratories function as arbitrative boards in the case of ambivalent or debatable results. New methods for detection of zoonotic agents are developed and validated to provide tools for analysis, e. g., in legal cases, if results from different parties are disputed. Besides these tasks, national reference laboratories offer capacity building and advanced training courses and control the performance of ring trials to ensure consistency in the quality of analyses in official laboratories. All reference laboratories work according to the ISO standard 17025 which defines the grounds for strict laboratory quality rules and in cooperation with the respective Community Reference Laboratories (CRL). From the group of veterinary reference laboratories for food-borne zoonoses, the national reference laboratories are responsible for Listeria monocytogenes, for Campylobacter, for the surveillance and control of viral and bacterial contamination of bivalve molluscs, for E. coli, for the performance of analysis and tests on zoonoses (Salmonella), and from the group of parasitological zoonotic agents, the national reference laboratory for Trichinella.

  2. Laboratory Spectroscopy of Astrophysically-Relevant Materials: Developing Dust as a Diagnostic

    NASA Technical Reports Server (NTRS)

    Rinehart, Stephen A.

    2010-01-01

    Over forty years ago, observations in the new field of infrared astronomy showed a broad spectral feature at 10 microns; the feature was quickly associated with the presence of silicate-rich dust. Since that time, improvements in infrared astronomy have led to the discovery of a plethora of additional spectral features attributable to dust. By combining these observations with spectroscopic data acquired in the laboratory, astronomers have a diagnostic tool that can be used to explore underlying astronomical phenomena. As the laboratory data improves, so does our ability to interpret the astronomical observations. Here, we discuss some recent progress in laboratory spectroscopy and attempt to identify future research directions.

  3. Introduction to Nucleonics: A Laboratory Course. Teacher's Guide.

    ERIC Educational Resources Information Center

    Phelps, William; And Others

    This collection of laboratory lessons is designed primarily for the non-college bound high school student. It can be adapted, however, to a wide range of abilities. It begins with an examination of the properties of nuclear radiation, develops an understanding of the fundamentals of nucleonics, and ends with an investigation of careers in areas…

  4. Understanding variations in secondary findings reporting practices across U.S. genome sequencing laboratories.

    PubMed

    Ackerman, Sara L; Koenig, Barbara A

    2018-01-01

    Increasingly used for clinical purposes, genome and exome sequencing can generate clinically relevant information that is not directly related to the reason for testing (incidental or secondary findings). Debates about the ethical implications of secondary findings were sparked by the American College of Medical Genetics (ACMG) 2013 policy statement, which recommended that laboratories report pathogenic alterations in 56 genes. Although wide variation in laboratories' secondary findings policies has been reported, little is known about its causes. We interviewed 18 laboratory directors and genetic counselors at 10 U.S. laboratories to investigate the motivations and interests shaping secondary findings reporting policies for clinical exome sequencing. Analysis of interview transcripts and laboratory documents was informed by sociological theories of standardization. Laboratories varied widely in terms of the types of secondary findings reported, consent-form language, and choices offered to patients. In explaining their adaptation of the ACMG report, our participants weighed genetic information's clinical, moral, professional, and commercial value in an attempt to maximize benefits for patients and families, minimize the costs of sequencing and analysis, adhere to professional norms, attract customers, and contend with the uncertain clinical implications of much of the genetic information generated. Nearly all laboratories in our study voluntarily adopted ACMG's recommendations, but their actual practices varied considerably and were informed by laboratory-specific judgments about clinical utility and patient benefit. Our findings offer a compelling example of standardization as a complex process that rarely leads simply to uniformity of practice. As laboratories take on a more prominent role in decisions about the return of genetic information, strategies are needed to inform patients, families, and clinicians about the differences between laboratories' practices

  5. Laboratory prototype flash evaporator

    NASA Technical Reports Server (NTRS)

    Gaddis, J. L.

    1972-01-01

    A laboratory prototype flash evaporator that is being developed as a candidate for the space shuttle environmental control system expendable heat sink is described. The single evaporator configuration uses water as an evaporant to accommodate reentry and on-orbit peak heat loads, and Freon 22 for terrestrial flight phases below 120,000 feet altitude. The design features, fabrication techniques used for the prototype unit, redundancy considerations, and the fluid temperature control arrangement are reported in detail. The results of an extensive test program to determine the evaporator operational characteristics under a wide variety of conditions are presented.

  6. A wide-band high-resolution spectrum analyzer

    NASA Technical Reports Server (NTRS)

    Quirk, Maureen P.; Garyantes, Michael F.; Wilck, Helmut C.; Grimm, Michael J.

    1988-01-01

    A two-million-channel, 40 MHz bandwidth, digital spectrum analyzer under development at the Jet Propulsion Laboratory is described. The analyzer system will serve as a prototype processor for the sky survey portion of NASA's Search for Extraterrestrial Intelligence program and for other applications in the Deep Space Network. The analyzer digitizes an analog input, performs a 2 (sup 21) point Discrete Fourier Transform, accumulates the output power, normalizes the output to remove frequency-dependent gain, and automates simple detection algorithms. Due to its built-in frequency-domain processing functions and configuration flexibility, the analyzer is a very powerful tool for real-time signal analysis.

  7. A wide-band high-resolution spectrum analyzer.

    PubMed

    Quirk, M P; Garyantes, M F; Wilck, H C; Grimm, M J

    1988-12-01

    This paper describes a two-million-channel 40-MHz-bandwidth, digital spectrum analyzer under development at the Jet Propulsion Laboratory. The analyzer system will serve as a prototype processor for the sky survey portion of NASA's Search for Extraterrestrial Intelligence program and for other applications in the Deep Space Network. The analyzer digitizes an analog input, performs a 2(21)-point, Discrete Fourier Transform, accumulates the output power, normalizes the output to remove frequency-dependent gain, and automates simple signal detection algorithms. Due to its built-in frequency-domain processing functions and configuration flexibility, the analyzer is a very powerful tool for real-time signal analysis and detection.

  8. Greenhouse gases from wastewater treatment - A review of modelling tools.

    PubMed

    Mannina, Giorgio; Ekama, George; Caniani, Donatella; Cosenza, Alida; Esposito, Giovanni; Gori, Riccardo; Garrido-Baserba, Manel; Rosso, Diego; Olsson, Gustaf

    2016-05-01

    Nitrous oxide, carbon dioxide and methane are greenhouse gases (GHG) emitted from wastewater treatment that contribute to its carbon footprint. As a result of the increasing awareness of GHG emissions from wastewater treatment plants (WWTPs), new modelling, design, and operational tools have been developed to address and reduce GHG emissions at the plant-wide scale and beyond. This paper reviews the state-of-the-art and the recently developed tools used to understand and manage GHG emissions from WWTPs, and discusses open problems and research gaps. The literature review reveals that knowledge on the processes related to N2O formation, especially due to autotrophic biomass, is still incomplete. The literature review shows also that a plant-wide modelling approach that includes GHG is the best option for the understanding how to reduce the carbon footprint of WWTPs. Indeed, several studies have confirmed that a wide vision of the WWPTs has to be considered in order to make them more sustainable as possible. Mechanistic dynamic models were demonstrated as the most comprehensive and reliable tools for GHG assessment. Very few plant-wide GHG modelling studies have been applied to real WWTPs due to the huge difficulties related to data availability and the model complexity. For further improvement in GHG plant-wide modelling and to favour its use at large real scale, knowledge of the mechanisms involved in GHG formation and release, and data acquisition must be enhanced. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Facilitating Improvements in Laboratory Report Writing Skills with Less Grading: A Laboratory Report Peer-Review Process†

    PubMed Central

    Brigati, Jennifer R.; Swann, Jerilyn M.

    2015-01-01

    Incorporating peer-review steps in the laboratory report writing process provides benefits to students, but it also can create additional work for laboratory instructors. The laboratory report writing process described here allows the instructor to grade only one lab report for every two to four students, while giving the students the benefits of peer review and prompt feedback on their laboratory reports. Here we present the application of this process to a sophomore level genetics course and a freshman level cellular biology course, including information regarding class time spent on student preparation activities, instructor preparation, prerequisite student knowledge, suggested learning outcomes, procedure, materials, student instructions, faculty instructions, assessment tools, and sample data. T-tests comparing individual and group grading of the introductory cell biology lab reports yielded average scores that were not significantly different from each other (p = 0.13, n = 23 for individual grading, n = 6 for group grading). T-tests also demonstrated that average laboratory report grades of students using the peer-review process were not significantly different from those of students working alone (p = 0.98, n = 9 for individual grading, n = 6 for pair grading). While the grading process described here does not lead to statistically significant gains (or reductions) in student learning, it allows student learning to be maintained while decreasing instructor workload. This reduction in workload could allow the instructor time to pursue other high-impact practices that have been shown to increase student learning. Finally, we suggest possible modifications to the procedure for application in a variety of settings. PMID:25949758

  10. Facilitating improvements in laboratory report writing skills with less grading: a laboratory report peer-review process.

    PubMed

    Brigati, Jennifer R; Swann, Jerilyn M

    2015-05-01

    Incorporating peer-review steps in the laboratory report writing process provides benefits to students, but it also can create additional work for laboratory instructors. The laboratory report writing process described here allows the instructor to grade only one lab report for every two to four students, while giving the students the benefits of peer review and prompt feedback on their laboratory reports. Here we present the application of this process to a sophomore level genetics course and a freshman level cellular biology course, including information regarding class time spent on student preparation activities, instructor preparation, prerequisite student knowledge, suggested learning outcomes, procedure, materials, student instructions, faculty instructions, assessment tools, and sample data. T-tests comparing individual and group grading of the introductory cell biology lab reports yielded average scores that were not significantly different from each other (p = 0.13, n = 23 for individual grading, n = 6 for group grading). T-tests also demonstrated that average laboratory report grades of students using the peer-review process were not significantly different from those of students working alone (p = 0.98, n = 9 for individual grading, n = 6 for pair grading). While the grading process described here does not lead to statistically significant gains (or reductions) in student learning, it allows student learning to be maintained while decreasing instructor workload. This reduction in workload could allow the instructor time to pursue other high-impact practices that have been shown to increase student learning. Finally, we suggest possible modifications to the procedure for application in a variety of settings.

  11. [Mechanism Causing Abnormal Laboratory Data--Significance of Electrophoresis and Information Transmission--Chairmen's Introductory Remarks].

    PubMed

    Maekawa, Masato; Fujita, Kiyotaka

    2014-11-01

    Abnormal laboratory data are observed due to some kinds of modification as well as pathological conditions of patients. Elucidation of the causal mechanism is very important for clinical laboratories. This symposium was planned to highlight the significance of electrophoresis. Electrophoresis is one of the most important tools to provide clinicians with information for medical diagnosis and care.

  12. Battery Storage Evaluation Tool, version 1.x

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-10-02

    The battery storage evaluation tool developed at Pacific Northwest National Laboratory is used to run a one-year simulation to evaluate the benefits of battery storage for multiple grid applications, including energy arbitrage, balancing service, capacity value, distribution system equipment deferral, and outage mitigation. This tool is based on the optimal control strategies to capture multiple services from a single energy storage device. In this control strategy, at each hour, a lookahead optimization is first formulated and solved to determine the battery base operating point. The minute-by-minute simulation is then performed to simulate the actual battery operation.

  13. Gene editing tools: state-of-the-art and the road ahead for the model and non-model fishes.

    PubMed

    Barman, Hirak Kumar; Rasal, Kiran Dashrath; Chakrapani, Vemulawada; Ninawe, A S; Vengayil, Doyil T; Asrafuzzaman, Syed; Sundaray, Jitendra K; Jayasankar, Pallipuram

    2017-10-01

    Advancements in the DNA sequencing technologies and computational biology have revolutionized genome/transcriptome sequencing of non-model fishes at an affordable cost. This has led to a paradigm shift with regard to our heightened understandings of structure-functional relationships of genes at a global level, from model animals/fishes to non-model large animals/fishes. Whole genome/transcriptome sequencing technologies were supplemented with the series of discoveries in gene editing tools, which are being used to modify genes at pre-determined positions using programmable nucleases to explore their respective in vivo functions. For a long time, targeted gene disruption experiments were mostly restricted to embryonic stem cells, advances in gene editing technologies such as zinc finger nuclease, transcriptional activator-like effector nucleases and CRISPR (clustered regulatory interspaced short palindromic repeats)/CRISPR-associated nucleases have facilitated targeted genetic modifications beyond stem cells to a wide range of somatic cell lines across species from laboratory animals to farmed animals/fishes. In this review, we discuss use of different gene editing tools and the strategic implications in fish species for basic and applied biology research.

  14. Chemometric tool for identification of iron-gall inks by use of visible-near infrared fibre optic reflection spectroscopy.

    PubMed

    Gál, Lukáš; Čeppan, Michal; Reháková, Milena; Dvonka, Vladimír; Tarajčáková, Jarmila; Hanus, Jozef

    2013-11-01

    A method has been developed for identification of corrosive iron-gall inks in historical drawings and documents. The method is based on target-factor analysis of visible-near infrared fibre optic reflection spectra (VIS-NIR FORS). A set of reference spectra was obtained from model samples of laboratory-prepared inks covering a wide range of mixing ratios of basic ink components deposited on substrates and artificially aged. As criteria for correspondence of a studied spectrum with a reference spectrum, the apparent error in target (AET) and the empirical function SPOIL according to Malinowski were used. The capability of the proposed tool to distinguish corrosive iron-gall inks from bistre and sepia inks was evaluated by use of a set of control samples of bistre, sepia, and iron-gall inks. Examples are presented of analysis of historical drawings from the 15th and 16th centuries and written documents from the 19th century. The results of analysis based on the tool were confirmed by XRF analysis and colorimetric spot analysis.

  15. A user-friendly tool to transform large scale administrative data into wide table format using a MapReduce program with a Pig Latin based script.

    PubMed

    Horiguchi, Hiromasa; Yasunaga, Hideo; Hashimoto, Hideki; Ohe, Kazuhiko

    2012-12-22

    Secondary use of large scale administrative data is increasingly popular in health services and clinical research, where a user-friendly tool for data management is in great demand. MapReduce technology such as Hadoop is a promising tool for this purpose, though its use has been limited by the lack of user-friendly functions for transforming large scale data into wide table format, where each subject is represented by one row, for use in health services and clinical research. Since the original specification of Pig provides very few functions for column field management, we have developed a novel system called GroupFilterFormat to handle the definition of field and data content based on a Pig Latin script. We have also developed, as an open-source project, several user-defined functions to transform the table format using GroupFilterFormat and to deal with processing that considers date conditions. Having prepared dummy discharge summary data for 2.3 million inpatients and medical activity log data for 950 million events, we used the Elastic Compute Cloud environment provided by Amazon Inc. to execute processing speed and scaling benchmarks. In the speed benchmark test, the response time was significantly reduced and a linear relationship was observed between the quantity of data and processing time in both a small and a very large dataset. The scaling benchmark test showed clear scalability. In our system, doubling the number of nodes resulted in a 47% decrease in processing time. Our newly developed system is widely accessible as an open resource. This system is very simple and easy to use for researchers who are accustomed to using declarative command syntax for commercial statistical software and Structured Query Language. Although our system needs further sophistication to allow more flexibility in scripts and to improve efficiency in data processing, it shows promise in facilitating the application of MapReduce technology to efficient data processing with

  16. ECOSYSTEM RESTORATION RESEARCH THROUGH THE NATIONAL RISK MANAGEMENT RESEARCH LABORATORY (NRMRL)

    EPA Science Inventory

    The Ecosystem Restoration Research Program underway through ORD's National Risk Management Research Laboratory (NRMRL) has the long-term goal of providing watershed managers with "..state-of-the-science field-evaluated tools, technical guidance, and decision-support systems for s...

  17. Multiple Solutions of Real-time Tsunami Forecasting Using Short-term Inundation Forecasting for Tsunamis Tool

    NASA Astrophysics Data System (ADS)

    Gica, E.

    2016-12-01

    The Short-term Inundation Forecasting for Tsunamis (SIFT) tool, developed by NOAA Center for Tsunami Research (NCTR) at the Pacific Marine Environmental Laboratory (PMEL), is used in forecast operations at the Tsunami Warning Centers in Alaska and Hawaii. The SIFT tool relies on a pre-computed tsunami propagation database, real-time DART buoy data, and an inversion algorithm to define the tsunami source. The tsunami propagation database is composed of 50×100km unit sources, simulated basin-wide for at least 24 hours. Different combinations of unit sources, DART buoys, and length of real-time DART buoy data can generate a wide range of results within the defined tsunami source. For an inexperienced SIFT user, the primary challenge is to determine which solution, among multiple solutions for a single tsunami event, would provide the best forecast in real time. This study investigates how the use of different tsunami sources affects simulated tsunamis at tide gauge locations. Using the tide gauge at Hilo, Hawaii, a total of 50 possible solutions for the 2011 Tohoku tsunami are considered. Maximum tsunami wave amplitude and root mean square error results are used to compare tide gauge data and the simulated tsunami time series. Results of this study will facilitate SIFT users' efforts to determine if the simulated tide gauge tsunami time series from a specific tsunami source solution would be within the range of possible solutions. This study will serve as the basis for investigating more historical tsunami events and tide gauge locations.

  18. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Management District, and other non-Federal sponsors shall rely on the best available science including models..., and assessment of projects. The selection of models and analytical tools shall be done in consultation... system-wide simulation models and analytical tools used in the evaluation and assessment of projects, and...

  19. Cryogenic Pressure Calibrator for Wide Temperature Electronically Scanned (ESP) Pressure Modules

    NASA Technical Reports Server (NTRS)

    Faulcon, Nettie D.

    2001-01-01

    Electronically scanned pressure (ESP) modules have been developed that can operate in ambient and in cryogenic environments, particularly Langley's National Transonic Facility (NTF). Because they can operate directly in a cryogenic environment, their use eliminates many of the operational problems associated with using conventional modules at low temperatures. To ensure the accuracy of these new instruments, calibration was conducted in a laboratory simulating the environmental conditions of NTF. This paper discusses the calibration process by means of the simulation laboratory, the system inputs and outputs and the analysis of the calibration data. Calibration results of module M4, a wide temperature ESP module with 16 ports and a pressure range of +/- 4 psid are given.

  20. Laboratory information management system: an example of international cooperation in Namibia.

    PubMed

    Colangeli, Patrizia; Ferrilli, Monica; Quaranta, Fabrizio; Malizia, Elio; Mbulu, Rosa-Stella; Mukete, Esther; Iipumbu, Lukas; Kamhulu, Anna; Tjipura-Zaire, Georgina; Di Francesco, Cesare; Lelli, Rossella; Scacchia, Massimo

    2012-01-01

    The authors describe the project undertaken by the Istituto G. Caporale to provide a laboratory information management system (LIMS) to the Central Veterinary Laboratory (CVL) in Windhoek, Namibia. This robust laboratory management tool satisfies Namibia's information obligations under international quality standard ISO 17025:2005. The Laboratory Information Management System (LIMS) for Africa was designed to collect and manage all necessary information on samples, tests and test results. The system involves the entry of sample data on arrival, as required by Namibian sampling plans, the tracking of samples through the various sections of the CVL, the collection of test results, generation of test reports and monitoring of outbreaks through data interrogation functions, eliminating multiple registrations of the same data on paper records. It is a fundamental component of the Namibian veterinary information system.

  1. The Small Body Mapping Tool (SBMT) for Accessing, Visualizing, and Analyzing Spacecraft Data in Three Dimensions

    NASA Astrophysics Data System (ADS)

    Barnouin, O. S.; Ernst, C. M.; Daly, R. T.

    2018-04-01

    The free, publicly available Small Body Mapping Tool (SBMT) developed at the Johns Hopkins University Applied Physics Laboratory is a powerful, easy-to-use tool for accessing and analyzing data from small bodies.

  2. SkICAT: A cataloging and analysis tool for wide field imaging surveys

    NASA Technical Reports Server (NTRS)

    Weir, N.; Fayyad, U. M.; Djorgovski, S. G.; Roden, J.

    1992-01-01

    We describe an integrated system, SkICAT (Sky Image Cataloging and Analysis Tool), for the automated reduction and analysis of the Palomar Observatory-ST ScI Digitized Sky Survey. The Survey will consist of the complete digitization of the photographic Second Palomar Observatory Sky Survey (POSS-II) in three bands, comprising nearly three Terabytes of pixel data. SkICAT applies a combination of existing packages, including FOCAS for basic image detection and measurement and SAS for database management, as well as custom software, to the task of managing this wealth of data. One of the most novel aspects of the system is its method of object classification. Using state-of-theart machine learning classification techniques (GID3* and O-BTree), we have developed a powerful method for automatically distinguishing point sources from non-point sources and artifacts, achieving comparably accurate discrimination a full magnitude fainter than in previous Schmidt plate surveys. The learning algorithms produce decision trees for classification by examining instances of objects classified by eye on both plate and higher quality CCD data. The same techniques will be applied to perform higher-level object classification (e.g., of galaxy morphology) in the near future. Another key feature of the system is the facility to integrate the catalogs from multiple plates (and portions thereof) to construct a single catalog of uniform calibration and quality down to the faintest limits of the survey. SkICAT also provides a variety of data analysis and exploration tools for the scientific utilization of the resulting catalogs. We include initial results of applying this system to measure the counts and distribution of galaxies in two bands down to Bj is approximately 21 mag over an approximate 70 square degree multi-plate field from POSS-II. SkICAT is constructed in a modular and general fashion and should be readily adaptable to other large-scale imaging surveys.

  3. New Caledonian crows attend to multiple functional properties of complex tools

    PubMed Central

    St Clair, James J. H.; Rutz, Christian

    2013-01-01

    The ability to attend to the functional properties of foraging tools should affect energy-intake rates, fitness components and ultimately the evolutionary dynamics of tool-related behaviour. New Caledonian crows Corvus moneduloides use three distinct tool types for extractive foraging: non-hooked stick tools, hooked stick tools and tools cut from the barbed edges of Pandanus spp. leaves. The latter two types exhibit clear functional polarity, because of (respectively) a single terminal, crow-manufactured hook and natural barbs running along one edge of the leaf strip; in each case, the ‘hooks’ can only aid prey capture if the tool is oriented correctly by the crow during deployment. A previous experimental study of New Caledonian crows found that subjects paid little attention to the barbs of supplied (wide) pandanus tools, resulting in non-functional tool orientation during foraging. This result is puzzling, given the presumed fitness benefits of consistently orienting tools functionally in the wild. We investigated whether the lack of discrimination with respect to (wide) pandanus tool orientation also applies to hooked stick tools. We experimentally provided subjects with naturalistic replica tools in a range of orientations and found that all subjects used these tools correctly, regardless of how they had been presented. In a companion experiment, we explored the extent to which normally co-occurring tool features (terminal hook, curvature of the tool shaft and stripped bark at the hooked end) inform tool-orientation decisions, by forcing birds to deploy ‘unnatural’ tools, which exhibited these traits at opposite ends. Our subjects attended to at least two of the three tool features, although, as expected, the location of the hook was of paramount importance. We discuss these results in the context of earlier research and propose avenues for future work. PMID:24101625

  4. New Caledonian crows attend to multiple functional properties of complex tools.

    PubMed

    St Clair, James J H; Rutz, Christian

    2013-11-19

    The ability to attend to the functional properties of foraging tools should affect energy-intake rates, fitness components and ultimately the evolutionary dynamics of tool-related behaviour. New Caledonian crows Corvus moneduloides use three distinct tool types for extractive foraging: non-hooked stick tools, hooked stick tools and tools cut from the barbed edges of Pandanus spp. leaves. The latter two types exhibit clear functional polarity, because of (respectively) a single terminal, crow-manufactured hook and natural barbs running along one edge of the leaf strip; in each case, the 'hooks' can only aid prey capture if the tool is oriented correctly by the crow during deployment. A previous experimental study of New Caledonian crows found that subjects paid little attention to the barbs of supplied (wide) pandanus tools, resulting in non-functional tool orientation during foraging. This result is puzzling, given the presumed fitness benefits of consistently orienting tools functionally in the wild. We investigated whether the lack of discrimination with respect to (wide) pandanus tool orientation also applies to hooked stick tools. We experimentally provided subjects with naturalistic replica tools in a range of orientations and found that all subjects used these tools correctly, regardless of how they had been presented. In a companion experiment, we explored the extent to which normally co-occurring tool features (terminal hook, curvature of the tool shaft and stripped bark at the hooked end) inform tool-orientation decisions, by forcing birds to deploy 'unnatural' tools, which exhibited these traits at opposite ends. Our subjects attended to at least two of the three tool features, although, as expected, the location of the hook was of paramount importance. We discuss these results in the context of earlier research and propose avenues for future work.

  5. Dust control effectiveness of drywall sanding tools.

    PubMed

    Young-Corbett, Deborah E; Nussbaum, Maury A

    2009-07-01

    In this laboratory study, four drywall sanding tools were evaluated in terms of dust generation rates in the respirable and thoracic size classes. In a repeated measures study design, 16 participants performed simulated drywall finishing tasks with each of four tools: (1) ventilated sander, (2) pole sander, (3) block sander, and (4) wet sponge. Dependent variables of interest were thoracic and respirable breathing zone dust concentrations. Analysis by Friedman's Test revealed that the ventilated drywall sanding tool produced significantly less dust, of both size classes, than did the other three tools. The pole and wet sanders produced significantly less dust of both size classes than did the block sander. The block sander, the most commonly used tool in drywall finishing operations, produced significantly more dust of both size classes than did the other three tools. When compared with the block sander, the other tools offer substantial dust reduction. The ventilated tool reduced respirable concentrations by 88% and thoracic concentrations by 85%. The pole sander reduced respirable concentrations by 58% and thoracic by 50%. The wet sander produced reductions of 60% and 47% in the respirable and thoracic classes, respectively. Wet sponge sanders and pole sanders are effective at reducing breathing-zone dust concentrations; however, based on its superior dust control effectiveness, the ventilated sander is the recommended tool for drywall finishing operations.

  6. [The role of reference laboratories in animal health programmes in South America].

    PubMed

    Bergmann, I E

    2003-08-01

    The contribution of the Panamerican Foot and Mouth Disease (FMD) Centre (PANAFTOSA), as an OIE (World organisation for animal health) regional reference laboratory for the diagnosis of FMD and vesicular stomatitis, and for the control of the FMD vaccine, has been of fundamental importance to the development, implementation and harmonisation of modern laboratory procedures in South America. The significance of the work conducted by PANAFTOSA is particularly obvious when one considers the two pillars on which eradication programmes are based, namely: a well-structured regional laboratory network, and the creation of a system which allows technology and new developments to be transferred to Member Countries as quickly and efficiently as possible. Over the past decade, PANAFTOSA has kept pace with the changing epidemiological situation on the continent, and with developments in the international political and economical situation. This has involved the strengthening of quality policies, and the elaboration and implementation of diagnostic tools that make for more thorough epidemiological analyses. The integration of PANAFTOSA into the network of national laboratories and its cooperation with technical and scientific institutes, universities and the private sector means that local needs can be met, thanks to the design and rapid implementation of methodological tools which are validated using internationally accepted criteria. This collaboration, which ensures harmonisation of laboratory tests and enhances the quality of national Veterinary Services, serves to promote greater equity, a prerequisite for regional eradication strategies and this in turn, helps to increase competitiveness in the region.

  7. Improving patient safety via automated laboratory-based adverse event grading.

    PubMed

    Niland, Joyce C; Stiller, Tracey; Neat, Jennifer; Londrc, Adina; Johnson, Dina; Pannoni, Susan

    2012-01-01

    The identification and grading of adverse events (AEs) during the conduct of clinical trials is a labor-intensive and error-prone process. This paper describes and evaluates a software tool developed by City of Hope to automate complex algorithms to assess laboratory results and identify and grade AEs. We compared AEs identified by the automated system with those previously assessed manually, to evaluate missed/misgraded AEs. We also conducted a prospective paired time assessment of automated versus manual AE assessment. We found a substantial improvement in accuracy/completeness with the automated grading tool, which identified an additional 17% of severe grade 3-4 AEs that had been missed/misgraded manually. The automated system also provided an average time saving of 5.5 min per treatment course. With 400 ongoing treatment trials at City of Hope and an average of 1800 laboratory results requiring assessment per study, the implications of these findings for patient safety are enormous.

  8. Flotation of Mineral and Dyes: A Laboratory Experiment for Separation Method Molecular Hitchhikers

    ERIC Educational Resources Information Center

    Rappon, Tim; Sylvestre, Jarrett A.; Rappon, Manit

    2016-01-01

    Flotation as a method of separation is widely researched and is applied in many industries. It has been used to address a wide range of environmental issues including treatment of wastewater, recovery of heavy metals for recycling, extraction of minerals in mining, and so forth. This laboratory attempts to show how such a simple method can be used…

  9. Laboratory, Field, and Analytical Procedures for Using ...

    EPA Pesticide Factsheets

    Regardless of the remedial technology invoked to address contaminated sediments in the environment, there is a critical need to have tools for assessing the effectiveness of the remedy. In the past, these tools have included chemical and biomonitoring of the water column and sediments, toxicity testing and bioaccumulation studies performed on site sediments, and application of partitioning, transport and fate modeling. All of these tools served as lines of evidence for making informed environmental management decisions at contaminated sediment sites. In the last ten years, a new tool for assessing remedial effectiveness has gained a great deal of attention. Passive sampling offers a tool capable of measuring the freely dissolved concentration (Cfree) of legacy contaminants in water and sediments. In addition to assessing the effectiveness of the remedy, passive sampling can be applied for a variety of other contaminated sediments site purposes involved with performing the preliminary assessment and site inspection, conducting the remedial investigation and feasibility study, preparing the remedial design, and assessing the potential for contaminant bioaccumulation. While there is a distinct need for using passive sampling at contaminated sediments sites and several previous documents and research articles have discussed various aspects of passive sampling, there has not been definitive guidance on the laboratory, field and analytical procedures for using pas

  10. Producing Production Level Tooling in Prototype Timing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mc Hugh, Kevin Matthew; Knirsch, J.

    A new rapid solidification process machine will be able to produce eight-inch diameter by six-inch thick finished cavities at the rate of one per hour - a rate that will change the tooling industry dramatically. Global Metal Technologies, Inc. (GMTI) (Solon, OH) has signed an exclusive license with Idaho National Engineered and Environmental Laboratories (INEEL) (Idaho Falls, ID) for the development and commercialization of the rapid solidification process (RSP tooling). The first production machine is scheduled for delivery in July 2001. The RSP tooling process is a method of producing production level tooling in prototype timing. The process' inventor, Kevinmore » McHugh, describes it as a rapid solidification method, which differentiates it from the standard spray forming methods. RSP itself is relatively straightforward. Molten metal is sprayed against the ceramic pattern, replicating the pattern's contours, surface texture and details. After spraying, the molten tool steel is cooled at room temperature and separated from the pattern. The irregular periphery of the freshly sprayed insert is squared off, either by machining or, in the case of harder tool steels, by wire EDM. XX« less

  11. External quality assurance performance of clinical research laboratories in sub-saharan Africa.

    PubMed

    Amukele, Timothy K; Michael, Kurt; Hanes, Mary; Miller, Robert E; Jackson, J Brooks

    2012-11-01

    Patient Safety Monitoring in International Laboratories (JHU-SMILE) is a resource at Johns Hopkins University that supports and monitors laboratories in National Institutes of Health-funded international clinical trials. To determine the impact of the JHU-SMILE quality assurance scheme in sub-Saharan African laboratories, we reviewed 40 to 60 months of external quality assurance (EQA) results of the College of American Pathologists (CAP) in these laboratories. We reviewed the performance of 8 analytes: albumin, alanine aminotransferase, creatinine, sodium, WBC, hemoglobin, hematocrit, and the human immunodeficiency virus antibody rapid test. Over the 40- to 60-month observation period, the sub-Saharan laboratories had a 1.63% failure rate, which was 40% lower than the 2011 CAP-wide rate of 2.8%. Seventy-six percent of the observed EQA failures occurred in 4 of the 21 laboratories. These results demonstrate that a system of remote monitoring, feedback, and audits can support quality in low-resource settings, even in places without strong regulatory support for laboratory quality.

  12. Trajectory Design Tools for Libration and Cis-Lunar Environments

    NASA Technical Reports Server (NTRS)

    Folta, David C.; Webster, Cassandra M.; Bosanac, Natasha; Cox, Andrew; Guzzetti, Davide; Howell, Kathleen C.

    2016-01-01

    Innovative trajectory design tools are required to support challenging multi-body regimes with complex dynamics, uncertain perturbations, and the integration of propulsion influences. Two distinctive tools, Adaptive Trajectory Design and the General Mission Analysis Tool have been developed and certified to provide the astrodynamics community with the ability to design multi-body trajectories. In this paper we discuss the multi-body design process and the capabilities of both tools. Demonstrable applications to confirmed missions, the Lunar IceCube Cubesat lunar mission and the Wide-Field Infrared Survey Telescope (WFIRST) Sun-Earth L2 mission, are presented.

  13. Developing Decontamination Tools and Approaches to ...

    EPA Pesticide Factsheets

    Developing Decontamination Tools and Approaches to Address Indoor Pesticide Contamination from Improper Bed Bug Treatments The National Exposure Research Laboratory (NERL) Human Exposure and Atmospheric Sciences Division (HEASD) conducts research in support of EPA mission to protect human health and the environment. HEASD research program supports Goal 1 (Clean Air) and Goal 4 (Healthy People) of EPA strategic plan. More specifically, our division conducts research to characterize the movement of pollutants from the source to contact with humans. Our multidisciplinary research program produces Methods, Measurements, and Models to identify relationships between and characterize processes that link source emissions, environmental concentrations, human exposures, and target-tissue dose. The impact of these tools is improved regulatory programs and policies for EPA.

  14. Technical Capabilities of the National Vehicle and Fuel Emissions Laboratory (NVFEL)

    EPA Pesticide Factsheets

    National Vehicle and Fuel Emissions Laboratory (NVFEL) is a state-of-the-art test facility that conducts a wide range of emissions testing and analysis for EPA’s motor vehicle, heavy-duty engine, and nonroad engine programs.

  15. RATT: Rapid Annotation Transfer Tool

    PubMed Central

    Otto, Thomas D.; Dillon, Gary P.; Degrave, Wim S.; Berriman, Matthew

    2011-01-01

    Second-generation sequencing technologies have made large-scale sequencing projects commonplace. However, making use of these datasets often requires gene function to be ascribed genome wide. Although tool development has kept pace with the changes in sequence production, for tasks such as mapping, de novo assembly or visualization, genome annotation remains a challenge. We have developed a method to rapidly provide accurate annotation for new genomes using previously annotated genomes as a reference. The method, implemented in a tool called RATT (Rapid Annotation Transfer Tool), transfers annotations from a high-quality reference to a new genome on the basis of conserved synteny. We demonstrate that a Mycobacterium tuberculosis genome or a single 2.5 Mb chromosome from a malaria parasite can be annotated in less than five minutes with only modest computational resources. RATT is available at http://ratt.sourceforge.net. PMID:21306991

  16. Wide-area situation awareness in electric power grid

    NASA Astrophysics Data System (ADS)

    Greitzer, Frank L.

    2010-04-01

    Two primary elements of the US energy policy are demand management and efficiency and renewable sources. Major objectives are clean energy transmission and integration, reliable energy transmission, and grid cyber security. Development of the Smart Grid seeks to achieve these goals by lowering energy costs for consumers, achieving energy independence and reducing greenhouse gas emissions. The Smart Grid is expected to enable real time wide-area situation awareness (SA) for operators. Requirements for wide-area SA have been identified among interoperability standards proposed by the Federal Energy Regulatory Commission and the National Institute of Standards and Technology to ensure smart-grid functionality. Wide-area SA and enhanced decision support and visualization tools are key elements in the transformation to the Smart Grid. This paper discusses human factors research to promote SA in the electric power grid and the Smart Grid. Topics that will be discussed include the role of human factors in meeting US energy policy goals, the impact and challenges for Smart Grid development, and cyber security challenges.

  17. National laboratory-based surveillance system for antimicrobial resistance: a successful tool to support the control of antimicrobial resistance in the Netherlands.

    PubMed

    Altorf-van der Kuil, Wieke; Schoffelen, Annelot F; de Greeff, Sabine C; Thijsen, Steven Ft; Alblas, H Jeroen; Notermans, Daan W; Vlek, Anne Lm; van der Sande, Marianne Ab; Leenstra, Tjalling

    2017-11-01

    An important cornerstone in the control of antimicrobial resistance (AMR) is a well-designed quantitative system for the surveillance of spread and temporal trends in AMR. Since 2008, the Dutch national AMR surveillance system, based on routine data from medical microbiological laboratories (MMLs), has developed into a successful tool to support the control of AMR in the Netherlands. It provides background information for policy making in public health and healthcare services, supports development of empirical antibiotic therapy guidelines and facilitates in-depth research. In addition, participation of the MMLs in the national AMR surveillance network has contributed to sharing of knowledge and quality improvement. A future improvement will be the implementation of a new semantic standard together with standardised data transfer, which will reduce errors in data handling and enable a more real-time surveillance. Furthermore, the scientific impact and the possibility of detecting outbreaks may be amplified by merging the AMR surveillance database with databases from selected pathogen-based surveillance programmes containing patient data and genotypic typing data.

  18. National laboratory-based surveillance system for antimicrobial resistance: a successful tool to support the control of antimicrobial resistance in the Netherlands

    PubMed Central

    Altorf-van der Kuil, Wieke; Schoffelen, Annelot F; de Greeff, Sabine C; Thijsen, Steven FT; Alblas, H Jeroen; Notermans, Daan W; Vlek, Anne LM; van der Sande, Marianne AB; Leenstra, Tjalling

    2017-01-01

    An important cornerstone in the control of antimicrobial resistance (AMR) is a well-designed quantitative system for the surveillance of spread and temporal trends in AMR. Since 2008, the Dutch national AMR surveillance system, based on routine data from medical microbiological laboratories (MMLs), has developed into a successful tool to support the control of AMR in the Netherlands. It provides background information for policy making in public health and healthcare services, supports development of empirical antibiotic therapy guidelines and facilitates in-depth research. In addition, participation of the MMLs in the national AMR surveillance network has contributed to sharing of knowledge and quality improvement. A future improvement will be the implementation of a new semantic standard together with standardised data transfer, which will reduce errors in data handling and enable a more real-time surveillance. Furthermore, the scientific impact and the possibility of detecting outbreaks may be amplified by merging the AMR surveillance database with databases from selected pathogen-based surveillance programmes containing patient data and genotypic typing data. PMID:29162208

  19. Watershed Management Optimization Support Tool (WMOST) Webinar

    EPA Pesticide Factsheets

    EPA’s WMOST is a publicly available tool that can be used by state and local managers to screen a wide-range of options for cost-effective management of water resources, and it supports a broader integrated watershed management approach.

  20. Lab Plays Central Role in Groundbreaking National Clinical Trial in Precision Medicine | Frederick National Laboratory for Cancer Research

    Cancer.gov

    The Molecular Characterization Laboratory at the Frederick National Laboratory for Cancer Research lies at the heart of an ambitious new approach for testing cancer drugs that will use the newest tools of precision medicine to select the best treatme