Sample records for processing laboratory users

  1. A software for managing chemical processes in a multi-user laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Camino, Fernando E.

    Here, we report a software for logging chemical processes in a multi-user laboratory, which implements a work flow designed to reduce hazardous situations associated with the disposal of chemicals in incompatible waste containers. The software allows users to perform only those processes displayed in their list of authorized chemical processes and provides the location and label code of waste containers, among other useful information. The software has been used for six years in the cleanroom of the Center for Functional Nanomaterials at Brookhaven National Laboratory and has been an important factor for the excellent safety record of the Center.

  2. A software for managing chemical processes in a multi-user laboratory

    DOE PAGES

    Camino, Fernando E.

    2016-10-26

    Here, we report a software for logging chemical processes in a multi-user laboratory, which implements a work flow designed to reduce hazardous situations associated with the disposal of chemicals in incompatible waste containers. The software allows users to perform only those processes displayed in their list of authorized chemical processes and provides the location and label code of waste containers, among other useful information. The software has been used for six years in the cleanroom of the Center for Functional Nanomaterials at Brookhaven National Laboratory and has been an important factor for the excellent safety record of the Center.

  3. Multiple-User, Multitasking, Virtual-Memory Computer System

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.; Roth, Don J.; Stang, David B.

    1993-01-01

    Computer system designed and programmed to serve multiple users in research laboratory. Provides for computer control and monitoring of laboratory instruments, acquisition and anlaysis of data from those instruments, and interaction with users via remote terminals. System provides fast access to shared central processing units and associated large (from megabytes to gigabytes) memories. Underlying concept of system also applicable to monitoring and control of industrial processes.

  4. The Virtual Geophysics Laboratory (VGL): Scientific Workflows Operating Across Organizations and Across Infrastructures

    NASA Astrophysics Data System (ADS)

    Cox, S. J.; Wyborn, L. A.; Fraser, R.; Rankine, T.; Woodcock, R.; Vote, J.; Evans, B.

    2012-12-01

    The Virtual Geophysics Laboratory (VGL) is web portal that provides geoscientists with an integrated online environment that: seamlessly accesses geophysical and geoscience data services from the AuScope national geoscience information infrastructure; loosely couples these data to a variety of gesocience software tools; and provides large scale processing facilities via cloud computing. VGL is a collaboration between CSIRO, Geoscience Australia, National Computational Infrastructure, Monash University, Australian National University and the University of Queensland. The VGL provides a distributed system whereby a user can enter an online virtual laboratory to seamlessly connect to OGC web services for geoscience data. The data is supplied in open standards formats using international standards like GeoSciML. A VGL user uses a web mapping interface to discover and filter the data sources using spatial and attribute filters to define a subset. Once the data is selected the user is not required to download the data. VGL collates the service query information for later in the processing workflow where it will be staged directly to the computing facilities. The combination of deferring data download and access to Cloud computing enables VGL users to access their data at higher resolutions and to undertake larger scale inversions, more complex models and simulations than their own local computing facilities might allow. Inside the Virtual Geophysics Laboratory, the user has access to a library of existing models, complete with exemplar workflows for specific scientific problems based on those models. For example, the user can load a geological model published by Geoscience Australia, apply a basic deformation workflow provided by a CSIRO scientist, and have it run in a scientific code from Monash. Finally the user can publish these results to share with a colleague or cite in a paper. This opens new opportunities for access and collaboration as all the resources (models, code, data, processing) are shared in the one virtual laboratory. VGL provides end users with access to an intuitive, user-centered interface that leverages cloud storage and cloud and cluster processing from both the research communities and commercial suppliers (e.g. Amazon). As the underlying data and information services are agnostic of the scientific domain, they can support many other data types. This fundamental characteristic results in a highly reusable virtual laboratory infrastructure that could also be used for example natural hazards, satellite processing, soil geochemistry, climate modeling, agriculture crop modeling.

  5. Integration Head Mounted Display Device and Hand Motion Gesture Device for Virtual Reality Laboratory

    NASA Astrophysics Data System (ADS)

    Rengganis, Y. A.; Safrodin, M.; Sukaridhoto, S.

    2018-01-01

    Virtual Reality Laboratory (VR Lab) is an innovation for conventional learning media which show us whole learning process in laboratory. There are many tools and materials are needed by user for doing practical in it, so user could feel new learning atmosphere by using this innovation. Nowadays, technologies more sophisticated than before. So it would carry in education and it will be more effective, efficient. The Supported technologies are needed us for making VR Lab such as head mounted display device and hand motion gesture device. The integration among them will be used us for making this research. Head mounted display device for viewing 3D environment of virtual reality laboratory. Hand motion gesture device for catching user real hand and it will be visualized in virtual reality laboratory. Virtual Reality will show us, if using the newest technologies in learning process it could make more interesting and easy to understand.

  6. Materials and Nondestructive Evaluation Laboratoriers: User Test Planning Guide

    NASA Technical Reports Server (NTRS)

    Schaschl, Leslie

    2011-01-01

    The Materials and Nondestructive Evaluation Laboratory process, milestones and inputs are unknowns to first-time users. The Materials and Nondestructive Evaluation Laboratory Planning Guide aids in establishing expectations for both NASA and non- NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware developers. It is intended to assist their project engineering personnel in materials analysis planning and execution. Material covered includes a roadmap of the analysis process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, products, and inputs necessary to define scope of analysis, cost, and schedule are included as an appendix to the guide.

  7. Certification & validation of biosafety level-2 & biosafety level-3 laboratories in Indian settings & common issues

    PubMed Central

    Mourya, Devendra T.; Yadav, Pragya D.; Khare, Ajay; Khan, Anwar H.

    2017-01-01

    With increasing awareness regarding biorisk management worldwide, many biosafety laboratories are being setup in India. It is important for the facility users, project managers and the executing agencies to understand the process of validation and certification of such biosafety laboratories. There are some international guidelines available, but there are no national guidelines or reference standards available in India on certification and validation of biosafety laboratories. There is no accredited government/private agency available in India to undertake validation and certification of biosafety laboratories. Therefore, the reliance is mostly on indigenous experience, talent and expertise available, which is in short supply. This article elucidates the process of certification and validation of biosafety laboratories in a concise manner for the understanding of the concerned users and suggests the important parameters and criteria that should be considered and addressed during the laboratory certification and validation process. PMID:29434059

  8. Certification & validation of biosafety level-2 & biosafety level-3 laboratories in Indian settings & common issues.

    PubMed

    Mourya, Devendra T; Yadav, Pragya D; Khare, Ajay; Khan, Anwar H

    2017-10-01

    With increasing awareness regarding biorisk management worldwide, many biosafety laboratories are being setup in India. It is important for the facility users, project managers and the executing agencies to understand the process of validation and certification of such biosafety laboratories. There are some international guidelines available, but there are no national guidelines or reference standards available in India on certification and validation of biosafety laboratories. There is no accredited government/private agency available in India to undertake validation and certification of biosafety laboratories. Therefore, the reliance is mostly on indigenous experience, talent and expertise available, which is in short supply. This article elucidates the process of certification and validation of biosafety laboratories in a concise manner for the understanding of the concerned users and suggests the important parameters and criteria that should be considered and addressed during the laboratory certification and validation process.

  9. Advanced Materials Laboratory User Test Planning Guide

    NASA Technical Reports Server (NTRS)

    Orndoff, Evelyne

    2012-01-01

    Test process, milestones and inputs are unknowns to first-time users of the Advanced Materials Laboratory. The User Test Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their test engineering personnel in test planning and execution. Material covered includes a roadmap of the test process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, test article interfaces, and inputs necessary to define test scope, cost, and schedule are included as an appendix to the guide.

  10. Planning for Space Station Freedom laboratory payload integration

    NASA Technical Reports Server (NTRS)

    Willenberg, Harvey J.; Torre, Larry P.

    1989-01-01

    Space Station Freedom is being developed to support extensive missions involving microgravity research and applications. Requirements for on-orbit payload integration and the simultaneous payload integration of multiple mission increments will provide the stimulus to develop new streamlined integration procedures in order to take advantage of the increased capabilities offered by Freedom. The United States Laboratory and its user accommodations are described. The process of integrating users' experiments and equipment into the United States Laboratory and the Pressurized Logistics Modules is described. This process includes the strategic and tactical phases of Space Station utilization planning. The support that the Work Package 01 Utilization office will provide to the users and hardware developers, in the form of Experiment Integration Engineers, early accommodation assessments, and physical integration of experiment equipment, is described. Plans for integrated payload analytical integration are also described.

  11. Flexible software architecture for user-interface and machine control in laboratory automation.

    PubMed

    Arutunian, E B; Meldrum, D R; Friedman, N A; Moody, S E

    1998-10-01

    We describe a modular, layered software architecture for automated laboratory instruments. The design consists of a sophisticated user interface, a machine controller and multiple individual hardware subsystems, each interacting through a client-server architecture built entirely on top of open Internet standards. In our implementation, the user-interface components are built as Java applets that are downloaded from a server integrated into the machine controller. The user-interface client can thereby provide laboratory personnel with a familiar environment for experiment design through a standard World Wide Web browser. Data management and security are seamlessly integrated at the machine-controller layer using QNX, a real-time operating system. This layer also controls hardware subsystems through a second client-server interface. This architecture has proven flexible and relatively easy to implement and allows users to operate laboratory automation instruments remotely through an Internet connection. The software architecture was implemented and demonstrated on the Acapella, an automated fluid-sample-processing system that is under development at the University of Washington.

  12. A user's guide for the signal processing software for image and speech compression developed in the Communications and Signal Processing Laboratory (CSPL), version 1

    NASA Technical Reports Server (NTRS)

    Kumar, P.; Lin, F. Y.; Vaishampayan, V.; Farvardin, N.

    1986-01-01

    A complete documentation of the software developed in the Communication and Signal Processing Laboratory (CSPL) during the period of July 1985 to March 1986 is provided. Utility programs and subroutines that were developed for a user-friendly image and speech processing environment are described. Additional programs for data compression of image and speech type signals are included. Also, programs for the zero-memory and block transform quantization in the presence of channel noise are described. Finally, several routines for simulating the perfromance of image compression algorithms are included.

  13. misr_view

    Atmospheric Science Data Center

    2018-03-21

    ... data files,  misr_view , was developed by NASA's Jet Propulsion Laboratory. misr_view, which includes a User's Guide, is available ... Processing Applications and Development Section at the Jet Propulsion Laboratory.   ...

  14. PMU Data Event Detection: A User Guide for Power Engineers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allen, A.; Singh, M.; Muljadi, E.

    2014-10-01

    This user guide is intended to accompany a software package containing a Matrix Laboratory (MATLAB) script and related functions for processing phasor measurement unit (PMU) data. This package and guide have been developed by the National Renewable Energy Laboratory and the University of Texas at Austin. The objective of this data processing exercise is to discover events in the vast quantities of data collected by PMUs. This document attempts to cover some of the theory behind processing the data to isolate events as well as the functioning of the MATLAB scripts. The report describes (1) the algorithms and mathematical backgroundmore » that the accompanying MATLAB codes use to detect events in PMU data and (2) the inputs required from the user and the outputs generated by the scripts.« less

  15. Controlling Laboratory Processes From A Personal Computer

    NASA Technical Reports Server (NTRS)

    Will, H.; Mackin, M. A.

    1991-01-01

    Computer program provides natural-language process control from IBM PC or compatible computer. Sets up process-control system that either runs without operator or run by workers who have limited programming skills. Includes three smaller programs. Two of them, written in FORTRAN 77, record data and control research processes. Third program, written in Pascal, generates FORTRAN subroutines used by other two programs to identify user commands with device-driving routines written by user. Also includes set of input data allowing user to define user commands to be executed by computer. Requires personal computer operating under MS-DOS with suitable hardware interfaces to all controlled devices. Also requires FORTRAN 77 compiler and device drivers written by user.

  16. A Multi-User Remote Academic Laboratory System

    ERIC Educational Resources Information Center

    Barrios, Arquimedes; Panche, Stifen; Duque, Mauricio; Grisales, Victor H.; Prieto, Flavio; Villa, Jose L.; Chevrel, Philippe; Canu, Michael

    2013-01-01

    This article describes the development, implementation and preliminary operation assessment of Multiuser Network Architecture to integrate a number of Remote Academic Laboratories for educational purposes on automatic control. Through the Internet, real processes or physical experiments conducted at the control engineering laboratories of four…

  17. Laboratory Astrophysics White Paper

    NASA Technical Reports Server (NTRS)

    Brickhouse, Nancy; Federman, Steve; Kwong, Victor; Salama, Farid; Savin, Daniel; Stancil, Phillip; Weingartner, Joe; Ziurys, Lucy

    2006-01-01

    Laboratory astrophysics and complementary theoretical calculations are the foundations of astronomical and planetary research and will remain so for many generations to come. From the level of scientific conception to that of the scientific return, it is our understanding of the underlying processes that allows us to address fundamental questions regarding the origins and evolution of galaxies, stars, planetary systems, and life in the cosmos. In this regard, laboratory astrophysics is much like detector and instrument development at NASA and NSF; these efforts are necessary for the astronomical research being funded by the agencies. The NASA Laboratory Astrophysics Workshop met at the University of Nevada, Las Vegas (UNLV) from 14-16 February, 2006 to identify the current laboratory data needed to support existing and future NASA missions and programs in the Astrophysics Division of the Science Mission Directorate (SMD). Here we refer to both laboratory and theoretical work as laboratory astrophysics unless a distinction is necessary. The format for the Workshop involved invited talks by users of laboratory data, shorter contributed talks and poster presentations by both users and providers that highlighted exciting developments in laboratory astrophysics, and breakout sessions where users and providers discussed each others' needs and limitations. We also note that the members of the Scientific Organizing Committee are users as well as providers of laboratory data. As in previous workshops, the focus was on atomic, molecular, and solid state physics.

  18. Selecting clinical quality indicators for laboratory medicine.

    PubMed

    Barth, Julian H

    2012-05-01

    Quality in laboratory medicine is often described as doing the right test at the right time for the right person. Laboratory processes currently operate under the oversight of an accreditation body which gives confidence that the process is good. However, there are aspects of quality that are not measured by these processes. These are largely focused on ensuring that the most clinically appropriate test is performed and interpreted correctly. Clinical quality indicators were selected through a two-phase process. Firstly, a series of focus groups of clinical scientists were held with the aim of developing a list of quality indicators. These were subsequently ranked in order by an expert panel of primary and secondary care physicians. The 10 top indicators included the communication of critical results, comprehensive education to all users and adequate quality assurance for point-of-care testing. Laboratories should ensure their tests are used to national standards, that they have clinical utility, are calibrated to national standards and have long-term stability for chronic disease management. Laboratories should have error logs and demonstrate evidence of measures introduced to reduce chances of similar future errors. Laboratories should make a formal scientific evaluation of analytical quality. This paper describes the process of selection of quality indicators for laboratory medicine that have been validated sequentially by deliverers and users of the service. They now need to be converted into measureable variables related to outcome and validated in practice.

  19. Open-source LIMS in Vietnam: The path toward sustainability and host country ownership.

    PubMed

    Landgraf, Kenneth M; Kakkar, Reshma; Meigs, Michelle; Jankauskas, Paul T; Phan, Thi Thu Huong; Nguyen, Viet Nga; Nguyen, Duy Thai; Duong, Thanh Tung; Nguyen, Thi Hoa; Bond, Kyle B

    2016-09-01

    The objectives of this case report are as follows: to describe the process of establishing a national laboratory information management system (LIMS) program for clinical and public health laboratories in Vietnam; to evaluate the outcomes and lessons learned; and to present a model for sustainability based on the program outcomes that could be applied to diverse laboratory programs. This case report comprises a review of program documentation and records, including planning and budgetary records of the donor, monthly reports from the implementer, direct observation, and ad-hoc field reports from technical advisors and governmental agencies. Additional data on program efficacy and user acceptance were collected from routine monitoring of laboratory policies and operational practices. LIMS software was implemented at 38 hospital, public health and HIV testing laboratories in Vietnam. This LIMS was accepted by users and program managers as a useful tool to support laboratory processes. Implementation cost per laboratory and average duration of deployment decreased over time, and project stakeholders initiated transition of financing (from the donor to local institutions) and of system maintenance functions (from the implementer to governmental and site-level staff). Collaboration between the implementer in Vietnam and the global LIMS user community was strongly established, and knowledge was successfully transferred to staff within Vietnam. Implementing open-sourced LIMS with local development and support was a feasible approach towards establishing a sustainable laboratory informatics program that met the needs of health laboratories in Vietnam. Further effort to institutionalize IT support capacity within key government agencies is ongoing. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. The Subsurface Flow and Transport Laboratory: A New Department of Energy User's Facility for Intermediate-Scale Experimentation

    NASA Astrophysics Data System (ADS)

    Wietsma, T. W.; Oostrom, M.; Foster, N. S.

    2003-12-01

    Intermediate-scale experiments (ISEs) for flow and transport are a valuable tool for simulating subsurface features and conditions encountered in the field at government and private sites. ISEs offer the ability to study, under controlled laboratory conditions, complicated processes characteristic of mixed wastes and heterogeneous subsurface environments, in multiple dimensions and at different scales. ISEs may, therefore, result in major cost savings if employed prior to field studies. A distinct advantage of ISEs is that researchers can design physical and/or chemical heterogeneities in the porous media matrix that better approximate natural field conditions and therefore address research questions that contain the additional complexity of processes often encountered in the natural environment. A new Subsurface Flow and Transport Laboratory (SFTL) has been developed for ISE users in the Environmental Spectroscopy & Biogeochemistry Facility in the Environmental Molecular Sciences Laboratory (EMSL) at Pacific Northwest National Laboratory (PNNL). The SFTL offers a variety of columns and flow cells, a new state-of-the-art dual-energy gamma system, a fully automated saturation-pressure apparatus, and analytical equipment for sample processing. The new facility, including qualified staff, is available for scientists interested in collaboration on conducting high-quality flow and transport experiments, including contaminant remediation. Close linkages exist between the SFTL and numerical modelers to aid in experimental design and interpretation. This presentation will discuss the facility and outline the procedures required to submit a proposal to use this unique facility for research purposes. The W. R. Wiley Environmental Molecular Sciences Laboratory, a national scientific user facility, is sponsored by the U.S. Department of Energy's Office of Biological and Environmental Research and located at Pacific Northwest National Laboratory.

  1. Environmental Response Laboratory Network (ERLN) WebEDR Quick Reference Guide

    EPA Pesticide Factsheets

    The Web Electronic Data Review is a web-based system that performs automated data processing on laboratory-submitted Electronic Data Deliverables (EDDs). Enables users to perform technical audits on data, and against Measurement Quality Objectives (MQOs).

  2. Computational Electromagnetics (CEM) Laboratory: Simulation Planning Guide

    NASA Technical Reports Server (NTRS)

    Khayat, Michael A.

    2011-01-01

    The simulation process, milestones and inputs are unknowns to first-time users of the CEM Laboratory. The Simulation Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their engineering personnel in simulation planning and execution. Material covered includes a roadmap of the simulation process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, facility interfaces, and inputs necessary to define scope, cost, and schedule are included as an appendix to the guide.

  3. LABORATORY PROCESS CONTROLLER USING NATURAL LANGUAGE COMMANDS FROM A PERSONAL COMPUTER

    NASA Technical Reports Server (NTRS)

    Will, H.

    1994-01-01

    The complex environment of the typical research laboratory requires flexible process control. This program provides natural language process control from an IBM PC or compatible machine. Sometimes process control schedules require changes frequently, even several times per day. These changes may include adding, deleting, and rearranging steps in a process. This program sets up a process control system that can either run without an operator, or be run by workers with limited programming skills. The software system includes three programs. Two of the programs, written in FORTRAN77, record data and control research processes. The third program, written in Pascal, generates the FORTRAN subroutines used by the other two programs to identify the user commands with the user-written device drivers. The software system also includes an input data set which allows the user to define the user commands which are to be executed by the computer. To set the system up the operator writes device driver routines for all of the controlled devices. Once set up, this system requires only an input file containing natural language command lines which tell the system what to do and when to do it. The operator can make up custom commands for operating and taking data from external research equipment at any time of the day or night without the operator in attendance. This process control system requires a personal computer operating under MS-DOS with suitable hardware interfaces to all controlled devices. The program requires a FORTRAN77 compiler and user-written device drivers. This program was developed in 1989 and has a memory requirement of about 62 Kbytes.

  4. adLIMS: a customized open source software that allows bridging clinical and basic molecular research studies.

    PubMed

    Calabria, Andrea; Spinozzi, Giulio; Benedicenti, Fabrizio; Tenderini, Erika; Montini, Eugenio

    2015-01-01

    Many biological laboratories that deal with genomic samples are facing the problem of sample tracking, both for pure laboratory management and for efficiency. Our laboratory exploits PCR techniques and Next Generation Sequencing (NGS) methods to perform high-throughput integration site monitoring in different clinical trials and scientific projects. Because of the huge amount of samples that we process every year, which result in hundreds of millions of sequencing reads, we need to standardize data management and tracking systems, building up a scalable and flexible structure with web-based interfaces, which are usually called Laboratory Information Management System (LIMS). We started collecting end-users' requirements, composed of desired functionalities of the system and Graphical User Interfaces (GUI), and then we evaluated available tools that could address our requirements, spanning from pure LIMS to Content Management Systems (CMS) up to enterprise information systems. Our analysis identified ADempiere ERP, an open source Enterprise Resource Planning written in Java J2EE, as the best software that also natively implements some highly desirable technological advances, such as the high usability and modularity that grants high use-case flexibility and software scalability for custom solutions. We extended and customized ADempiere ERP to fulfil LIMS requirements and we developed adLIMS. It has been validated by our end-users verifying functionalities and GUIs through test cases for PCRs samples and pre-sequencing data and it is currently in use in our laboratories. adLIMS implements authorization and authentication policies, allowing multiple users management and roles definition that enables specific permissions, operations and data views to each user. For example, adLIMS allows creating sample sheets from stored data using available exporting operations. This simplicity and process standardization may avoid manual errors and information backtracking, features that are not granted using track recording on files or spreadsheets. adLIMS aims to combine sample tracking and data reporting features with higher accessibility and usability of GUIs, thus allowing time to be saved on doing repetitive laboratory tasks, and reducing errors with respect to manual data collection methods. Moreover, adLIMS implements automated data entry, exploiting sample data multiplexing and parallel/transactional processing. adLIMS is natively extensible to cope with laboratory automation through platform-dependent API interfaces, and could be extended to genomic facilities due to the ERP functionalities.

  5. MASTR-MS: a web-based collaborative laboratory information management system (LIMS) for metabolomics.

    PubMed

    Hunter, Adam; Dayalan, Saravanan; De Souza, David; Power, Brad; Lorrimar, Rodney; Szabo, Tamas; Nguyen, Thu; O'Callaghan, Sean; Hack, Jeremy; Pyke, James; Nahid, Amsha; Barrero, Roberto; Roessner, Ute; Likic, Vladimir; Tull, Dedreia; Bacic, Antony; McConville, Malcolm; Bellgard, Matthew

    2017-01-01

    An increasing number of research laboratories and core analytical facilities around the world are developing high throughput metabolomic analytical and data processing pipelines that are capable of handling hundreds to thousands of individual samples per year, often over multiple projects, collaborations and sample types. At present, there are no Laboratory Information Management Systems (LIMS) that are specifically tailored for metabolomics laboratories that are capable of tracking samples and associated metadata from the beginning to the end of an experiment, including data processing and archiving, and which are also suitable for use in large institutional core facilities or multi-laboratory consortia as well as single laboratory environments. Here we present MASTR-MS, a downloadable and installable LIMS solution that can be deployed either within a single laboratory or used to link workflows across a multisite network. It comprises a Node Management System that can be used to link and manage projects across one or multiple collaborating laboratories; a User Management System which defines different user groups and privileges of users; a Quote Management System where client quotes are managed; a Project Management System in which metadata is stored and all aspects of project management, including experimental setup, sample tracking and instrument analysis, are defined, and a Data Management System that allows the automatic capture and storage of raw and processed data from the analytical instruments to the LIMS. MASTR-MS is a comprehensive LIMS solution specifically designed for metabolomics. It captures the entire lifecycle of a sample starting from project and experiment design to sample analysis, data capture and storage. It acts as an electronic notebook, facilitating project management within a single laboratory or a multi-node collaborative environment. This software is being developed in close consultation with members of the metabolomics research community. It is freely available under the GNU GPL v3 licence and can be accessed from, https://muccg.github.io/mastr-ms/.

  6. Virtual and remote robotic laboratory using EJS, MATLAB and LabVIEW.

    PubMed

    Chaos, Dictino; Chacón, Jesús; Lopez-Orozco, Jose Antonio; Dormido, Sebastián

    2013-02-21

    This paper describes the design and implementation of a virtual and remote laboratory based on Easy Java Simulations (EJS) and LabVIEW. The main application of this laboratory is to improve the study of sensors in Mobile Robotics, dealing with the problems that arise on the real world experiments. This laboratory allows the user to work from their homes, tele-operating a real robot that takes measurements from its sensors in order to obtain a map of its environment. In addition, the application allows interacting with a robot simulation (virtual laboratory) or with a real robot (remote laboratory), with the same simple and intuitive graphical user interface in EJS. Thus, students can develop signal processing and control algorithms for the robot in simulation and then deploy them on the real robot for testing purposes. Practical examples of application of the laboratory on the inter-University Master of Systems Engineering and Automatic Control are presented.

  7. Virtual and Remote Robotic Laboratory Using EJS, MATLAB and Lab VIEW

    PubMed Central

    Chaos, Dictino; Chacón, Jesús; Lopez-Orozco, Jose Antonio; Dormido, Sebastián

    2013-01-01

    This paper describes the design and implementation of a virtual and remote laboratory based on Easy Java Simulations (EJS) and LabVIEW. The main application of this laboratory is to improve the study of sensors in Mobile Robotics, dealing with the problems that arise on the real world experiments. This laboratory allows the user to work from their homes, tele-operating a real robot that takes measurements from its sensors in order to obtain a map of its environment. In addition, the application allows interacting with a robot simulation (virtual laboratory) or with a real robot (remote laboratory), with the same simple and intuitive graphical user interface in EJS. Thus, students can develop signal processing and control algorithms for the robot in simulation and then deploy them on the real robot for testing purposes. Practical examples of application of the laboratory on the inter-University Master of Systems Engineering and Automatic Control are presented. PMID:23429578

  8. User's Manual for BEST-Dairy: Benchmarking and Energy/water-Saving Tool (BEST) for the Dairy Processing Industry (Version 1.2)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, T.; Ke, J.; Sathaye, J.

    2011-04-20

    This User's Manual summarizes the background information of the Benchmarking and Energy/water-Saving Tool (BEST) for the Dairy Processing Industry (Version 1.2, 2011), including'Read Me' portion of the tool, the sections of Introduction, and Instructions for the BEST-Dairy tool that is developed and distributed by Lawrence Berkeley National Laboratory (LBNL).

  9. Reducing Missed Laboratory Results: Defining Temporal Responsibility, Generating User Interfaces for Test Process Tracking, and Retrospective Analyses to Identify Problems

    PubMed Central

    Tarkan, Sureyya; Plaisant, Catherine; Shneiderman, Ben; Hettinger, A. Zachary

    2011-01-01

    Researchers have conducted numerous case studies reporting the details on how laboratory test results of patients were missed by the ordering medical providers. Given the importance of timely test results in an outpatient setting, there is limited discussion of electronic versions of test result management tools to help clinicians and medical staff with this complex process. This paper presents three ideas to reduce missed results with a system that facilitates tracking laboratory tests from order to completion as well as during follow-up: (1) define a workflow management model that clarifies responsible agents and associated time frame, (2) generate a user interface for tracking that could eventually be integrated into current electronic health record (EHR) systems, (3) help identify common problems in past orders through retrospective analyses. PMID:22195201

  10. Pilot users in agile development processes: motivational factors.

    PubMed

    Johannessen, Liv Karen; Gammon, Deede

    2010-01-01

    Despite a wealth of research on user participation, few studies offer insights into how to involve multi-organizational users in agile development methods. This paper is a case study of user involvement in developing a system for electronic laboratory requisitions using agile methodologies in a multi-organizational context. Building on an interpretive approach, we illuminate questions such as: How does collaboration between users and developers evolve and how might it be improved? What key motivational aspects are at play when users volunteer and continue contributing in the face of considerable added burdens? The study highlights how agile methods in themselves appear to facilitate mutually motivating collaboration between user groups and developers. Lessons learned for leveraging the advantages of agile development processes include acknowledging the substantial and ongoing contributions of users and their roles as co-designers of the system.

  11. Audio Development Laboratory (ADL) User Test Planning Guide

    NASA Technical Reports Server (NTRS)

    Romero, Andy

    2012-01-01

    Test process, milestones and inputs are unknowns to first-time users of the ADL. The User Test Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their test engineering personnel in test planning and execution. Material covered includes a roadmap of the test process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, test article interfaces, and inputs necessary to define test scope, cost, and schedule are included as an appendix to the guide.

  12. Electronic Systems Test Laboratory (ESTL) User Test Planning Guide

    NASA Technical Reports Server (NTRS)

    Robinson, Neil

    2011-01-01

    Test process, milestones and inputs are unknowns to first-time users of the ESTL. The User Test Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their test engineering personnel in test planning and execution. Material covered includes a roadmap of the test process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, test article interfaces, and inputs necessary to define test scope, cost, and schedule are included as an appendix to the guide.

  13. Structures Test Laboratory (STL). User Test Planning Guide

    NASA Technical Reports Server (NTRS)

    Zipay, John J.

    2011-01-01

    Test process, milestones and inputs are unknowns to first-time users of the STL. The User Test Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their test engineering personnel in test planning and execution. Material covered includes a roadmap of the test process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, test article interfaces, and inputs necessary to define test scope, cost, and schedule are included as an appendix to the guide.

  14. From the Research Laboratory to the Operating Company: How Information Travels.

    ERIC Educational Resources Information Center

    Coppin, Ann S.; Palmer, Linda L.

    1980-01-01

    Reviews transmission processes of Chevron Oil Field Research Company (COFRC) research results from laboratories to end-user operating companies worldwide. Information dissemination methods described included informal communication, intercompany meetings, visits by COFRC personnel to operating company offices, distribution of written reports,…

  15. Virtual Earth System Laboratory (VESL): Effective Visualization of Earth System Data and Process Simulations

    NASA Astrophysics Data System (ADS)

    Quinn, J. D.; Larour, E. Y.; Cheng, D. L. C.; Halkides, D. J.

    2016-12-01

    The Virtual Earth System Laboratory (VESL) is a Web-based tool, under development at the Jet Propulsion Laboratory and UC Irvine, for the visualization of Earth System data and process simulations. It contains features geared toward a range of applications, spanning research and outreach. It offers an intuitive user interface, in which model inputs are changed using sliders and other interactive components. Current capabilities include simulation of polar ice sheet responses to climate forcing, based on NASA's Ice Sheet System Model (ISSM). We believe that the visualization of data is most effective when tailored to the target audience, and that many of the best practices for modern Web design/development can be applied directly to the visualization of data: use of negative space, color schemes, typography, accessibility standards, tooltips, etc cetera. We present our prototype website, and invite input from potential users, including researchers, educators, and students.

  16. Preparation for microgravity: The role of the microgravity materials science laboratory

    NASA Technical Reports Server (NTRS)

    Johnston, J. Christopher; Rosenthal, Bruce N.; Meyer, Maryjo B.; Glasgow, Thomas K.

    1988-01-01

    A laboratory dedicated to ground based materials processing in preparation for space flight was established at the NASA Lewis Research Center. Experiments are performed to delineate the effects of gravity on processes of both scientific and commercial interest. Processes are modeled physically and mathematically. Transport model systems are used where possible to visually track convection, settling, crystal growth, phase separation, agglomeration, vapor transport, diffusive flow, and polymers reactions. The laboratory contains apparatus which functionally duplicates apparatus available for flight experiments and other pieces instrumented specifically to allow process characterization. Materials addressed include metals, alloys, salts, glasses, ceramics, and polymers. The Microgravity Materials Science Laboratory is staffed by engineers and technicians from a variety of disciplines and is open to users from industry and academia as well as the government. Examples will be given of the laboratory apparatus typical experiments and results.

  17. DHM and serious games: a case-study oil and gas laboratories.

    PubMed

    Santos, V; Zamberlan, M; Streit, P; Oliveira, J; Guimarães, C; Pastura, F; Cid, G

    2012-01-01

    The aim in this paper is to present a research on the application of serious games for the design of laboratories in the oil and gas industries. The focus is in human virtual representation acquired from 3D scanning, human interaction, workspace layout and equipment designed considering ergonomics standards. The laboratory studies were simulated in Unity3D platform, which allows the users to control the DHM1 on the dynamic virtual scenario, in order to simulate work activities. This methodology can change the design process by improving the level of interaction between final users, managers and human factor teams. That helps to better visualize future work settings and improve the level of participation between all stakeholders.

  18. MOtoNMS: A MATLAB toolbox to process motion data for neuromusculoskeletal modeling and simulation.

    PubMed

    Mantoan, Alice; Pizzolato, Claudio; Sartori, Massimo; Sawacha, Zimi; Cobelli, Claudio; Reggiani, Monica

    2015-01-01

    Neuromusculoskeletal modeling and simulation enable investigation of the neuromusculoskeletal system and its role in human movement dynamics. These methods are progressively introduced into daily clinical practice. However, a major factor limiting this translation is the lack of robust tools for the pre-processing of experimental movement data for their use in neuromusculoskeletal modeling software. This paper presents MOtoNMS (matlab MOtion data elaboration TOolbox for NeuroMusculoSkeletal applications), a toolbox freely available to the community, that aims to fill this lack. MOtoNMS processes experimental data from different motion analysis devices and generates input data for neuromusculoskeletal modeling and simulation software, such as OpenSim and CEINMS (Calibrated EMG-Informed NMS Modelling Toolbox). MOtoNMS implements commonly required processing steps and its generic architecture simplifies the integration of new user-defined processing components. MOtoNMS allows users to setup their laboratory configurations and processing procedures through user-friendly graphical interfaces, without requiring advanced computer skills. Finally, configuration choices can be stored enabling the full reproduction of the processing steps. MOtoNMS is released under GNU General Public License and it is available at the SimTK website and from the GitHub repository. Motion data collected at four institutions demonstrate that, despite differences in laboratory instrumentation and procedures, MOtoNMS succeeds in processing data and producing consistent inputs for OpenSim and CEINMS. MOtoNMS fills the gap between motion analysis and neuromusculoskeletal modeling and simulation. Its support to several devices, a complete implementation of the pre-processing procedures, its simple extensibility, the available user interfaces, and its free availability can boost the translation of neuromusculoskeletal methods in daily and clinical practice.

  19. Energy Systems Test Area (ESTA) Electrical Power Systems Test Operations: User Test Planning Guide

    NASA Technical Reports Server (NTRS)

    Salinas, Michael J.

    2012-01-01

    Test process, milestones and inputs are unknowns to first-time users of the ESTA Electrical Power Systems Test Laboratory. The User Test Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their test engineering personnel in test planning and execution. Material covered includes a roadmap of the test process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, test article interfaces, and inputs necessary to define test scope, cost, and schedule are included as an appendix to the guide.

  20. Students' perceptions of constructivist Internet learning environments by a physics virtual laboratory: the gap between ideal and reality and gender differences.

    PubMed

    Chuang, Shih-Chyueh; Hwang, Fu-Kwun; Tsai, Chin-Chung

    2008-04-01

    The purpose of this study was to investigate the perceptions of Internet users of a physics virtual laboratory, Demolab, in Taiwan. Learners' perceptions of Internet-based learning environments were explored and the role of gender was examined by using preferred and actual forms of a revised Constructivist Internet-based Learning Environment Survey (CILES). The students expressed a clear gap between ideal and reality, and they showed higher preferences for many features of constructivist Internet-based learning environments than for features they had actually learned in Demolab. The results further suggested that male users prefer to be involved in the process of discussion and to show critical judgments. In addition, male users indicated they enjoyed the process of negotiation and discussion with others and were able to engage in reflective thoughts while learning in Demolab. In light of these findings, male users seemed to demonstrate better adaptability to the constructivist Internet-based learning approach than female users did. Although this study indicated certain differences between males and females in their responses to Internet-based learning environments, they also shared numerous similarities. A well-established constructivist Internet-based learning environment may encourage more female learners to participate in the science community.

  1. A Brave New Animal for a Brave New World

    PubMed Central

    Kirk, Robert G. W.

    2012-01-01

    In 1947 the Medical Research Council of Britain established the Laboratory Animals Bureau in order to develop national standards of animal production that would enable commercial producers better to provide for the needs of laboratory animal users. Under the directorship of William Lane-Petter, the bureau expanded well beyond this remit, pioneering a new discipline of “laboratory animal science” and becoming internationally known as a producer of pathogenically and genetically standardized laboratory animals. The work of this organization, later renamed the Laboratory Animals Centre, and of Lane-Petter did much to systematize worldwide standards for laboratory animal production and provision—for example, by prompting the formation of the International Committee on Laboratory Animals. This essay reconstructs how the bureau became an internationally recognized center of expertise and argues that standardization discourses within science are inherently internationalizing. It traces the dynamic co-constitution of standard laboratory animals alongside that of the identities of the users, producers, and regulators of laboratory animals. This process is shown to have brought into being a transnational community with shared conceptual understandings and material practices grounded in the materiality of the laboratory animal, conceived as an instrumental technology. PMID:20575490

  2. Adapting Raman Spectra from Laboratory Spectrometers to Portable Detection Libraries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weatherall, James; Barber, Jeffrey B.; Brauer, Carolyn S.

    2013-02-01

    Raman spectral data collected with high-resolution laboratory spectrometers are processed into a for- mat suitable for importing as a user library on a 1064nm DeltaNu rst generation, eld-deployable spectrometer prototype. The two laboratory systems used are a 1064nm Bruker spectrometer and a 785nm Kaiser spectrometer. The steps taken to compensate for device-dependent spectral resolution, wavenumber shifts between instruments, and wavenumber sensitivity variation are described.

  3. Factory-Calibrated Continuous Glucose Sensors: The Science Behind the Technology.

    PubMed

    Hoss, Udo; Budiman, Erwin Satrya

    2017-05-01

    The use of commercially available continuous glucose monitors for diabetes management requires sensor calibrations, which until recently are exclusively performed by the patient. A new development is the implementation of factory calibration for subcutaneous glucose sensors, which eliminates the need for user calibrations and the associated blood glucose tests. Factory calibration means that the calibration process is part of the sensor manufacturing process and performed under controlled laboratory conditions. The ability to move from a user calibration to factory calibration is based on several technical requirements related to sensor stability and the robustness of the sensor manufacturing process. The main advantages of factory calibration over the conventional user calibration are: (a) more convenience for the user, since no more fingersticks are required for calibration and (b) elimination of use errors related to the execution of the calibration process, which can lead to sensor inaccuracies. The FreeStyle Libre ™ and FreeStyle Libre Pro ™ flash continuous glucose monitoring systems are the first commercially available sensor systems using factory-calibrated sensors. For these sensor systems, no user calibrations are required throughout the sensor wear duration.

  4. Factory-Calibrated Continuous Glucose Sensors: The Science Behind the Technology

    PubMed Central

    Budiman, Erwin Satrya

    2017-01-01

    Abstract The use of commercially available continuous glucose monitors for diabetes management requires sensor calibrations, which until recently are exclusively performed by the patient. A new development is the implementation of factory calibration for subcutaneous glucose sensors, which eliminates the need for user calibrations and the associated blood glucose tests. Factory calibration means that the calibration process is part of the sensor manufacturing process and performed under controlled laboratory conditions. The ability to move from a user calibration to factory calibration is based on several technical requirements related to sensor stability and the robustness of the sensor manufacturing process. The main advantages of factory calibration over the conventional user calibration are: (a) more convenience for the user, since no more fingersticks are required for calibration and (b) elimination of use errors related to the execution of the calibration process, which can lead to sensor inaccuracies. The FreeStyle Libre™ and FreeStyle Libre Pro™ flash continuous glucose monitoring systems are the first commercially available sensor systems using factory-calibrated sensors. For these sensor systems, no user calibrations are required throughout the sensor wear duration. PMID:28541139

  5. A Prototype Lisp-Based Soft Real-Time Object-Oriented Graphical User Interface for Control System Development

    NASA Technical Reports Server (NTRS)

    Litt, Jonathan; Wong, Edmond; Simon, Donald L.

    1994-01-01

    A prototype Lisp-based soft real-time object-oriented Graphical User Interface for control system development is presented. The Graphical User Interface executes alongside a test system in laboratory conditions to permit observation of the closed loop operation through animation, graphics, and text. Since it must perform interactive graphics while updating the screen in real time, techniques are discussed which allow quick, efficient data processing and animation. Examples from an implementation are included to demonstrate some typical functionalities which allow the user to follow the control system's operation.

  6. NG6: Integrated next generation sequencing storage and processing environment.

    PubMed

    Mariette, Jérôme; Escudié, Frédéric; Allias, Nicolas; Salin, Gérald; Noirot, Céline; Thomas, Sylvain; Klopp, Christophe

    2012-09-09

    Next generation sequencing platforms are now well implanted in sequencing centres and some laboratories. Upcoming smaller scale machines such as the 454 junior from Roche or the MiSeq from Illumina will increase the number of laboratories hosting a sequencer. In such a context, it is important to provide these teams with an easily manageable environment to store and process the produced reads. We describe a user-friendly information system able to manage large sets of sequencing data. It includes, on one hand, a workflow environment already containing pipelines adapted to different input formats (sff, fasta, fastq and qseq), different sequencers (Roche 454, Illumina HiSeq) and various analyses (quality control, assembly, alignment, diversity studies,…) and, on the other hand, a secured web site giving access to the results. The connected user will be able to download raw and processed data and browse through the analysis result statistics. The provided workflows can easily be modified or extended and new ones can be added. Ergatis is used as a workflow building, running and monitoring system. The analyses can be run locally or in a cluster environment using Sun Grid Engine. NG6 is a complete information system designed to answer the needs of a sequencing platform. It provides a user-friendly interface to process, store and download high-throughput sequencing data.

  7. End-User Evaluations of Semantic Web Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCool, Rob; Cowell, Andrew J.; Thurman, David A.

    Stanford University's Knowledge Systems Laboratory (KSL) is working in partnership with Battelle Memorial Institute and IBM Watson Research Center to develop a suite of technologies for information extraction, knowledge representation & reasoning, and human-information interaction, in unison entitled 'Knowledge Associates for Novel Intelligence' (KANI). We have developed an integrated analytic environment composed of a collection of analyst associates, software components that aid the user at different stages of the information analysis process. An important part of our participatory design process has been to ensure our technologies and designs are tightly integrate with the needs and requirements of our end users,more » To this end, we perform a sequence of evaluations towards the end of the development process that ensure the technologies are both functional and usable. This paper reports on that process.« less

  8. Pre-examination factors affecting molecular diagnostic test results and interpretation: A case-based approach.

    PubMed

    Payne, Deborah A; Baluchova, Katarina; Peoc'h, Katell H; van Schaik, Ron H N; Chan, K C Allen; Maekawa, Masato; Mamotte, Cyril; Russomando, Graciela; Rousseau, François; Ahmad-Nejad, Parviz

    2017-04-01

    Multiple organizations produce guidance documents that provide opportunities to harmonize quality practices for diagnostic testing. The International Organization for Standardization ISO 15189 standard addresses requirements for quality in management and technical aspects of the clinical laboratory. One technical aspect addresses the complexities of the pre-examination phase prior to diagnostic testing. The Committee for Molecular Diagnostics of the International Federation for Clinical Chemistry and Laboratory Medicine (also known as, IFCC C-MD) conducted a survey of international molecular laboratories and determined ISO 15189 to be the most referenced guidance document. In this review, the IFCC C-MD provides case-based examples illustrating the value of select pre-examination processes as these processes relate to molecular diagnostic testing. Case-based examples in infectious disease, oncology, inherited disease and pharmacogenomics address the utility of: 1) providing information to patients and users, 2) designing requisition forms, 3) obtaining informed consent and 4) maintaining sample integrity prior to testing. The pre-examination phase requires extensive and consistent communication between the laboratory, the healthcare provider and the end user. The clinical vignettes presented in this paper illustrate the value of applying select ISO 15189 recommendations for general laboratory to the more specialized area of Molecular Diagnostics. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Achieving continuous improvement in laboratory organization through performance measurements: a seven-year experience.

    PubMed

    Salinas, Maria; López-Garrigós, Maite; Gutiérrez, Mercedes; Lugo, Javier; Sirvent, Jose Vicente; Uris, Joaquin

    2010-01-01

    Laboratory performance can be measured using a set of model key performance indicators (KPIs). The design and implementation of KPIs are important issues. KPI results from 7 years are reported and their implementation, monitoring, objectives, interventions, result reporting and delivery are analyzed. The KPIs of the entire laboratory process were obtained using Laboratory Information System (LIS) registers. These were collected automatically using a data warehouse application, spreadsheets and external quality program reports. Customer satisfaction was assessed using surveys. Nine model laboratory KPIs were proposed and measured. The results of some examples of KPIs used in our laboratory are reported. Their corrective measurements or the implementation of objectives led to improvement in the associated KPIs results. Measurement of laboratory performance using KPIs and a data warehouse application that continuously collects registers and calculates KPIs confirmed the reliability of indicators, indicator acceptability and usability for users, and continuous process improvement.

  10. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds †

    PubMed Central

    Frank, Jared A.; Brill, Anthony; Kapila, Vikram

    2016-01-01

    Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their embedded hardware and software in the measurement and control of the test-beds. This approach is a first step towards replacing laboratory-grade peripherals with more compact and affordable smartphone-based platforms, whose interactive user interfaces can engender wider participation and engagement from learners. Demonstrative cases are presented in which the sensing, computation, control, and user interaction with three motor-based test-beds are handled by a mounted smartphone. Results of experiments and simulations are used to validate the feasibility of mounted smartphones as measurement and feedback control platforms for motor-based laboratory test-beds, report the measurement precision and closed-loop performance achieved with such platforms, and address challenges in the development of platforms to maintain system stability. PMID:27556464

  11. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds.

    PubMed

    Frank, Jared A; Brill, Anthony; Kapila, Vikram

    2016-08-20

    Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their embedded hardware and software in the measurement and control of the test-beds. This approach is a first step towards replacing laboratory-grade peripherals with more compact and affordable smartphone-based platforms, whose interactive user interfaces can engender wider participation and engagement from learners. Demonstrative cases are presented in which the sensing, computation, control, and user interaction with three motor-based test-beds are handled by a mounted smartphone. Results of experiments and simulations are used to validate the feasibility of mounted smartphones as measurement and feedback control platforms for motor-based laboratory test-beds, report the measurement precision and closed-loop performance achieved with such platforms, and address challenges in the development of platforms to maintain system stability.

  12. User-driven product data manager system design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-03-01

    With the infusion of information technologies into product development and production processes, effective management of product data is becoming essential to modern production enterprises. When an enterprise-wide Product Data Manager (PDM) is implemented, PDM designers must satisfy the requirements of individual users with different job functions and requirements, as well as the requirements of the enterprise as a whole. Concern must also be shown for the interrelationships between information, methods for retrieving archival information and integration of the PDM into the product development process. This paper describes a user-driven approach applied to PDM design for an agile manufacturing pilot projectmore » at Sandia National Laboratories that has been successful in achieving a much faster design-to-production process for a precision electro mechanical surety device.« less

  13. Design and development of sustainable remediation process for mitigation of fluoride contamination in ground water and field application for domestic use.

    PubMed

    Gwala, Poonam; Andey, Subhash; Nagarnaik, Pranav; Ghosh, Sarika Pimpalkar; Pal, Prashant; Deshmukh, Prashant; Labhasetwar, Pawan

    2014-08-01

    Decentralised household chemo defluoridation unit (CDU) was developed and designed based on a combination of coagulation and sorption processes. Chemo-defluoridation process was optimised to reduce use of chemicals and increase acceptability among beneficiaries without affecting palatability of water. Chemical dose optimization undertaken in the laboratory using jar test revealed the optimum calcium salt to initial fluoride ratio of 60 for fluoride removal. Performance of CDU was evaluated in the laboratory for removal efficiency, water quality parameters, filter bed cleaning cycle and desorption of fluoride. CDU evaluation in the laboratory with spiked water (5 mg/L) and field water (~4.2 mg/L) revealed treated water fluoride concentration of less than 1mg/L. Seventy five CDUs were installed in households at Sakhara Village, Yavatmal District in Maharashtra State of India. Monthly monitoring of CDUs for one year indicated reduction of the raw water fluoride concentration from around 4 mg/L to less than 1mg/L. Post implementation survey after regular consumption of treated drinking water by the users for one year indicated user satisfaction and technological sustainability. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Image processing and pattern recognition with CVIPtools MATLAB toolbox: automatic creation of masks for veterinary thermographic images

    NASA Astrophysics Data System (ADS)

    Mishra, Deependra K.; Umbaugh, Scott E.; Lama, Norsang; Dahal, Rohini; Marino, Dominic J.; Sackman, Joseph

    2016-09-01

    CVIPtools is a software package for the exploration of computer vision and image processing developed in the Computer Vision and Image Processing Laboratory at Southern Illinois University Edwardsville. CVIPtools is available in three variants - a) CVIPtools Graphical User Interface, b) CVIPtools C library and c) CVIPtools MATLAB toolbox, which makes it accessible to a variety of different users. It offers students, faculty, researchers and any user a free and easy way to explore computer vision and image processing techniques. Many functions have been implemented and are updated on a regular basis, the library has reached a level of sophistication that makes it suitable for both educational and research purposes. In this paper, the detail list of the functions available in the CVIPtools MATLAB toolbox are presented and how these functions can be used in image analysis and computer vision applications. The CVIPtools MATLAB toolbox allows the user to gain practical experience to better understand underlying theoretical problems in image processing and pattern recognition. As an example application, the algorithm for the automatic creation of masks for veterinary thermographic images is presented.

  15. User Guide: How to Use and Operate Virtual Reality Equipment in the Systems Assessment and Usability Laboratory (SAUL) for Conducting Demonstrations

    DTIC Science & Technology

    2017-08-01

    ARL-TN-0839 ● AUG 2017 US Army Research Laboratory User Guide: How to Use and Operate Virtual Reality Equipment in the Systems...ARL-TN-0839 ● AUG 2017 US Army Research Laboratory User Guide: How to Use and Operate Virtual Reality Equipment in the Systems...September 2017 4. TITLE AND SUBTITLE User Guide: How to Use and Operate Virtual Reality Equipment in the Systems Assessment and Usability Laboratory

  16. Interfacing LabVIEW With Instrumentation for Electronic Failure Analysis and Beyond

    NASA Technical Reports Server (NTRS)

    Buchanan, Randy K.; Bryan, Coleman; Ludwig, Larry

    1996-01-01

    The Laboratory Virtual Instrumentation Engineering Workstation (LabVIEW) software is designed such that equipment and processes related to control systems can be operationally lined and controlled by the use of a computer. Various processes within the failure analysis laboratories of NASA's Kennedy Space Center (KSC) demonstrate the need for modernization and, in some cases, automation, using LabVIEW. An examination of procedures and practices with the Failure Analaysis Laboratory resulted in the conclusion that some device was necessary to elevate the potential users of LabVIEW to an operational level in minimum time. This paper outlines the process involved in creating a tutorial application to enable personnel to apply LabVIEW to their specific projects. Suggestions for furthering the extent to which LabVIEW is used are provided in the areas of data acquisition and process control.

  17. Remote Access Multi-Mission Processing and Analysis Ground Environment (RAMPAGE)

    NASA Technical Reports Server (NTRS)

    Lee, Y.; Specht, T.

    2000-01-01

    At Jet Propulsion Laboratory (JPL), a goal of providing easy and simple data access to the mission engineering data using web-based standards to a wide variety of users is now possible by the RAMPAGE development.

  18. [Impact of Lean methodology to improve care processes and levels of satisfaction in patient care in a clinical laboratory].

    PubMed

    Morón-Castañeda, L H; Useche-Bernal, A; Morales-Reyes, O L; Mojica-Figueroa, I L; Palacios-Carlos, A; Ardila-Gómez, C E; Parra-Ardila, M V; Martínez-Nieto, O; Sarmiento-Echeverri, N; Rodríguez, C A; Alvarado-Heine, C; Isaza-Ruget, M A

    2015-01-01

    The application of the Lean methodology in health institutions is an effective tool to improve the capacity and workflow, as well as to increase the level of satisfaction of patients and employees. To optimise the time of outpatient care in a clinical laboratory, by implementing a methodology based on the organisation of operational procedures to improve user satisfaction and reduce the number of complaints for delays in care. A quasi-experimental before and after study was conducted between October 2011 to September 2012. XBar and S charts were used to observe the mean service times and standard deviation. The user satisfaction was assessed using service questionnaires. A reduction of 17 minutes was observed in the time of patient care from arrival to leaving the laboratory, and a decrease of 60% in complaints of delay in care. Despite the high staff turnover and 38% increase in the number of patients seen, a culture of empowerment and continuous improvement was acquired, as well as greater efficiency and productivity in the care process, which was reflected by maintaining standards 12 months after implementation. Lean is a viable methodology for clinical laboratory procedures, improving their efficiency and effectiveness. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.

  19. Touchstone for success

    NASA Astrophysics Data System (ADS)

    Longdon, Norman; Dauphin, J.; Dunn, B. D.; Judd, M. D.; Levadou, F. G.; Zwaal, A.

    1992-04-01

    This booklet is addressed to the users of the Materials and Processes Laboratories of the European Space Research and Technology Centre (ESTEC). The revised edition updates the July 1988 edition featuring the enhancement of existing laboratories and the establishment of a ceramics laboratory. Information on three ESTEC laboratories is presented as well as a look into the future. The three laboratories are the Environmental Effects Laboratory, the Metallic Materials Laboratory, and the Non-metallic Laboratory. The booklet reports on the effects of the space environment on radiation effects (UV and particles), outgassing and contamination, charging-up and discharges, particulate contaminants, atomic oxygen and debris/impacts. Applications of metallic materials to space hardware are covered in the areas of mechanical properties, corrosion/stress corrosion, fracture testing and interpretation, metallurgical processes and failure analysis. Particular applications of non metallic materials to space hardware that are covered are advanced and reinforced polymers, advanced ceramics, thermal properties, manned ambiance, polymer processing, non-destructive tests (NDT), and failure analysis. Future emphasis will be on the measurement of thermo-optical properties for the Infrared Space Observatory (ISO) and other infrared telescopes, support of the Columbus program, Hermes related problems such as 'warm' composites and 'hot' reinforced ceramics for thermal insulation, materials for extravehicular activity (EVA), and NDT.

  20. Shared-resource computing for small research labs.

    PubMed

    Ackerman, M J

    1982-04-01

    A real time laboratory computer network is described. This network is composed of four real-time laboratory minicomputers located in each of four division laboratories and a larger minicomputer in a centrally located computer room. Off the shelf hardware and software were used with no customization. The network is configured for resource sharing using DECnet communications software and the RSX-11-M multi-user real-time operating system. The cost effectiveness of the shared resource network and multiple real-time processing using priority scheduling is discussed. Examples of utilization within a medical research department are given.

  1. SHynergie: Development of a virtual project laboratory for monitoring hydraulic stimulations

    NASA Astrophysics Data System (ADS)

    Renner, Jörg; Friederich, Wolfgang; Meschke, Günther; Müller, Thomas; Steeb, Holger

    2016-04-01

    Hydraulic stimulations are the primary means of developing subsurface reservoirs regarding the extent of fluid transport in them. The associated creation or conditioning of a system of hydraulic conduits involves a range of hydraulic and mechanical processes but also chemical reactions, such as dissolution and precipitation, may affect the stimulation result on time scales as short as hours. In the light of the extent and complexity of these processes, the steering potential for the operator of a stimulation critically depends on the ability to integrate the maximum amount of site-specific information with profound process understanding and a large spectrum of experience. We report on the development of a virtual project laboratory for monitoring hydraulic stimulations within the project SHynergie (http://www.ruhr-uni-bochum.de/shynergie/). The concept of the laboratory envisioned product that constitutes a preparing and accompanying rather than post-processing instrument ultimately accessible to persons responsible for a project over a web-repository. The virtual laboratory consists of a data base, a toolbox, and a model-building environment. Entries in the data base are of two categories. On the one hand, selected mineral and rock properties are provided from the literature. On the other hand, project-specific entries of any format can be made that are assigned attributes regarding their use in a stimulation problem at hand. The toolbox is interactive and allows the user to perform calculations of effective properties and simulations of different types (e.g., wave propagation in a reservoir, hydraulic test). The model component is also hybrid. The laboratory provides a library of models reflecting a range of scenarios but also allows the user to develop a site-specific model constituting the basis for simulations. The laboratory offers the option to use its components following the typical workflow of a stimulation project. The toolbox incorporates simulation instruments developed in the course of the SHynergie project that account for the experimental and modeling results of the various sub-projects.

  2. Are Experienced Hearing Aid Users Faster at Grasping the Meaning of a Sentence Than Inexperienced Users? An Eye-Tracking Study

    PubMed Central

    Kollmeier, Birger; Neher, Tobias

    2016-01-01

    This study assessed the effects of hearing aid (HA) experience on how quickly a participant can grasp the meaning of an acoustic sentence-in-noise stimulus presented together with two similar pictures that either correctly (target) or incorrectly (competitor) depict the meaning conveyed by the sentence. Using an eye tracker, the time taken by the participant to start fixating the target (the processing time) was measured for two levels of linguistic complexity (low vs. high) and three HA conditions: clinical linear amplification (National Acoustic Laboratories-Revised), single-microphone noise reduction with National Acoustic Laboratories-Revised, and linear amplification ensuring a sensation level of ≥ 15 dB up to at least 4 kHz for the speech material used here. Timed button presses to the target stimuli after the end of the sentences (offline reaction times) were also collected. Groups of experienced (eHA) and inexperienced (iHA) HA users matched in terms of age, hearing loss, and working memory capacity took part (N = 15 each). For the offline reaction times, no effects were found. In contrast, processing times increased with linguistic complexity. Furthermore, for all HA conditions, processing times were longer (poorer) for the iHA group than for the eHA group, despite comparable speech recognition performance. Taken together, these results indicate that processing times are more sensitive to speech processing-related factors than offline reaction times. Furthermore, they support the idea that HA experience positively impacts the ability to process noisy speech quickly, irrespective of the precise gain characteristics. PMID:27595793

  3. Resource allocation planning with international components

    NASA Technical Reports Server (NTRS)

    Burke, Gene; Durham, Ralph; Leppla, Frank; Porter, David

    1993-01-01

    Dumas, Briggs, Reid and Smith (1989) describe the need for identifying mutually acceptable methodologies for developing standard agreements for the exchange of tracking time or facility use among international components. One possible starting point is the current process used at the Jet Propulsion Laboratory (JPL) in planning the use of tracking resources. While there is a significant promise of better resource utilization by international cooperative agreements, there is a serious challenge to provide convenient user participation given the separate project and network locations. Coordination among users and facility providers will require a more decentralized communication process and a wider variety of automated planning tools to help users find potential exchanges. This paper provides a framework in which international cooperation in the utilization of ground based space communication systems can be facilitated.

  4. 9 CFR 130.17 - User fees for other veterinary diagnostic laboratory tests performed at NVSL (excluding FADDL) or...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 9 Animals and Animal Products 1 2011-01-01 2011-01-01 false User fees for other veterinary... FEES USER FEES § 130.17 User fees for other veterinary diagnostic laboratory tests performed at NVSL (excluding FADDL) or at authorized sites. (a) User fees for veterinary diagnostics tests performed at the...

  5. 9 CFR 130.17 - User fees for other veterinary diagnostic laboratory tests performed at NVSL (excluding FADDL) or...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 9 Animals and Animal Products 1 2010-01-01 2010-01-01 false User fees for other veterinary... FEES USER FEES § 130.17 User fees for other veterinary diagnostic laboratory tests performed at NVSL (excluding FADDL) or at authorized sites. (a) User fees for veterinary diagnostics tests performed at the...

  6. DB4US: A Decision Support System for Laboratory Information Management.

    PubMed

    Carmona-Cejudo, José M; Hortas, Maria Luisa; Baena-García, Manuel; Lana-Linati, Jorge; González, Carlos; Redondo, Maximino; Morales-Bueno, Rafael

    2012-11-14

    Until recently, laboratory automation has focused primarily on improving hardware. Future advances are concentrated on intelligent software since laboratories performing clinical diagnostic testing require improved information systems to address their data processing needs. In this paper, we propose DB4US, an application that automates information related to laboratory quality indicators information. Currently, there is a lack of ready-to-use management quality measures. This application addresses this deficiency through the extraction, consolidation, statistical analysis, and visualization of data related to the use of demographics, reagents, and turn-around times. The design and implementation issues, as well as the technologies used for the implementation of this system, are discussed in this paper. To develop a general methodology that integrates the computation of ready-to-use management quality measures and a dashboard to easily analyze the overall performance of a laboratory, as well as automatically detect anomalies or errors. The novelty of our approach lies in the application of integrated web-based dashboards as an information management system in hospital laboratories. We propose a new methodology for laboratory information management based on the extraction, consolidation, statistical analysis, and visualization of data related to demographics, reagents, and turn-around times, offering a dashboard-like user web interface to the laboratory manager. The methodology comprises a unified data warehouse that stores and consolidates multidimensional data from different data sources. The methodology is illustrated through the implementation and validation of DB4US, a novel web application based on this methodology that constructs an interface to obtain ready-to-use indicators, and offers the possibility to drill down from high-level metrics to more detailed summaries. The offered indicators are calculated beforehand so that they are ready to use when the user needs them. The design is based on a set of different parallel processes to precalculate indicators. The application displays information related to tests, requests, samples, and turn-around times. The dashboard is designed to show the set of indicators on a single screen. DB4US was deployed for the first time in the Hospital Costa del Sol in 2008. In our evaluation we show the positive impact of this methodology for laboratory professionals, since the use of our application has reduced the time needed for the elaboration of the different statistical indicators and has also provided information that has been used to optimize the usage of laboratory resources by the discovery of anomalies in the indicators. DB4US users benefit from Internet-based communication of results, since this information is available from any computer without having to install any additional software. The proposed methodology and the accompanying web application, DB4US, automates the processing of information related to laboratory quality indicators and offers a novel approach for managing laboratory-related information, benefiting from an Internet-based communication mechanism. The application of this methodology has been shown to improve the usage of time, as well as other laboratory resources.

  7. A rigorous approach to facilitate and guarantee the correctness of the genetic testing management in human genome information systems.

    PubMed

    Araújo, Luciano V; Malkowski, Simon; Braghetto, Kelly R; Passos-Bueno, Maria R; Zatz, Mayana; Pu, Calton; Ferreira, João E

    2011-12-22

    Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces.

  8. A rigorous approach to facilitate and guarantee the correctness of the genetic testing management in human genome information systems

    PubMed Central

    2011-01-01

    Background Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. Results This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. Conclusions This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces. PMID:22369688

  9. A Nonlinear, Multiinput, Multioutput Process Control Laboratory Experiment

    ERIC Educational Resources Information Center

    Young, Brent R.; van der Lee, James H.; Svrcek, William Y.

    2006-01-01

    Experience in using a user-friendly software, Mathcad, in the undergraduate chemical reaction engineering course is discussed. Example problems considered for illustration deal with simultaneous solution of linear algebraic equations (kinetic parameter estimation), nonlinear algebraic equations (equilibrium calculations for multiple reactions and…

  10. CONCEPTS AND APPROACHES FOR THE BIOASSESSMENT OF NON-WADEABLE STREAMS AND RIVERS

    EPA Science Inventory

    This document is intended to assist users in establishing or refining protocols, including the specific methods related to field sampling, laboratory sample processing, taxonomy, data entry, management and analysis, and final assessment and reporting. It also reviews and provide...

  11. Secure Display of Space-Exploration Images

    NASA Technical Reports Server (NTRS)

    Cheng, Cecilia; Thornhill, Gillian; McAuley, Michael

    2006-01-01

    Java EDR Display Interface (JEDI) is software for either local display or secure Internet distribution, to authorized clients, of image data acquired from cameras aboard spacecraft engaged in exploration of remote planets. ( EDR signifies experimental data record, which, in effect, signifies image data.) Processed at NASA s Multimission Image Processing Laboratory (MIPL), the data can be from either near-realtime processing streams or stored files. JEDI uses the Java Advanced Imaging application program interface, plus input/output packages that are parts of the Video Image Communication and Retrieval software of the MIPL, to display images. JEDI can be run as either a standalone application program or within a Web browser as a servlet with an applet front end. In either operating mode, JEDI communicates using the HTTP(s) protocol(s). In the Web-browser case, the user must provide a password to gain access. For each user and/or image data type, there is a configuration file, called a "personality file," containing parameters that control the layout of the displays and the information to be included in them. Once JEDI has accepted the user s password, it processes the requested EDR (provided that user is authorized to receive the specific EDR) to create a display according to the user s personality file.

  12. On-line interactive virtual experiments on nanoscience

    NASA Astrophysics Data System (ADS)

    Kadar, Manuella; Ileana, Ioan; Hutanu, Constantin

    2009-01-01

    This paper is an overview on the next generation web which allows students to experience virtual experiments on nano science, physics devices, processes and processing equipment. Virtual reality is used to support a real university lab in which a student can experiment real lab sessions. The web material is presented in an intuitive and highly visual 3D form that is accessible to a diverse group of students. Such type of laboratory provides opportunities for professional and practical education for a wide range of users. The expensive equipment and apparatuses that build the experimental stage in a particular standard laboratory is used to create virtual educational research laboratories. Students learn how to prepare the apparatuses and facilities for the experiment. The online experiments metadata schema is the format for describing online experiments, much like the schema behind a library catalogue used to describe the books in a library. As an online experiment is a special kind of learning object, one specifies its schema as an extension to an established metadata schema for learning objects. The content of the courses, metainformation as well as readings and user data are saved on the server in a database as XML objects.

  13. Virtual Earth System Laboratory (VESL): A Virtual Research Environment for The Visualization of Earth System Data and Process Simulations

    NASA Astrophysics Data System (ADS)

    Cheng, D. L. C.; Quinn, J. D.; Larour, E. Y.; Halkides, D. J.

    2017-12-01

    The Virtual Earth System Laboratory (VESL) is a Web application, under continued development at the Jet Propulsion Laboratory and UC Irvine, for the visualization of Earth System data and process simulations. As with any project of its size, we have encountered both successes and challenges during the course of development. Our principal point of success is the fact that VESL users can interact seamlessly with our earth science simulations within their own Web browser. Some of the challenges we have faced include retrofitting the VESL Web application to respond to touch gestures, reducing page load time (especially as the application has grown), and accounting for the differences between the various Web browsers and computing platforms.

  14. Effects of Nutrients and Physical Forcing on Satellite-Derived Optical Properties Near the Mississippi River Delta

    DTIC Science & Technology

    2007-07-17

    receiving system and NRL’s Automated Processing System (APS) (Martinolich 2005). APS Version 3.4 utilized atmospheric correction algorithms proscribed by... Automated Processing System User’s Guide Version 3.4, edited by N.R. Laboratory. Rabalais, N.N., R.E. Turner, and W.J. Wiseman, Jr. 2002. Hypoxia in the

  15. Communication Systems Simulation Laboratory (CSSL): Simulation Planning Guide

    NASA Technical Reports Server (NTRS)

    Schlesinger, Adam

    2012-01-01

    The simulation process, milestones and inputs are unknowns to first-time users of the CSSL. The Simulation Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their engineering personnel in simulation planning and execution. Material covered includes a roadmap of the simulation process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, facility interfaces, and inputs necessary to define scope, cost, and schedule are included as an appendix to the guide.

  16. Agile data management for curation of genomes to watershed datasets

    NASA Astrophysics Data System (ADS)

    Varadharajan, C.; Agarwal, D.; Faybishenko, B.; Versteeg, R.

    2015-12-01

    A software platform is being developed for data management and assimilation [DMA] as part of the U.S. Department of Energy's Genomes to Watershed Sustainable Systems Science Focus Area 2.0. The DMA components and capabilities are driven by the project science priorities and the development is based on agile development techniques. The goal of the DMA software platform is to enable users to integrate and synthesize diverse and disparate field, laboratory, and simulation datasets, including geological, geochemical, geophysical, microbiological, hydrological, and meteorological data across a range of spatial and temporal scales. The DMA objectives are (a) developing an integrated interface to the datasets, (b) storing field monitoring data, laboratory analytical results of water and sediments samples collected into a database, (c) providing automated QA/QC analysis of data and (d) working with data providers to modify high-priority field and laboratory data collection and reporting procedures as needed. The first three objectives are driven by user needs, while the last objective is driven by data management needs. The project needs and priorities are reassessed regularly with the users. After each user session we identify development priorities to match the identified user priorities. For instance, data QA/QC and collection activities have focused on the data and products needed for on-going scientific analyses (e.g. water level and geochemistry). We have also developed, tested and released a broker and portal that integrates diverse datasets from two different databases used for curation of project data. The development of the user interface was based on a user-centered design process involving several user interviews and constant interaction with data providers. The initial version focuses on the most requested feature - i.e. finding the data needed for analyses through an intuitive interface. Once the data is found, the user can immediately plot and download data through the portal. The resulting product has an interface that is more intuitive and presents the highest priority datasets that are needed by the users. Our agile approach has enabled us to build a system that is keeping pace with the science needs while utilizing limited resources.

  17. Quality Assurance in Clinical Chemistry: A Touch of Statistics and A Lot of Common Sense

    PubMed Central

    2016-01-01

    Summary Working in laboratories of clinical chemistry, we risk feeling that our personal contribution to quality is small and that statistical models and manufacturers play the major roles. It is seldom sufficiently acknowledged that personal knowledge, skills and common sense are crucial for quality assurance in the interest of patients. The employees, environment and procedures inherent to the laboratory including its interactions with the clients are crucial for the overall result of the total testing chain. As the measurement systems, reagents and procedures are gradually improved, work on the preanalytical, postanalytical and clinical phases is likely to pay the most substantial dividends in accomplishing further quality improvements. This means changing attitudes and behaviour, especially of the users of the laboratory. It requires understanding people and how to engage them in joint improvement processes. We need to use our knowledge and common sense expanded with new skills e.g. from the humanities, management, business and change sciences in order to bring this about together with the users of the laboratory. PMID:28356868

  18. The National Solar Permitting Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gunderson, Renic

    "The soft costs of solar — costs not associated with hardware — remain stubbornly high. Among the biggest soft costs are those associated with inefficiencies in local permitting and inspection. A study by the National Renewable Energy Laboratory and Lawrence Berkeley National Laboratory estimates that these costs add an average of $0.22/W per residential installation. This project helps reduce non-hardware/balance of system (BOS) costs by creating and maintaining a free and available site of permitting requirements and solar system verification software that installers can use to reduce time, capital, and resource investments in tracking permitting requirements. Software tools to identifymore » best permitting practices can enable government stakeholders to optimize their permitting process and remove superfluous costs and requirements. Like ""a Wikipedia for solar permitting"", users can add, edit, delete, and update information for a given jurisdiction. We incentivize this crowdsourcing approach by recognizing users for their contributions in the form of SEO benefits to their company or organization by linking back to users' websites."« less

  19. Improving the Plasticity of LIMS Implementation: LIMS Extension through Microsoft Excel

    NASA Technical Reports Server (NTRS)

    Culver, Mark

    2017-01-01

    A Laboratory Information Management System (LIMS) is a databasing software with many built-in tools ideal for handling and documenting most laboratory processes in an accurate and consistent manner, making it an indispensable tool for the modern laboratory. However, a lot of LIMS end users will find that in the performance of analyses that have unique considerations such as standard curves, multiple stages incubations, or logical considerations, a base LIMS distribution may not ideally suit their needs. These considerations bring about the need for extension languages, which can extend the functionality of a LIMS. While these languages do provide the implementation team the functionality required to accommodate these special laboratory analyses, they are usually too complex for the end user to modify to compensate for natural changes in laboratory operations. The LIMS utilized by our laboratory offers a unique and easy-to-use choice for an extension language, one that is already heavily relied upon not only in science but also in most academic and business pursuits: Microsoft Excel. The validity of Microsoft Excel as a pseudo programming language and its usability and versatility as a LIMS extension language will be discussed. The NELAC implications and overall drawbacks of this LIMS configuration will also be discussed.

  20. MDMA, cortisol, and heightened stress in recreational ecstasy users.

    PubMed

    Parrott, Andrew C; Montgomery, Cathy; Wetherell, Mark A; Downey, Luke A; Stough, Con; Scholey, Andrew B

    2014-09-01

    Stress develops when an organism requires additional metabolic resources to cope with demanding situations. This review will debate how recreational 3,4-methylenedioxymethamphetamine (MDMA, 'ecstasy') can increase some aspects of acute and chronic stress in humans. Laboratory studies on the acute effects of MDMA on cortisol release and neurohormone levels in drug-free regular ecstasy/MDMA users have been reviewed, and the role of the hypothalamic-pituitary-adrenal (HPA) axis in chronic changes in anxiety, stress, and cognitive coping is debated. In the laboratory, acute ecstasy/MDMA use can increase cortisol levels by 100-200%, whereas ecstasy/MDMA-using dance clubbers experience an 800% increase in cortisol levels, because of the combined effects of the stimulant drug and dancing. Three-month hair samples of abstinent users revealed cortisol levels 400% higher than those in controls. Chronic users show heightened cortisol release in stressful environments and deficits in complex neurocognitive tasks. Event-related evoked response potential studies show altered patterns of brain activation, suggestive of increased mental effort, during basic information processing. Chronic mood deficits include more daily stress and higher depression in susceptible individuals. We conclude that ecstasy/MDMA increases cortisol levels acutely and subchronically and that changes in the HPA axis may explain why recreational ecstasy/MDMA users show various aspects of neuropsychobiological stress.

  1. The gallium melting-point standard: its role in manufacture and quality control of electronic thermometers for the clinical laboratory.

    PubMed

    Sostman, H E

    1977-01-01

    I discuss the traceability of calibration of electronic thermometers to thermometric constants of nature or to the National Bureau of Standards, form a manufacturer's basic standards through the manufacturing process to the user's laboratory. Useful electrical temperature sensors, their advantages, and means for resolving their disadvantages are described. I summarize our development of a cell for realizing the melting phase equilibrium of pure gallium (at 29.770 degrees C) as a thermometer calibration fixed point, and enumerate its advantages in the routine calibration verification of electrical thermometers in the clinical chemistry laboratory.

  2. Power User Interface

    NASA Technical Reports Server (NTRS)

    Pfister, Robin; McMahon, Joe

    2006-01-01

    Power User Interface 5.0 (PUI) is a system of middleware, written for expert users in the Earth-science community, PUI enables expedited ordering of data granules on the basis of specific granule-identifying information that the users already know or can assemble. PUI also enables expert users to perform quick searches for orderablegranule information for use in preparing orders. PUI 5.0 is available in two versions (note: PUI 6.0 has command-line mode only): a Web-based application program and a UNIX command-line- mode client program. Both versions include modules that perform data-granule-ordering functions in conjunction with external systems. The Web-based version works with Earth Observing System Clearing House (ECHO) metadata catalog and order-entry services and with an open-source order-service broker server component, called the Mercury Shopping Cart, that is provided separately by Oak Ridge National Laboratory through the Department of Energy. The command-line version works with the ECHO metadata and order-entry process service. Both versions of PUI ultimately use ECHO to process an order to be sent to a data provider. Ordered data are provided through means outside the PUI software system.

  3. Practical experience with graphical user interfaces and object-oriented design in the clinical laboratory.

    PubMed

    Wells, I G; Cartwright, R Y; Farnan, L P

    1993-12-15

    The computing strategy in our laboratories evolved from research in Artificial Intelligence, and is based on powerful software tools running on high performance desktop computers with a graphical user interface. This allows most tasks to be regarded as design problems rather than implementation projects, and both rapid prototyping and an object-oriented approach to be employed during the in-house development and enhancement of the laboratory information systems. The practical application of this strategy is discussed, with particular reference to the system designer, the laboratory user and the laboratory customer. Routine operation covers five departments, and the systems are stable, flexible and well accepted by the users. Client-server computing, currently undergoing final trials, is seen as the key to further development, and this approach to Pathology computing has considerable potential for the future.

  4. Laboratory automation of high-quality and efficient ligand-binding assays for biotherapeutic drug development.

    PubMed

    Wang, Jin; Patel, Vimal; Burns, Daniel; Laycock, John; Pandya, Kinnari; Tsoi, Jennifer; DeSilva, Binodh; Ma, Mark; Lee, Jean

    2013-07-01

    Regulated bioanalytical laboratories that run ligand-binding assays in support of biotherapeutics development face ever-increasing demand to support more projects with increased efficiency. Laboratory automation is a tool that has the potential to improve both quality and efficiency in a bioanalytical laboratory. The success of laboratory automation requires thoughtful evaluation of program needs and fit-for-purpose strategies, followed by pragmatic implementation plans and continuous user support. In this article, we present the development of fit-for-purpose automation of total walk-away and flexible modular modes. We shared the sustaining experience of vendor collaboration and team work to educate, promote and track the use of automation. The implementation of laboratory automation improves assay performance, data quality, process efficiency and method transfer to CRO in a regulated bioanalytical laboratory environment.

  5. Laboratory Information Management System (LIMS): A case study

    NASA Technical Reports Server (NTRS)

    Crandall, Karen S.; Auping, Judith V.; Megargle, Robert G.

    1987-01-01

    In the late 70's, a refurbishment of the analytical laboratories serving the Materials Division at NASA Lewis Research Center was undertaken. As part of the modernization efforts, a Laboratory Information Management System (LIMS) was to be included. Preliminary studies indicated a custom-designed system as the best choice in order to satisfy all of the requirements. A scaled down version of the original design has been in operation since 1984. The LIMS, a combination of computer hardware, provides the chemical characterization laboratory with an information data base, a report generator, a user interface, and networking capabilities. This paper is an account of the processes involved in designing and implementing that LIMS.

  6. Safety | Argonne National Laboratory

    Science.gov Websites

    laboratory's ongoing effort to provide a safe and productive environment for employees, users, other site Skip to main content Argonne National Laboratory Toggle Navigation Toggle Search Energy Environment Careers Education Community Diversity Directory Energy Environment National Security User Facilities

  7. Francis Bitter National Magnet Laboratory annual report, July 1988 through June 1989

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1989-01-01

    Contents include: reports on laboratory research programs--magneto-optics and semiconductor physics, magnetism, superconductivity, solid-state nuclear magnetic resonance, condensed-matter chemistry, biomagnetism, magnet technology, instrumentation for high-magnetic-field research, molecular biophysics; reports of visiting scientists--reports of users of the High Magnetic Field Facility, reports of users of the Pulsed Field Facility, reports of users of the SQUID Magnetometer and Moessbauer Facility, reports of users of the High-Field NMR Facility; Appendices--publications and meeting speeches, organization, summary of High-Field Magnet Facility use January 1, 1981 through December 31, 1988; geographic distribution of High-Field Magnet users (excluding laboratory staff); and summary of educational activities.

  8. A low-cost computer-controlled Arduino-based educational laboratory system for teaching the fundamentals of photovoltaic cells

    NASA Astrophysics Data System (ADS)

    Zachariadou, K.; Yiasemides, K.; Trougkakos, N.

    2012-11-01

    We present a low-cost, fully computer-controlled, Arduino-based, educational laboratory (SolarInsight) to be used in undergraduate university courses concerned with electrical engineering and physics. The major goal of the system is to provide students with the necessary instrumentation, software tools and methodology in order to learn fundamental concepts of semiconductor physics by exploring the process of an experimental physics inquiry. The system runs under the Windows operating system and is composed of a data acquisition/control board, a power supply and processing boards, sensing elements, a graphical user interface and data analysis software. The data acquisition/control board is based on the Arduino open source electronics prototyping platform. The graphical user interface and communication with the Arduino are developed in C# and C++ programming languages respectively, by using IDE Microsoft Visual Studio 2010 Professional, which is freely available to students. Finally, the data analysis is performed by using the open source, object-oriented framework ROOT. Currently the system supports five teaching activities, each one corresponding to an independent tab in the user interface. SolarInsight has been partially developed in the context of a diploma thesis conducted within the Technological Educational Institute of Piraeus under the co-supervision of the Physics and Electronic Computer Systems departments’ academic staff.

  9. User needs, benefits and integration of robotic systems in a space station laboratory

    NASA Technical Reports Server (NTRS)

    Farnell, K. E.; Richard, J. A.; Ploge, E.; Badgley, M. B.; Konkel, C. R.; Dodd, W. R.

    1989-01-01

    The methodology, results and conclusions of the User Needs, Benefits, and Integration Study (UNBIS) of Robotic Systems in the Space Station Microgravity and Materials Processing Facility are summarized. Study goals include the determination of user requirements for robotics within the Space Station, United States Laboratory. Three experiments were selected to determine user needs and to allow detailed investigation of microgravity requirements. A NASTRAN analysis of Space Station response to robotic disturbances, and acceleration measurement of a standard industrial robot (Intelledex Model 660) resulted in selection of two ranges of low gravity manipulation: Level 1 (10-3 to 10-5 G at greater than 1 Hz.) and Level 2 (less than = 10-6 G at 0.1 Hz). This included an evaluation of microstepping methods for controlling stepper motors and concluded that an industrial robot actuator can perform milli-G motion without modification. Relative merits of end-effectors and manipulators were studied in order to determine their ability to perform a range of tasks related to the three low gravity experiments. An Effectivity Rating was established for evaluating these robotic system capabilities. Preliminary interface requirements were determined such that definition of requirements for an orbital flight demonstration experiment may be established.

  10. The Geohazards Exploitation Platform

    NASA Astrophysics Data System (ADS)

    Laur, Henri; Casu, Francesco; Bally, Philippe; Caumont, Hervé; Pinto, Salvatore

    2016-04-01

    The Geohazards Exploitation Platform, or Geohazards TEP (GEP), is an ESA originated R&D activity of the EO ground segment to demonstrate the benefit of new technologies for large scale processing of EO data. This encompasses on-demand processing for specific user needs, systematic processing to address common information needs of the geohazards community, and integration of newly developed processors for scientists and other expert users. The platform supports the geohazards community's objectives as defined in the context of the International Forum on Satellite EO and Geohazards organised by ESA and GEO in Santorini in 2012. The GEP is a follow on to the Supersites Exploitation Platform (SSEP) an ESA initiative to support the Geohazards Supersites & Natural Laboratories initiative (GSNL). Today the GEP allows to exploit 70+ Terabyte of ERS and ENVISAT archive and the Copernicus Sentinel-1 data available on line. The platform has already engaged 22 European early adopters in a validation activity initiated in March 2015. Since September, this validation has reached 29 single user projects. Each project is concerned with either integrating an application, running on demand processing or systematically generating a product collection using an application available in the platform. The users primarily include 15 geoscience centres and universities based in Europe: British Geological Survey (UK), University of Leeds (UK), University College London (UK), ETH University of Zurich (CH), INGV (IT), CNR-IREA and CNR-IRPI (IT), University of L'Aquila (IT), NOA (GR), Univ. Blaise Pascal & CNRS (FR), Ecole Normale Supérieure (FR), ISTERRE / University of Grenoble-Alpes (FR). In addition, there are users from Africa and North America with the University of Rabat (MA) and the University of Miami (US). Furthermore two space agencies and four private companies are involved: the German Space Research Centre DLR (DE), the European Space Agency (ESA), Altamira Information (ES), DEIMOS Space (ES), eGEOS (IT) and SATIM (PL). The GEP is now pursuing these projects with early adopters integrating additional conventional and advanced EO processors. It will also expand its user base to gradually reach a total of 60 separate users in pre-operations in 2017 with 6 new pilot projects being taken on board: photogrammetric processing using Optical EO data with University of Strasbourg (FR); optical based processing method for volcanic hazard monitoring with INGV (IT); systematic generation of Interferometric displacement time series based on the Sentinel-1 data with CNR IREA (IT); systematic processing of Sentinel-1 Interferometric Browse imagery with DLR (DE); precise terrain motion mapping with SPN Persistent Scatterers Interferometric chain of Altamira Information (ES); and a campaign to test and exploit GEP applications with the Corinth Rift Laboratory in which Greek and French experts of seismic hazards are engaged. Following the pre-operations phase starting in 2017 the Geohazards platform is intended to support a broad user community and has already established partnerships with large user networks, a particular example of which being the EPOS research infrastructure. Within EPOS, the GEP is intended to act as the main interface for accessing, processing, analysing and sharing products related to the Satellite Data Thematic Service.

  11. DB4US: A Decision Support System for Laboratory Information Management

    PubMed Central

    Hortas, Maria Luisa; Baena-García, Manuel; Lana-Linati, Jorge; González, Carlos; Redondo, Maximino; Morales-Bueno, Rafael

    2012-01-01

    Background Until recently, laboratory automation has focused primarily on improving hardware. Future advances are concentrated on intelligent software since laboratories performing clinical diagnostic testing require improved information systems to address their data processing needs. In this paper, we propose DB4US, an application that automates information related to laboratory quality indicators information. Currently, there is a lack of ready-to-use management quality measures. This application addresses this deficiency through the extraction, consolidation, statistical analysis, and visualization of data related to the use of demographics, reagents, and turn-around times. The design and implementation issues, as well as the technologies used for the implementation of this system, are discussed in this paper. Objective To develop a general methodology that integrates the computation of ready-to-use management quality measures and a dashboard to easily analyze the overall performance of a laboratory, as well as automatically detect anomalies or errors. The novelty of our approach lies in the application of integrated web-based dashboards as an information management system in hospital laboratories. Methods We propose a new methodology for laboratory information management based on the extraction, consolidation, statistical analysis, and visualization of data related to demographics, reagents, and turn-around times, offering a dashboard-like user web interface to the laboratory manager. The methodology comprises a unified data warehouse that stores and consolidates multidimensional data from different data sources. The methodology is illustrated through the implementation and validation of DB4US, a novel web application based on this methodology that constructs an interface to obtain ready-to-use indicators, and offers the possibility to drill down from high-level metrics to more detailed summaries. The offered indicators are calculated beforehand so that they are ready to use when the user needs them. The design is based on a set of different parallel processes to precalculate indicators. The application displays information related to tests, requests, samples, and turn-around times. The dashboard is designed to show the set of indicators on a single screen. Results DB4US was deployed for the first time in the Hospital Costa del Sol in 2008. In our evaluation we show the positive impact of this methodology for laboratory professionals, since the use of our application has reduced the time needed for the elaboration of the different statistical indicators and has also provided information that has been used to optimize the usage of laboratory resources by the discovery of anomalies in the indicators. DB4US users benefit from Internet-based communication of results, since this information is available from any computer without having to install any additional software. Conclusions The proposed methodology and the accompanying web application, DB4US, automates the processing of information related to laboratory quality indicators and offers a novel approach for managing laboratory-related information, benefiting from an Internet-based communication mechanism. The application of this methodology has been shown to improve the usage of time, as well as other laboratory resources. PMID:23608745

  12. MEMS-Based Waste Vibrational Energy Harvesters

    DTIC Science & Technology

    2013-06-01

    7 1. Lead Zirconium Titanate ( PZT ) .........................................................7 2. Aluminum...Laboratory PiezoMUMPS Piezoelectric Multi-User MEMS Processes PZT Lead Zirconate Titanate SEM Scanning Electron Microscopy SiO2 Silicon...titanate ( PZT ) possess high 4 coupling between the electrical and mechanical domains [11]. The output voltage, V, is related to the z-component

  13. DISCUS Interactive System Users' Manual. Final Report.

    ERIC Educational Resources Information Center

    Silver, Steven S.; Meredith, Joseph C.

    The results of the second 18 months (December 15, 1968-June 30, 1970) of effort toward developing an Information Processing Laboratory for research and education in library science is reported in six volumes. This volume contains: the basic on-line interchange, DISCUS operations, programming in DISCUS, concise DISCUS specifications, system author…

  14. The telerobot workstation testbed for the shuttle aft flight deck: A project plan for integrating human factors into system design

    NASA Technical Reports Server (NTRS)

    Sauerwein, Timothy

    1989-01-01

    The human factors design process in developing a shuttle orbiter aft flight deck workstation testbed is described. In developing an operator workstation to control various laboratory telerobots, strong elements of human factors engineering and ergonomics are integrated into the design process. The integration of human factors is performed by incorporating user feedback at key stages in the project life-cycle. An operator centered design approach helps insure the system users are working with the system designer in the design and operation of the system. The design methodology is presented along with the results of the design and the solutions regarding human factors design principles.

  15. A User Assessment of Workspaces in Selected Music Education Computer Laboratories.

    ERIC Educational Resources Information Center

    Badolato, Michael Jeremy

    A study of 120 students selected from the user populations of four music education computer laboratories was conducted to determine the applicability of current ergonomic and environmental design guidelines in satisfying the needs of users of educational computing workspaces. Eleven categories of workspace factors were organized into a…

  16. Friction Stir Welding Development at NASA-Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Bhat, Biliyar N.; Carter, Robert W.; Ding, Robert J.; Lawless, Kirby G.; Nunes, Arthur C., Jr.; Russell, Carolyn K.; Shah, Sandeep R.

    2001-01-01

    This paper presents an overview of friction stir welding (FSW) process development and applications at Marshall Space Flight Center (MSFC). FSW process development started as a laboratory curiosity but soon found support from many users. The FSW process advanced very quickly and has found many applications both within and outside the aerospace industry. It is currently being adapted for joining key elements of the Space Shuttle External Tank for improved producibility and reliability. FSW process modeling is done to better understand and improve the process. Special tools have been developed to weld variable thickness materials including thin and thick materials. FSW is now being applied to higher temperature materials such as copper and to advanced materials such as metal matrix composites. FSW technology is being successfully transferred from MSFC laboratory to shop floors of many commercial companies.

  17. The use of Graphic User Interface for development of a user-friendly CRS-Stack software

    NASA Astrophysics Data System (ADS)

    Sule, Rachmat; Prayudhatama, Dythia; Perkasa, Muhammad D.; Hendriyana, Andri; Fatkhan; Sardjito; Adriansyah

    2017-04-01

    The development of a user-friendly Common Reflection Surface (CRS) Stack software that has been built by implementing Graphical User Interface (GUI) is described in this paper. The original CRS-Stack software developed by WIT Consortium is compiled in the unix/linux environment, which is not a user-friendly software, so that a user must write the commands and parameters manually in a script file. Due to this limitation, the CRS-Stack become a non popular method, although applying this method is actually a promising way in order to obtain better seismic sections, which have better reflector continuity and S/N ratio. After obtaining successful results that have been tested by using several seismic data belong to oil companies in Indonesia, it comes to an idea to develop a user-friendly software in our own laboratory. Graphical User Interface (GUI) is a type of user interface that allows people to interact with computer programs in a better way. Rather than typing commands and module parameters, GUI allows the users to use computer programs in much simple and easy. Thus, GUI can transform the text-based interface into graphical icons and visual indicators. The use of complicated seismic unix shell script can be avoided. The Java Swing GUI library is used to develop this CRS-Stack GUI. Every shell script that represents each seismic process is invoked from Java environment. Besides developing interactive GUI to perform CRS-Stack processing, this CRS-Stack GUI is design to help geophysicists to manage a project with complex seismic processing procedures. The CRS-Stack GUI software is composed by input directory, operators, and output directory, which are defined as a seismic data processing workflow. The CRS-Stack processing workflow involves four steps; i.e. automatic CMP stack, initial CRS-Stack, optimized CRS-Stack, and CRS-Stack Supergather. Those operations are visualized in an informative flowchart with self explanatory system to guide the user inputting the parameter values for each operation. The knowledge of CRS-Stack processing procedure is still preserved in the software, which is easy and efficient to be learned. The software will still be developed in the future. Any new innovative seismic processing workflow will also be added into this GUI software.

  18. Integration of Dakota into the NEAMS Workbench

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swiler, Laura Painton; Lefebvre, Robert A.; Langley, Brandon R.

    2017-07-01

    This report summarizes a NEAMS (Nuclear Energy Advanced Modeling and Simulation) project focused on integrating Dakota into the NEAMS Workbench. The NEAMS Workbench, developed at Oak Ridge National Laboratory, is a new software framework that provides a graphical user interface, input file creation, parsing, validation, job execution, workflow management, and output processing for a variety of nuclear codes. Dakota is a tool developed at Sandia National Laboratories that provides a suite of uncertainty quantification and optimization algorithms. Providing Dakota within the NEAMS Workbench allows users of nuclear simulation codes to perform uncertainty and optimization studies on their nuclear codes frommore » within a common, integrated environment. Details of the integration and parsing are provided, along with an example of Dakota running a sampling study on the fuels performance code, BISON, from within the NEAMS Workbench.« less

  19. Documentation of the Goddard Laboratory for atmospheres fourth-order two-layer shallow water model

    NASA Technical Reports Server (NTRS)

    Takacs, L. L. (Compiler)

    1986-01-01

    The theory and numerical treatment used in the 2-level GLA fourth-order shallow water model are described. This model was designed to emulate the horizontal finite differences used by the GLA Fourth-Order General Circulation Model (Kalnay et al., 1983) in addition to its grid structure, form of high-latitude and global filtering, and time-integration schemes. A user's guide is also provided instructing the user on how to create initial conditions, execute the model, and post-process the data history.

  20. The Use of a UNIX-Based Workstation in the Information Systems Laboratory

    DTIC Science & Technology

    1989-03-01

    system. The conclusions of the research and the resulting recommendations are presented in Chapter III. These recommendations include how to manage...required to run the program on a new system, these should not be significant changes. 2. Processing Environment The UNIX processing environment is...interactive with multi-tasking and multi-user capabilities. Multi-tasking refers to the fact that many programs can be run concurrently. This capability

  1. Francis Bitter National Magnet Laboratory annual report, July 1990 through June 1991

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-06-01

    The contents include: reports on laboratory research programs--magneto-optics and semiconductor physics, magnetism, superconductivity, solid state nuclear magnetic resonance, condensed matter chemistry, biomagnetism, magnet technology, instrumentation, molecular biophysics; reports of visiting scientists--reports of users of the high magnetic field facility, reports of users of the pulsed field facility, reports of users of the SQUID magnetometer and Mossbauer facility, reports of users of the high field NMR facility; appendices--publications and meeting speeches, organization, summary of high magnetic field facility use, user tables, geographic distribution of high magnetic field facility users, summary of educational activities.

  2. Francis Bitter National Magnet Laboratory annual report, July 1989 through June 1990

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1990-01-01

    Contents: Reports on laboratory research programs: Magneto-optics and semiconductor physics, Magnetism, Superconductivity, Solid state nuclear magnetic resonance, Condensed matter chemistry, Biomagnetism, Magnet technology, Molecular biophysics; Reports of visiting scientists: Reports of users of the High Magnetic Field Facility, Reports of users of the pulsed field facility, Reports of users of the squid magnetometer and Mossbauer facility, Reports of users of the high field NMR facility; Appendices: Publications and meeting speeches, Organization, Summary of high magnetic field facility use, User tables, Geographic distribution of high magnetic field facility users, Summary of educational activities.

  3. User's Manual for the National Water-Quality Assessment Program Invertebrate Data Analysis System (IDAS) Software: Version 3

    USGS Publications Warehouse

    Cuffney, Thomas F.

    2003-01-01

    The Invertebrate Data Analysis System (IDAS) software provides an accurate, consistent, and efficient mechanism for analyzing invertebrate data collected as part of the National Water-Quality Assessment Program and stored in the Biological Transactional Database (Bio-TDB). The IDAS software is a stand-alone program for personal computers that run Microsoft (MS) Windows?. It allows users to read data downloaded from Bio-TDB and stored either as MS Excel? or MS Access? files. The program consists of five modules. The Edit Data module allows the user to subset, combine, delete, and summarize community data. The Data Preparation module allows the user to select the type(s) of sample(s) to process, calculate densities, delete taxa based on laboratory processing notes, combine lifestages or keep them separate, select a lowest taxonomic level for analysis, delete rare taxa, and resolve taxonomic ambiguities. The Calculate Community Metrics module allows the user to calculate over 130 community metrics, including metrics based on organism tolerances and functional feeding groups. The Calculate Diversities and Similarities module allows the user to calculate nine diversity and eight similarity indices. The Data export module allows the user to export data to other software packages and produce tables of community data that can be imported into spreadsheet and word-processing programs. Though the IDAS program was developed to process invertebrate data downloaded from USGS databases, it will work with other data sets that are converted to the USGS (Bio-TDB) format. Consequently, the data manipulation, analysis, and export procedures provided by the IDAS program can be used by anyone involved in using benthic macroinvertebrates in applied or basic research.

  4. Cardio-PACs: a new opportunity

    NASA Astrophysics Data System (ADS)

    Heupler, Frederick A., Jr.; Thomas, James D.; Blume, Hartwig R.; Cecil, Robert A.; Heisler, Mary

    2000-05-01

    It is now possible to replace film-based image management in the cardiac catheterization laboratory with a Cardiology Picture Archiving and Communication System (Cardio-PACS) based on digital imaging technology. The first step in the conversion process is installation of a digital image acquisition system that is capable of generating high-quality DICOM-compatible images. The next three steps, which are the subject of this presentation, involve image display, distribution, and storage. Clinical requirements and associated cost considerations for these three steps are listed below: Image display: (1) Image quality equal to film, with DICOM format, lossless compression, image processing, desktop PC-based with color monitor, and physician-friendly imaging software; (2) Performance specifications include: acquire 30 frames/sec; replay 15 frames/sec; access to file server 5 seconds, and to archive 5 minutes; (3) Compatibility of image file, transmission, and processing formats; (4) Image manipulation: brightness, contrast, gray scale, zoom, biplane display, and quantification; (5) User-friendly control of image review. Image distribution: (1) Standard IP-based network between cardiac catheterization laboratories, file server, long-term archive, review stations, and remote sites; (2) Non-proprietary formats; (3) Bidirectional distribution. Image storage: (1) CD-ROM vs disk vs tape; (2) Verification of data integrity; (3) User-designated storage capacity for catheterization laboratory, file server, long-term archive. Costs: (1) Image acquisition equipment, file server, long-term archive; (2) Network infrastructure; (3) Review stations and software; (4) Maintenance and administration; (5) Future upgrades and expansion; (6) Personnel.

  5. Human System Simulation in Support of Human Performance Technical Basis at NPPs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David Gertman; Katya Le Blanc; alan mecham

    2010-06-01

    This paper focuses on strategies and progress toward establishing the Idaho National Laboratory’s (INL’s) Human Systems Simulator Laboratory at the Center for Advanced Energy Studies (CAES), a consortium of Idaho State Universities. The INL is one of the National Laboratories of the US Department of Energy. One of the first planned applications for the Human Systems Simulator Laboratory is implementation of a dynamic nuclear power plant simulation (NPP) where studies of operator workload, situation awareness, performance and preference will be carried out in simulated control rooms including nuclear power plant control rooms. Simulation offers a means by which to reviewmore » operational concepts, improve design practices and provide a technical basis for licensing decisions. In preparation for the next generation power plant and current government and industry efforts in support of light water reactor sustainability, human operators will be attached to a suite of physiological measurement instruments and, in combination with traditional Human Factors Measurement techniques, carry out control room tasks in simulated advanced digital and hybrid analog/digital control rooms. The current focus of the Human Systems Simulator Laboratory is building core competence in quantitative and qualitative measurements of situation awareness and workload. Of particular interest is whether introduction of digital systems including automated procedures has the potential to reduce workload and enhance safety while improving situation awareness or whether workload is merely shifted and situation awareness is modified in yet to be determined ways. Data analysis is carried out by engineers and scientists and includes measures of the physical and neurological correlates of human performance. The current approach supports a user-centered design philosophy (see ISO 13407 “Human Centered Design Process for Interactive Systems, 1999) wherein the context for task performance along with the requirements of the end-user are taken into account during the design process and the validity of design is determined through testing of real end users« less

  6. The EU-project United4Health: User-centred design of an information system for a Norwegian telemedicine service.

    PubMed

    Smaradottir, Berglind; Gerdes, Martin; Martinez, Santiago; Fensli, Rune

    2016-10-01

    Organizational changes of health care services in Norway brought to light a need for new clinical pathways. This study presents the design and evaluation of an information system for a new telemedicine service for chronic obstructive pulmonary disease patients after hospital discharge. A user-centred design approach was employed composed of a workshop with end-users, two user tests and a field trial. For data collection, qualitative methods such as observations, semi-structured interviews and a questionnaire were used. User workshop's outcome informed the implementation of the system initial prototype, evaluated by end-users in a usability laboratory. Several usability and functionality issues were identified and solved, such as the interface between the initial colour scheme and the triage colours. Iterative refinements were made and a second user evaluation showed that the main issues were solved. The responses to a questionnaire presented a high score of user satisfaction. In the final phase, a field trial showed satisfactory use of the system. This study showed how the target end-users groups were actively involved in identifying the needs, suggestions and preferences. These aspects were addressed in the development of an information system through a user-centred design process. The process efficiently enabled users to give feedback about design and functionality. Continuous refinement of the system was the key to full development and suitability for the telemedicine service. This research was a result of the international cooperation between partners within the project United4Health, a part of the Seventh Framework Programme for Research of the European Union. © The Author(s) 2015.

  7. The Contribution of a Virtual Biology Laboratory to College Students' Learning

    ERIC Educational Resources Information Center

    Swan, Aubrie E.; O'Donnell, Angela M.

    2009-01-01

    The virtual laboratories developed by a life sciences department at a public university in the US were designed for use by college students enrolled in an introductory biology course. The results analyses conducted to examine their effectiveness indicated that self-selected users of the virtual laboratories outperformed non-users on laboratory…

  8. United States Air Force High School Apprenticeship Program. 1990 Program Management Report. Volume 3

    DTIC Science & Technology

    1991-04-18

    User Guide Shelly Knupp 73 Computer-Aided Design (CAD) Area Christopher O’Dell 74 Electron Beam Lithography Suzette Yu 68 Flight Dynamics Laboratory 75...fabrication. I Mr. Ed Davis, for the background knowledge of device processes and I information on electron beam lithography . Captain Mike Cheney, for...researcher may write gates on to the wafer by a process called lithography . This is the most crucial and complex part of the process. Two types of proven

  9. Implementation of a configurable laboratory information management system for use in cellular process development and manufacturing.

    PubMed

    Russom, Diana; Ahmed, Amira; Gonzalez, Nancy; Alvarnas, Joseph; DiGiusto, David

    2012-01-01

    Regulatory requirements for the manufacturing of cell products for clinical investigation require a significant level of record-keeping, starting early in process development and continuing through to the execution and requisite follow-up of patients on clinical trials. Central to record-keeping is the management of documentation related to patients, raw materials, processes, assays and facilities. To support these requirements, we evaluated several laboratory information management systems (LIMS), including their cost, flexibility, regulatory compliance, ongoing programming requirements and ability to integrate with laboratory equipment. After selecting a system, we performed a pilot study to develop a user-configurable LIMS for our laboratory in support of our pre-clinical and clinical cell-production activities. We report here on the design and utilization of this system to manage accrual with a healthy blood-donor protocol, as well as manufacturing operations for the production of a master cell bank and several patient-specific stem cell products. The system was used successfully to manage blood donor eligibility, recruiting, appointments, billing and serology, and to provide annual accrual reports. Quality management reporting features of the system were used to capture, report and investigate process and equipment deviations that occurred during the production of a master cell bank and patient products. Overall the system has served to support the compliance requirements of process development and phase I/II clinical trial activities for our laboratory and can be easily modified to meet the needs of similar laboratories.

  10. e-Stars Template Builder

    NASA Technical Reports Server (NTRS)

    Cox, Brian

    2003-01-01

    e-Stars Template Builder is a computer program that implements a concept of enabling users to rapidly gain access to information on projects of NASA's Jet Propulsion Laboratory. The information about a given project is not stored in a data base, but rather, in a network that follows the project as it develops. e-Stars Template Builder resides on a server computer, using Practical Extraction and Reporting Language (PERL) scripts to create what are called "e-STARS node templates," which are software constructs that allow for project-specific configurations. The software resides on the server and does not require specific software on the user machine except for an Internet browser. A user's computer need not be equipped with special software (other than an Internet-browser program). e-Stars Template Builder is compatible with Windows, Macintosh, and UNIX operating systems. A user invokes e-Stars Template Builder from a browser window. Operations that can be performed by the user include the creation of child processes and the addition of links and descriptions of documentation to existing pages or nodes. By means of this addition of "child processes" of nodes, a network that reflects the development of a project is generated.

  11. A user-friendly software package to ease the use of VIC hydrologic model for practitioners

    NASA Astrophysics Data System (ADS)

    Wi, S.; Ray, P.; Brown, C.

    2016-12-01

    The VIC (Variable Infiltration Capacity) hydrologic and river routing model simulates the water and energy fluxes that occur near the land surface and provides users with useful information regarding the quantity and timing of available water at points of interest within the basin. However, despite its popularity (proved by numerous applications in the literature), its wider adoption is hampered by the considerable effort required to prepare model inputs; e.g., input files storing spatial information related to watershed topography, soil properties, and land cover. This study presents a user-friendly software package (named VIC Setup Toolkit) developed within the MATLAB (matrix laboratory) framework and accessible through an intuitive graphical user interface. The VIC Setup Toolkit enables users to navigate the model building process confidently through prompts and automation, with an intention to promote the use of the model for both practical and academic purposes. The automated processes include watershed delineation, climate and geographical input set-up, model parameter calibration, graph generation and output evaluation. We demonstrate the package's usefulness in various case studies with the American River, Oklahoma River, Feather River and Zambezi River basins.

  12. Friction Stir Welding Development at National Aeronautics and Space Administration-Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Bhat, Biliyar N.; Carter, Robert W.; Ding, Robert J.; Lawless, Kirby G.; Nunes, Arthur C., Jr.; Russell, Carolyn K.; Shah, Sandeep R.; Munafo, Paul M. (Technical Monitor)

    2001-01-01

    This paper presents an over-view of friction stir welding (FSW) process development and applications at Marshall Space Flight Center (MSFC). FSW process development started as a laboratory curiosity but soon found support from many users. The FSW process advanced very quickly and has found many applications both within and outside the aerospace industry. It is currently being adapted for joining key elements of the Space Shuttle External Tank for improved producibility and reliability. FSW process modeling is done to better understand and improve the process. Special tools have been developed to weld variable thickness materials including very thin and very thick materials. FSW is now being applied to higher temperature materials such as copper and to advanced materials such as metal matrix composites. FSW technology is being successfully transferred from MSFC laboratory to shop floors of many commercial companies.

  13. Bottom-up laboratory testing of the DKIST Visible Broadband Imager (VBI)

    NASA Astrophysics Data System (ADS)

    Ferayorni, Andrew; Beard, Andrew; Cole, Wes; Gregory, Scott; Wöeger, Friedrich

    2016-08-01

    The Daniel K. Inouye Solar Telescope (DKIST) is a 4-meter solar observatory under construction at Haleakala, Hawaii [1]. The Visible Broadband Imager (VBI) is a first light instrument that will record images at the highest possible spatial and temporal resolution of the DKIST at a number of scientifically important wavelengths [2]. The VBI is a pathfinder for DKIST instrumentation and a test bed for developing processes and procedures in the areas of unit, systems integration, and user acceptance testing. These test procedures have been developed and repeatedly executed during VBI construction in the lab as part of a "test early and test often" philosophy aimed at identifying and resolving issues early thus saving cost during integration test and commissioning on summit. The VBI team recently completed a bottom up end-to-end system test of the instrument in the lab that allowed the instrument's functionality, performance, and usability to be validated against documented system requirements. The bottom up testing approach includes four levels of testing, each introducing another layer in the control hierarchy that is tested before moving to the next level. First the instrument mechanisms are tested for positioning accuracy and repeatability using a laboratory position-sensing detector (PSD). Second the real-time motion controls are used to drive the mechanisms to verify speed and timing synchronization requirements are being met. Next the high-level software is introduced and the instrument is driven through a series of end-to-end tests that exercise the mechanisms, cameras, and simulated data processing. Finally, user acceptance testing is performed on operational and engineering use cases through the use of the instrument engineering graphical user interface (GUI). In this paper we present the VBI bottom up test plan, procedures, example test cases and tools used, as well as results from test execution in the laboratory. We will also discuss the benefits realized through completion of this testing, and share lessons learned from the bottoms up testing process.

  14. Chemical Hygiene Program

    NASA Technical Reports Server (NTRS)

    Mayor, Antoinette C.

    1999-01-01

    The Chemical Management Team is responsible for ensuring compliance with the OSHA Laboratory Standard. The program at Lewis Research Center (LeRC) evolved over many years to include training, developing Standard Operating Procedures (SOPS) for each laboratory process, coordinating with other safety and health organizations and teams at the Center, and issuing an SOP binder. The Chemical Hygiene Policy was first established for the Center. The Chemical Hygiene Plan was established and reviewed by technical, laboratory and management for viability and applicability to the Center. A risk assessment was conducted for each laboratory. The laboratories were prioritized by order of risk, higher risk taking priority. A Chemical Management Team staff member interviewed the lead researcher for each laboratory process to gather the information needed to develop the SOP for the process. A binder containing the Chemical Hygiene Plan, the SOP, a map of the laboratory identifying the personal protective equipment and best egress, and glove guides, as well as other guides for safety and health. Each laboratory process has been captured in the form of an SOP. The chemicals used in the procedure have been identified and the information is used to reduce the number of chemicals in the lab. The Chemical Hygiene Plan binder is used as a training tool for new employees. LeRC is in compliance with the OSHA Standard. The program was designed to comply with the OSHA standard. In the process, we have been able to assess the usage of chemicals in the laboratories, as well as reduce or relocate the chemicals being stored in the laboratory. Our researchers are trained on the hazards of the materials they work with and have a better understanding of the hazards of the process and what is needed to prevent any incident. From the SOP process, we have been able to reduce our chemical inventory, determine and implement better hygiene procedures and equipment in the laboratories, and provide specific training to our employees. As a result of this program, we are adding labeling to the laboratories for emergency responders and initiating a certified chemical user program.

  15. Development and implementation of the Caribbean Laboratory Quality Management Systems Stepwise Improvement Process (LQMS-SIP) Towards Accreditation.

    PubMed

    Alemnji, George; Edghill, Lisa; Guevara, Giselle; Wallace-Sankarsingh, Sacha; Albalak, Rachel; Cognat, Sebastien; Nkengasong, John; Gabastou, Jean-Marc

    2017-01-01

    Implementing quality management systems and accrediting laboratories in the Caribbean has been a challenge. We report the development of a stepwise process for quality systems improvement in the Caribbean Region. The Caribbean Laboratory Stakeholders met under a joint Pan American Health Organization/US Centers for Disease Control and Prevention initiative and developed a user-friendly framework called 'Laboratory Quality Management System - Stepwise Improvement Process (LQMS-SIP) Towards Accreditation' to support countries in strengthening laboratory services through a stepwise approach toward fulfilling the ISO 15189: 2012 requirements. This approach consists of a three-tiered framework. Tier 1 represents the minimum requirements corresponding to the mandatory criteria for obtaining a licence from the Ministry of Health of the participating country. The next two tiers are quality improvement milestones that are achieved through the implementation of specific quality management system requirements. Laboratories that meet the requirements of the three tiers will be encouraged to apply for accreditation. The Caribbean Regional Organisation for Standards and Quality hosts the LQMS-SIP Secretariat and will work with countries, including the Ministry of Health and stakeholders, including laboratory staff, to coordinate and implement LQMS-SIP activities. The Caribbean Public Health Agency will coordinate and advocate for the LQMS-SIP implementation. This article presents the Caribbean LQMS-SIP framework and describes how it will be implemented among various countries in the region to achieve quality improvement.

  16. A Computerized Data-Capture System for Animal Biosafety Level 4 Laboratories

    PubMed Central

    Bente, Dennis A; Friesen, Jeremy; White, Kyle; Koll, Jordan; Kobinger, Gary P

    2011-01-01

    The restrictive nature of an Animal Biosafety Level 4 (ABSL4) laboratory complicates even simple clinical evaluation including data capture. Typically, clinical data are recorded on paper during procedures, faxed out of the ABSL4, and subsequently manually entered into a computer. This system has many disadvantages including transcriptional errors. Here, we describe the development of a highly customizable, tablet-PC-based computerized data-capture system, allowing reliable collection of observational and clinical data from experimental animals in a restrictive biocontainment setting. A multidisciplinary team with skills in containment laboratory animal science, database design, and software engineering collaborated on the development of this system. The goals were to design an easy-to-use and flexible user interface on a touch-screen tablet PC with user-supportable processes for recovery, full auditing capabilities, and cost effectiveness. The system simplifies data capture, reduces the necessary time in an ABSL4 environment, offers timely reporting and review of data, facilitates statistical analysis, reduces potential of erroneous data entry, improves quality assurance of animal care, and advances the use and refinement of humane endpoints. PMID:22330712

  17. Materials sciences programs: Fiscal year 1994

    NASA Astrophysics Data System (ADS)

    1995-04-01

    The Division of Materials Sciences is located within the DOE in the Office of Basic Energy Sciences. The Division of Materials Sciences is responsible for basic research and research facilities in strategic materials science topics of critical importance to the mission of the Department and its Strategic Plan. Materials Science is an enabling technology. The performance parameters, economics, environmental acceptability and safety of all energy generation, conversion, transmission and conservation technologies are limited by the properties and behavior of materials. The Materials Sciences programs develop scientific understanding of the synergistic relationship amongst the synthesis, processing, structure, properties, behavior, performance and other characteristics of materials. Emphasis is placed on the development of the capability to discover technologically, economically, and environmentally desirable new materials and processes, and the instruments and national user facilities necessary for achieving such progress. Materials Sciences sub-fields include physical metallurgy, ceramics, polymers, solid state and condensed matter physics, materials chemistry, surface science and related disciplines where the emphasis is on the science of materials. This report includes program descriptions for 458 research programs including 216 at 14 DOE National Laboratories, 242 research grants (233 for universities), and 9 Small Business Innovation Research (SBIR) Grants. The report is divided into eight sections. Section A contains all Laboratory projects, Section B has all contract research projects, Section C has projects funded under the SBIR Program, Section D describes the Center of Excellence for the Synthesis and Processing of Advanced Materials and E has information on major user facilities. F contains descriptions of other user facilities; G, a summary of funding levels; and H, indices characterizing research projects.

  18. Materials sciences programs, fiscal year 1994

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-04-01

    The Division of Materials Sciences is located within the DOE in the Office of Basic Energy Sciences. The Division of Materials Sciences is responsible for basic research and research facilities in strategic materials science topics of critical importance to the mission of the Department and its Strategic Plan. Materials Science is an enabling technology. The performance parameters, economics, environmental acceptability and safety of all energy generation, conversion, transmission and conservation technologies are limited by the properties and behavior of materials. The Materials Sciences programs develop scientific understanding of the synergistic relationship amongst the synthesis, processing, structure, properties, behavior, performance andmore » other characteristics of materials. Emphasis is placed on the development of the capability to discover technologically, economically, and environmentally desirable new materials and processes, and the instruments and national user facilities necessary for achieving such progress. Materials Sciences sub-fields include physical metallurgy, ceramics, polymers, solid state and condensed matter physics, materials chemistry, surface science and related disciplines where the emphasis is on the science of materials. This report includes program descriptions for 458 research programs including 216 at 14 DOE National Laboratories, 242 research grants (233 for universities), and 9 Small Business Innovation Research (SBIR) Grants. The report is divided into eight sections. Section A contains all Laboratory projects, Section B has all contract research projects, Section C has projects funded under the SBIR Program, Section D describes the Center of Excellence for the Synthesis and Processing of Advanced Materials and E has information on major user facilities. F contains descriptions of other user facilities; G, a summary of funding levels; and H, indices characterizing research projects.« less

  19. How to achieve benefit from mission-oriented research: lessons from the U.S. Department of Agriculture and the Naval Research Laboratory

    NASA Astrophysics Data System (ADS)

    Logar, N. J.

    2006-12-01

    Does the research performed by government mission agencies contribute to improved decision-making? Climate research within the U.S. Department of Agriculture (USDA) has the stated goal of providing "optimal benefit" to decision makers on all levels, and the meteorology division of Department of Defense's Naval Research Laboratory promises research directed towards application. Assuming that research can lead to benefit for decision makers with minimal guidance can lead to irrelevance, wasted effort, and missed opportunities. Moving beyond the assumption leads to critical consideration of processes creating climate and meteorological science. I report the results of contextual mapping, of research on decision processes, and of interviews with agency scientists and users of science to evaluate their science regimes. In the case of the USDA scientists do target stakeholders through formal and informal mechanisms, but much of the science does not find use due to institutional constraints, political considerations, and disciplinary inertia. The research results will provide options for closing these policy gaps, such as higher-level stakeholder interaction and better representation of diverse interests. I apply the economic concept of supply and demand to describe where supply of science provides decision support that matches user demand, and where science policies might miss opportunities or mischaracterize research as useful to a specific user. This analysis leads to increased understanding of how factors such as the definition of scientific problems, hierarchies in science decision-making structures, quality control mechanisms beyond peer review, distribution of participants in the knowledge production enterprise, and social accountability guide the process of producing useful information.

  20. Integrating NASA's Land Analysis System (LAS) image processing software with an appropriate Geographic Information System (GIS): A review of candidates in the public domain

    NASA Technical Reports Server (NTRS)

    Rochon, Gilbert L.

    1989-01-01

    A user requirements analysis (URA) was undertaken to determine and appropriate public domain Geographic Information System (GIS) software package for potential integration with NASA's LAS (Land Analysis System) 5.0 image processing system. The necessity for a public domain system was underscored due to the perceived need for source code access and flexibility in tailoring the GIS system to the needs of a heterogenous group of end-users, and to specific constraints imposed by LAS and its user interface, Transportable Applications Executive (TAE). Subsequently, a review was conducted of a variety of public domain GIS candidates, including GRASS 3.0, MOSS, IEMIS, and two university-based packages, IDRISI and KBGIS. The review method was a modified version of the GIS evaluation process, development by the Federal Interagency Coordinating Committee on Digital Cartography. One IEMIS-derivative product, the ALBE (AirLand Battlefield Environment) GIS, emerged as the most promising candidate for integration with LAS. IEMIS (Integrated Emergency Management Information System) was developed by the Federal Emergency Management Agency (FEMA). ALBE GIS is currently under development at the Pacific Northwest Laboratory under contract with the U.S. Army Corps of Engineers' Engineering Topographic Laboratory (ETL). Accordingly, recommendations are offered with respect to a potential LAS/ALBE GIS linkage and with respect to further system enhancements, including coordination with the development of the Spatial Analysis and Modeling System (SAMS) GIS in Goddard's IDM (Intelligent Data Management) developments in Goddard's National Space Science Data Center.

  1. Operation plan for the data 100/LARS terminal system

    NASA Technical Reports Server (NTRS)

    Bowen, A. J., Jr.

    1980-01-01

    The Data 100/LARS terminal system provides an interface for processing on the IBM 3031 computer system at Purdue University's Laboratory for Applications of Remote Sensing. The environment in which the system is operated and supported is discussed. The general support responsibilities, procedural mechanisms, and training established for the benefit of the system users are defined.

  2. Design, Fabrication, and Characterization of a Microelectromechanical Directional Microphone

    DTIC Science & Technology

    2011-06-01

    7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES...Figure 5.2 SOIC packaging . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38 Figure 5.3 Laboratory setup...Mean Squared SOC System-On-Chip SOIC Small Outline Integrated Circuit SOIMUMPS Silicon-On-Insulator Multi-User MEMS Process SPL Sound Pressure Level

  3. Advanced user support programme—TEMPUS IML-2

    NASA Astrophysics Data System (ADS)

    Diefenbach, A.; Kratz, M.; Uffelmann, D.; Willnecker, R.

    1995-05-01

    The DLR Microgravity User Support Centre (MUSC) in Cologne has supported microgravity experiments in the field of materials and life sciences since 1979. In the beginning of user support activities, MUSC tasks comprised the basic ground and mission support, whereas present programmes are expanded on, for example, powerful telescience and advanced real time data acquisition capabilities for efficient experiment operation and monitoring. In view of the Space Station era, user support functions will increase further. Additional tasks and growing responsibilities must be covered, e.g. extended science support as well as experiment and facility operations. The user support for TEMPUS IML-2, under contract of the German Space Agency DARA, represents a further step towards the required new-generation of future ground programme. TEMPUS is a new highly sophisticated Spacelab multi-user facility for containerless processing of metallic samples. Electromagnetic levitation technique is applied and various experiment diagnosis tools are offered. Experiments from eight U.S. and German investigator groups have been selected for flight on the second International Microgravity Laboratory Mission IML-2 in 1994. Based on the experience gained in the research programme of the DLR Institute for Space Simulation since 1984, MUSC is performing a comprehensive experiment preparation programme in close collaboration with the investigator teams. Complex laboratory equipment has been built up for technology and experiment preparation development. New experiment techniques have been developed for experiment verification tests. The MUSC programme includes thorough analysis and testing of scientific requirements of every proposed experiment with respect to the facility hard- and software capabilities. In addition, studies on the experiment-specific operation requirements have been performed and suitable telescience scenarios were analysed. The present paper will give a survey of the TEMPUS user support tasks emphasizing the advanced science support activities, which are considered significant for future ground programmes.

  4. Quality indicators in laboratory medicine: a fundamental tool for quality and patient safety.

    PubMed

    Plebani, Mario; Sciacovelli, Laura; Marinova, Mariela; Marcuccitti, Jessica; Chiozza, Maria Laura

    2013-09-01

    The identification of reliable quality indicators (QIs) is a crucial step in enabling users to quantify the quality of laboratory services. The current lack of attention to extra-laboratory factors is in stark contrast with the body of evidence pointing to the multitude of errors that continue to occur in the pre- and post-analytical phases. Different QIs and terminologies are currently used and, therefore, there is the need to harmonize proposed QIs. A model of quality indicators (MQI) has been consensually developed by a group of clinical laboratories according to a project launched by a working group of the International Federation of Clinical Chemistry and Laboratory Medicine (IFCC). The model includes 57 QIs related to key processes (35 pre-, 7 intra- and 15 post-analytical phases) and 3 to support processes. The developed MQI and the data collected provide evidence of the feasibility of the project to harmonize currently available QIs, but further efforts should be done to involve more clinical laboratories and to collect a more consistent amount of data. Copyright © 2012 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  5. Assessment of Application Technology of Natural User Interfaces in the Creation of a Virtual Chemical Laboratory

    ERIC Educational Resources Information Center

    Jagodzinski, Piotr; Wolski, Robert

    2015-01-01

    Natural User Interfaces (NUI) are now widely used in electronic devices such as smartphones, tablets and gaming consoles. We have tried to apply this technology in the teaching of chemistry in middle school and high school. A virtual chemical laboratory was developed in which students can simulate the performance of laboratory activities similar…

  6. Development and implementation of a custom integrated database with dashboards to assist with hematopathology specimen triage and traffic

    PubMed Central

    Azzato, Elizabeth M.; Morrissette, Jennifer J. D.; Halbiger, Regina D.; Bagg, Adam; Daber, Robert D.

    2014-01-01

    Background: At some institutions, including ours, bone marrow aspirate specimen triage is complex, with hematopathology triage decisions that need to be communicated to downstream ancillary testing laboratories and many specimen aliquot transfers that are handled outside of the laboratory information system (LIS). We developed a custom integrated database with dashboards to facilitate and streamline this workflow. Methods: We developed user-specific dashboards that allow entry of specimen information by technologists in the hematology laboratory, have custom scripting to present relevant information for the hematopathology service and ancillary laboratories and allow communication of triage decisions from the hematopathology service to other laboratories. These dashboards are web-accessible on the local intranet and accessible from behind the hospital firewall on a computer or tablet. Secure user access and group rights ensure that relevant users can edit or access appropriate records. Results: After database and dashboard design, two-stage beta-testing and user education was performed, with the first focusing on technologist specimen entry and the second on downstream users. Commonly encountered issues and user functionality requests were resolved with database and dashboard redesign. Final implementation occurred within 6 months of initial design; users report improved triage efficiency and reduced need for interlaboratory communications. Conclusions: We successfully developed and implemented a custom database with dashboards that facilitates and streamlines our hematopathology bone marrow aspirate triage. This provides an example of a possible solution to specimen communications and traffic that are outside the purview of a standard LIS. PMID:25250187

  7. Canadian macromolecular crystallography facility: a suite of fully automated beamlines.

    PubMed

    Grochulski, Pawel; Fodje, Michel; Labiuk, Shaunivan; Gorin, James; Janzen, Kathryn; Berg, Russ

    2012-06-01

    The Canadian light source is a 2.9 GeV national synchrotron radiation facility located on the University of Saskatchewan campus in Saskatoon. The small-gap in-vacuum undulator illuminated beamline, 08ID-1, together with the bending magnet beamline, 08B1-1, constitute the Canadian Macromolecular Crystallography Facility (CMCF). The CMCF provides service to more than 50 Principal Investigators in Canada and the United States. Up to 25% of the beam time is devoted to commercial users and the general user program is guaranteed up to 55% of the useful beam time through a peer-review process. CMCF staff provides "Mail-In" crystallography service to users with the highest scored proposals. Both beamlines are equipped with very robust end-stations including on-axis visualization systems, Rayonix 300 CCD series detectors and Stanford-type robotic sample auto-mounters. MxDC, an in-house developed beamline control system, is integrated with a data processing module, AutoProcess, allowing full automation of data collection and data processing with minimal human intervention. Sample management and remote monitoring of experiments is enabled through interaction with a Laboratory Information Management System developed at the facility.

  8. The Technology Information Environment with Industry{trademark} system description

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Detry, R.; Machin, G.

    The Technology Information Environment with Industry (TIE-In{trademark}) provides users with controlled access to distributed laboratory resources that are packaged in intelligent user interfaces. These interfaces help users access resources without requiring the user to have technical or computer expertise. TIE-In utilizes existing, proven technologies such as the Kerberos authentication system, X-Windows, and UNIX sockets. A Front End System (FES) authenticates users and allows them to register for resources and subsequently access them. The FES also stores status and accounting information, and provides an automated method for the resource owners to recover costs from users. The resources available through TIE-In aremore » typically laboratory-developed applications that are used to help design, analyze, and test components in the nation`s nuclear stockpile. Many of these applications can also be used by US companies for non-weapons-related work. TIE-In allows these industry partners to obtain laboratory-developed technical solutions without requiring them to duplicate the technical resources (people, hardware, and software) at Sandia.« less

  9. Laboratory process control using natural language commands from a personal computer

    NASA Technical Reports Server (NTRS)

    Will, Herbert A.; Mackin, Michael A.

    1989-01-01

    PC software is described which provides flexible natural language process control capability with an IBM PC or compatible machine. Hardware requirements include the PC, and suitable hardware interfaces to all controlled devices. Software required includes the Microsoft Disk Operating System (MS-DOS) operating system, a PC-based FORTRAN-77 compiler, and user-written device drivers. Instructions for use of the software are given as well as a description of an application of the system.

  10. Four principles for user interface design of computerised clinical decision support systems.

    PubMed

    Kanstrup, Anne Marie; Christiansen, Marion Berg; Nøhr, Christian

    2011-01-01

    The paper presents results from a design research project of a user interface (UI) for a Computerised Clinical Decision Support System (CDSS). The ambition has been to design Human-Computer Interaction (HCI) that can minimise medication errors. Through an iterative design process a digital prototype for prescription of medicine has been developed. This paper presents results from the formative evaluation of the prototype conducted in a simulation laboratory with ten participating physicians. Data from the simulation is analysed by use of theory on how users perceive information. The conclusion is a model, which sum up four principles of interaction for design of CDSS. The four principles for design of user interfaces for CDSS are summarised as four A's: All in one, At a glance, At hand and Attention. The model emphasises integration of all four interaction principles in the design of user interfaces for CDSS, i.e. the model is an integrated model which we suggest as a guide for interaction design when working with preventing medication errors.

  11. USGS Blind Sample Project: monitoring and evaluating laboratory analytical quality

    USGS Publications Warehouse

    Ludtke, Amy S.; Woodworth, Mark T.

    1997-01-01

    The U.S. Geological Survey (USGS) collects and disseminates information about the Nation's water resources. Surface- and ground-water samples are collected and sent to USGS laboratories for chemical analyses. The laboratories identify and quantify the constituents in the water samples. Random and systematic errors occur during sample handling, chemical analysis, and data processing. Although all errors cannot be eliminated from measurements, the magnitude of their uncertainty can be estimated and tracked over time. Since 1981, the USGS has operated an independent, external, quality-assurance project called the Blind Sample Project (BSP). The purpose of the BSP is to monitor and evaluate the quality of laboratory analytical results through the use of double-blind quality-control (QC) samples. The information provided by the BSP assists the laboratories in detecting and correcting problems in the analytical procedures. The information also can aid laboratory users in estimating the extent that laboratory errors contribute to the overall errors in their environmental data.

  12. Efficiency in pathology laboratories: a survey of operations management in NHS bacteriology.

    PubMed

    Szczepura, A K

    1991-01-01

    In recent years pathology laboratory services in the U.K. have experienced large increases in demand. But the extent to which U.K. laboratories have introduced controls to limit unnecessary procedures within the laboratory was previously unclear. This paper presents the results of a survey of all 343 NHS bacteriology laboratories which records the extent to which such operations management controls are now in place. The survey shows large differences between laboratories. Quality controls over inputs, the use of screening tests as a culture substitute, the use of direct susceptibility testing, controls over routine antibiotic susceptibility testing, and controls over reporting of results all vary widely. The survey also records the prevalence of hospital antibiotic policies, the extent to which laboratories produce antibiograms for user clinicians, the degree of computerisation in data handling, and the degree of automation in processing specimens. Finally, the survey uncovers a large variation between NHS labs in the percentage of bacteriology samples which prove positive and lead to antibiotic susceptibility tests being carried out.

  13. V-Sipal - a Virtual Laboratory for Satellite Image Processing and Analysis

    NASA Astrophysics Data System (ADS)

    Buddhiraju, K. M.; Eeti, L.; Tiwari, K. K.

    2011-09-01

    In this paper a virtual laboratory for the Satellite Image Processing and Analysis (v-SIPAL) being developed at the Indian Institute of Technology Bombay is described. v-SIPAL comprises a set of experiments that are normally carried out by students learning digital processing and analysis of satellite images using commercial software. Currently, the experiments that are available on the server include Image Viewer, Image Contrast Enhancement, Image Smoothing, Edge Enhancement, Principal Component Transform, Texture Analysis by Co-occurrence Matrix method, Image Indices, Color Coordinate Transforms, Fourier Analysis, Mathematical Morphology, Unsupervised Image Classification, Supervised Image Classification and Accuracy Assessment. The virtual laboratory includes a theory module for each option of every experiment, a description of the procedure to perform each experiment, the menu to choose and perform the experiment, a module on interpretation of results when performed with a given image and pre-specified options, bibliography, links to useful internet resources and user-feedback. The user can upload his/her own images for performing the experiments and can also reuse outputs of one experiment in another experiment where applicable. Some of the other experiments currently under development include georeferencing of images, data fusion, feature evaluation by divergence andJ-M distance, image compression, wavelet image analysis and change detection. Additions to the theory module include self-assessment quizzes, audio-video clips on selected concepts, and a discussion of elements of visual image interpretation. V-SIPAL is at the satge of internal evaluation within IIT Bombay and will soon be open to selected educational institutions in India for evaluation.

  14. Resolving Complex Research Data Management Issues in Biomedical Laboratories: Qualitative Study of an Industry-Academia Collaboration

    PubMed Central

    Myneni, Sahiti; Patel, Vimla L.; Bova, G. Steven; Wang, Jian; Ackerman, Christopher F.; Berlinicke, Cynthia A.; Chen, Steve H.; Lindvall, Mikael; Zack, Donald J.

    2016-01-01

    This paper describes a distributed collaborative effort between industry and academia to systematize data management in an academic biomedical laboratory. Heterogeneous and voluminous nature of research data created in biomedical laboratories make information management difficult and research unproductive. One such collaborative effort was evaluated over a period of four years using data collection methods including ethnographic observations, semi-structured interviews, web-based surveys, progress reports, conference call summaries, and face-to-face group discussions. Data were analyzed using qualitative methods of data analysis to 1) characterize specific problems faced by biomedical researchers with traditional information management practices, 2) identify intervention areas to introduce a new research information management system called Labmatrix, and finally to 3) evaluate and delineate important general collaboration (intervention) characteristics that can optimize outcomes of an implementation process in biomedical laboratories. Results emphasize the importance of end user perseverance, human-centric interoperability evaluation, and demonstration of return on investment of effort and time of laboratory members and industry personnel for success of implementation process. In addition, there is an intrinsic learning component associated with the implementation process of an information management system. Technology transfer experience in a complex environment such as the biomedical laboratory can be eased with use of information systems that support human and cognitive interoperability. Such informatics features can also contribute to successful collaboration and hopefully to scientific productivity. PMID:26652980

  15. ATLAS with CARIBU: A laboratory portrait

    DOE PAGES

    Pardo, Richard C.; Savard, Guy; Janssens, Robert V. F.

    2016-03-21

    The Argonne Tandem Linac Accelerator System (ATLAS) is the world's first superconducting accelerator for projectiles heavier than the electron. This unique system is a U.S. Department of Energy (DOE) national user research facility open to scientists from all over the world. Here, it is located within the Physics Division at Argonne National Laboratory and is one of five large scientific user facilities located at the laboratory.

  16. DESIGN AND EVALUATION OF INDIVIDUAL ELEMENTS OF THE INTERFACE FOR AN AGRICULTURAL MACHINE.

    PubMed

    Rakhra, Aadesh K; Mann, Danny D

    2018-01-29

    If a user-centered approach is not used to design information displays, the quantity and quality of information presented to the user may not match the needs of the user, or it may exceed the capability of the human operator for processing and using that information. The result may be an excessive mental workload and reduced situation awareness of the operator, which can negatively affect the machine performance and operational outcomes. The increasing use of technology in agricultural machines may expose the human operator to excessive and undesirable information if the operator's information needs and information processing capabilities are ignored. In this study, a user-centered approach was used to design specific interface elements for an agricultural air seeder. Designs of the interface elements were evaluated in a laboratory environment by developing high-fidelity prototypes. Evaluations of the user interface elements yielded significant improvement in situation awareness (up to 11%; overall mean difference = 5.0 (4.8%), 95% CI (6.4728, 3.5939), p 0.0001). Mental workload was reduced by up to 19.7% (overall mean difference = -5.2 (-7.9%), n = 30, a = 0.05). Study participants rated the overall performance of the newly designed user-centered interface elements higher in comparison to the previous designs (overall mean difference = 27.3 (189.8%), 99% CI (35.150, 19.384), p 0.0001. Copyright© by the American Society of Agricultural Engineers.

  17. GeoBrain Computational Cyber-laboratory for Earth Science Studies

    NASA Astrophysics Data System (ADS)

    Deng, M.; di, L.

    2009-12-01

    Computational approaches (e.g., computer-based data visualization, analysis and modeling) are critical for conducting increasingly data-intensive Earth science (ES) studies to understand functions and changes of the Earth system. However, currently Earth scientists, educators, and students have met two major barriers that prevent them from being effectively using computational approaches in their learning, research and application activities. The two barriers are: 1) difficulties in finding, obtaining, and using multi-source ES data; and 2) lack of analytic functions and computing resources (e.g., analysis software, computing models, and high performance computing systems) to analyze the data. Taking advantages of recent advances in cyberinfrastructure, Web service, and geospatial interoperability technologies, GeoBrain, a project funded by NASA, has developed a prototype computational cyber-laboratory to effectively remove the two barriers. The cyber-laboratory makes ES data and computational resources at large organizations in distributed locations available to and easily usable by the Earth science community through 1) enabling seamless discovery, access and retrieval of distributed data, 2) federating and enhancing data discovery with a catalogue federation service and a semantically-augmented catalogue service, 3) customizing data access and retrieval at user request with interoperable, personalized, and on-demand data access and services, 4) automating or semi-automating multi-source geospatial data integration, 5) developing a large number of analytic functions as value-added, interoperable, and dynamically chainable geospatial Web services and deploying them in high-performance computing facilities, 6) enabling the online geospatial process modeling and execution, and 7) building a user-friendly extensible web portal for users to access the cyber-laboratory resources. Users can interactively discover the needed data and perform on-demand data analysis and modeling through the web portal. The GeoBrain cyber-laboratory provides solutions to meet common needs of ES research and education, such as, distributed data access and analysis services, easy access to and use of ES data, and enhanced geoprocessing and geospatial modeling capability. It greatly facilitates ES research, education, and applications. The development of the cyber-laboratory provides insights, lessons-learned, and technology readiness to build more capable computing infrastructure for ES studies, which can meet wide-range needs of current and future generations of scientists, researchers, educators, and students for their formal or informal educational training, research projects, career development, and lifelong learning.

  18. A LabVIEW based template for user created experiment automation.

    PubMed

    Kim, D J; Fisk, Z

    2012-12-01

    We have developed an expandable software template to automate user created experiments. The LabVIEW based template is easily modifiable to add together user created measurements, controls, and data logging with virtually any type of laboratory equipment. We use reentrant sequential selection to implement sequence script making it possible to wrap a long series of the user created experiments and execute them in sequence. Details of software structure and application examples for scanning probe microscope and automated transport experiments using custom built laboratory electronics and a cryostat are described.

  19. Cielo Computational Environment Usage Model With Mappings to ACE Requirements for the General Availability User Environment Capabilities Release Version 1.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vigil,Benny Manuel; Ballance, Robert; Haskell, Karen

    Cielo is a massively parallel supercomputer funded by the DOE/NNSA Advanced Simulation and Computing (ASC) program, and operated by the Alliance for Computing at Extreme Scale (ACES), a partnership between Los Alamos National Laboratory (LANL) and Sandia National Laboratories (SNL). The primary Cielo compute platform is physically located at Los Alamos National Laboratory. This Cielo Computational Environment Usage Model documents the capabilities and the environment to be provided for the Q1 FY12 Level 2 Cielo Capability Computing (CCC) Platform Production Readiness Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model ismore » focused on the needs of the ASC user working in the secure computing environments at Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory, or Sandia National Laboratories, but also addresses the needs of users working in the unclassified environment. The Cielo Computational Environment Usage Model maps the provided capabilities to the tri-Lab ASC Computing Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the Production Readiness Milestone user environment capabilities of the ASC community. A description of ACE requirements met, and those requirements that are not met, are included in each section of this document. The Cielo Computing Environment, along with the ACE mappings, has been issued and reviewed throughout the tri-Lab community.« less

  20. IMAGES: An interactive image processing system

    NASA Technical Reports Server (NTRS)

    Jensen, J. R.

    1981-01-01

    The IMAGES interactive image processing system was created specifically for undergraduate remote sensing education in geography. The system is interactive, relatively inexpensive to operate, almost hardware independent, and responsive to numerous users at one time in a time-sharing mode. Most important, it provides a medium whereby theoretical remote sensing principles discussed in lecture may be reinforced in laboratory as students perform computer-assisted image processing. In addition to its use in academic and short course environments, the system has also been used extensively to conduct basic image processing research. The flow of information through the system is discussed including an overview of the programs.

  1. User Interactive Software for Analysis of Human Physiological Data

    NASA Technical Reports Server (NTRS)

    Cowings, Patricia S.; Toscano, William; Taylor, Bruce C.; Acharya, Soumydipta

    2006-01-01

    Ambulatory physiological monitoring has been used to study human health and performance in space and in a variety of Earth-based environments (e.g., military aircraft, armored vehicles, small groups in isolation, and patients). Large, multi-channel data files are typically recorded in these environments, and these files often require the removal of contaminated data prior to processing and analyses. Physiological data processing can now be performed with user-friendly, interactive software developed by the Ames Psychophysiology Research Laboratory. This software, which runs on a Windows platform, contains various signal-processing routines for both time- and frequency- domain data analyses (e.g., peak detection, differentiation and integration, digital filtering, adaptive thresholds, Fast Fourier Transform power spectrum, auto-correlation, etc.). Data acquired with any ambulatory monitoring system that provides text or binary file format are easily imported to the processing software. The application provides a graphical user interface where one can manually select and correct data artifacts utilizing linear and zero interpolation and adding trigger points for missed peaks. Block and moving average routines are also provided for data reduction. Processed data in numeric and graphic format can be exported to Excel. This software, PostProc (for post-processing) requires the Dadisp engineering spreadsheet (DSP Development Corp), or equivalent, for implementation. Specific processing routines were written for electrocardiography, electroencephalography, electromyography, blood pressure, skin conductance level, impedance cardiography (cardiac output, stroke volume, thoracic fluid volume), temperature, and respiration

  2. Development and implementation of the Caribbean Laboratory Quality Management Systems Stepwise Improvement Process (LQMS-SIP) Towards Accreditation

    PubMed Central

    Alemnji, George; Edghill, Lisa; Wallace-Sankarsingh, Sacha; Albalak, Rachel; Cognat, Sebastien; Nkengasong, John; Gabastou, Jean-Marc

    2017-01-01

    Background Implementing quality management systems and accrediting laboratories in the Caribbean has been a challenge. Objectives We report the development of a stepwise process for quality systems improvement in the Caribbean Region. Methods The Caribbean Laboratory Stakeholders met under a joint Pan American Health Organization/US Centers for Disease Control and Prevention initiative and developed a user-friendly framework called ‘Laboratory Quality Management System – Stepwise Improvement Process (LQMS-SIP) Towards Accreditation’ to support countries in strengthening laboratory services through a stepwise approach toward fulfilling the ISO 15189: 2012 requirements. Results This approach consists of a three-tiered framework. Tier 1 represents the minimum requirements corresponding to the mandatory criteria for obtaining a licence from the Ministry of Health of the participating country. The next two tiers are quality improvement milestones that are achieved through the implementation of specific quality management system requirements. Laboratories that meet the requirements of the three tiers will be encouraged to apply for accreditation. The Caribbean Regional Organisation for Standards and Quality hosts the LQMS-SIP Secretariat and will work with countries, including the Ministry of Health and stakeholders, including laboratory staff, to coordinate and implement LQMS-SIP activities. The Caribbean Public Health Agency will coordinate and advocate for the LQMS-SIP implementation. Conclusion This article presents the Caribbean LQMS-SIP framework and describes how it will be implemented among various countries in the region to achieve quality improvement. PMID:28879149

  3. Classification of Movement and Inhibition Using a Hybrid BCI.

    PubMed

    Chmura, Jennifer; Rosing, Joshua; Collazos, Steven; Goodwin, Shikha J

    2017-01-01

    Brain-computer interfaces (BCIs) are an emerging technology that are capable of turning brain electrical activity into commands for an external device. Motor imagery (MI)-when a person imagines a motion without executing it-is widely employed in BCI devices for motor control because of the endogenous origin of its neural control mechanisms, and the similarity in brain activation to actual movements. Challenges with translating a MI-BCI into a practical device used outside laboratories include the extensive training required, often due to poor user engagement and visual feedback response delays; poor user flexibility/freedom to time the execution/inhibition of their movements, and to control the movement type (right arm vs. left leg) and characteristics (reaching vs. grabbing); and high false positive rates of motion control. Solutions to improve sensorimotor activation and user performance of MI-BCIs have been explored. Virtual reality (VR) motor-execution tasks have replaced simpler visual feedback (smiling faces, arrows) and have solved this problem to an extent. Hybrid BCIs (hBCIs) implementing an additional control signal to MI have improved user control capabilities to a limited extent. These hBCIs either fail to allow the patients to gain asynchronous control of their movements, or have a high false positive rate. We propose an immersive VR environment which provides visual feedback that is both engaging and immediate, but also uniquely engages a different cognitive process in the patient that generates event-related potentials (ERPs). These ERPs provide a key executive function for the users to execute/inhibit movements. Additionally, we propose signal processing strategies and machine learning algorithms to move BCIs toward developing long-term signal stability in patients with distinctive brain signals and capabilities to control motor signals. The hBCI itself and the VR environment we propose would help to move BCI technology outside laboratory environments for motor rehabilitation in hospitals, and potentially for controlling a prosthetic.

  4. Classification of Movement and Inhibition Using a Hybrid BCI

    PubMed Central

    Chmura, Jennifer; Rosing, Joshua; Collazos, Steven; Goodwin, Shikha J.

    2017-01-01

    Brain-computer interfaces (BCIs) are an emerging technology that are capable of turning brain electrical activity into commands for an external device. Motor imagery (MI)—when a person imagines a motion without executing it—is widely employed in BCI devices for motor control because of the endogenous origin of its neural control mechanisms, and the similarity in brain activation to actual movements. Challenges with translating a MI-BCI into a practical device used outside laboratories include the extensive training required, often due to poor user engagement and visual feedback response delays; poor user flexibility/freedom to time the execution/inhibition of their movements, and to control the movement type (right arm vs. left leg) and characteristics (reaching vs. grabbing); and high false positive rates of motion control. Solutions to improve sensorimotor activation and user performance of MI-BCIs have been explored. Virtual reality (VR) motor-execution tasks have replaced simpler visual feedback (smiling faces, arrows) and have solved this problem to an extent. Hybrid BCIs (hBCIs) implementing an additional control signal to MI have improved user control capabilities to a limited extent. These hBCIs either fail to allow the patients to gain asynchronous control of their movements, or have a high false positive rate. We propose an immersive VR environment which provides visual feedback that is both engaging and immediate, but also uniquely engages a different cognitive process in the patient that generates event-related potentials (ERPs). These ERPs provide a key executive function for the users to execute/inhibit movements. Additionally, we propose signal processing strategies and machine learning algorithms to move BCIs toward developing long-term signal stability in patients with distinctive brain signals and capabilities to control motor signals. The hBCI itself and the VR environment we propose would help to move BCI technology outside laboratory environments for motor rehabilitation in hospitals, and potentially for controlling a prosthetic. PMID:28860986

  5. Novel Networked Remote Laboratory Architecture for Open Connectivity Based on PLC-OPC-LabVIEW-EJS Integration. Application in Remote Fuzzy Control and Sensors Data Acquisition.

    PubMed

    González, Isaías; Calderón, Antonio José; Mejías, Andrés; Andújar, José Manuel

    2016-10-31

    In this paper the design and implementation of a network for integrating Programmable Logic Controllers (PLC), the Object-Linking and Embedding for Process Control protocol (OPC) and the open-source Easy Java Simulations (EJS) package is presented. A LabVIEW interface and the Java-Internet-LabVIEW (JIL) server complete the scheme for data exchange. This configuration allows the user to remotely interact with the PLC. Such integration can be considered a novelty in scientific literature for remote control and sensor data acquisition of industrial plants. An experimental application devoted to remote laboratories is developed to demonstrate the feasibility and benefits of the proposed approach. The experiment to be conducted is the parameterization and supervision of a fuzzy controller of a DC servomotor. The graphical user interface has been developed with EJS and the fuzzy control is carried out by our own PLC. In fact, the distinctive features of the proposed novel network application are the integration of the OPC protocol to share information with the PLC and the application under control. The user can perform the tuning of the controller parameters online and observe in real time the effect on the servomotor behavior. The target group is engineering remote users, specifically in control- and automation-related tasks. The proposed architecture system is described and experimental results are presented.

  6. Novel Networked Remote Laboratory Architecture for Open Connectivity Based on PLC-OPC-LabVIEW-EJS Integration. Application in Remote Fuzzy Control and Sensors Data Acquisition

    PubMed Central

    González, Isaías; Calderón, Antonio José; Mejías, Andrés; Andújar, José Manuel

    2016-01-01

    In this paper the design and implementation of a network for integrating Programmable Logic Controllers (PLC), the Object-Linking and Embedding for Process Control protocol (OPC) and the open-source Easy Java Simulations (EJS) package is presented. A LabVIEW interface and the Java-Internet-LabVIEW (JIL) server complete the scheme for data exchange. This configuration allows the user to remotely interact with the PLC. Such integration can be considered a novelty in scientific literature for remote control and sensor data acquisition of industrial plants. An experimental application devoted to remote laboratories is developed to demonstrate the feasibility and benefits of the proposed approach. The experiment to be conducted is the parameterization and supervision of a fuzzy controller of a DC servomotor. The graphical user interface has been developed with EJS and the fuzzy control is carried out by our own PLC. In fact, the distinctive features of the proposed novel network application are the integration of the OPC protocol to share information with the PLC and the application under control. The user can perform the tuning of the controller parameters online and observe in real time the effect on the servomotor behavior. The target group is engineering remote users, specifically in control- and automation-related tasks. The proposed architecture system is described and experimental results are presented. PMID:27809229

  7. epiPATH: an information system for the storage and management of molecular epidemiology data from infectious pathogens.

    PubMed

    Amadoz, Alicia; González-Candelas, Fernando

    2007-04-20

    Most research scientists working in the fields of molecular epidemiology, population and evolutionary genetics are confronted with the management of large volumes of data. Moreover, the data used in studies of infectious diseases are complex and usually derive from different institutions such as hospitals or laboratories. Since no public database scheme incorporating clinical and epidemiological information about patients and molecular information about pathogens is currently available, we have developed an information system, composed by a main database and a web-based interface, which integrates both types of data and satisfies requirements of good organization, simple accessibility, data security and multi-user support. From the moment a patient arrives to a hospital or health centre until the processing and analysis of molecular sequences obtained from infectious pathogens in the laboratory, lots of information is collected from different sources. We have divided the most relevant data into 12 conceptual modules around which we have organized the database schema. Our schema is very complete and it covers many aspects of sample sources, samples, laboratory processes, molecular sequences, phylogenetics results, clinical tests and results, clinical information, treatments, pathogens, transmissions, outbreaks and bibliographic information. Communication between end-users and the selected Relational Database Management System (RDMS) is carried out by default through a command-line window or through a user-friendly, web-based interface which provides access and management tools for the data. epiPATH is an information system for managing clinical and molecular information from infectious diseases. It facilitates daily work related to infectious pathogens and sequences obtained from them. This software is intended for local installation in order to safeguard private data and provides advanced SQL-users the flexibility to adapt it to their needs. The database schema, tool scripts and web-based interface are free software but data stored in our database server are not publicly available. epiPATH is distributed under the terms of GNU General Public License. More details about epiPATH can be found at http://genevo.uv.es/epipath.

  8. The ideal laboratory information system.

    PubMed

    Sepulveda, Jorge L; Young, Donald S

    2013-08-01

    Laboratory information systems (LIS) are critical components of the operation of clinical laboratories. However, the functionalities of LIS have lagged significantly behind the capacities of current hardware and software technologies, while the complexity of the information produced by clinical laboratories has been increasing over time and will soon undergo rapid expansion with the use of new, high-throughput and high-dimensionality laboratory tests. In the broadest sense, LIS are essential to manage the flow of information between health care providers, patients, and laboratories and should be designed to optimize not only laboratory operations but also personalized clinical care. To list suggestions for designing LIS with the goal of optimizing the operation of clinical laboratories while improving clinical care by intelligent management of laboratory information. Literature review, interviews with laboratory users, and personal experience and opinion. Laboratory information systems can improve laboratory operations and improve patient care. Specific suggestions for improving the function of LIS are listed under the following sections: (1) Information Security, (2) Test Ordering, (3) Specimen Collection, Accessioning, and Processing, (4) Analytic Phase, (5) Result Entry and Validation, (6) Result Reporting, (7) Notification Management, (8) Data Mining and Cross-sectional Reports, (9) Method Validation, (10) Quality Management, (11) Administrative and Financial Issues, and (12) Other Operational Issues.

  9. Database Access Manager for the Software Engineering Laboratory (DAMSEL) user's guide

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Operating instructions for the Database Access Manager for the Software Engineering Laboratory (DAMSEL) system are presented. Step-by-step instructions for performing various data entry and report generation activities are included. Sample sessions showing the user interface display screens are also included. Instructions for generating reports are accompanied by sample outputs for each of the reports. The document groups the available software functions by the classes of users that may access them.

  10. Purple Computational Environment With Mappings to ACE Requirements for the General Availability User Environment Capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barney, B; Shuler, J

    2006-08-21

    Purple is an Advanced Simulation and Computing (ASC) funded massively parallel supercomputer located at Lawrence Livermore National Laboratory (LLNL). The Purple Computational Environment documents the capabilities and the environment provided for the FY06 LLNL Level 1 General Availability Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model is focused on the needs of the ASC user working in the secure computing environments at Los Alamos National Laboratory, Lawrence Livermore National Laboratory, and Sandia National Laboratories, but also documents needs of the LLNL and Alliance users working in the unclassified environment. Additionally,more » the Purple Computational Environment maps the provided capabilities to the Trilab ASC Computing Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the General Availability user environment capabilities of the ASC community. Appendix A lists these requirements and includes a description of ACE requirements met and those requirements that are not met for each section of this document. The Purple Computing Environment, along with the ACE mappings, has been issued and reviewed throughout the Tri-lab community.« less

  11. Technology demonstration of space intravehicular automation and robotics

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Barker, L. Keith

    1994-01-01

    Automation and robotic technologies are being developed and capabilities demonstrated which would increase the productivity of microgravity science and materials processing in the space station laboratory module, especially when the crew is not present. The Automation Technology Branch at NASA Langley has been working in the area of intravehicular automation and robotics (IVAR) to provide a user-friendly development facility, to determine customer requirements for automated laboratory systems, and to improve the quality and efficiency of commercial production and scientific experimentation in space. This paper will describe the IVAR facility and present the results of a demonstration using a simulated protein crystal growth experiment inside a full-scale mockup of the space station laboratory module using a unique seven-degree-of-freedom robot.

  12. Virtual Partnerships in Research and Education.

    ERIC Educational Resources Information Center

    Payne, Deborah A.; Keating, Kelly A.; Myers, James D.

    The William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) at the Pacific Northwest National Laboratory (Washington) is a collaborative user facility with many unique scientific capabilities. The EMSL expects to support many of its remote users and collaborators by electronic means and is creating a collaborative environment for this…

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saunders, P.

    The majority of general-purpose low-temperature handheld radiation thermometers are severely affected by the size-of-source effect (SSE). Calibration of these instruments is pointless unless the SSE is accounted for in the calibration process. Traditional SSE measurement techniques, however, are costly and time consuming, and because the instruments are direct-reading in temperature, traditional SSE results are not easily interpretable, particularly by the general user. This paper describes a simplified method for measuring the SSE, suitable for second-tier calibration laboratories and requiring no additional equipment, and proposes a means of reporting SSE results on a calibration certificate that should be easily understood bymore » the non-specialist user.« less

  14. AMPA experimental communications systems

    NASA Technical Reports Server (NTRS)

    Beckerman, D.; Fass, S.; Keon, T.; Sielman, P.

    1982-01-01

    The program was conducted to demonstrate the satellite communication advantages of Adaptive Phased Array Technology. A laboratory based experiment was designed and implemented to demonstrate a low earth orbit satellite communications system. Using a 32 element, L-band phased array augmented with 4 sets of weights (2 for reception and 2 for transmission) a high speed digital processing system and operating against multiple user terminals and interferers, the AMPA system demonstrated: communications with austere user terminals, frequency reuse, communications in the face of interference, and geolocation. The program and experiment objectives are described, the system hardware and software/firmware are defined, and the test performed and the resultant test data are presented.

  15. When is it time to get married? Or when should the assay user and the assay developer collaborate?

    PubMed Central

    Swan, S H; Lasley, B L

    1991-01-01

    Hormone assays are being developed in the laboratory to detect specific molecular markers in nonclinical populations. Epidemiology is increasingly using these assays to improve the precision with which disease processes and exposures can be defined. This growing body of molecular epidemiology requires a high degree of cooperation between the assay developer and the assay user. We draw on our experience in using a sensitive hormone assay for the detection of early pregnancy via urinary human chorionic gonadotropin to illustrate these points. We conclude that this collaborative effort, in addition to making this study possible, has provided unexpected rewards. PMID:1954925

  16. A Renewal Plan for the Advanced Photon Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fischetti, Robert F.; Fuoss, Paul H.; Gerig, Rodney E.

    2010-06-23

    With coordination from the APS Renewal Steering Committee (the members of which are the co-authors of this paper), staff and users of the U.S. Department of Energy's Advanced Photon Source (APS) at Argonne National Laboratory are in the process of developing a renewal plan for the facility. The renewal is a coordinated upgrade of the accelerator, beamlines, and associated technical structure that will enable users of the APS to address key scientific challenges in the coming decades. The cost of the renewal is estimated to be from $300M to $400M and to take approximately six years from start to finish.

  17. QCloud: A cloud-based quality control system for mass spectrometry-based proteomics laboratories

    PubMed Central

    Chiva, Cristina; Olivella, Roger; Borràs, Eva; Espadas, Guadalupe; Pastor, Olga; Solé, Amanda

    2018-01-01

    The increasing number of biomedical and translational applications in mass spectrometry-based proteomics poses new analytical challenges and raises the need for automated quality control systems. Despite previous efforts to set standard file formats, data processing workflows and key evaluation parameters for quality control, automated quality control systems are not yet widespread among proteomics laboratories, which limits the acquisition of high-quality results, inter-laboratory comparisons and the assessment of variability of instrumental platforms. Here we present QCloud, a cloud-based system to support proteomics laboratories in daily quality assessment using a user-friendly interface, easy setup, automated data processing and archiving, and unbiased instrument evaluation. QCloud supports the most common targeted and untargeted proteomics workflows, it accepts data formats from different vendors and it enables the annotation of acquired data and reporting incidences. A complete version of the QCloud system has successfully been developed and it is now open to the proteomics community (http://qcloud.crg.eu). QCloud system is an open source project, publicly available under a Creative Commons License Attribution-ShareAlike 4.0. PMID:29324744

  18. Users Do the Darndest Things: True Stories from the CyLab Usable Privacy and Security Laboratory

    NASA Astrophysics Data System (ADS)

    Cranor, Lorrie Faith

    How can we make security and privacy software more usable? The first step is to study our users. Ideally, we would watch them interacting with security or privacy software in situations where they face actual risk. But everyday computer users don't sit around fiddling with security software, and subjecting users to actual security attacks raises ethical and legal concerns. Thus, it can be difficult to observe users interacting with security and privacy software in their natural habitat. At the CyLab Usable Privacy and Security Laboratory, we've conducted a wide variety of studies aimed at understanding how users think about security and privacy and how they interact with security and privacy software. In this talk I'll give a behind the scenes tour of some of the techniques we've used to study users both in the laboratory and in the wild. I'll discuss the trials and tribulations of designing and carrying out security and privacy user studies, and highlight some of our surprising observations. Find out what privacy-sensitive items you can actually get study participants to purchase, how you can observe users' responses to a man-in-the-middle attack without actually conducting such an attack, why it's hard to get people to use high tech cell phones even when you give them away, and what's actually in that box behind the couch in my office.

  19. Software Engineering Laboratory (SEL). Data base organization and user's guide, revision 1

    NASA Technical Reports Server (NTRS)

    Lo, P. S.; Wyckoff, D.; Page, J.; Mcgarry, F. E.

    1983-01-01

    The structure of the Software Engineering Laboratory (SEL) data base is described. It defines each data base file in detail and provides information about how to access and use the data for programmers and other users. Several data base reporting programs are described also.

  20. Software Engineering Laboratory (SEL) data base reporting software user's guide and system description. Volume 1: Introduction and user's guide

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Reporting software programs provide formatted listings and summary reports of the Software Engineering Laboratory (SEL) data base contents. The operating procedures and system information for 18 different reporting software programs are described. Sample output reports from each program are provided.

  1. Make it better but don't change anything.

    PubMed

    Wright, Jerry M

    2009-11-19

    With massive amounts of data being generated in electronic format, there is a need in basic science laboratories to adopt new methods for tracking and analyzing data. An electronic laboratory notebook (ELN) is not just a replacement for a paper lab notebook, it is a new method of storing and organizing data while maintaining the data entry flexibility and legal recording functions of paper notebooks. Paper notebooks are regarded as highly flexible since the user can configure it to store almost anything that can be written or physically pasted onto the pages. However, data retrieval and data sharing from paper notebooks are labor intensive processes and notebooks can be misplaced, a single point of failure that loses all entries in the volume. Additional features provided by electronic notebooks include searchable indices, data sharing, automatic archiving for security against loss and ease of data duplication. Furthermore, ELNs can be tasked with additional functions not commonly found in paper notebooks such as inventory control. While ELNs have been on the market for some time now, adoption of an ELN in academic basic science laboratories has been lagging. Issues that have restrained development and adoption of ELN in research laboratories are the sheer variety and frequency of changes in protocols with a need for the user to control notebook configuration outside the framework of professional IT staff support. In this commentary, we will look at some of the issues and experiences in academic laboratories that have proved challenging in implementing an electronic lab notebook.

  2. Francis bitter national magnet laboratory annual report, July 1991 through June 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1992-06-01

    ;Contents: Reports on Laboratory Research Programs--Magneto-Optics and Semiconductor Physics, Superconductivity and Magnetism, Solid State Nuclear Magnetic Resonance, Condensed Matter Chemistry, Biomagnetism, Magnet Technology, Instrumentation, Molecular Biophysics, Carbon Filters and Fullerenes; Reports of Visiting Scientists--Reports of Users of the High Magnetic Field Facility, Reports of the Users of the Pulsed Field Facility, Reports of the Users of the High Field NMR Facility; Appendices--Publications and Meeting Speeches, Organization, Summary of High Magnetic Field Facility Use--January 1, 1984 through June 30, 1992, Geographic Distribution of High Magnetic Field Facility Users (Excluding FBNML Staff), Summary of Educational Activities.

  3. Constraints on transportation of reagents, reference materials and samples, difficulties and possible solutions: a user's perspective.

    PubMed

    Blanchard, P C

    2006-01-01

    The air transportation of infectious materials is regulated by international air transport associations and based on United Nations Model regulations which have become more practical in addressing animal disease agents. However, individual countries' import and interstate requirements determine what materials can be imported and transported, and this approval process can be long, resulting in delays in organism confirmation, use of international OIE and other reference laboratories, and acquisition of reference materials, proficiency test panels, and reagents for performing necessary testing. Delays can be prevented for permits that are required for the routine work performed by a laboratory through the use of comprehensive and annually renewed permits. This process, however, does not address new and exotic agents where time is critical to an effective emergency response. This paper suggests actions by both the OIE and regulatory authorities which can assist in streamlining and expediting the permit process.

  4. A straightforward graphical user interface for basic and advanced signal processing of thermographic infrared sequences

    NASA Astrophysics Data System (ADS)

    Klein, Matthieu T.; Ibarra-Castanedo, Clemente; Maldague, Xavier P.; Bendada, Abdelhakim

    2008-03-01

    IR-View, is a free and open source Matlab software that was released in 1998 at the Computer Vision and Systems Laboratory (CVSL) at Université Laval, Canada, as an answer to many common and recurrent needs in Infrared thermography. IR-View has proven to be a useful tool at CVSL for the past 10 years. The software by itself and/or its concept and functions may be of interest for other laboratories and companies working in research in the IR NDT field. This article describes the functions and processing techniques integrated to IR-View, freely downloadable under the GNU license at http://mivim.gel.ulaval.ca. Demonstration of IR-View functionalities will also be done during the DSS08 SPIE Defense and Security Symposium.

  5. User`s and reference guide to the INEL RML/analytical radiochemistry sample tracking database version 1.00

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Femec, D.A.

    This report discusses the sample tracking database in use at the Idaho National Engineering Laboratory (INEL) by the Radiation Measurements Laboratory (RML) and Analytical Radiochemistry. The database was designed in-house to meet the specific needs of the RML and Analytical Radiochemistry. The report consists of two parts, a user`s guide and a reference guide. The user`s guide presents some of the fundamentals needed by anyone who will be using the database via its user interface. The reference guide describes the design of both the database and the user interface. Briefly mentioned in the reference guide are the code-generating tools, CREATE-SCHEMAmore » and BUILD-SCREEN, written to automatically generate code for the database and its user interface. The appendices contain the input files used by the these tools to create code for the sample tracking database. The output files generated by these tools are also included in the appendices.« less

  6. Airland Battlefield Environment (ALBE) Tactical Decision Aid (TDA) Demonstration Program,

    DTIC Science & Technology

    1987-11-12

    Management System (DBMS) software, GKS graphics libraries, and user interface software. These components of the ATB system software architecture will be... knowlede base ano auqent the decision mak:n• process by providing infocr-mation useful in the formulation and execution of battlefield strategies...Topographic Laboratories as an Engineer. Ms. Capps is managing the software development of the AirLand Battlefield Environment (ALBE) geographic

  7. Machine learning for micro-tomography

    NASA Astrophysics Data System (ADS)

    Parkinson, Dilworth Y.; Pelt, Daniël. M.; Perciano, Talita; Ushizima, Daniela; Krishnan, Harinarayan; Barnard, Harold S.; MacDowell, Alastair A.; Sethian, James

    2017-09-01

    Machine learning has revolutionized a number of fields, but many micro-tomography users have never used it for their work. The micro-tomography beamline at the Advanced Light Source (ALS), in collaboration with the Center for Applied Mathematics for Energy Research Applications (CAMERA) at Lawrence Berkeley National Laboratory, has now deployed a series of tools to automate data processing for ALS users using machine learning. This includes new reconstruction algorithms, feature extraction tools, and image classification and recommen- dation systems for scientific image. Some of these tools are either in automated pipelines that operate on data as it is collected or as stand-alone software. Others are deployed on computing resources at Berkeley Lab-from workstations to supercomputers-and made accessible to users through either scripting or easy-to-use graphical interfaces. This paper presents a progress report on this work.

  8. This Is not Participatory Design - A Critical Analysis of Eight Living Laboratories.

    PubMed

    Bygholm, Ann; Kanstrup, Anne Marie

    2017-01-01

    Design of Health Technology for elderly and care personnel has a high priority because of a severe increase of elderly citizens in need of health care combined with a decrease of resources in the health care sector. Desires for maintaining and improving the quality of care while reducing costs has resulted in a search for approaches that support co-operation between technology designers, elderly persons and health care professionals on innovating future care technology. Living laboratories, where areas of a care environment are transformed into a so-called platform for technology innovation, are popular. Expectations for living laboratories are high but examinations of how such laboratories support the intended participatory innovation are few. This paper presents and examines eight living laboratories set up in Danish nursing homes for technology innovation. We present the notion of a living laboratory and explicate the aspirations and expectations of this approach, and discuss why these expectations are hard to meet both on a general level and in the investigated labs. We question the basic assumptions of the possibility of reconciling the different interests of the stakeholders involved. In our analysis we focus on users in the living laboratories. We use guiding principles developed within Participatory Design to reveal the role and participation of the users - the health care professionals and the elderly - in the eight living laboratories. In general, these users played a minor role, in the labs where technical problems turned out to be main activity. We conclude that living laboratories do not nullify different/conflicting interests and that a real-life setting by itself is no guarantee for user participation.

  9. Resolving complex research data management issues in biomedical laboratories: Qualitative study of an industry-academia collaboration.

    PubMed

    Myneni, Sahiti; Patel, Vimla L; Bova, G Steven; Wang, Jian; Ackerman, Christopher F; Berlinicke, Cynthia A; Chen, Steve H; Lindvall, Mikael; Zack, Donald J

    2016-04-01

    This paper describes a distributed collaborative effort between industry and academia to systematize data management in an academic biomedical laboratory. Heterogeneous and voluminous nature of research data created in biomedical laboratories make information management difficult and research unproductive. One such collaborative effort was evaluated over a period of four years using data collection methods including ethnographic observations, semi-structured interviews, web-based surveys, progress reports, conference call summaries, and face-to-face group discussions. Data were analyzed using qualitative methods of data analysis to (1) characterize specific problems faced by biomedical researchers with traditional information management practices, (2) identify intervention areas to introduce a new research information management system called Labmatrix, and finally to (3) evaluate and delineate important general collaboration (intervention) characteristics that can optimize outcomes of an implementation process in biomedical laboratories. Results emphasize the importance of end user perseverance, human-centric interoperability evaluation, and demonstration of return on investment of effort and time of laboratory members and industry personnel for success of implementation process. In addition, there is an intrinsic learning component associated with the implementation process of an information management system. Technology transfer experience in a complex environment such as the biomedical laboratory can be eased with use of information systems that support human and cognitive interoperability. Such informatics features can also contribute to successful collaboration and hopefully to scientific productivity. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  10. Designing Online Resources in Preparation for Authentic Laboratory Experiences

    PubMed Central

    Boulay, Rachel; Parisky, Alex; Leong, Peter

    2013-01-01

    Professional development for science teachers can be benefited through active learning in science laboratories. However, how online training materials can be used to complement traditional laboratory training is less understood. This paper explores the design of online training modules to teach molecular biology and user perception of those modules that were part of an intensive molecular biology “boot camp” targeting high school biology teachers in the State of Hawaii. The John A. Burns School of Medicine at the University of Hawaii had an opportunity to design and develop professional development that prepares science teachers with an introduction of skills, techniques, and applications for their students to conduct medical research in a laboratory setting. A group of 29 experienced teachers shared their opinions of the online materials and reported on how they used the online materials in their learning process or teaching. PMID:24319698

  11. Leaf LIMS: A Flexible Laboratory Information Management System with a Synthetic Biology Focus.

    PubMed

    Craig, Thomas; Holland, Richard; D'Amore, Rosalinda; Johnson, James R; McCue, Hannah V; West, Anthony; Zulkower, Valentin; Tekotte, Hille; Cai, Yizhi; Swan, Daniel; Davey, Robert P; Hertz-Fowler, Christiane; Hall, Anthony; Caddick, Mark

    2017-12-15

    This paper presents Leaf LIMS, a flexible laboratory information management system (LIMS) designed to address the complexity of synthetic biology workflows. At the project's inception there was a lack of a LIMS designed specifically to address synthetic biology processes, with most systems focused on either next generation sequencing or biobanks and clinical sample handling. Leaf LIMS implements integrated project, item, and laboratory stock tracking, offering complete sample and construct genealogy, materials and lot tracking, and modular assay data capture. Hence, it enables highly configurable task-based workflows and supports data capture from project inception to completion. As such, in addition to it supporting synthetic biology it is ideal for many laboratory environments with multiple projects and users. The system is deployed as a web application through Docker and is provided under a permissive MIT license. It is freely available for download at https://leaflims.github.io .

  12. QADATA user's manual; an interactive computer program for the retrieval and analysis of the results from the external blind sample quality- assurance project of the U.S. Geological Survey

    USGS Publications Warehouse

    Lucey, K.J.

    1990-01-01

    The U.S. Geological Survey conducts an external blind sample quality assurance project for its National Water Quality Laboratory in Denver, Colorado, based on the analysis of reference water samples. Reference samples containing selected inorganic and nutrient constituents are disguised as environmental samples at the Survey 's office in Ocala, Florida, and are sent periodically through other Survey offices to the laboratory. The results of this blind sample project indicate the quality of analytical data produced by the laboratory. This report provides instructions on the use of QADATA, an interactive, menu-driven program that allows users to retrieve the results of the blind sample quality- assurance project. The QADATA program, which is available on the U.S. Geological Survey 's national computer network, accesses a blind sample data base that contains more than 50,000 determinations from the last five water years for approximately 40 constituents at various concentrations. The data can be retrieved from the database for any user- defined time period and for any or all available constituents. After the user defines the retrieval, the program prepares statistical tables, control charts, and precision plots and generates a report which can be transferred to the user 's office through the computer network. A discussion of the interpretation of the program output is also included. This quality assurance information will permit users to document the quality of the analytical results received from the laboratory. The blind sample data is entered into the database within weeks after being produced by the laboratory and can be retrieved to meet the needs of specific projects or programs. (USGS)

  13. Measurement Assurance for End-Item Users

    NASA Technical Reports Server (NTRS)

    Mimbs, Scott M.

    2008-01-01

    The goal of a Quality Management System (QMS) as specified in ISO 9001 and AS9100 is to assure the end product meets specifications and customer requirements. Measuring devices, often called measuring and test equipments (MTE), provide the evidence of product conformity to the prescribed requirements. Therefore the processes which employ MTE can become a weak link to the overall QMS if proper attention is not given to development and execution of these processes. Traditionally, calibration of MTE is given more focus in industry standards and process control efforts than the equally important proper usage of the same equipment. It is a common complaint of calibration laboratory personnel that MTE users are only interested in "a sticker." If the QMS requires the MTE "to demonstrate conformity of the product," then the quality of the measurement process must be adequate for the task. This leads to an ad hoc definition; measurement assurance is a discipline that assures that all processes, activities, environments, standards, and procedures involved in making a measurement produce a result that can be rigorously evaluated for validity and accuracy. To evaluate that the existing measurement processes are providing an adequate level of quality to support the decisions based upon this measurement data, an understanding of measurement assurance basics is essential. This topic is complimentary to the calibration standard, ANSI/NCSL Z540.3-2006, which targets the calibration of MTE at the organizational level. This paper will discuss general measurement assurance when MTE is used to provide evidence of product conformity, therefore the target audience of this paper is end item users of MTE. A central focus of the paper will be the verification of tolerances and the associated risks, so calibration professionals may find the paper useful in communication with their customers, MTE users.

  14. New reporting procedures based on long-term method detection levels and some considerations for interpretations of water-quality data provided by the U.S. Geological Survey National Water Quality Laboratory

    USGS Publications Warehouse

    Childress, Carolyn J. Oblinger; Foreman, William T.; Connor, Brooke F.; Maloney, Thomas J.

    1999-01-01

    This report describes the U.S. Geological Survey National Water Quality Laboratory?s approach for determining long-term method detection levels and establishing reporting levels, details relevant new reporting conventions, and provides preliminary guidance on interpreting data reported with the new conventions. At the long-term method detection level concentration, the risk of a false positive detection (analyte reported present at the long-term method detection level when not in sample) is no more than 1 percent. However, at the long-term method detection level, the risk of a false negative occurrence (analyte reported not present when present at the long-term method detection level concentration) is up to 50 percent. Because this false negative rate is too high for use as a default 'less than' reporting level, a more reliable laboratory reporting level is set at twice the determined long-term method detection level. For all methods, concentrations measured between the laboratory reporting level and the long-term method detection level will be reported as estimated concentrations. Non-detections will be censored to the laboratory reporting level. Adoption of the new reporting conventions requires a full understanding of how low-concentration data can be used and interpreted and places responsibility for using and presenting final data with the user rather than with the laboratory. Users must consider that (1) new laboratory reporting levels may differ from previously established minimum reporting levels, (2) long-term method detection levels and laboratory reporting levels may change over time, and (3) estimated concentrations are less certain than concentrations reported above the laboratory reporting level. The availability of uncensored but qualified low-concentration data for interpretation and statistical analysis is a substantial benefit to the user. A decision to censor data after they are reported from the laboratory may still be made by the user, if merited, on the basis of the intended use of the data.

  15. Current State of Agile User-Centered Design: A Survey

    NASA Astrophysics Data System (ADS)

    Hussain, Zahid; Slany, Wolfgang; Holzinger, Andreas

    Agile software development methods are quite popular nowadays and are being adopted at an increasing rate in the industry every year. However, these methods are still lacking usability awareness in their development lifecycle, and the integration of usability/User-Centered Design (UCD) into agile methods is not adequately addressed. This paper presents the preliminary results of a recently conducted online survey regarding the current state of the integration of agile methods and usability/UCD. A world wide response of 92 practitioners was received. The results show that the majority of practitioners perceive that the integration of agile methods with usability/UCD has added value to their adopted processes and to their teams; has resulted in the improvement of usability and quality of the product developed; and has increased the satisfaction of the end-users of the product developed. The top most used HCI techniques are low-fidelity prototyping, conceptual designs, observational studies of users, usability expert evaluations, field studies, personas, rapid iterative testing, and laboratory usability testing.

  16. The Underwater Shock Analysis Code (USA-Version 3): A Reference Manual.

    DTIC Science & Technology

    1980-09-15

    Library Manager EZ-DAL of the NOSTRA Data Management System", LMSC-D626839, Lockheed Palo Alto Research Laboratory, Palo Alto, California, July 1978...Lockheed Palo Alto Research Laboratory, Palo Alto, California, September 1977 [23] C. A. Felippa, "Skymatrix Processing Utility Library (SKYPUL) Users...w mooC c ’ OI 0 IdQ Cc I 7’-) C’C uiz 0 l W Z 4 Z * a l -A 0 W J /0 4 ZWI < (’( j U ( LL J0 ’M - ~ i WLn Z-4i z (r 0 JWz t 33 > 0~a aw d z .TO/ Ca

  17. Design of Web-based Fuzzy Input Expert System for the analysis of serology laboratory tests.

    PubMed

    Başçiftçi, Fatih; Incekara, Hayri

    2012-08-01

    In this study, it is aimed, using the Web-based Expert System with Fuzzy Input (WESFI), to convert the patients' (users') Serology Laboratory Tests (SLT) results to linguistic statements (low, normal, high) and analyzing those, give a feedback to the user (patient) of the potential signs of disease. The feedbacks given to the patients are the existing interpretations in the database, which were prepared by doctors before. Furthermore, the SLT terms (Brucella Coombs, Ama, P-Protein etc.) are explained in a way that the user can understand. The WESFI is published with an interface on the web environment. In order to determine the rate of the success of the WESFI, users evaluated the system answering the "How do you find the evaluation?" question. The question has been answered by 461 users. As a result it is observed that 90% of female users, 92% of male users and 91% of all users found the system useful.

  18. User input verification and test driven development in the NJOY21 nuclear data processing code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trainer, Amelia Jo; Conlin, Jeremy Lloyd; McCartney, Austin Paul

    Before physically-meaningful data can be used in nuclear simulation codes, the data must be interpreted and manipulated by a nuclear data processing code so as to extract the relevant quantities (e.g. cross sections and angular distributions). Perhaps the most popular and widely-trusted of these processing codes is NJOY, which has been developed and improved over the course of 10 major releases since its creation at Los Alamos National Laboratory in the mid-1970’s. The current phase of NJOY development is the creation of NJOY21, which will be a vast improvement from its predecessor, NJOY2016. Designed to be fast, intuitive, accessible, andmore » capable of handling both established and modern formats of nuclear data, NJOY21 will address many issues that many NJOY users face, while remaining functional for those who prefer the existing format. Although early in its development, NJOY21 is quickly providing input validation to check user input. By providing rapid and helpful responses to users while writing input files, NJOY21 will prove to be more intuitive and easy to use than any of its predecessors. Furthermore, during its development, NJOY21 is subject to regular testing, such that its test coverage must strictly increase with the addition of any production code. This thorough testing will allow developers and NJOY users to establish confidence in NJOY21 as it gains functionality. This document serves as a discussion regarding the current state input checking and testing practices of NJOY21.« less

  19. Modeling microbiological and chemical processes in municipal solid waste bioreactor, Part II: Application of numerical model BIOKEMOD-3P.

    PubMed

    Gawande, Nitin A; Reinhart, Debra R; Yeh, Gour-Tsyh

    2010-02-01

    Biodegradation process modeling of municipal solid waste (MSW) bioreactor landfills requires the knowledge of various process reactions and corresponding kinetic parameters. Mechanistic models available to date are able to simulate biodegradation processes with the help of pre-defined species and reactions. Some of these models consider the effect of critical parameters such as moisture content, pH, and temperature. Biomass concentration is a vital parameter for any biomass growth model and often not compared with field and laboratory results. A more complex biodegradation model includes a large number of chemical and microbiological species. Increasing the number of species and user defined process reactions in the simulation requires a robust numerical tool. A generalized microbiological and chemical model, BIOKEMOD-3P, was developed to simulate biodegradation processes in three-phases (Gawande et al. 2009). This paper presents the application of this model to simulate laboratory-scale MSW bioreactors under anaerobic conditions. BIOKEMOD-3P was able to closely simulate the experimental data. The results from this study may help in application of this model to full-scale landfill operation.

  20. NASA's Laboratory Astrophysics Workshop: Opening Remarks

    NASA Technical Reports Server (NTRS)

    Hasan, Hashima

    2002-01-01

    The Astronomy and Physics Division at NASA Headquarters has an active and vibrant program in Laboratory Astrophysics. The objective of the program is to provide the spectroscopic data required by observers to analyze data from NASA space astronomy missions. The program also supports theoretical investigations to provide those spectroscopic parameters that cannot be obtained in the laboratory; simulate space environment to understand formation of certain molecules, dust grains and ices; and production of critically compiled databases of spectroscopic parameters. NASA annually solicits proposals, and utilizes the peer review process to select meritorious investigations for funding. As the mission of NASA evolves, new missions are launched, and old ones are terminated, the Laboratory Astrophysics program needs to evolve accordingly. Consequently, it is advantageous for NASA and the astronomical community to periodically conduct a dialog to assess the status of the program. This Workshop provides a forum for producers and users of laboratory data to get together and understand each others needs and limitations. A multi-wavelength approach enables a cross fertilization of ideas across wavelength bands.

  1. iVirtualWorld: A Domain-Oriented End-User Development Environment for Building 3D Virtual Chemistry Experiments

    ERIC Educational Resources Information Center

    Zhong, Ying

    2013-01-01

    Virtual worlds are well-suited for building virtual laboratories for educational purposes to complement hands-on physical laboratories. However, educators may face technical challenges because developing virtual worlds requires skills in programming and 3D design. Current virtual world building tools are developed for users who have programming…

  2. AV Instructional Materials Manual; A Sslf-Instructional Guide to AV Laboratory Experiences. Third Edition.

    ERIC Educational Resources Information Center

    Brown, James W., Ed.; Lewis, Richard B., Ed.

    This self-instructional guide to audiovisual laboratory experiences contains 50 exercises designed to give the user active experiences in the practical problems of choosing, using, and inventing instructional materials and in operating and audiovisual equipment. With the exception of the first four exercises (which introduce the user to the manual…

  3. Mixing HTC and HPC Workloads with HTCondor and Slurm

    NASA Astrophysics Data System (ADS)

    Hollowell, C.; Barnett, J.; Caramarcu, C.; Strecker-Kellogg, W.; Wong, A.; Zaytsev, A.

    2017-10-01

    Traditionally, the RHIC/ATLAS Computing Facility (RACF) at Brookhaven National Laboratory (BNL) has only maintained High Throughput Computing (HTC) resources for our HEP/NP user community. We’ve been using HTCondor as our batch system for many years, as this software is particularly well suited for managing HTC processor farm resources. Recently, the RACF has also begun to design/administrate some High Performance Computing (HPC) systems for a multidisciplinary user community at BNL. In this paper, we’ll discuss our experiences using HTCondor and Slurm in an HPC context, and our facility’s attempts to allow our HTC and HPC processing farms/clusters to make opportunistic use of each other’s computing resources.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kegel, T.M.

    Calibration laboratories are faced with the need to become accredited or registered to one or more quality standards. One requirement common to all of these standards is the need to have in place a measurement assurance program. What is a measurement assurance program? Brian Belanger, in Measurement Assurance Programs: Part 1, describes it as a {open_quotes}quality assurance program for a measurement process that quantifies the total uncertainty of the measurements (both random and systematic components of error) with respect to national or designated standards and demonstrates that the total uncertainty is sufficiently small to meet the user`s requirements.{close_quotes} Rolf Schumachermore » is more specific in Measurement Assurance in Your Own Laboratory. He states, {open_quotes}Measurement assurance is the application of broad quality control principles to measurements of calibrations.{close_quotes} Here, the focus is on one important part of any measurement assurance program: implementation of statistical process control (SPC). Paraphrasing Juran`s Quality Control Handbook, a process is in statistical control if the only observed variations are those that can be attributed to random causes. Conversely, a process that exhibits variations due to assignable causes is not in a state of statistical control. Finally, Carrol Croarkin states, {open_quotes}In the measurement assurance context the measurement algorithm including instrumentation, reference standards and operator interactions is the process that is to be controlled, and its direct product is the measurement per se. The measurements are assumed to be valid if the measurement algorithm is operating in a state of control.{close_quotes} Implicit in this statement is the important fact that an out-of-control process cannot produce valid measurements. 7 figs.« less

  5. Hypermedia Laboratory, Defense Applied Information Technology Center; Review for 1988

    DTIC Science & Technology

    1988-12-01

    Information System (DGIS): The Department of Defense Microcomputer User’s Gateway to the World;" Microcomputers for Information Management : An International...accessing. "Knowledge Gateways: The Building Blocks." Information Processing & Management , Volume 24, Number 4, pp. 459-468, 1988. Donald T. Hawkins...intelligence and hypermedia. This activity is managed by the Defense Technical Information Center (DTIC). Much of our development at this point is

  6. Laboratory testing in primary care: A systematic review of health IT impacts.

    PubMed

    Maillet, Éric; Paré, Guy; Currie, Leanne M; Raymond, Louis; Ortiz de Guinea, Ana; Trudel, Marie-Claude; Marsan, Josianne

    2018-08-01

    Laboratory testing in primary care is a fundamental process that supports patient management and care. Any breakdown in the process may alter clinical information gathering and decision-making activities and can lead to medical errors and potential adverse outcomes for patients. Various information technologies are being used in primary care with the goal to support the process, maximize patient benefits and reduce medical errors. However, the overall impact of health information technologies on laboratory testing processes has not been evaluated. To synthesize the positive and negative impacts resulting from the use of health information technology in each phase of the laboratory 'total testing process' in primary care. We conducted a systematic review. Databases including Medline, PubMed, CINAHL, Web of Science and Google Scholar were searched. Studies eligible for inclusion reported empirical data on: 1) the use of a specific IT system, 2) the impacts of the systems to support the laboratory testing process, and were conducted in 3) primary care settings (including ambulatory care and primary care offices). Our final sample consisted of 22 empirical studies which were mapped to a framework that outlines the phases of the laboratory total testing process, focusing on phases where medical errors may occur. Health information technology systems support several phases of the laboratory testing process, from ordering the test to following-up with patients. This is a growing field of research with most studies focusing on the use of information technology during the final phases of the laboratory total testing process. The findings were largely positive. Positive impacts included easier access to test results by primary care providers, reduced turnaround times, and increased prescribed tests based on best practice guidelines. Negative impacts were reported in several studies: paper-based processes employed in parallel to the electronic process increased the potential for medical errors due to clinicians' cognitive overload; systems deemed not reliable or user-friendly hampered clinicians' performance; and organizational issues arose when results tracking relied on the prescribers' memory. The potential of health information technology lies not only in the exchange of health information, but also in knowledge sharing among clinicians. This review has underscored the important role played by cognitive factors, which are critical in the clinician's decision-making, the selection of the most appropriate tests, correct interpretation of the results and efficient interventions. By providing the right information, at the right time to the right clinician, many IT solutions adequately support the laboratory testing process and help primary care clinicians make better decisions. However, several technological and organizational barriers require more attention to fully support the highly fragmented and error-prone process of laboratory testing. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. SCAMP and the ASP

    NASA Astrophysics Data System (ADS)

    Idehara, H.; Carbon, D. F.

    2004-12-01

    We present two new, publicly available tools to support the examination and interpretation of spectra. SCAMP is a specialized graphical user interface for MATLAB. It allows researchers to rapidly intercompare sets of observational, theoretical, and/or laboratory spectra. Users have extensive control over the colors and placement of individual spectra, and over spectrum normalization from one spectral region to another. Spectra can be interactively assigned to user-defined groups and the groupings recalled at a later time. The user can measure/record positions and intensities of spectral features, interactively spline-fit spectra, and normalize spectra by fitted splines. User-defined wavelengths can be automatically highlighted in SCAMP plots. The user can save/print annotated graphical output suitable for a scientific notebook depicting the work at any point. The ASP is a WWW portal that provides interactive access to two spectrum data sets: a library of synthetic stellar spectra and a library of laboratory PAH spectra. The synthetic stellar spectra in the ASP are appropriate to the giant branch with an assortment of compositions. Each spectrum spans the full range from 2 to 600 microns at a variety of resolutions. The ASP is designed to allow users to quickly identify individual features at any resolution that arise from any of the included isotopic species. The user may also retrieve the depth of formation of individual features at any resolution. PAH spectra accessible through the ASP are drawn from the extensive library of spectra measured by the NASA Ames Astrochemistry Laboratory. The user may interactively choose any subset of PAHs in the data set, combine them with user-defined weights and temperatures, and view/download the resultant spectrum at any user-defined resolution. This work was funded by the NASA Advanced Supercomputing Division, NASA Ames Research Center.

  8. Automated synthesis of image processing procedures using AI planning techniques

    NASA Technical Reports Server (NTRS)

    Chien, Steve; Mortensen, Helen

    1994-01-01

    This paper describes the Multimission VICAR (Video Image Communication and Retrieval) Planner (MVP) (Chien 1994) system, which uses artificial intelligence planning techniques (Iwasaki & Friedland, 1985, Pemberthy & Weld, 1992, Stefik, 1981) to automatically construct executable complex image processing procedures (using models of the smaller constituent image processing subprograms) in response to image processing requests made to the JPL Multimission Image Processing Laboratory (MIPL). The MVP system allows the user to specify the image processing requirements in terms of the various types of correction required. Given this information, MVP derives unspecified required processing steps and determines appropriate image processing programs and parameters to achieve the specified image processing goals. This information is output as an executable image processing program which can then be executed to fill the processing request.

  9. The Unified Database for BM@N experiment data handling

    NASA Astrophysics Data System (ADS)

    Gertsenberger, Konstantin; Rogachevsky, Oleg

    2018-04-01

    The article describes the developed Unified Database designed as a comprehensive relational data storage for the BM@N experiment at the Joint Institute for Nuclear Research in Dubna. The BM@N experiment, which is one of the main elements of the first stage of the NICA project, is a fixed target experiment at extracted Nuclotron beams of the Laboratory of High Energy Physics (LHEP JINR). The structure and purposes of the BM@N setup are briefly presented. The article considers the scheme of the Unified Database, its attributes and implemented features in detail. The use of the developed BM@N database provides correct multi-user access to actual information of the experiment for data processing. It stores information on the experiment runs, detectors and their geometries, different configuration, calibration and algorithm parameters used in offline data processing. An important part of any database - user interfaces are presented.

  10. Rapid access to information resources in clinical biochemistry: medical applications of Personal Digital Assistants (PDA).

    PubMed

    Serdar, Muhittin A; Turan, Mustafa; Cihan, Murat

    2008-06-01

    Laboratory specialists currently need to access scientific-based information at anytime and anywhere. A considerable period of time and too much effort are required to access this information through existing accumulated data. Personal digital assistants (PDA) are supposed to provide an effective solution with commercial software for this problem. In this study, 11 commercial software products (UpToDate, ePocrates, Inforetrive, Pepid, eMedicine, FIRST Consult, and 5 laboratory e-books released by Skyscape and/or Isilo) were selected and the benefits of their use were evaluated by seven laboratory specialists. The assessment of the software was performed based on the number of the tests included, the software content of detailed information for each test-like process, method, interpretation of results, reference ranges, critical values, interferences, equations, pathophysiology, supplementary technical details such as sample collection principles, and additional information such as linked references, evidence-based data, test cost, etc. In terms of technique, the following items are considered: the amount of memory required to run the software, the graphical user interface, which is a user-friendly instrument, and the frequency of new and/or up-date releases. There is still no perfect program, as we have anticipated. Interpretation of laboratory results may require software with an integrated program. However, methodological data are mostly not included in the software evaluated. It seems that these shortcomings will be fixed in the near future, and PDAs and relevant medical applications will also become indispensable for all physicians including laboratory specialists in the field of training/education and in patient care.

  11. Flaws in current human training protocols for spontaneous Brain-Computer Interfaces: lessons learned from instructional design

    PubMed Central

    Lotte, Fabien; Larrue, Florian; Mühl, Christian

    2013-01-01

    While recent research on Brain-Computer Interfaces (BCI) has highlighted their potential for many applications, they remain barely used outside laboratories. The main reason is their lack of robustness. Indeed, with current BCI, mental state recognition is usually slow and often incorrect. Spontaneous BCI (i.e., mental imagery-based BCI) often rely on mutual learning efforts by the user and the machine, with BCI users learning to produce stable ElectroEncephaloGraphy (EEG) patterns (spontaneous BCI control being widely acknowledged as a skill) while the computer learns to automatically recognize these EEG patterns, using signal processing. Most research so far was focused on signal processing, mostly neglecting the human in the loop. However, how well the user masters the BCI skill is also a key element explaining BCI robustness. Indeed, if the user is not able to produce stable and distinct EEG patterns, then no signal processing algorithm would be able to recognize them. Unfortunately, despite the importance of BCI training protocols, they have been scarcely studied so far, and used mostly unchanged for years. In this paper, we advocate that current human training approaches for spontaneous BCI are most likely inappropriate. We notably study instructional design literature in order to identify the key requirements and guidelines for a successful training procedure that promotes a good and efficient skill learning. This literature study highlights that current spontaneous BCI user training procedures satisfy very few of these requirements and hence are likely to be suboptimal. We therefore identify the flaws in BCI training protocols according to instructional design principles, at several levels: in the instructions provided to the user, in the tasks he/she has to perform, and in the feedback provided. For each level, we propose new research directions that are theoretically expected to address some of these flaws and to help users learn the BCI skill more efficiently. PMID:24062669

  12. NASA Astrophysics Data System (ADS)

    Knosp, B.; Neely, S.; Zimdars, P.; Mills, B.; Vance, N.

    2007-12-01

    The Microwave Limb Sounder (MLS) Science Computing Facility (SCF) stores over 50 terabytes of data, has over 240 computer processing hosts, and 64 users from around the world. These resources are spread over three primary geographical locations - the Jet Propulsion Laboratory (JPL), Raytheon RIS, and New Mexico Institute of Mining and Technology (NMT). A need for a grid network system was identified and defined to solve the problem of users competing for finite, and increasingly scarce, MLS SCF computing resources. Using Sun's Grid Engine software, a grid network was successfully created in a development environment that connected the JPL and Raytheon sites, established master and slave hosts, and demonstrated that transfer queues for jobs can work among multiple clusters in the same grid network. This poster will first describe MLS SCF resources and the lessons that were learned in the design and development phase of this project. It will then go on to discuss the test environment and plans for deployment by highlighting benchmarks and user experiences.

  13. Development and construction of a comprehensive set of research diagnostics for the FLARE user facility

    NASA Astrophysics Data System (ADS)

    Yoo, Jongsoo; Jara-Almonte, J.; Majeski, S.; Frank, S.; Ji, H.; Yamada, M.

    2016-10-01

    FLARE (Facility for Laboratory Reconnection Experiments) will be operated as a flexible user facility, and so a complete set of research diagnostics is under development, including magnetic probe arrays, Langmuir probes, Mach probes, spectroscopic probes, and a laser interferometer. In order to accommodate the various requirements of users, large-scale (1 m), variable resolution (0.5-4 cm) magnetic probes have been designed, and are currently being prototyped. Moreover, a fully fiber-coupled laser interferometer has been designed to measure the line-integrated electron density. This fiber-coupled interferometer system will reduce the complexity of alignment processes and minimize maintenance of the system. Finally, improvements to the electrostatic probes and spectroscopic probes currently used in the Magnetic Reconnection Experiment (MRX) are discussed. The specifications of other subsystems, such as integrators and digitizers, are also presented. This work is supported by DoE Contract No. DE-AC0209CH11466.

  14. The Construction of the Siam Photon Laboratory and Its Ripple Effects

    NASA Astrophysics Data System (ADS)

    Ishii, Takehiko

    2004-03-01

    The Siam Photon Laboratory of the National Synchrotron Research Center(NSRC) is a synchrotron radiation research facility built for promoting the scientific and technological research activity of the country and enhancing the human resources development. The accelerator complex was originally owned by the SORTEC Laboratory in Tsukuba and transferred to NSRC gratis. The storage ring design was renewed and the construction of the whole accelerator complex with the reformed storage ring was completed two years ago. In the course of the construction, we found many problems distinctive of second hand machines. The maximum stored current and the beam lifetime at present are 210mA and 6hr at 100mA, respectively. One beam line for photoemission experiments has been opened to outside users. First experimental studies made on Ni(111) by our staff members has been completed. Since the project started from scratch, NSRC was asked to carry out all work necessary for opening the facility to outside users, The work includes collecting users and setting up the users organization. In industrial applications, for instance, we have to find either some government or private sectors who are interested in the fundamantal technological research using synchrotorn radiation. Then, the training of users from the relevant organizations will start. After the establishment of the Siam Photon Laboratory, the trend of the promotion of pertinent research has increased. More fundamental human resources development including the graduate school education is underway around the Siam Photon Laboratory. The growth of enterprises as a part of the infrastructure is slow but steady.

  15. Multi-modal virtual environment research at Armstrong Laboratory

    NASA Technical Reports Server (NTRS)

    Eggleston, Robert G.

    1995-01-01

    One mission of the Paul M. Fitts Human Engineering Division of Armstrong Laboratory is to improve the user interface for complex systems through user-centered exploratory development and research activities. In support of this goal, many current projects attempt to advance and exploit user-interface concepts made possible by virtual reality (VR) technologies. Virtual environments may be used as a general purpose interface medium, an alternative display/control method, a data visualization and analysis tool, or a graphically based performance assessment tool. An overview is given of research projects within the division on prototype interface hardware/software development, integrated interface concept development, interface design and evaluation tool development, and user and mission performance evaluation tool development.

  16. SIP: A Web-Based Astronomical Image Processing Program

    NASA Astrophysics Data System (ADS)

    Simonetti, J. H.

    1999-12-01

    I have written an astronomical image processing and analysis program designed to run over the internet in a Java-compatible web browser. The program, Sky Image Processor (SIP), is accessible at the SIP webpage (http://www.phys.vt.edu/SIP). Since nothing is installed on the user's machine, there is no need to download upgrades; the latest version of the program is always instantly available. Furthermore, the Java programming language is designed to work on any computer platform (any machine and operating system). The program could be used with students in web-based instruction or in a computer laboratory setting; it may also be of use in some research or outreach applications. While SIP is similar to other image processing programs, it is unique in some important respects. For example, SIP can load images from the user's machine or from the Web. An instructor can put images on a web server for students to load and analyze on their own personal computer. Or, the instructor can inform the students of images to load from any other web server. Furthermore, since SIP was written with students in mind, the philosophy is to present the user with the most basic tools necessary to process and analyze astronomical images. Images can be combined (by addition, subtraction, multiplication, or division), multiplied by a constant, smoothed, cropped, flipped, rotated, and so on. Statistics can be gathered for pixels within a box drawn by the user. Basic tools are available for gathering data from an image which can be used for performing simple differential photometry, or astrometry. Therefore, students can learn how astronomical image processing works. Since SIP is not part of a commercial CCD camera package, the program is written to handle the most common denominator image file, the FITS format.

  17. 9 CFR 130.11 - User fees for inspecting and approving import/export facilities and establishments.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... hourly user fee rate in § 130.30(2) applies to biosecurity level two laboratories. (b) [Reserved] [65 FR... approval) Per year $537.00 $553.00 $570.00 $587.00 $604.00 Inspection for approval of biosecurity level three labs (all inspections related to approving the laboratory for handling one defined set of...

  18. Teaching clinical reasoning and decision-making skills to nursing students: Design, development, and usability evaluation of a serious game.

    PubMed

    Johnsen, Hege Mari; Fossum, Mariann; Vivekananda-Schmidt, Pirashanthie; Fruhling, Ann; Slettebø, Åshild

    2016-10-01

    Serious games (SGs) are a type of simulation technology that may provide nursing students with the opportunity to practice their clinical reasoning and decision-making skills in a safe and authentic environment. Despite the growing number of SGs developed for healthcare professionals, few SGs are video based or address the domain of home health care. This paper aims to describe the design, development, and usability evaluation of a video based SG for teaching clinical reasoning and decision-making skills to nursing students who care for patients with chronic obstructive pulmonary disease (COPD) in home healthcare settings. A prototype SG was developed. A unified framework of usability called TURF (Task, User, Representation, and Function) and SG theory were employed to ensure a user-centered design. The educational content was based on the clinical decision-making model, Bloom's taxonomy, and a Bachelor of Nursing curriculum. A purposeful sample of six participants evaluated the SG prototype in a usability laboratory. Cognitive walkthrough evaluations, a questionnaire, and individual interviews were used for the usability evaluation. The data were analyzed using qualitative deductive content analysis based on the TURF framework elements and related usability heuristics. The SG was perceived as being realistic, clinically relevant, and at an adequate level of complexity for the intended users. Usability issues regarding functionality and the user-computer interface design were identified. However, the SG was perceived as being easy to learn, and participants suggested that the SG could serve as a supplement to traditional training in laboratory and clinical settings. Using video based scenarios with an authentic COPD patient and a home healthcare registered nurse as actors contributed to increased realism. Using different theoretical approaches in the SG design was considered an advantage of the design process. The SG was perceived as being useful, usable, and satisfying. The achievement of the desired functionality and the minimization of user-computer interface issues emphasize the importance of conducting a usability evaluation during the SG development process. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  19. Optimized R functions for analysis of ecological community data using the R virtual laboratory (RvLab)

    PubMed Central

    Varsos, Constantinos; Patkos, Theodore; Pavloudi, Christina; Gougousis, Alexandros; Ijaz, Umer Zeeshan; Filiopoulou, Irene; Pattakos, Nikolaos; Vanden Berghe, Edward; Fernández-Guerra, Antonio; Faulwetter, Sarah; Chatzinikolaou, Eva; Pafilis, Evangelos; Bekiari, Chryssoula; Doerr, Martin; Arvanitidis, Christos

    2016-01-01

    Abstract Background Parallel data manipulation using R has previously been addressed by members of the R community, however most of these studies produce ad hoc solutions that are not readily available to the average R user. Our targeted users, ranging from the expert ecologist/microbiologists to computational biologists, often experience difficulties in finding optimal ways to exploit the full capacity of their computational resources. In addition, improving performance of commonly used R scripts becomes increasingly difficult especially with large datasets. Furthermore, the implementations described here can be of significant interest to expert bioinformaticians or R developers. Therefore, our goals can be summarized as: (i) description of a complete methodology for the analysis of large datasets by combining capabilities of diverse R packages, (ii) presentation of their application through a virtual R laboratory (RvLab) that makes execution of complex functions and visualization of results easy and readily available to the end-user. New information In this paper, the novelty stems from implementations of parallel methodologies which rely on the processing of data on different levels of abstraction and the availability of these processes through an integrated portal. Parallel implementation R packages, such as the pbdMPI (Programming with Big Data – Interface to MPI) package, are used to implement Single Program Multiple Data (SPMD) parallelization on primitive mathematical operations, allowing for interplay with functions of the vegan package. The dplyr and RPostgreSQL R packages are further integrated offering connections to dataframe like objects (databases) as secondary storage solutions whenever memory demands exceed available RAM resources. The RvLab is running on a PC cluster, using version 3.1.2 (2014-10-31) on a x86_64-pc-linux-gnu (64-bit) platform, and offers an intuitive virtual environmet interface enabling users to perform analysis of ecological and microbial communities based on optimized vegan functions. A beta version of the RvLab is available after registration at: https://portal.lifewatchgreece.eu/ PMID:27932907

  20. Optimized R functions for analysis of ecological community data using the R virtual laboratory (RvLab).

    PubMed

    Varsos, Constantinos; Patkos, Theodore; Oulas, Anastasis; Pavloudi, Christina; Gougousis, Alexandros; Ijaz, Umer Zeeshan; Filiopoulou, Irene; Pattakos, Nikolaos; Vanden Berghe, Edward; Fernández-Guerra, Antonio; Faulwetter, Sarah; Chatzinikolaou, Eva; Pafilis, Evangelos; Bekiari, Chryssoula; Doerr, Martin; Arvanitidis, Christos

    2016-01-01

    Parallel data manipulation using R has previously been addressed by members of the R community, however most of these studies produce ad hoc solutions that are not readily available to the average R user. Our targeted users, ranging from the expert ecologist/microbiologists to computational biologists, often experience difficulties in finding optimal ways to exploit the full capacity of their computational resources. In addition, improving performance of commonly used R scripts becomes increasingly difficult especially with large datasets. Furthermore, the implementations described here can be of significant interest to expert bioinformaticians or R developers. Therefore, our goals can be summarized as: (i) description of a complete methodology for the analysis of large datasets by combining capabilities of diverse R packages, (ii) presentation of their application through a virtual R laboratory (RvLab) that makes execution of complex functions and visualization of results easy and readily available to the end-user. In this paper, the novelty stems from implementations of parallel methodologies which rely on the processing of data on different levels of abstraction and the availability of these processes through an integrated portal. Parallel implementation R packages, such as the pbdMPI (Programming with Big Data - Interface to MPI) package, are used to implement Single Program Multiple Data (SPMD) parallelization on primitive mathematical operations, allowing for interplay with functions of the vegan package. The dplyr and RPostgreSQL R packages are further integrated offering connections to dataframe like objects (databases) as secondary storage solutions whenever memory demands exceed available RAM resources. The RvLab is running on a PC cluster, using version 3.1.2 (2014-10-31) on a x86_64-pc-linux-gnu (64-bit) platform, and offers an intuitive virtual environmet interface enabling users to perform analysis of ecological and microbial communities based on optimized vegan functions. A beta version of the RvLab is available after registration at: https://portal.lifewatchgreece.eu/.

  1. Flow Cytometry Scientist | Center for Cancer Research

    Cancer.gov

    PROGRAM DESCRIPTION The Basic Science Program (BSP) pursues independent, multidisciplinary research in basic and applied molecular biology, immunology, retrovirology, cancer biology, and human genetics. Research efforts and support are an integral part of the Center for Cancer Research (CCR) at the Frederick National Laboratory for Cancer Research (FNLCR). KEY ROLES/RESPONSIBILITIES The Flow Cytometry Core (Flow Core) in the Cancer and Inflammation Program (CIP) is a service core which supports the research efforts of the CCR by providing expertise in the field of flow cytometry (using analyzers and sorters) with the goal of gaining a more thorough understanding of the biology of the immune system, cancer, and inflammation processes. The Flow Core provides service to 12-15 CIP laboratories and more than 22 non-CIP laboratories. Flow core staff provide technical advice on the experimental design of applications, which include immunological phenotyping, cell function assays, and cell cycle analysis. Work is performed per customer requirements, and no independent research is involved. The Flow Cytometry Scientist will be responsible for: Daily management of the Flow Cytometry Core, to include the supervision and guidance of technical staff members Monitor performance of and maintain high dimensional flow cytometer analyzers and cell sorters Operate high dimensional flow cytometer analyzers and cell sorters Provide scientific expertise to the user community and facilitate the development of cutting edge technologies Interact with Flow Core users and customers, and provide technical and scientific advice, and guidance regarding their experiments, including possible collaborations Train staff and scientific end users on the use of flow cytometry in their research, as well as teach them how to operate and troubleshoot the bench-top analyzer instruments Prepare and deliver lectures, as well as one-on-one training sessions, with customers/users Ensure that protocols are up-to-date, and appropriately adhered to Experience with sterile technique and tissue culture

  2. The AtChem On-line model and Electronic Laboratory Notebook (ELN): A free community modelling tool with provenance capture

    NASA Astrophysics Data System (ADS)

    Young, J. C.; Boronska, K.; Martin, C. J.; Rickard, A. R.; Vázquez Moreno, M.; Pilling, M. J.; Haji, M. H.; Dew, P. M.; Lau, L. M.; Jimack, P. K.

    2010-12-01

    AtChem On-line1 is a simple to use zero-dimensional box modelling toolkit, developed for use by laboratory, field and chamber scientists. Any set of chemical reactions can be simulated, in particular the whole Master Chemical Mechanism (MCM2) or any subset of it. Parameters and initial data can be provided through a self-explanatory web form and the resulting model is compiled and run on a dedicated server. The core part of the toolkit, providing a robust solver for thousands of chemical reactions, is written in Fortran and uses SUNDIALS3 CVODE libraries. Chemical systems can be constrained at multiple, user-determined timescales; this enabled studies of radical chemistry at one minute timescales. AtChem On-line is free to use and requires no installation - a web browser, text editor and any compressing software is all the user needs. CPU and storage are provided by the server (input and output data are saved indefinitely). An off-line version is also being developed, which will provide batch processing, an advanced graphical user interface and post-processing tools, for example, Rate of Production Analysis (ROPA) and chainlength analysis. The source code is freely available for advanced users wishing to adapt and run the program locally. Data management, dissemination and archiving are essential in all areas of science. In order to do this in an efficient and transparent way, there is a critical need to capture high quality metadata/provenance for modelling activities. An Electronic Laboratory Notebook (ELN) has been developed in parallel with AtChem Online as part of the EC EUROCHAMP24 project. In order to use controlled chamber experiments to evaluate the MCM, we need to be able to archive, track and search information on all associated chamber model runs, so that they can be used in subsequent mechanism development. Therefore it would be extremely useful if experiment and model metadata/provenance could be easily and automatically stored electronically. Archiving metadata/provenance via an ELN makes it easier to write a paper or thesis and for mechanism developers/evaluators/peer review to search for appropriate experimental and modelling results and conclusions. The development of an ELN in the context mechanism evaluation/development using large experimental chamber datasets is presented.

  3. Distress intolerance moderation of motivated attention to cannabis and negative stimuli after induced stress among cannabis users: an ERP study.

    PubMed

    Macatee, Richard J; Okey, Sarah A; Albanese, Brian J; Schmidt, Norman B; Cougle, Jesse R

    2018-05-07

    Prevalence of cannabis use is increasing, but many regular users do not develop cannabis use disorder (CUD); thus, CUD risk identification among current users is vital for targeted intervention development. Existing data suggest that high distress intolerance (DI), an individual difference reflective of the ability to tolerate negative affect, may be linked to CUD, but no studies have tested possible neurophysiological mechanisms. Increased motivated attentional processing of cannabis and negative emotional stimuli as indexed by neurophysiology [i.e. the late positive potential (LPP)], particularly during acute stress, may contribute to CUD among high DI users. Frequent cannabis users with high (n = 61) and low DI (n = 44) viewed cannabis, negative, and matched neutral images during electroencephalography (EEG) recording before and after a laboratory stressor. Cannabis cue-elicited modulation of the 1000- to 3000-milliseconds LPP was larger in high DI users at post-stressor only, although the effect was only robust in the 1000- to 2000-milliseconds window. Further, modulation magnitude in the high DI group covaried with stress-relief craving and some CUD indices in the 400- to 1000-milliseconds and 1000- to 3000-milliseconds windows, respectively. No significant effects of DI on negative stimuli-elicited LPP modulation were found, although inverse associations with some CUD indices were observed. Finally, exploratory analyses revealed some evidence for DI moderation of the relation between subjective stressor reactivity and negative stimuli-elicited LPP modulation such that greater stressor reactivity was associated with blunted versus enhanced modulation in the high and low DI groups, respectively. Negative and cannabis stimuli-elicited LPP modulation appear to index distinct, CUD-relevant neural processes in high DI cannabis users. © 2018 Society for the Study of Addiction.

  4. E-facts: business process management in clinical data repositories.

    PubMed

    Wattanasin, Nich; Peng, Zhaoping; Raine, Christine; Mitchell, Mariah; Wang, Charles; Murphy, Shawn N

    2008-11-06

    The Partners Healthcare Research Patient Data Registry (RPDR) is a centralized data repository that gathers clinical data from various hospital systems. The RPDR allows clinical investigators to obtain aggregate numbers of patients with user-defined characteristics such as diagnoses, procedures, medications, and laboratory values. They may then obtain patient identifiers and electronic medical records with prior IRB approval. Moreover, the accurate identification and efficient population of worthwhile and quantifiable facts from doctor's report into the RPDR is a significant process. As part of our ongoing e-Fact project, this work describes a new business process management technology that helps coordinate and simplify this procedure.

  5. Applications of digital image processing techniques to problems of data registration and correlation

    NASA Technical Reports Server (NTRS)

    Green, W. B.

    1978-01-01

    An overview is presented of the evolution of the computer configuration at JPL's Image Processing Laboratory (IPL). The development of techniques for the geometric transformation of digital imagery is discussed and consideration is given to automated and semiautomated image registration, and the registration of imaging and nonimaging data. The increasing complexity of image processing tasks at IPL is illustrated with examples of various applications from the planetary program and earth resources activities. It is noted that the registration of existing geocoded data bases with Landsat imagery will continue to be important if the Landsat data is to be of genuine use to the user community.

  6. Design and Implementation Issues for Modern Remote Laboratories

    ERIC Educational Resources Information Center

    Guimaraes, E. G.; Cardozo, E.; Moraes, D. H.; Coelho, P. R.

    2011-01-01

    The design and implementation of remote laboratories present different levels of complexity according to the nature of the equipments operated by the remote laboratory, the requirements imposed on the accessing computers, the network linking the user to the laboratory, and the type of experiments the laboratory supports. This paper addresses the…

  7. PTT Advisor: A CDC-supported initiative to develop a mobile clinical laboratory decision support application for the iOS platform

    PubMed Central

    Savel, Thomas G.; Lee, Brian A.; Ledbetter, Greg; Brown, Sara; LaValley, Dale; Taylor, Julie; Thompson, Pam

    2013-01-01

    Objectives: This manuscript describes the development of PTT (Partial Thromboplastin Time) Advisor, one of the first of a handful of iOS-based mobile applications to be released by the US Centers for Disease Control and Prevention (CDC). PTT Advisor has been a collaboration between two groups at CDC (Informatics R&D and Laboratory Science), and one partner team (Clinical Laboratory Integration into Healthcare Collaborative - CLIHC). The application offers clinicians a resource to quickly select the appropriate follow-up tests to evaluate patients with a prolonged PTT and a normal Prothrombin Time (PT) laboratory result. Methods: The application was designed leveraging an agile methodology, and best practices in user experience (UX) design and mobile application development. Results: As it is an open-source project, the code to PTT Advisor was made available to the public under the Apache Software License. On July 6, 2012, the free app was approved by Apple, and was published to their App Store. Conclusions: Regardless of the complexity of the mobile application, the level of effort required in the development process should not be underestimated. There are several issues that make designing the UI for a mobile phone challenging (not just small screen size): the touchscreen, users' mobile mindset (tasks need to be quick and focused), and the fact that mobile UI conventions/expectations are still being defined and refined (due to the maturity level of the field of mobile application development). PMID:23923100

  8. PTT Advisor: A CDC-supported initiative to develop a mobile clinical laboratory decision support application for the iOS platform.

    PubMed

    Savel, Thomas G; Lee, Brian A; Ledbetter, Greg; Brown, Sara; Lavalley, Dale; Taylor, Julie; Thompson, Pam

    2013-01-01

    This manuscript describes the development of PTT (Partial Thromboplastin Time) Advisor, one of the first of a handful of iOS-based mobile applications to be released by the US Centers for Disease Control and Prevention (CDC). PTT Advisor has been a collaboration between two groups at CDC (Informatics R&D and Laboratory Science), and one partner team (Clinical Laboratory Integration into Healthcare Collaborative - CLIHC). The application offers clinicians a resource to quickly select the appropriate follow-up tests to evaluate patients with a prolonged PTT and a normal Prothrombin Time (PT) laboratory result. The application was designed leveraging an agile methodology, and best practices in user experience (UX) design and mobile application development. As it is an open-source project, the code to PTT Advisor was made available to the public under the Apache Software License. On July 6, 2012, the free app was approved by Apple, and was published to their App Store. Regardless of the complexity of the mobile application, the level of effort required in the development process should not be underestimated. There are several issues that make designing the UI for a mobile phone challenging (not just small screen size): the touchscreen, users' mobile mindset (tasks need to be quick and focused), and the fact that mobile UI conventions/expectations are still being defined and refined (due to the maturity level of the field of mobile application development).

  9. Routine operation of an Elliott 903 computer in a clinical chemistry laboratory

    PubMed Central

    Whitby, L. G.; Simpson, D.

    1973-01-01

    Experience gained in the last four years concerning the capabilities and limitations of an 8K Elliott 903 (18-bit word) computer with magnetic tape backing store in the routine operation of a clinical chemistry laboratory is described. Designed as a total system, routine operation has latterly had to be confined to data acquisition and process control functions, due primarily to limitations imposed by the choice of hardware early in the project. In this final report of a partially successful experiment the opportunity is taken to review mistakes made, especially at the start of the project, to warn potential computer users of pitfalls to be avoided. PMID:4580240

  10. GEOS. User Tutorials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fu, Pengchen; Settgast, Randolph R.; Johnson, Scott M.

    2014-12-17

    GEOS is a massively parallel, multi-physics simulation application utilizing high performance computing (HPC) to address subsurface reservoir stimulation activities with the goal of optimizing current operations and evaluating innovative stimulation methods. GEOS enables coupling of di erent solvers associated with the various physical processes occurring during reservoir stimulation in unique and sophisticated ways, adapted to various geologic settings, materials and stimulation methods. Developed at the Lawrence Livermore National Laboratory (LLNL) as a part of a Laboratory-Directed Research and Development (LDRD) Strategic Initiative (SI) project, GEOS represents the culmination of a multi-year ongoing code development and improvement e ort that hasmore » leveraged existing code capabilities and sta expertise to design new computational geosciences software.« less

  11. CONCEPTUAL DESIGN REPORT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    ROBINSON,K.

    2006-12-31

    Brookhaven National Laboratory has prepared a conceptual design for a world class user facility for scientific research using synchrotron radiation. This facility, called the ''National Synchrotron Light Source II'' (NSLS-II), will provide ultra high brightness and flux and exceptional beam stability. It will also provide advanced insertion devices, optics, detectors, and robotics, and a suite of scientific instruments designed to maximize the scientific output of the facility. Together these will enable the study of material properties and functions with a spatial resolution of {approx}1 nm, an energy resolution of {approx}0.1 meV, and the ultra high sensitivity required to perform spectroscopymore » on a single atom. The overall objective of the NSLS-II project is to deliver a research facility to advance fundamental science and have the capability to characterize and understand physical properties at the nanoscale, the processes by which nanomaterials can be manipulated and assembled into more complex hierarchical structures, and the new phenomena resulting from such assemblages. It will also be a user facility made available to researchers engaged in a broad spectrum of disciplines from universities, industries, and other laboratories.« less

  12. Bushland Evapotranspiration and Agricultural Remote Sensing System (BEARS) software

    NASA Astrophysics Data System (ADS)

    Gowda, P. H.; Moorhead, J.; Brauer, D. K.

    2017-12-01

    Evapotranspiration (ET) is a major component of the hydrologic cycle. ET data are used for a variety of water management and research purposes such as irrigation scheduling, water and crop modeling, streamflow, water availability, and many more. Remote sensing products have been widely used to create spatially representative ET data sets which provide important information from field to regional scales. As UAV capabilities increase, remote sensing use is likely to also increase. For that purpose, scientists at the USDA-ARS research laboratory in Bushland, TX developed the Bushland Evapotranspiration and Agricultural Remote Sensing System (BEARS) software. The BEARS software is a Java based software that allows users to process remote sensing data to generate ET outputs using predefined models, or enter custom equations and models. The capability to define new equations and build new models expands the applicability of the BEARS software beyond ET mapping to any remote sensing application. The software also includes an image viewing tool that allows users to visualize outputs, as well as draw an area of interest using various shapes. This software is freely available from the USDA-ARS Conservation and Production Research Laboratory website.

  13. Development of coffee maker service robot using speech and face recognition systems using POMDP

    NASA Astrophysics Data System (ADS)

    Budiharto, Widodo; Meiliana; Santoso Gunawan, Alexander Agung

    2016-07-01

    There are many development of intelligent service robot in order to interact with user naturally. This purpose can be done by embedding speech and face recognition ability on specific tasks to the robot. In this research, we would like to propose Intelligent Coffee Maker Robot which the speech recognition is based on Indonesian language and powered by statistical dialogue systems. This kind of robot can be used in the office, supermarket or restaurant. In our scenario, robot will recognize user's face and then accept commands from the user to do an action, specifically in making a coffee. Based on our previous work, the accuracy for speech recognition is about 86% and face recognition is about 93% in laboratory experiments. The main problem in here is to know the intention of user about how sweetness of the coffee. The intelligent coffee maker robot should conclude the user intention through conversation under unreliable automatic speech in noisy environment. In this paper, this spoken dialog problem is treated as a partially observable Markov decision process (POMDP). We describe how this formulation establish a promising framework by empirical results. The dialog simulations are presented which demonstrate significant quantitative outcome.

  14. User's manual for the National Water-Quality Assessment Program Invertebrate Data Analysis System (IDAS) software, version 5

    USGS Publications Warehouse

    Cuffney, Thomas F.; Brightbill, Robin A.

    2011-01-01

    The Invertebrate Data Analysis System (IDAS) software was developed to provide an accurate, consistent, and efficient mechanism for analyzing invertebrate data collected as part of the U.S. Geological Survey National Water-Quality Assessment (NAWQA) Program. The IDAS software is a stand-alone program for personal computers that run Microsoft Windows(Registered). It allows users to read data downloaded from the NAWQA Program Biological Transactional Database (Bio-TDB) or to import data from other sources either as Microsoft Excel(Registered) or Microsoft Access(Registered) files. The program consists of five modules: Edit Data, Data Preparation, Calculate Community Metrics, Calculate Diversities and Similarities, and Data Export. The Edit Data module allows the user to subset data on the basis of taxonomy or sample type, extract a random subsample of data, combine or delete data, summarize distributions, resolve ambiguous taxa (see glossary) and conditional/provisional taxa, import non-NAWQA data, and maintain and create files of invertebrate attributes that are used in the calculation of invertebrate metrics. The Data Preparation module allows the user to select the type(s) of sample(s) to process, calculate densities, delete taxa on the basis of laboratory processing notes, delete pupae or terrestrial adults, combine lifestages or keep them separate, select a lowest taxonomic level for analysis, delete rare taxa on the basis of the number of sites where a taxon occurs and (or) the abundance of a taxon in a sample, and resolve taxonomic ambiguities by one of four methods. The Calculate Community Metrics module allows the user to calculate 184 community metrics, including metrics based on organism tolerances, functional feeding groups, and behavior. The Calculate Diversities and Similarities module allows the user to calculate nine diversity and eight similarity indices. The Data Export module allows the user to export data to other software packages (CANOCO, Primer, PC-ORD, MVSP) and produce tables of community data that can be imported into spreadsheet, database, graphics, statistics, and word-processing programs. The IDAS program facilitates the documentation of analyses by keeping a log of the data that are processed, the files that are generated, and the program settings used to process the data. Though the IDAS program was developed to process NAWQA Program invertebrate data downloaded from Bio-TDB, the Edit Data module includes tools that can be used to convert non-NAWQA data into Bio-TDB format. Consequently, the data manipulation, analysis, and export procedures provided by the IDAS program can be used to process data generated outside of the NAWQA Program.

  15. Robotic laboratory for distance education

    NASA Astrophysics Data System (ADS)

    Luciano, Sarah C.; Kost, Alan R.

    2016-09-01

    This project involves the construction of a remote-controlled laboratory experiment that can be accessed by online students. The project addresses a need to provide a laboratory experience for students who are taking online courses to be able to provide an in-class experience. The chosen task for the remote user is an optical engineering experiment, specifically aligning a spatial filter. We instrument the physical laboratory set up in Tucson, AZ at the University of Arizona. The hardware in the spatial filter experiment is augmented by motors and cameras to allow the user to remotely control the hardware. The user interacts with a software on their computer, which communicates with a server via Internet connection to the host computer in the Optics Laboratory at the University of Arizona. Our final overall system is comprised of several subsystems. These are the optical experiment set-up, which is a spatial filter experiment; the mechanical subsystem, which interfaces the motors with the micrometers to move the optical hardware; the electrical subsystem, which allows for the electrical communications from the remote computer to the host computer to the hardware; and finally the software subsystem, which is the means by which messages are communicated throughout the system. The goal of the project is to convey as much of an in-lab experience as possible by allowing the user to directly manipulate hardware and receive visual feedback in real-time. Thus, the remote user is able to learn important concepts from this particular experiment and is able to connect theory to the physical world by actually seeing the outcome of a procedure. The latter is a learning experience that is often lost with distance learning and is one that this project hopes to provide.

  16. Goal Management Training and Mindfulness Meditation improve executive functions and transfer to ecological tasks of daily life in polysubstance users enrolled in therapeutic community treatment.

    PubMed

    Valls-Serrano, Carlos; Caracuel, Alfonso; Verdejo-Garcia, Antonio

    2016-08-01

    We have previously shown that Goal Management Training+Mindfulness Meditation (GMT+MM) improves executive functions in polysubstance users enrolled in outpatient treatment. The aim of this study was to establish if GMT+MM has similar positive effects on executive functions in polysubstance users in residential treatment, and if executive functions' gains transfer to more ecologically valid goal-oriented tasks. Thirty-two polysbustance users were randomly allocated to eight weeks of GMT+MM (n=16) or control, i.e., no-intervention (n=16); both groups received treatment as usual. Outcome measures included performance in laboratory tasks of basic and complex executive functions (i.e., basic: working memory and inhibition; complex: planning and self-regulation) and in an ecological task of goal-directed behavior (the Multiple Errands Test - contextualized version, MET-CV) measured post-interventions. Results showed that GMT+MM was superior to control in improving basic measures of working memory (Letter-number sequencing; F=4.516, p=0.049) and reflection impulsivity (Information Sampling Test; F=6.217, p=0.018), along with initial thinking times during planning (Zoo Map Test; F=8.143, p=0.008). In addition, GMT+MM was superior to control in improving performance in the MET-CV (task failures; F=8.485, p=0.007). Our findings demonstrate that GMT+MM increases reflective processes and the achievement of goals in daily activities, furthermore ecological test can detects changes easily than laboratory tasks. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  17. The Prosocial Effects of 3,4-methylenedioxymethamphetamine (MDMA): Controlled Studies in Humans and Laboratory Animals

    PubMed Central

    Kamilar-Britt, Philip; Bedi, Gillinder

    2015-01-01

    Users of ±3,4-Methylenedioxymethamphetamine (MDMA; ‘ecstasy’) report prosocial effects such as sociability and empathy. Supporting these apparently unique social effects, data from controlled laboratory studies indicate that MDMA alters social feelings, information processing, and behavior in humans, and social behavior in rodents. Here, we review this growing body of evidence. In rodents, MDMA increases passive prosocial behavior (adjacent lying) and social reward while decreasing aggression, effects that may involve serotonin 1A receptor mediated oxytocin release interacting with vasopressin receptor 1A. In humans, MDMA increases plasma oxytocin and produces feelings of social affiliation. It decreases identification of negative facial expressions (cognitive empathy) and blunts responses to social rejection, while enhancing responses to others’ positive emotions (emotional empathy) and increasing social approach. Thus, consistent with drug folklore, laboratory administration of MDMA robustly alters social processing in humans and increases social approach in humans and animals. Effects are consistent with increased sociability, with mixed evidence about enhanced empathy. These neurobiologically-complex prosocial effects likely motivate recreational ecstasy use. PMID:26408071

  18. The prosocial effects of 3,4-methylenedioxymethamphetamine (MDMA): Controlled studies in humans and laboratory animals.

    PubMed

    Kamilar-Britt, Philip; Bedi, Gillinder

    2015-10-01

    Users of ±3,4-methylenedioxymethamphetamine (MDMA; 'ecstasy') report prosocial effects such as sociability and empathy. Supporting these apparently unique social effects, data from controlled laboratory studies indicate that MDMA alters social feelings, information processing, and behavior in humans, and social behavior in rodents. Here, we review this growing body of evidence. In rodents, MDMA increases passive prosocial behavior (adjacent lying) and social reward while decreasing aggression, effects that may involve serotonin 1A receptor mediated oxytocin release interacting with vasopressin receptor 1A. In humans, MDMA increases plasma oxytocin and produces feelings of social affiliation. It decreases identification of negative facial expressions (cognitive empathy) and blunts responses to social rejection, while enhancing responses to others' positive emotions (emotional empathy) and increasing social approach. Thus, consistent with drug folklore, laboratory administration of MDMA robustly alters social processing in humans and increases social approach in humans and animals. Effects are consistent with increased sociability, with mixed evidence about enhanced empathy. These neurobiologically-complex prosocial effects likely motivate recreational ecstasy use. Copyright © 2015. Published by Elsevier Ltd.

  19. Harmonization activities of Noklus - a quality improvement organization for point-of-care laboratory examinations.

    PubMed

    Stavelin, Anne; Sandberg, Sverre

    2018-05-16

    Noklus is a non-profit quality improvement organization that focuses to improve all elements in the total testing process. The aim is to ensure that all medical laboratory examinations are ordered, performed and interpreted correctly and in accordance with the patients' needs for investigation, treatment and follow-up. For 25 years, Noklus has focused on point-of-care (POC) testing in primary healthcare laboratories and has more than 3100 voluntary participants. The Noklus quality system uses different tools to obtain harmonization and improvement: (1) external quality assessment for the pre-examination, examination and postexamination phase to monitor the harmonization process and to identify areas that need improvement and harmonization, (2) manufacturer-independent evaluations of the analytical quality and user-friendliness of POC instruments and (3) close interactions and follow-up of the participants through site visits, courses, training and guidance. Noklus also recommends which tests that should be performed in the different facilities like general practitioner offices, nursing homes, home care, etc. About 400 courses with more than 6000 delegates are organized annually. In 2017, more than 21,000 e-learning programs were completed.

  20. Utilization of the High Flux Isotope Reactor at Oak Ridge National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Selby, Douglas L; Bilheux, Hassina Z; Meilleur, Flora

    2015-01-01

    This paper addresses several aspects of the scientific utilization of the Oak Ridge National Laboratory High Flux Isotope Reactor (HFIR). Topics to be covered will include: 1) HFIR neutron scattering instruments and the formal instrument user program; 2) Recent upgrades to the neutron scattering instrument stations at the reactor, and 3) eMod a new tool for addressing instrument modifications and providing configuration control and design process for scientific instruments at HFIR and the Spallation Neutron Source (SNS). There are 15 operating neutron instrument stations at HFIR with 12 of them organized into a formal user program. Since the last presentationmore » on HFIR instruments at IGORR we have installed a Single Crystal Quasi-Laue Diffractometer instrument called IMAGINE; and we have made significant upgrades to HFIR neutron scattering instruments including the Cold Triple Axis Instrument, the Wide Angle Neutron Diffractometer, the Powder Diffractometer, and the Neutron Imaging station. In addition, we have initiated upgrades to the Thermal Triple Axis Instrument and the Bio-SANS cold neutron instrument detector system. All of these upgrades are tied to a continuous effort to maintain a high level neutron scattering user program at the HFIR. For the purpose of tracking modifications such as those mentioned and configuration control we have been developing an electronic system for entering instrument modification requests that follows a modification or instrument project through concept development, design, fabrication, installation, and commissioning. This system, which we call eMod, electronically leads the task leader through a series of questions and checklists that then identifies such things as ES&H and radiological issues and then automatically designates specific individuals for the activity review process. The system has been in use for less than a year and we are still working out some of the inefficiencies, but we believe that this will become a very effective tool for achieving the configuration and process control believed to be necessary for scientific instrument systems.« less

  1. Shifting Sands and Turning Tides: Using 3D Visualization Technology to Shape the Environment for Undergraduate Students

    NASA Astrophysics Data System (ADS)

    Jenkins, H. S.; Gant, R.; Hopkins, D.

    2014-12-01

    Teaching natural science in a technologically advancing world requires that our methods reach beyond the traditional computer interface. Innovative 3D visualization techniques and real-time augmented user interfaces enable students to create realistic environments to understand the world around them. Here, we present a series of laboratory activities that utilize an Augmented Reality Sandbox to teach basic concepts of hydrology, geology, and geography to undergraduates at Harvard University and the University of Redlands. The Augmented Reality (AR) Sandbox utilizes a real sandbox that is overlain by a digital projection of topography and a color elevation map. A Microsoft Kinect 3D camera feeds altimetry data into a software program that maps this information onto the sand surface using a digital projector. Students can then manipulate the sand and observe as the Sandbox augments their manipulations with projections of contour lines, an elevation color map, and a simulation of water. The idea for the AR Sandbox was conceived at MIT by the Tangible Media Group in 2002 and the simulation software used here was written and developed by Dr. Oliver Kreylos of the University of California - Davis as part of the NSF funded LakeViz3D project. Between 2013 and 2014, we installed AR Sandboxes at Harvard and the University of Redlands, respectively, and developed laboratory exercises to teach flooding hazard, erosion and watershed development in undergraduate earth and environmental science courses. In 2013, we introduced a series of AR Sandbox laboratories in Introductory Geology, Hydrology, and Natural Disasters courses. We found laboratories that utilized the AR Sandbox at both universities allowed students to become quickly immersed in the learning process, enabling a more intuitive understanding of the processes that govern the natural world. The physical interface of the AR Sandbox reduces barriers to learning, can be used to rapidly illustrate basic concepts of geology, geography and hydrology, and enabled our undergraduate students to understand topography intuitively. We therefore find the AR Sandbox to be a novel teaching tool and an effective demonstration of the capabilities of 3D visualization and real-time augmented user interfaces that enable students to better understand environmental processes.

  2. Medical Services: Department of Defense Veterinary/Medical Laboratory Food Safety and Quality Assurance Program

    DTIC Science & Technology

    1995-02-01

    ice (cream) milk and yogurt processing at the retail/user level (chap 5). o Revises and prescribes the use of DD Form 2385 (Microbiological Quality... yogurt , sandwiches and spreads, salad type convenience foods, and other processed/ p r e – p a c k a g e d a n d r e a d y – t o – e a t ( R T E ) f o o...dairy products to include, but not be limited to, fresh and cultured dairy products, frozen desserts, soft serve ice cream/milk and yogurt mix. (3) RTE

  3. Virtual Geophysics Laboratory: Exploiting the Cloud and Empowering Geophysicsts

    NASA Astrophysics Data System (ADS)

    Fraser, Ryan; Vote, Josh; Goh, Richard; Cox, Simon

    2013-04-01

    Over the last five decades geoscientists from Australian state and federal agencies have collected and assembled around 3 Petabytes of geoscience data sets under public funding. As a consequence of technological progress, data is now being acquired at exponential rates and in higher resolution than ever before. Effective use of these big data sets challenges the storage and computational infrastructure of most organizations. The Virtual Geophysics Laboratory (VGL) is a scientific workflow portal addresses some of the resulting issues by providing Australian geophysicists with access to a Web 2.0 or Rich Internet Application (RIA) based integrated environment that exploits eResearch tools and Cloud computing technology, and promotes collaboration between the user community. VGL simplifies and automates large portions of what were previously manually intensive scientific workflow processes, allowing scientists to focus on the natural science problems, rather than computer science and IT. A number of geophysical processing codes are incorporated to support multiple workflows. For example a gravity inversion can be performed by combining the Escript/Finley codes (from the University of Queensland) with the gravity data registered in VGL. Likewise, tectonic processes can also be modeled by combining the Underworld code (from Monash University) with one of the various 3D models available to VGL. Cloud services provide scalable and cost effective compute resources. VGL is built on top of mature standards-compliant information services, many deployed using the Spatial Information Services Stack (SISS), which provides direct access to geophysical data. A large number of data sets from Geoscience Australia assist users in data discovery. GeoNetwork provides a metadata catalog to store workflow results for future use, discovery and provenance tracking. VGL has been developed in collaboration with the research community using incremental software development practices and open source tools. While developed to provide the geophysics research community with a sustainable platform and scalable infrastructure; VGL has also developed a number of concepts, patterns and generic components of which have been reused for cases beyond geophysics, including natural hazards, satellite processing and other areas requiring spatial data discovery and processing. Future plans for VGL include a number of improvements in both functional and non-functional areas in response to its user community needs and advancement in information technologies. In particular, research is underway in the following areas (a) distributed and parallel workflow processing in the cloud, (b) seamless integration with various cloud providers, and (c) integration with virtual laboratories representing other science domains. Acknowledgements: VGL was developed by CSIRO in collaboration with Geoscience Australia, National Computational Infrastructure, Australia National University, Monash University and University of Queensland, and has been supported by the Australian Government's Education Investment Funds through NeCTAR.

  4. User Access | Energy Systems Integration Facility | NREL

    Science.gov Websites

    User Access User Access The ESIF houses an unparalleled collection of state-of-the-art capabilities user access program, the ESIF allows researchers access to its premier laboratories in support of research and development that aims to optimize our entire energy system at full power. Requests for access

  5. Space nuclear safety from a user's viewpoint

    NASA Technical Reports Server (NTRS)

    Campbell, R. W.

    1985-01-01

    The National Aeronautics and Space Administration (NASA) launched the Jet Propulsion Laboratory's (JPL) two Voyager spacecraft to Jupiter in 1977, each using three radioisotope thermoelectric generators (RTGs) supplied by the Department of Energy (DOE) for onboard electric power. In 1986 NASA will launch JPL's Galileo spacecraft to Jupiter equipped with two DOE supplied RTGs of an improved design. NASA and JPL are also responsible for obtaining a single RTG of this type from DOE and supplying it to the European Space Agency as part of its participation in the International Solar Polar Mission. As a result of these missions, JPL has been deeply involved in space nuclear safety as a user. This paper will give a brief review of the user contributions by JPL - and NASA in general - to the nuclear safety processes and relate them to the overall nuclear safety program necessary for the launch of an RTG. The two major safety areas requiring user support are the ground operations involving RTGs at the launch site and the failure modes and probabilities associated with launch accidents.

  6. The evaluation of a web-based incident reporting system.

    PubMed

    Kuo, Ya-Hui; Lee, Ting-Ting; Mills, Mary Etta; Lin, Kuan-Chia

    2012-07-01

    A Web-based reporting system is essential to report incident events anonymously and confidentially. The purpose of this study was to evaluate a Web-based reporting system in Taiwan. User satisfaction and impact of system use were evaluated through a survey answered by 249 nurses. Incident events reported in paper and electronic systems were collected for comparison purposes. Study variables included system user satisfaction, willingness to report, number of reports, severity of the events, and efficiency of the reporting process. Results revealed that senior nurses were less willing to report events, nurses on internal medicine units had higher satisfaction than others, and lowest satisfaction was related to the time it took to file a report. In addition, the Web-based reporting system was used more often than the paper system. The percentages of events reported were significantly higher in the Web-based system in laboratory, environment/device, and incidents occurring in other units, whereas the proportions of reports involving bedsores and dislocation of endotracheal tubes were decreased. Finally, moderate injury event reporting decreased, whereas minor or minimal injury event reporting increased. The study recommends that the data entry process be simplified and the network system be improved to increase user satisfaction and reporting rates.

  7. National Ignition Facility, High-Energy-Density and Inertial Confinement Fusion, Peer-Review Panel (PRP) Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keane, C. J.

    2014-01-28

    The National Ignition Facility (NIF) at Lawrence Livermore National Laboratory (LLNL) is operated as a National Nuclear Security Administration (NNSA) user facility in accordance with Department of Energy (DOE) best practices, including peer-reviewed experiments, regular external reviews of performance, and the use of a management structure that facilitates user and stakeholder feedback. NIF facility time is managed using processes similar to those in other DOE science facilities and is tailored to meet the mix of missions and customers that NIF supports. The NIF Governance Plan describes the process for allocating facility time on NIF and for creating the shot schedule.more » It also includes the flow of responsibility from entity to entity. The plan works to ensure that NIF meets its mission goals using the principles of scientific peer review, including transparency and cooperation among the sponsor, the NIF staff, and the various user communities. The NIF Governance Plan, dated September 28, 2012, was accepted and signed by LLNL Director Parney Albright, NIF Director Ed Moses, and Don Cook and Thomas D’Agostino of NNSA. Figure 1 shows the organizational structure for NIF Governance.« less

  8. Diagnostic equipment outside the laboratory.

    PubMed Central

    Burrin, J M; Fyffe, J A

    1988-01-01

    A questionnaire was circulated to clinical biochemistry laboratories in the North West Thames region of the United Kingdom requesting information on extralaboratory equipment. Data on the types and numbers of instruments in use, their relationship with the laboratory, and quality assurance procedures were obtained. Laboratories were prepared to maintain equipment over which they had no responsibility for purchase, training of users, or use. The quality assurance of these instruments gave even greater cause for concern. Although internal quality control procedures were performed on many of the instruments, laboratories were involved in only a minority of these procedures. Quality control procedures and training of users were undertaken on site in less than 50% of blood gas analysers and bilirubin meters and in less than 25% of glucose meters. External quality assessment procedures were non-existent for all of the instruments in use with the exception of glucose stick meters in two laboratories. PMID:3192750

  9. DOE - BES Nanoscale Science Research Centers (NSRCs)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beecher, Cathy Jo

    2016-11-14

    These are slides from a powerpoint shown to guests during tours of Center for Integrated Nanotechnologies (CINT) at Los Alamos National Laboratory. It shows the five DOE-BES nanoscale science research centers (NSRCs), which are located at different national laboratories throughout the country. Then it goes into detail specifically about the Center for Integrated Nanotechnologies at LANL, including statistics on its user community and CINT's New Mexico industrial users.

  10. The price elasticity of demand for heroin: matched longitudinal and experimental evidence#

    PubMed Central

    Olmstead, Todd A.; Alessi, Sheila M.; Kline, Brendan; Pacula, Rosalie Liccardo; Petry, Nancy M.

    2015-01-01

    This paper reports estimates of the price elasticity of demand for heroin based on a newly constructed dataset. The dataset has two matched components concerning the same sample of regular heroin users: longitudinal information about real-world heroin demand (actual price and actual quantity at daily intervals for each heroin user in the sample) and experimental information about laboratory heroin demand (elicited by presenting the same heroin users with scenarios in a laboratory setting). Two empirical strategies are used to estimate the price elasticity of demand for heroin. The first strategy exploits the idiosyncratic variation in the price experienced by a heroin user over time that occurs in markets for illegal drugs. The second strategy exploits the experimentally-induced variation in price experienced by a heroin user across experimental scenarios. Both empirical strategies result in the estimate that the conditional price elasticity of demand for heroin is approximately −0.80. PMID:25702687

  11. The Sixth Omega Laser Facility Users Group Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petrasso, R. D.

    A capacity gathering of over 100 researchers from 25 universities and laboratories met at the Laboratory for Laser Energetics (LLE) for the Sixth Omega Laser Facility Users Group (OLUG) workshop. The purpose of the 2.5-day workshop was to facilitate communications and exchanges among individual OMEGA users, and between users and the LLE management; to present ongoing and proposed research; to encourage research opportunities and collaborations that could be undertaken at the Omega Laser Facility and in a complementary fashion at other facilities [such as the National Ignition Facility (NIF) or the Laboratoire pour l’Utilisation des Lasers Intenses (LULI)]; to providemore » an opportunity for students, postdoctoral fellows, and young researchers to present their research in an informal setting; and to provide feedback from the users to LLE management about ways to improve and keep the facility and future experimental campaigns at the cutting edge.« less

  12. The Fifth Omega Laser Facility Users Group Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petrasso, R. D.

    A capacity gathering of over 100 researchers from 25 universities and laboratories met at the Laboratory for Laser Energetics (LLE) for the Fifth Omega Laser Facility Users Group (OLUG) workshop. The purpose of the 2.5-day workshop was to facilitate communications and exchanges among individual Omega users and between users and the LLE management; to present ongoing and proposed research; to encourage research opportunities and collaborations that could be undertaken at the Omega Laser Facility and in a complementary fashion at other facilities [such as the National Ignition Facility (NIF) or the Laboratoire pour l’Utilisation des Lasers Intenses (LULI)]; to providemore » an opportunity for students, postdoctoral fellows, and young researchers to present their research in an informal setting; and to provide feedback to LLE management from the users about ways to improve the facility and future experimental campaigns.« less

  13. Using Evernote as an electronic lab notebook in a translational science laboratory.

    PubMed

    Walsh, Emily; Cho, Ilseung

    2013-06-01

    Electronic laboratory notebooks (ELNs) offer significant advantages over traditional paper laboratory notebooks (PLNs), yet most research labs today continue to use paper documentation. While biopharmaceutical companies represent the largest portion of ELN users, government and academic labs trail far behind in their usage. Our lab, a translational science laboratory at New York University School of Medicine (NYUSoM), wanted to determine if an ELN could effectively replace PLNs in an academic research setting. Over 6 months, we used the program Evernote to record all routine experimental information. We also surveyed students working in research laboratories at NYUSoM on the relative advantages and limitations of ELNs and PLNs and discovered that electronic and paper notebook users alike reported the inability to freehand into a notebook as a limitation when using electronic methods. Using Evernote, we found that the numerous advantages of ELNs greatly outweighed the inability to freehand directly into a notebook. We also used imported snapshots and drawing program add-ons to obviate the need for freehanding. Thus, we found that using Evernote as an ELN not only effectively replaces PLNs in an academic research setting but also provides users with a wealth of other advantages over traditional paper notebooks.

  14. NSUF Irradiated Materials Library

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cole, James Irvin

    The Nuclear Science User Facilities has been in the process of establishing an innovative Irradiated Materials Library concept for maximizing the value of previous and on-going materials and nuclear fuels irradiation test campaigns, including utilization of real-world components retrieved from current and decommissioned reactors. When the ATR national scientific user facility was established in 2007 one of the goals of the program was to establish a library of irradiated samples for users to access and conduct research through competitively reviewed proposal process. As part of the initial effort, staff at the user facility identified legacy materials from previous programs thatmore » are still being stored in laboratories and hot-cell facilities at the INL. In addition other materials of interest were identified that are being stored outside the INL that the current owners have volunteered to enter into the library. Finally, over the course of the last several years, the ATR NSUF has irradiated more than 3500 specimens as part of NSUF competitively awarded research projects. The Logistics of managing this large inventory of highly radioactive poses unique challenges. This document will describe materials in the library, outline the policy for accessing these materials and put forth a strategy for making new additions to the library as well as establishing guidelines for minimum pedigree needed to be included in the library to limit the amount of material stored indefinitely without identified value.« less

  15. Comparing Binaural Pre-processing Strategies II: Speech Intelligibility of Bilateral Cochlear Implant Users.

    PubMed

    Baumgärtel, Regina M; Hu, Hongmei; Krawczyk-Becker, Martin; Marquardt, Daniel; Herzke, Tobias; Coleman, Graham; Adiloğlu, Kamil; Bomke, Katrin; Plotz, Karsten; Gerkmann, Timo; Doclo, Simon; Kollmeier, Birger; Hohmann, Volker; Dietz, Mathias

    2015-12-30

    Several binaural audio signal enhancement algorithms were evaluated with respect to their potential to improve speech intelligibility in noise for users of bilateral cochlear implants (CIs). 50% speech reception thresholds (SRT50) were assessed using an adaptive procedure in three distinct, realistic noise scenarios. All scenarios were highly nonstationary, complex, and included a significant amount of reverberation. Other aspects, such as the perfectly frontal target position, were idealized laboratory settings, allowing the algorithms to perform better than in corresponding real-world conditions. Eight bilaterally implanted CI users, wearing devices from three manufacturers, participated in the study. In all noise conditions, a substantial improvement in SRT50 compared to the unprocessed signal was observed for most of the algorithms tested, with the largest improvements generally provided by binaural minimum variance distortionless response (MVDR) beamforming algorithms. The largest overall improvement in speech intelligibility was achieved by an adaptive binaural MVDR in a spatially separated, single competing talker noise scenario. A no-pre-processing condition and adaptive differential microphones without a binaural link served as the two baseline conditions. SRT50 improvements provided by the binaural MVDR beamformers surpassed the performance of the adaptive differential microphones in most cases. Speech intelligibility improvements predicted by instrumental measures were shown to account for some but not all aspects of the perceptually obtained SRT50 improvements measured in bilaterally implanted CI users. © The Author(s) 2015.

  16. Scaling of an information system in a public healthcare market--infrastructuring from the vendor's perspective.

    PubMed

    Johannessen, Liv Karen; Obstfelder, Aud; Lotherington, Ann Therese

    2013-05-01

    The purpose of this paper is to explore the making and scaling of information infrastructures, as well as how the conditions for scaling a component may change for the vendor. The first research question is how the making and scaling of a healthcare information infrastructure can be done and by whom. The second question is what scope for manoeuvre there might be for vendors aiming to expand their market. This case study is based on an interpretive approach, whereby data is gathered through participant observation and semi-structured interviews. A case study of the making and scaling of an electronic system for general practitioners ordering laboratory services from hospitals is described as comprising two distinct phases. The first may be characterized as an evolving phase, when development, integration and implementation were achieved in small steps, and the vendor, together with end users, had considerable freedom to create the solution according to the users' needs. The second phase was characterized by a large-scale procurement process over which regional healthcare authorities exercised much more control and the needs of groups other than the end users influenced the design. The making and scaling of healthcare information infrastructures is not simply a process of evolution, in which the end users use and change the technology. It also consists of large steps, during which different actors, including vendors and healthcare authorities, may make substantial contributions. This process requires work, negotiation and strategies. The conditions for the vendor may change dramatically, from considerable freedom and close relationships with users and customers in the small-scale development, to losing control of the product and being required to engage in more formal relations with customers in the wider public healthcare market. Onerous procurement processes may be one of the reasons why large-scale implementation of information projects in healthcare is difficult and slow. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  17. A Tool for Automatic Verification of Real-Time Expert Systems

    NASA Technical Reports Server (NTRS)

    Traylor, B.; Schwuttke, U.; Quan, A.

    1994-01-01

    The creation of an automated, user-driven tool for expert system development, validation, and verification is curretly onoging at NASA's Jet Propulsion Laboratory. In the new age of faster, better, cheaper missions, there is an increased willingness to utilize embedded expert systems for encapsulating and preserving mission expertise in systems which combine conventional algorithmic processing and artifical intelligence. The once-questioned role of automation in spacecraft monitoring is now becoming one of increasing importance.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foster, Nancy S.; Showalter, Mary Ann

    This report describes the activities and research performed at the Environmental Molecular Sciences Laboratory, a Department of Energy national scientific user facility at Pacific Northwest National Laboratory, during Fiscal Year 2006.

  19. Privacy Policy | Frederick National Laboratory for Cancer Research

    Cancer.gov

    The privacy of our users is of utmost importance to Frederick National Laboratory. The policy outlined below establishes how Frederick National Laboratory will use the information we gather about you from your visit to our website. We may coll

  20. Implementation of 5S Method for Ergonomic Laboratory

    NASA Astrophysics Data System (ADS)

    Dila Sari, Amarria; Ilma Rahmillah, Fety; Prabowo Aji, Bagus

    2017-06-01

    This article discusses 5S implementation in Work System Design and Ergonomic Laboratory, Department of Industrial Engineering, Islamic University of Indonesia. There are some problems related to equipment settings for activity involving students such as files which is accumulated over the previous year practicum, as well as the movement of waste in the form of time due to the placement of goods that do not fit. Therefore, this study aims to apply the 5S method in DSK & E laboratory to facilitate the work processes and reduce waste. The project is performed by laboratory management using 5S methods in response to continuous improvement (Kaizen). Moreover, some strategy and suggestions are promoted to impose 5S system within the laboratory. As a result, the tidiness and cleanliness can be achieved that lead to the great performance of laboratory users. Score assessment before implementing 5S DSKE laboratory is at 64 (2.56) while the score after implementation is 32 (1.28) and shows an improvement of 50%. This has implications for better use in the laboratory area, save time when looking for tools and materials due to its location and good visual control, as well as improving the culture and spirit of ‘5S’ on staff regarding better working environment

  1. Experiences in Training End-User Searchers.

    ERIC Educational Resources Information Center

    Haines, Judith S.

    1982-01-01

    Describes study of chemists in the Chemistry Division, Organic Research Laboratory, Eastman Kodak Company, as end-user searchers on the DIALOG system searching primarily the "Chemical Abstracts" database. Training, level of use, online browsing, types of searches, satisfaction, costs, and value of end-user searching are highlighted.…

  2. A laboratory procedure for measuring and georeferencing soil colour

    NASA Astrophysics Data System (ADS)

    Marques-Mateu, A.; Balaguer-Puig, M.; Moreno-Ramon, H.; Ibanez-Asensio, S.

    2015-04-01

    Remote sensing and geospatial applications very often require ground truth data to assess outcomes from spatial analyses or environmental models. Those data sets, however, may be difficult to collect in proper format or may even be unavailable. In the particular case of soil colour the collection of reliable ground data can be cumbersome due to measuring methods, colour communication issues, and other practical factors which lead to a lack of standard procedure for soil colour measurement and georeferencing. In this paper we present a laboratory procedure that provides colour coordinates of georeferenced soil samples which become useful in later processing stages of soil mapping and classification from digital images. The procedure requires a laboratory setup consisting of a light booth and a trichromatic colorimeter, together with a computer program that performs colour measurement, storage, and colour space transformation tasks. Measurement tasks are automated by means of specific data logging routines which allow storing recorded colour data in a spatial format. A key feature of the system is the ability of transforming between physically-based colour spaces and the Munsell system which is still the standard in soil science. The working scheme pursues the automation of routine tasks whenever possible and the avoidance of input mistakes by means of a convenient layout of the user interface. The program can readily manage colour and coordinate data sets which eventually allow creating spatial data sets. All the tasks regarding data joining between colorimeter measurements and samples locations are executed by the software in the background, allowing users to concentrate on samples processing. As a result, we obtained a robust and fully functional computer-based procedure which has proven a very useful tool for sample classification or cataloging purposes as well as for integrating soil colour data with other remote sensed and spatial data sets.

  3. openBIS ELN-LIMS: an open-source database for academic laboratories.

    PubMed

    Barillari, Caterina; Ottoz, Diana S M; Fuentes-Serna, Juan Mariano; Ramakrishnan, Chandrasekhar; Rinn, Bernd; Rudolf, Fabian

    2016-02-15

    The open-source platform openBIS (open Biology Information System) offers an Electronic Laboratory Notebook and a Laboratory Information Management System (ELN-LIMS) solution suitable for the academic life science laboratories. openBIS ELN-LIMS allows researchers to efficiently document their work, to describe materials and methods and to collect raw and analyzed data. The system comes with a user-friendly web interface where data can be added, edited, browsed and searched. The openBIS software, a user guide and a demo instance are available at https://openbis-eln-lims.ethz.ch. The demo instance contains some data from our laboratory as an example to demonstrate the possibilities of the ELN-LIMS (Ottoz et al., 2014). For rapid local testing, a VirtualBox image of the ELN-LIMS is also available. © The Author 2015. Published by Oxford University Press.

  4. Toward a Virtual Laboratory to Assess Biodiversity from Data Produced by an Underwater Microscope

    NASA Astrophysics Data System (ADS)

    Beaulieu, S.; Ball, M.; Futrelle, J.; Sosik, H. M.

    2016-12-01

    Real-time data from sensors deployed in the ocean are increasingly available online for broad use by scientists, educators, and the public. Such data have previously been limited to physical parameters, but data for biological parameters are becoming more prevalent with the development of new submersible instruments. Imaging FlowCytobot (IFCB), for example, automatically and rapidly acquires images of microscopic algae (phytoplankton) at the base of the food web in marine ecosystems. These images and products from image processing and automated classification are accessible via web services from an IFCB dashboard. However, until now, to process these data further into results representing the biodiversity of the phytoplankton required a complex workflow that could only be executed by scientists involved in the instrument development. Also, because these data have been collected near continuously for a decade, a number of "big data" challenges arise in attempting to implement and reproduce the workflow. Our research is geared toward the development of a virtual laboratory to enable other scientists and educators, as new users of data from this underwater microscope, to generate biodiversity data products. Our solution involves an electronic notebook (Jupyter Notebook) that can be re-purposed by users with some Python programming experience. However, when we scaled the virtual laboratory to accommodate a 2-month example time series (thousands of binned files each representing thousands of images), we needed to expand the execution environment to include batch processing outside of the notebook. We will share how we packaged these tools to share with other scientists to perform their own biodiversity assessment from data available on an IFCB dashboard. Additional outcomes of software development in this project include a prototype for time-series visualizations to be generated in near-real-time and recommendations for new products accessible via web services from the IFCB dashboard.

  5. web-based interactive data processing: application to stable isotope metrology.

    PubMed

    Verkouteren, R M; Lee, J N

    2001-08-01

    To address a fundamental need in stable isotope metrology, the National Institute of Standards and Technology (NIST) has established a web-based interactive data-processing system accessible through a common gateway interface (CGI) program on the internet site http://www. nist.gov/widps-co2. This is the first application of a web-based tool that improves the measurement traceability afforded by a series of NIST standard materials. Specifically, this tool promotes the proper usage of isotope reference materials (RMs) and improves the quality of reported data from extensive measurement networks. Through the International Atomic Energy Agency (IAEA), we have defined standard procedures for stable isotope measurement and data-processing, and have determined and applied consistent reference values for selected NIST and IAEA isotope RMs. Measurement data of samples and RMs are entered into specified fields on the web-based form. These data are submitted through the CGI program on a NIST Web server, where appropriate calculations are performed and results returned to the client. Several international laboratories have independently verified the accuracy of the procedures and algorithm for measurements of naturally occurring carbon-13 and oxygen-18 abundances and slightly enriched compositions up to approximately 150% relative to natural abundances. To conserve the use of the NIST RMs, users may determine value assignments for a secondary standard to be used in routine analysis. Users may also wish to validate proprietary algorithms embedded in their laboratory instrumentation, or specify the values of fundamental variables that are usually fixed in reduction algorithms to see the effect on the calculations. The results returned from the web-based tool are limited in quality only by the measurements themselves, and further value may be realized through the normalization function. When combined with stringent measurement protocols, two- to threefold improvements have been realized in the reproducibility of carbon-13 and oxygen-18 determinations across laboratories.

  6. Predicting Mental Imagery-Based BCI Performance from Personality, Cognitive Profile and Neurophysiological Patterns.

    PubMed

    Jeunet, Camille; N'Kaoua, Bernard; Subramanian, Sriram; Hachet, Martin; Lotte, Fabien

    2015-01-01

    Mental-Imagery based Brain-Computer Interfaces (MI-BCIs) allow their users to send commands to a computer using their brain-activity alone (typically measured by ElectroEncephaloGraphy-EEG), which is processed while they perform specific mental tasks. While very promising, MI-BCIs remain barely used outside laboratories because of the difficulty encountered by users to control them. Indeed, although some users obtain good control performances after training, a substantial proportion remains unable to reliably control an MI-BCI. This huge variability in user-performance led the community to look for predictors of MI-BCI control ability. However, these predictors were only explored for motor-imagery based BCIs, and mostly for a single training session per subject. In this study, 18 participants were instructed to learn to control an EEG-based MI-BCI by performing 3 MI-tasks, 2 of which were non-motor tasks, across 6 training sessions, on 6 different days. Relationships between the participants' BCI control performances and their personality, cognitive profile and neurophysiological markers were explored. While no relevant relationships with neurophysiological markers were found, strong correlations between MI-BCI performances and mental-rotation scores (reflecting spatial abilities) were revealed. Also, a predictive model of MI-BCI performance based on psychometric questionnaire scores was proposed. A leave-one-subject-out cross validation process revealed the stability and reliability of this model: it enabled to predict participants' performance with a mean error of less than 3 points. This study determined how users' profiles impact their MI-BCI control ability and thus clears the way for designing novel MI-BCI training protocols, adapted to the profile of each user.

  7. PIMS sequencing extension: a laboratory information management system for DNA sequencing facilities.

    PubMed

    Troshin, Peter V; Postis, Vincent Lg; Ashworth, Denise; Baldwin, Stephen A; McPherson, Michael J; Barton, Geoffrey J

    2011-03-07

    Facilities that provide a service for DNA sequencing typically support large numbers of users and experiment types. The cost of services is often reduced by the use of liquid handling robots but the efficiency of such facilities is hampered because the software for such robots does not usually integrate well with the systems that run the sequencing machines. Accordingly, there is a need for software systems capable of integrating different robotic systems and managing sample information for DNA sequencing services. In this paper, we describe an extension to the Protein Information Management System (PIMS) that is designed for DNA sequencing facilities. The new version of PIMS has a user-friendly web interface and integrates all aspects of the sequencing process, including sample submission, handling and tracking, together with capture and management of the data. The PIMS sequencing extension has been in production since July 2009 at the University of Leeds DNA Sequencing Facility. It has completely replaced manual data handling and simplified the tasks of data management and user communication. Samples from 45 groups have been processed with an average throughput of 10000 samples per month. The current version of the PIMS sequencing extension works with Applied Biosystems 3130XL 96-well plate sequencer and MWG 4204 or Aviso Theonyx liquid handling robots, but is readily adaptable for use with other combinations of robots. PIMS has been extended to provide a user-friendly and integrated data management solution for DNA sequencing facilities that is accessed through a normal web browser and allows simultaneous access by multiple users as well as facility managers. The system integrates sequencing and liquid handling robots, manages the data flow, and provides remote access to the sequencing results. The software is freely available, for academic users, from http://www.pims-lims.org/.

  8. Implementation and use of cloud-based electronic lab notebook in a bioprocess engineering teaching laboratory.

    PubMed

    Riley, Erin M; Hattaway, Holly Z; Felse, P Arthur

    2017-01-01

    Electronic lab notebooks (ELNs) are better equipped than paper lab notebooks (PLNs) to handle present-day life science and engineering experiments that generate large data sets and require high levels of data integrity. But limited training and a lack of workforce with ELN knowledge have restricted the use of ELN in academic and industry research laboratories which still rely on cumbersome PLNs for recordkeeping. We used LabArchives, a cloud-based ELN in our bioprocess engineering lab course to train students in electronic record keeping, good documentation practices (GDPs), and data integrity. Implementation of ELN in the bioprocess engineering lab course, an analysis of user experiences, and our development actions to improve ELN training are presented here. ELN improved pedagogy and learning outcomes of the lab course through stream lined workflow, quick data recording and archiving, and enhanced data sharing and collaboration. It also enabled superior data integrity, simplified information exchange, and allowed real-time and remote monitoring of experiments. Several attributes related to positive user experiences of ELN improved between the two subsequent years in which ELN was offered. Student responses also indicate that ELN is better than PLN for compliance. We demonstrated that ELN can be successfully implemented in a lab course with significant benefits to pedagogy, GDP training, and data integrity. The methods and processes presented here for ELN implementation can be adapted to many types of laboratory experiments.

  9. Molecular Genetics Information System (MOLGENIS): alternatives in developing local experimental genomics databases.

    PubMed

    Swertz, Morris A; De Brock, E O; Van Hijum, Sacha A F T; De Jong, Anne; Buist, Girbe; Baerends, Richard J S; Kok, Jan; Kuipers, Oscar P; Jansen, Ritsert C

    2004-09-01

    Genomic research laboratories need adequate infrastructure to support management of their data production and research workflow. But what makes infrastructure adequate? A lack of appropriate criteria makes any decision on buying or developing a system difficult. Here, we report on the decision process for the case of a molecular genetics group establishing a microarray laboratory. Five typical requirements for experimental genomics database systems were identified: (i) evolution ability to keep up with the fast developing genomics field; (ii) a suitable data model to deal with local diversity; (iii) suitable storage of data files in the system; (iv) easy exchange with other software; and (v) low maintenance costs. The computer scientists and the researchers of the local microarray laboratory considered alternative solutions for these five requirements and chose the following options: (i) use of automatic code generation; (ii) a customized data model based on standards; (iii) storage of datasets as black boxes instead of decomposing them in database tables; (iv) loosely linking to other programs for improved flexibility; and (v) a low-maintenance web-based user interface. Our team evaluated existing microarray databases and then decided to build a new system, Molecular Genetics Information System (MOLGENIS), implemented using code generation in a period of three months. This case can provide valuable insights and lessons to both software developers and a user community embarking on large-scale genomic projects. http://www.molgenis.nl

  10. Measuring and Inferring the State of the User via the Microsoft Kinect with Application to Cyber Security Research

    DTIC Science & Technology

    2018-01-16

    ARL-TN-0864 ● JAN 2018 US Army Research Laboratory Measuring and Inferring the State of the User via the Microsoft Kinect with...Application to Cyber Security Research by Christopher J Garneau Approved for public release; distribution is unlimited...this report when it is no longer needed. Do not return it to the originator. ARL-TN-0864● JAN 2018 US Army Research Laboratory

  11. Nuclear Science User Facilities (NSUF) Monthly Report March 2015

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soelberg, Renae

    Nuclear Science User Facilities (NSUF) Formerly: Advanced Test Reactor National Scientific User Facility (ATR NSUF) Monthly Report February 2015 Highlights; Jim Cole attended the OECD NEA Expert Group on Innovative Structural Materials meeting in Paris, France; Jim Lane and Doug Copsey of Writers Ink visited PNNL to prepare an article for the NSUF annual report; Brenden Heidrich briefed the Nuclear Energy Advisory Committee-Facilities Subcommittee on the Nuclear Energy Infrastructure Database project and provided them with custom reports for their upcoming visits to Argonne National Laboratory, Idaho National Laboratory, Oak Ridge National Laboratory and the Massachusetts Institute of Technology; and Universitymore » of California-Berkeley Principal Investigator Mehdi Balooch visited PNNL to observe measurements and help finalize plans for completing the desired suite of analyses. His visit was coordinated to coincide with the visit of Jim Lane and Doug Copsey.« less

  12. 7 CFR 504.4 - Exemptions from user fee charges.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... OF AGRICULTURE USER FEES § 504.4 Exemptions from user fee charges. (a) USDA laboratories and ARS cooperators designated by the Curator of the ARS Patent Culture Collection are exempt from fee assessments. (b) The Curator of the ARS Patent Culture Collection is delegated the authority to approve and revoke...

  13. 7 CFR 504.4 - Exemptions from user fee charges.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... OF AGRICULTURE USER FEES § 504.4 Exemptions from user fee charges. (a) USDA laboratories and ARS cooperators designated by the Curator of the ARS Patent Culture Collection are exempt from fee assessments. (b) The Curator of the ARS Patent Culture Collection is delegated the authority to approve and revoke...

  14. 7 CFR 504.4 - Exemptions from user fee charges.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... OF AGRICULTURE USER FEES § 504.4 Exemptions from user fee charges. (a) USDA laboratories and ARS cooperators designated by the Curator of the ARS Patent Culture Collection are exempt from fee assessments. (b) The Curator of the ARS Patent Culture Collection is delegated the authority to approve and revoke...

  15. 7 CFR 504.4 - Exemptions from user fee charges.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... OF AGRICULTURE USER FEES § 504.4 Exemptions from user fee charges. (a) USDA laboratories and ARS cooperators designated by the Curator of the ARS Patent Culture Collection are exempt from fee assessments. (b) The Curator of the ARS Patent Culture Collection is delegated the authority to approve and revoke...

  16. 7 CFR 504.4 - Exemptions from user fee charges.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... OF AGRICULTURE USER FEES § 504.4 Exemptions from user fee charges. (a) USDA laboratories and ARS cooperators designated by the Curator of the ARS Patent Culture Collection are exempt from fee assessments. (b) The Curator of the ARS Patent Culture Collection is delegated the authority to approve and revoke...

  17. Computer-Based Tools for Evaluating Graphical User Interfaces

    NASA Technical Reports Server (NTRS)

    Moore, Loretta A.

    1997-01-01

    The user interface is the component of a software system that connects two very complex system: humans and computers. Each of these two systems impose certain requirements on the final product. The user is the judge of the usability and utility of the system; the computer software and hardware are the tools with which the interface is constructed. Mistakes are sometimes made in designing and developing user interfaces because the designers and developers have limited knowledge about human performance (e.g., problem solving, decision making, planning, and reasoning). Even those trained in user interface design make mistakes because they are unable to address all of the known requirements and constraints on design. Evaluation of the user inter-face is therefore a critical phase of the user interface development process. Evaluation should not be considered the final phase of design; but it should be part of an iterative design cycle with the output of evaluation being feed back into design. The goal of this research was to develop a set of computer-based tools for objectively evaluating graphical user interfaces. The research was organized into three phases. The first phase resulted in the development of an embedded evaluation tool which evaluates the usability of a graphical user interface based on a user's performance. An expert system to assist in the design and evaluation of user interfaces based upon rules and guidelines was developed during the second phase. During the final phase of the research an automatic layout tool to be used in the initial design of graphical inter- faces was developed. The research was coordinated with NASA Marshall Space Flight Center's Mission Operations Laboratory's efforts in developing onboard payload display specifications for the Space Station.

  18. The MetabolomeExpress Project: enabling web-based processing, analysis and transparent dissemination of GC/MS metabolomics datasets.

    PubMed

    Carroll, Adam J; Badger, Murray R; Harvey Millar, A

    2010-07-14

    Standardization of analytical approaches and reporting methods via community-wide collaboration can work synergistically with web-tool development to result in rapid community-driven expansion of online data repositories suitable for data mining and meta-analysis. In metabolomics, the inter-laboratory reproducibility of gas-chromatography/mass-spectrometry (GC/MS) makes it an obvious target for such development. While a number of web-tools offer access to datasets and/or tools for raw data processing and statistical analysis, none of these systems are currently set up to act as a public repository by easily accepting, processing and presenting publicly submitted GC/MS metabolomics datasets for public re-analysis. Here, we present MetabolomeExpress, a new File Transfer Protocol (FTP) server and web-tool for the online storage, processing, visualisation and statistical re-analysis of publicly submitted GC/MS metabolomics datasets. Users may search a quality-controlled database of metabolite response statistics from publicly submitted datasets by a number of parameters (eg. metabolite, species, organ/biofluid etc.). Users may also perform meta-analysis comparisons of multiple independent experiments or re-analyse public primary datasets via user-friendly tools for t-test, principal components analysis, hierarchical cluster analysis and correlation analysis. They may interact with chromatograms, mass spectra and peak detection results via an integrated raw data viewer. Researchers who register for a free account may upload (via FTP) their own data to the server for online processing via a novel raw data processing pipeline. MetabolomeExpress https://www.metabolome-express.org provides a new opportunity for the general metabolomics community to transparently present online the raw and processed GC/MS data underlying their metabolomics publications. Transparent sharing of these data will allow researchers to assess data quality and draw their own insights from published metabolomics datasets.

  19. Earthquake Early Warning Beta Users: Java, Modeling, and Mobile Apps

    NASA Astrophysics Data System (ADS)

    Strauss, J. A.; Vinci, M.; Steele, W. P.; Allen, R. M.; Hellweg, M.

    2014-12-01

    Earthquake Early Warning (EEW) is a system that can provide a few to tens of seconds warning prior to ground shaking at a user's location. The goal and purpose of such a system is to reduce, or minimize, the damage, costs, and casualties resulting from an earthquake. A demonstration earthquake early warning system (ShakeAlert) is undergoing testing in the United States by the UC Berkeley Seismological Laboratory, Caltech, ETH Zurich, University of Washington, the USGS, and beta users in California and the Pacific Northwest. The beta users receive earthquake information very rapidly in real-time and are providing feedback on their experiences of performance and potential uses within their organization. Beta user interactions allow the ShakeAlert team to discern: which alert delivery options are most effective, what changes would make the UserDisplay more useful in a pre-disaster situation, and most importantly, what actions users plan to take for various scenarios. Actions could include: personal safety approaches, such as drop cover, and hold on; automated processes and procedures, such as opening elevator or fire stations doors; or situational awareness. Users are beginning to determine which policy and technological changes may need to be enacted, and funding requirements to implement their automated controls. The use of models and mobile apps are beginning to augment the basic Java desktop applet. Modeling allows beta users to test their early warning responses against various scenarios without having to wait for a real event. Mobile apps are also changing the possible response landscape, providing other avenues for people to receive information. All of these combine to improve business continuity and resiliency.

  20. How to: identify non-tuberculous Mycobacterium species using MALDI-TOF mass spectrometry.

    PubMed

    Alcaide, F; Amlerová, J; Bou, G; Ceyssens, P J; Coll, P; Corcoran, D; Fangous, M-S; González-Álvarez, I; Gorton, R; Greub, G; Hery-Arnaud, G; Hrábak, J; Ingebretsen, A; Lucey, B; Marekoviċ, I; Mediavilla-Gradolph, C; Monté, M R; O'Connor, J; O'Mahony, J; Opota, O; O'Reilly, B; Orth-Höller, D; Oviaño, M; Palacios, J J; Palop, B; Pranada, A B; Quiroga, L; Rodríguez-Temporal, D; Ruiz-Serrano, M J; Tudó, G; Van den Bossche, A; van Ingen, J; Rodriguez-Sanchez, B

    2018-06-01

    The implementation of MALDI-TOF MS for microorganism identification has changed the routine of the microbiology laboratories as we knew it. Most microorganisms can now be reliably identified within minutes using this inexpensive, user-friendly methodology. However, its application in the identification of mycobacteria isolates has been hampered by the structure of their cell wall. Improvements in the sample processing method and in the available database have proved key factors for the rapid and reliable identification of non-tuberculous mycobacteria isolates using MALDI-TOF MS. The main objective is to provide information about the proceedings for the identification of non-tuberculous isolates using MALDI-TOF MS and to review different sample processing methods, available databases, and the interpretation of the results. Results from relevant studies on the use of the available MALDI-TOF MS instruments, the implementation of innovative sample processing methods, or the implementation of improved databases are discussed. Insight about the methodology required for reliable identification of non-tuberculous mycobacteria and its implementation in the microbiology laboratory routine is provided. Microbiology laboratories where MALDI-TOF MS is available can benefit from its capacity to identify most clinically interesting non-tuberculous mycobacteria in a rapid, reliable, and inexpensive manner. Copyright © 2017 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.

  1. 75 FR 17281 - Changes in Hourly Fee Rates for Science and Technology Laboratory Services-Fiscal Years 2010-2012

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-06

    ..., residue chemistry, proximate analysis for composition, and biomolecular (DNA-based) testing. A user fee... provide greater clarity of reported test analyses and laboratory determinations. DATES: Effective April 7... analyses and laboratory determinations provided by AMS laboratory services apply only to the submitted...

  2. On the Integration of Remote Experimentation into Undergraduate Laboratories--Pedagogical Approach

    ERIC Educational Resources Information Center

    Esche, Sven K.

    2005-01-01

    This paper presents an Internet-based open approach to laboratory instruction. In this article, the author talks about an open laboratory approach using a multi-user multi-device remote facility. This approach involves both the direct contact with the computer-controlled laboratory setup of interest with the students present in the laboratory…

  3. PRISM: Processing routines in IDL for spectroscopic measurements (installation manual and user's guide, version 1.0)

    USGS Publications Warehouse

    Kokaly, Raymond F.

    2011-01-01

    This report describes procedures for installing and using the U.S. Geological Survey Processing Routines in IDL for Spectroscopic Measurements (PRISM) software. PRISM provides a framework to conduct spectroscopic analysis of measurements made using laboratory, field, airborne, and space-based spectrometers. Using PRISM functions, the user can compare the spectra of materials of unknown composition with reference spectra of known materials. This spectroscopic analysis allows the composition of the material to be identified and characterized. Among its other functions, PRISM contains routines for the storage of spectra in database files, import/export of ENVI spectral libraries, importation of field spectra, correction of spectra to absolute reflectance, arithmetic operations on spectra, interactive continuum removal and comparison of spectral features, correction of imaging spectrometer data to ground-calibrated reflectance, and identification and mapping of materials using spectral feature-based analysis of reflectance data. This report provides step-by-step instructions for installing the PRISM software and running its functions.

  4. A Secure Web Application Providing Public Access to High-Performance Data Intensive Scientific Resources - ScalaBLAST Web Application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis, Darren S.; Peterson, Elena S.; Oehmen, Chris S.

    2008-05-04

    This work presents the ScalaBLAST Web Application (SWA), a web based application implemented using the PHP script language, MySQL DBMS, and Apache web server under a GNU/Linux platform. SWA is an application built as part of the Data Intensive Computer for Complex Biological Systems (DICCBS) project at the Pacific Northwest National Laboratory (PNNL). SWA delivers accelerated throughput of bioinformatics analysis via high-performance computing through a convenient, easy-to-use web interface. This approach greatly enhances emerging fields of study in biology such as ontology-based homology, and multiple whole genome comparisons which, in the absence of a tool like SWA, require a heroicmore » effort to overcome the computational bottleneck associated with genome analysis. The current version of SWA includes a user account management system, a web based user interface, and a backend process that generates the files necessary for the Internet scientific community to submit a ScalaBLAST parallel processing job on a dedicated cluster.« less

  5. Signal evaluation environment: a new method for the design of peripheral in-vehicle warning signals.

    PubMed

    Werneke, Julia; Vollrath, Mark

    2011-06-01

    An evaluation method called the Signal Evaluation Environment (SEE) was developed for use in the early stages of the design process of peripheral warning signals while driving. Accident analyses have shown that with complex driving situations such as intersections, the visual scan strategies of the driver contribute to overlooking other road users who have the right of way. Salient peripheral warning signals could disrupt these strategies and direct drivers' attention towards these road users. To select effective warning signals, the SEE was developed as a laboratory task requiring visual-cognitive processes similar to those used at intersections. For validation of the SEE, four experiments were conducted using different stimulus characteristics (size, colour contrast, shape, flashing) that influence peripheral vision. The results confirm that the SEE is able to differentiate between the selected stimulus characteristics. The SEE is a useful initial tool for designing peripheral signals, allowing quick and efficient preselection of beneficial signals.

  6. A comparison of representations for discrete multi-criteria decision problems☆

    PubMed Central

    Gettinger, Johannes; Kiesling, Elmar; Stummer, Christian; Vetschera, Rudolf

    2013-01-01

    Discrete multi-criteria decision problems with numerous Pareto-efficient solution candidates place a significant cognitive burden on the decision maker. An interactive, aspiration-based search process that iteratively progresses toward the most preferred solution can alleviate this task. In this paper, we study three ways of representing such problems in a DSS, and compare them in a laboratory experiment using subjective and objective measures of the decision process as well as solution quality and problem understanding. In addition to an immediate user evaluation, we performed a re-evaluation several weeks later. Furthermore, we consider several levels of problem complexity and user characteristics. Results indicate that different problem representations have a considerable influence on search behavior, although long-term consistency appears to remain unaffected. We also found interesting discrepancies between subjective evaluations and objective measures. Conclusions from our experiments can help designers of DSS for large multi-criteria decision problems to fit problem representations to the goals of their system and the specific task at hand. PMID:24882912

  7. Optical design and Initial Results from The National Institute of Standards and Technology’s AMMT/TEMPS Facility

    PubMed Central

    Grantham, Steven; Lane, Brandon; Neira, Jorge; Mekhontsev, Sergey; Vlasea, Mihaela; Hanssen, Leonard

    2017-01-01

    The National Institute of Standards and Technology’s (NIST) Physical Measurement and Engineering Laboratories are jointly developing the Additive Manufacturing Measurement Testbed (AMMT)/ Temperature and Emittance of Melts, Powders and Solids (TEMPS) facilities. These facilities will be co-located on an open architecture laser-based powder bed fusion system allowing users full access to the system’s operation parameters. This will provide users with access to machine-independent monitoring and control of the powder bed fusion process. In this paper there will be emphasis on the AMMT, which incorporates in-line visible light collection optics for monitoring and feedback control of the powder bed fusion process. We shall present an overview of the AMMT/TEMPS program and its goals. The optical and mechanical design of the open architecture powder-bed fusion system and the AMMT will also be described. In addition, preliminary measurement results from the system along with the current status of the system will be described. PMID:28579666

  8. Commerce Laboratory: Mission analysis payload integration study

    NASA Technical Reports Server (NTRS)

    Bannister, T. C.

    1984-01-01

    A mission model which will accommodate commercial users and provide a basic data base for further mission planning is reported. The data bases to be developed are: (1) user requirements; (2) apparatus capabilities and availabilities; and (3) carrier capabilities. These data bases are synthesized in a trades and analysis phase along with the STS flight apparatus, and optimum missions will be identified. The completed work is reported. The user requirements data base was expanded to identify within the six scientific disciplines the areas of investigation, investigation categories and status, potential commercial application, interested parties, process, and experiment requirements. The scope of the apparatus data base was expanded to indicate apparatus status as to whether it is ground or flight equipment and, within both categories, whether the apparatus is: (1) existing, (2) under development, (3) planned, or (4) needed. Applications for the apparatus are listed. The methodology is revised in the areas of trades and analysis and mission planning. The carrier capabilities data base was updated and completed.

  9. A user-friendly LabVIEW software platform for grating based X-ray phase-contrast imaging.

    PubMed

    Wang, Shenghao; Han, Huajie; Gao, Kun; Wang, Zhili; Zhang, Can; Yang, Meng; Wu, Zhao; Wu, Ziyu

    2015-01-01

    X-ray phase-contrast imaging can provide greatly improved contrast over conventional absorption-based imaging for weakly absorbing samples, such as biological soft tissues and fibre composites. In this study, we introduced an easy and fast way to develop a user-friendly software platform dedicated to the new grating-based X-ray phase-contrast imaging setup at the National Synchrotron Radiation Laboratory of the University of Science and Technology of China. The control of 21 motorized stages, of a piezoelectric stage and of an X-ray tube are achieved with this software, it also covers image acquisition with a flat panel detector for automatic phase stepping scan. Moreover, a data post-processing module for signals retrieval and other custom features are in principle available. With a seamless integration of all the necessary functions in one software package, this platform greatly facilitate users' activities during experimental runs with this grating based X-ray phase contrast imaging setup.

  10. SLIMS--a user-friendly sample operations and inventory management system for genotyping labs.

    PubMed

    Van Rossum, Thea; Tripp, Ben; Daley, Denise

    2010-07-15

    We present the Sample-based Laboratory Information Management System (SLIMS), a powerful and user-friendly open source web application that provides all members of a laboratory with an interface to view, edit and create sample information. SLIMS aims to simplify common laboratory tasks with tools such as a user-friendly shopping cart for subjects, samples and containers that easily generates reports, shareable lists and plate designs for genotyping. Further key features include customizable data views, database change-logging and dynamically filled pre-formatted reports. Along with being feature-rich, SLIMS' power comes from being able to handle longitudinal data from multiple time-points and biological sources. This type of data is increasingly common from studies searching for susceptibility genes for common complex diseases that collect thousands of samples generating millions of genotypes and overwhelming amounts of data. LIMSs provide an efficient way to deal with this data while increasing accessibility and reducing laboratory errors; however, professional LIMS are often too costly to be practical. SLIMS gives labs a feasible alternative that is easily accessible, user-centrically designed and feature-rich. To facilitate system customization, and utilization for other groups, manuals have been written for users and developers. Documentation, source code and manuals are available at http://genapha.icapture.ubc.ca/SLIMS/index.jsp. SLIMS was developed using Java 1.6.0, JSPs, Hibernate 3.3.1.GA, DB2 and mySQL, Apache Tomcat 6.0.18, NetBeans IDE 6.5, Jasper Reports 3.5.1 and JasperSoft's iReport 3.5.1.

  11. LabRS: A Rosetta stone for retrospective standardization of clinical laboratory test results.

    PubMed

    Hauser, Ronald George; Quine, Douglas B; Ryder, Alex

    2018-02-01

    Clinical laboratories in the United States do not have an explicit result standard to report the 7 billion laboratory tests results they produce each year. The absence of standardized test results creates inefficiencies and ambiguities for secondary data users. We developed and tested a tool to standardize the results of laboratory tests in a large, multicenter clinical data warehouse. Laboratory records, each of which consisted of a laboratory result and a test identifier, from 27 diverse facilities were captured from 2000 through 2015. Each record underwent a standardization process to convert the original result into a format amenable to secondary data analysis. The standardization process included the correction of typos, normalization of categorical results, separation of inequalities from numbers, and conversion of numbers represented by words (eg, "million") to numerals. Quality control included expert review. We obtained 1.266 × 109 laboratory records and standardized 1.252 × 109 records (98.9%). Of the unique unstandardized records (78.887 × 103), most appeared <5 times (96%, eg, typos), did not have a test identifier (47%), or belonged to an esoteric test with <100 results (2%). Overall, these 3 reasons accounted for nearly all unstandardized results (98%). Current results suggest that the tool is both scalable and generalizable among diverse clinical laboratories. Based on observed trends, the tool will require ongoing maintenance to stay current with new tests and result formats. Future work to develop and implement an explicit standard for test results would reduce the need to retrospectively standardize test results. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  12. Laboratory Computing Resource Center

    Science.gov Websites

    Systems Computing and Data Resources Purchasing Resources Future Plans For Users Getting Started Using LCRC Software Best Practices and Policies Getting Help Support Laboratory Computing Resource Center Laboratory Computing Resource Center Latest Announcements See All April 27, 2018, Announcements, John Low

  13. HEP Division Argonne National Laboratory

    Science.gov Websites

    Design Neutrino Physics Theoretical Physics Seminars HEP Division Seminar HEP Lunch Seminar HEP Theory administrators theory users trice users HEP webmaster U.S. Department of Energy Office of Science | UChicago

  14. Implementation of a virtual laboratory for training on sound insulation testing and uncertainty calculations in acoustic tests.

    PubMed

    Asensio, C; Gasco, L; Ruiz, M; Recuero, M

    2015-02-01

    This paper describes a methodology and case study for the implementation of educational virtual laboratories for practice training on acoustic tests according to international standards. The objectives of this activity are (a) to help the students understand and apply the procedures described in the standards and (b) to familiarize the students with the uncertainty in measurement and its estimation in acoustics. The virtual laboratory will not focus on the handling and set-up of real acoustic equipment but rather on procedures and uncertainty. The case study focuses on the application of the virtual laboratory for facade sound insulation tests according to ISO 140-5:1998 (International Organization for Standardization, Geneva, Switzerland, 1998), and the paper describes the causal and stochastic models and the constraints applied in the virtual environment under consideration. With a simple user interface, the laboratory will provide measurement data that the students will have to process to report the insulation results that must converge with the "virtual true values" in the laboratory. The main advantage of the virtual laboratory is derived from the customization of factors in which the student will be instructed or examined (for instance, background noise correction, the detection of sporadic corrupted observations, and the effect of instrument precision).

  15. LabData database sub-systems for post-processing and quality control of stable isotope and gas chromatography measurements

    NASA Astrophysics Data System (ADS)

    Suckow, A. O.

    2013-12-01

    Measurements need post-processing to obtain results that are comparable between laboratories. Raw data may need to be corrected for blank, memory, drift (change of reference values with time), linearity (dependence of reference on signal height) and normalized to international reference materials. Post-processing parameters need to be stored for traceability of results. State of the art stable isotope correction schemes are available based on MS Excel (Geldern and Barth, 2012; Gröning, 2011) or MS Access (Coplen, 1998). These are specialized to stable isotope measurements only, often only to the post-processing of a special run. Embedding of algorithms into a multipurpose database system was missing. This is necessary to combine results of different tracers (3H, 3He, 2H, 18O, CFCs, SF6...) or geochronological tools (Sediment dating e.g. with 210Pb, 137Cs), to relate to attribute data (submitter, batch, project, geographical origin, depth in core, well information etc.) and for further interpretation tools (e.g. lumped parameter modelling). Database sub-systems to the LabData laboratory management system (Suckow and Dumke, 2001) are presented for stable isotopes and for gas chromatographic CFC and SF6 measurements. The sub-system for stable isotopes allows the following post-processing: 1. automated import from measurement software (Isodat, Picarro, LGR), 2. correction for sample-to sample memory, linearity, drift, and renormalization of the raw data. The sub-system for gas chromatography covers: 1. storage of all raw data 2. storage of peak integration parameters 3. correction for blank, efficiency and linearity The user interface allows interactive and graphical control of the post-processing and all corrections by export to and plot in MS Excel and is a valuable tool for quality control. The sub-databases are integrated into LabData, a multi-user client server architecture using MS SQL server as back-end and an MS Access front-end and installed in four laboratories to date. Attribute data storage (unique ID for each subsample, origin, project context etc.) and laboratory management features are included. Export routines to Excel (depth profiles, time series, all possible tracer-versus tracer plots...) and modelling capabilities are add-ons. The source code is public domain and available under the GNU general public licence agreement (GNU-GPL). References Coplen, T.B., 1998. A manual for a laboratory information management system (LIMS) for light stable isotopes. Version 7.0. USGS open file report 98-284. Geldern, R.v., Barth, J.A.C., 2012. Optimization of instrument setup and post-run corrections for oxygen and hydrogen stable isotope measurements of water by isotope ratio infrared spectroscopy (IRIS). Limnology and Oceanography: Methods 10, 1024-1036. Gröning, M., 2011. Improved water δ2H and δ18O calibration and calculation of measurement uncertainty using a simple software tool. Rapid Communications in Mass Spectrometry 25, 2711-2720. Suckow, A., Dumke, I., 2001. A database system for geochemical, isotope hydrological and geochronological laboratories. Radiocarbon 43, 325-337.

  16. P43-S Computational Biology Applications Suite for High-Performance Computing (BioHPC.net)

    PubMed Central

    Pillardy, J.

    2007-01-01

    One of the challenges of high-performance computing (HPC) is user accessibility. At the Cornell University Computational Biology Service Unit, which is also a Microsoft HPC institute, we have developed a computational biology application suite that allows researchers from biological laboratories to submit their jobs to the parallel cluster through an easy-to-use Web interface. Through this system, we are providing users with popular bioinformatics tools including BLAST, HMMER, InterproScan, and MrBayes. The system is flexible and can be easily customized to include other software. It is also scalable; the installation on our servers currently processes approximately 8500 job submissions per year, many of them requiring massively parallel computations. It also has a built-in user management system, which can limit software and/or database access to specified users. TAIR, the major database of the plant model organism Arabidopsis, and SGN, the international tomato genome database, are both using our system for storage and data analysis. The system consists of a Web server running the interface (ASP.NET C#), Microsoft SQL server (ADO.NET), compute cluster running Microsoft Windows, ftp server, and file server. Users can interact with their jobs and data via a Web browser, ftp, or e-mail. The interface is accessible at http://cbsuapps.tc.cornell.edu/.

  17. The price elasticity of demand for heroin: Matched longitudinal and experimental evidence.

    PubMed

    Olmstead, Todd A; Alessi, Sheila M; Kline, Brendan; Pacula, Rosalie Liccardo; Petry, Nancy M

    2015-05-01

    This paper reports estimates of the price elasticity of demand for heroin based on a newly constructed dataset. The dataset has two matched components concerning the same sample of regular heroin users: longitudinal information about real-world heroin demand (actual price and actual quantity at daily intervals for each heroin user in the sample) and experimental information about laboratory heroin demand (elicited by presenting the same heroin users with scenarios in a laboratory setting). Two empirical strategies are used to estimate the price elasticity of demand for heroin. The first strategy exploits the idiosyncratic variation in the price experienced by a heroin user over time that occurs in markets for illegal drugs. The second strategy exploits the experimentally induced variation in price experienced by a heroin user across experimental scenarios. Both empirical strategies result in the estimate that the conditional price elasticity of demand for heroin is approximately -0.80. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. SPINS: standardized protein NMR storage. A data dictionary and object-oriented relational database for archiving protein NMR spectra.

    PubMed

    Baran, Michael C; Moseley, Hunter N B; Sahota, Gurmukh; Montelione, Gaetano T

    2002-10-01

    Modern protein NMR spectroscopy laboratories have a rapidly growing need for an easily queried local archival system of raw experimental NMR datasets. SPINS (Standardized ProteIn Nmr Storage) is an object-oriented relational database that provides facilities for high-volume NMR data archival, organization of analyses, and dissemination of results to the public domain by automatic preparation of the header files required for submission of data to the BioMagResBank (BMRB). The current version of SPINS coordinates the process from data collection to BMRB deposition of raw NMR data by standardizing and integrating the storage and retrieval of these data in a local laboratory file system. Additional facilities include a data mining query tool, graphical database administration tools, and a NMRStar v2. 1.1 file generator. SPINS also includes a user-friendly internet-based graphical user interface, which is optionally integrated with Varian VNMR NMR data collection software. This paper provides an overview of the data model underlying the SPINS database system, a description of its implementation in Oracle, and an outline of future plans for the SPINS project.

  19. An Extensible, User- Modifiable Framework for Planning Activities

    NASA Technical Reports Server (NTRS)

    Joshing, Joseph C.; Abramyan, Lucy; Mickelson, Megan C.; Wallick, Michael N.; Kurien, James A.; Crockett, Thomasa M.; Powell, Mark W.; Pyrzak, Guy; Aghevli, Arash

    2013-01-01

    This software provides a development framework that allows planning activities for the Mars Science Laboratory rover to be altered at any time, based on changes of the Activity Dictionary. The Activity Dictionary contains the definition of all activities that can be carried out by a particular asset (robotic or human). These definitions (and combinations of these definitions) are used by mission planners to give a daily plan of what a mission should do. During the development and course of the mission, the Activity Dictionary and actions that are going to be carried out will often be changed. Previously, such changes would require a change to the software and redeployment. Now, the Activity Dictionary authors are able to customize activity definitions, parameters, and resource usage without requiring redeployment. This software provides developers and end users the ability to modify the behavior of automatically generated activities using a script. This allows changes to the software behavior without incurring the burden of redeployment. This software is currently being used for the Mars Science Laboratory, and is in the process of being integrated into the LADEE (Lunar Atmosphere and Dust Environment Explorer) mission, as well as the International Space Station.

  20. LIMS user acceptance testing.

    PubMed

    Klein, Corbett S

    2003-01-01

    Laboratory Information Management Systems (LIMS) play a key role in the pharmaceutical industry. Thorough and accurate validation of such systems is critical and is a regulatory requirement. LIMS user acceptance testing is one aspect of this testing and enables the user to make a decision to accept or reject implementation of the system. This paper discusses key elements in facilitating the development and execution of a LIMS User Acceptance Test Plan (UATP).

  1. The contribution of the Geohazards Exploitation Platform for the GEO Supersites community

    NASA Astrophysics Data System (ADS)

    Manunta, Michele; Caumont, Hervé; Mora, Oscar; Casu, Francesco; Zinno, Ivana; De Luca, Claudio; Pepe, Susi; Pepe, Antonio; Brito, Fabrice; Romero, Laia; Stumpf, Andre; Malet, Jean-Philippe; Brcic, Ramon; Rodriguez Gonzalez, Fernando; Musacchio, Massimo; Buongiorno, Fabrizia; Briole, Pierre

    2016-04-01

    The European Space Agency (ESA) initiative for the creation of an ecosystem of Thematic Exploitation Platforms (TEP) focuses on the capitalization of Ground Segment capabilities and ICT technologies to maximize the exploitation of EO data from past and future missions. A TEP refers to a computing platform that complies to a set of exploitation scenarios involving scientific users, data providers and ICT providers, aggregated around an Earth Science thematic area. The Exploitation Platforms are targeted to cover different capacities and they define, implement and validate a platform for effective data exploitation of EO data sources in a given thematic area. In this framework, the Geohazards Thematic Exploitation Platform or Geohazards TEP (GEP) aims at providing on-demand processing services for specific user needs as well as systematic processing services to address the need of the geohazards community for common information layers and, finally, to integrate newly developed processors for scientists and other expert users. The GEP has now on-boarded over 40 European early adopters and will transition during 2016 to pre-operations by developing six new Pilot applications that will significantly augment the Platform's capabilities for systematic production and community building. Each project on the Platform is concerned with either integrating an application, running on demand processing using an application available in the platform or systematically generating a new product collection. The platform will also expand its user base in 2016, to gradually reach a total of about 65 individual users. Under a Consortium lead by Terradue Srl, six new pilot projects have been taken on board: photogrammetric processing using optical EO data with University of Strasbourg (FR), optical based processing method for volcanic hazard monitoring with INGV (IT), systematic generation of deformation time-series with Sentinel-1 data with CNR IREA (IT), systematic processing of Sentinel-1 interferometric imagery with DLR (DE), precise terrain motion mapping with SPN Persistent Scatterers Interferometric chain of Altamira Information (ES) and a campaign to test and exploit GEP applications with the Corinth Rift Laboratory in which Greek and French experts of seismic hazards are engaged. The consortium is in charge of the resources and services management under a sustainable and fair governance model to ensure alignment of the Platform with user community needs, broad collaboration with main data and service providers in the domain, and excellence among user initiatives willing to contribute. In this work we show how the GEO Geohazards Supersites community can fully benefit from availability of an advanced IT infrastructure, where satellite and in-situ data, processing tools and web-based visualization instruments are at the disposal of users to address scientific questions. In particular, we focus on the contributions provided by GEP for the management of EO data, for the implementation of a European e-infrastructure, and for the monitoring and modelling of ground deformations and seismic activity.

  2. Assessment of Application Technology of Natural User Interfaces in the Creation of a Virtual Chemical Laboratory

    NASA Astrophysics Data System (ADS)

    Jagodziński, Piotr; Wolski, Robert

    2015-02-01

    Natural User Interfaces (NUI) are now widely used in electronic devices such as smartphones, tablets and gaming consoles. We have tried to apply this technology in the teaching of chemistry in middle school and high school. A virtual chemical laboratory was developed in which students can simulate the performance of laboratory activities similar to those that they perform in a real laboratory. Kinect sensor was used for the detection and analysis of the student's hand movements, which is an example of NUI. The studies conducted found the effectiveness of educational virtual laboratory. The extent to which the use of a teaching aid increased the students' progress in learning chemistry was examined. The results indicate that the use of NUI creates opportunities to both enhance and improve the quality of the chemistry education. Working in a virtual laboratory using the Kinect interface results in greater emotional involvement and an increased sense of self-efficacy in the laboratory work among students. As a consequence, students are getting higher marks and are more interested in the subject of chemistry.

  3. Spin Measurements of an Electron Bound to a Single Phosphorous Donor in Silicon

    NASA Astrophysics Data System (ADS)

    Luhman, D. R.; Nguyen, K.; Tracy, L. A.; Carr, S. M.; Borchardt, J.; Bishop, N. C.; Ten Eyck, G. A.; Pluym, T.; Wendt, J.; Carroll, M. S.; Lilly, M. P.

    2014-03-01

    The spin of an electron bound to a single donor implanted in silicon is potentially useful for quantum information processing. We report on our efforts to measure and manipulate the spin of an electron bound to a single P donor in silicon. A low number of P donors are implanted using a self-aligned process into a silicon substrate in close proximity to a single-electron-transistor (SET) defined by lithographically patterned polysilicon gates. The SET is used to sense the occupancy of the electron on the donor and for spin read-out. An adjacent transmission line allows the application of microwave pulses to rotate the spin of the electron. We will present data from various experiments designed to exploit these capabilities. This work was performed, in part, at the Center for Integrated Nanotechnologies, a U.S. DOE Office of Basic Energy Sciences user facility. The work was supported by Sandia National Laboratories Directed Research and Development Program. Sandia National Laboratories is a multi-program laboratory operated by Sandia Corporation, a Lockheed-Martin Company, for the U. S. Department of Energy under Contract No. DE-AC04-94AL85000.

  4. Automated Processing of Plasma Samples for Lipoprotein Separation by Rate-Zonal Ultracentrifugation.

    PubMed

    Peters, Carl N; Evans, Iain E J

    2016-12-01

    Plasma lipoproteins are the primary means of lipid transport among tissues. Defining alterations in lipid metabolism is critical to our understanding of disease processes. However, lipoprotein measurement is limited to specialized centers. Preparation for ultracentrifugation involves the formation of complex density gradients that is both laborious and subject to handling errors. We created a fully automated device capable of forming the required gradient. The design has been made freely available for download by the authors. It is inexpensive relative to commercial density gradient formers, which generally create linear gradients unsuitable for rate-zonal ultracentrifugation. The design can easily be modified to suit user requirements and any potential future improvements. Evaluation of the device showed reliable peristaltic pump accuracy and precision for fluid delivery. We also demonstrate accurate fluid layering with reduced mixing at the gradient layers when compared to usual practice by experienced laboratory personnel. Reduction in layer mixing is of critical importance, as it is crucial for reliable lipoprotein separation. The automated device significantly reduces laboratory staff input and reduces the likelihood of error. Overall, this device creates a simple and effective solution to formation of complex density gradients. © 2015 Society for Laboratory Automation and Screening.

  5. Economic Impact Forecast System (EIFS). Version 2.0. Users Manual. Supplement II. European Economic Impact Forecast System (EEIFS), Phase 1, (FRG/EIFS Pilot Model).

    DTIC Science & Technology

    1982-05-01

    Chmpip. tL : Construction engineering Research Laboratory ; available from NTIS. 1982. 71 p. (Technical report / Construction Engineering Researsh ...AD-Al17 661 CONSTRUCTION ENGINEERING RESEARCH LAB (ARMY) CHAMPAIGN IL F/G 5/3 ECONOMIC IMPACT FORECAST SYSTEM (EIFS). VERSION 2.0. USERS MANU--ETC(u...CONSTRUCTION ENGINEERING RESEARCH LABORATORY 4A762720A896-C-004 P.O. BOX 4005, CHAMPAIGN, IL 61820 I. CONTROLLING OFFICE NAME AND ADDRESS It. REPORT

  6. GRC Ground Support Facilities

    NASA Technical Reports Server (NTRS)

    SaintOnge, Thomas H.

    2010-01-01

    The ISS Program is conducting an "ISS Research Academy' at JSC the first week of August 2010. This Academy will be a tutorial for new Users of the International Space Station, focused primarily on the new ISS National Laboratory and its members including Non-Profit Organizations, other government agencies and commercial users. Presentations on the on-orbit research facilities accommodations and capabilities will be made, as well as ground based hardware development, integration and test facilities and capabilities. This presentation describes the GRC Hardware development, test and laboratory facilities.

  7. WHAM!: a web-based visualization suite for user-defined analysis of metagenomic shotgun sequencing data.

    PubMed

    Devlin, Joseph C; Battaglia, Thomas; Blaser, Martin J; Ruggles, Kelly V

    2018-06-25

    Exploration of large data sets, such as shotgun metagenomic sequence or expression data, by biomedical experts and medical professionals remains as a major bottleneck in the scientific discovery process. Although tools for this purpose exist for 16S ribosomal RNA sequencing analysis, there is a growing but still insufficient number of user-friendly interactive visualization workflows for easy data exploration and figure generation. The development of such platforms for this purpose is necessary to accelerate and streamline microbiome laboratory research. We developed the Workflow Hub for Automated Metagenomic Exploration (WHAM!) as a web-based interactive tool capable of user-directed data visualization and statistical analysis of annotated shotgun metagenomic and metatranscriptomic data sets. WHAM! includes exploratory and hypothesis-based gene and taxa search modules for visualizing differences in microbial taxa and gene family expression across experimental groups, and for creating publication quality figures without the need for command line interface or in-house bioinformatics. WHAM! is an interactive and customizable tool for downstream metagenomic and metatranscriptomic analysis providing a user-friendly interface allowing for easy data exploration by microbiome and ecological experts to facilitate discovery in multi-dimensional and large-scale data sets.

  8. Making Advanced Scientific Algorithms and Big Scientific Data Management More Accessible

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Venkatakrishnan, S. V.; Mohan, K. Aditya; Beattie, Keith

    2016-02-14

    Synchrotrons such as the Advanced Light Source (ALS) at Lawrence Berkeley National Laboratory are known as user facilities. They are sources of extremely bright X-ray beams, and scientists come from all over the world to perform experiments that require these beams. As the complexity of experiments has increased, and the size and rates of data sets has exploded, managing, analyzing and presenting the data collected at synchrotrons has been an increasing challenge. The ALS has partnered with high performance computing, fast networking, and applied mathematics groups to create a"super-facility", giving users simultaneous access to the experimental, computational, and algorithmic resourcesmore » to overcome this challenge. This combination forms an efficient closed loop, where data despite its high rate and volume is transferred and processed, in many cases immediately and automatically, on appropriate compute resources, and results are extracted, visualized, and presented to users or to the experimental control system, both to provide immediate insight and to guide decisions about subsequent experiments during beam-time. In this paper, We will present work done on advanced tomographic reconstruction algorithms to support users of the 3D micron-scale imaging instrument (Beamline 8.3.2, hard X-ray micro-tomography).« less

  9. Comprehensive analysis of a medication dosing error related to CPOE.

    PubMed

    Horsky, Jan; Kuperman, Gilad J; Patel, Vimla L

    2005-01-01

    This case study of a serious medication error demonstrates the necessity of a comprehensive methodology for the analysis of failures in interaction between humans and information systems. The authors used a novel approach to analyze a dosing error related to computer-based ordering of potassium chloride (KCl). The method included a chronological reconstruction of events and their interdependencies from provider order entry usage logs, semistructured interviews with involved clinicians, and interface usability inspection of the ordering system. Information collected from all sources was compared and evaluated to understand how the error evolved and propagated through the system. In this case, the error was the product of faults in interaction among human and system agents that methods limited in scope to their distinct analytical domains would not identify. The authors characterized errors in several converging aspects of the drug ordering process: confusing on-screen laboratory results review, system usability difficulties, user training problems, and suboptimal clinical system safeguards that all contributed to a serious dosing error. The results of the authors' analysis were used to formulate specific recommendations for interface layout and functionality modifications, suggest new user alerts, propose changes to user training, and address error-prone steps of the KCl ordering process to reduce the risk of future medication dosing errors.

  10. The Oliktok Point Arctic Research Facility (OPARF)

    NASA Astrophysics Data System (ADS)

    Zak, B. D.; Ivey, M.

    2011-12-01

    For the past year, the US Department of Energy, through Sandia National Laboratories, has operated a Designated User Facility at Oliktok Point Alaska, on the Arctic Ocean coast near the western end of the Prudhoe Bay oil fields. The primary purpose of this user facility is to accommodate and support manned and unmanned airborne measurement platforms over the Arctic Ocean and adjacent coastline as the arctic sea ice recedes. The speed at which the sea ice is receding exceeds model-projected speeds considerably for reasons that are not fully understood. The ultimate objective is to incorporate improved understanding of the radiative and other processes impacting sea ice recession into the relevant climate models. OPARF is based at a USAF Long Range Radar Station, an old Distant Early Warning (DEW) radar station built during the height of the Cold War, but continuing to be operated to track air traffic over the pole. The USAF has graciously granted Sandia and DOE use of selected facilities at Oliktok on a non-interference basis. DOE also maintains FAA-granted Restricted Airspace over Oliktok Point and adjacent ocean. In addition, DOE has also requested that the FAA establish a Warning Area over international waters 30 miles wide and 700 miles long stretching from near Oliktok towards the North Pole. That request is currently being processed by the FAA, with the public comment period now closed. This paper will update OPARF developments for potential users of the Oliktok user facility and other interested researchers.

  11. An exploratory investigation of the translation of Pacific Northwest Laboratory`s print manuals system to an on-line manuals system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heubach, J.G.; Hunt, S.T.; Pond, L.R.

    1992-06-01

    Information management technology has proliferated in the past decade in response to the information explosion. As documentation accumulates, the need to access information residing in manuals, handbooks and regulations conveniently, accurately, and quickly has increased. However, studies show that only fractions of the available information is read (Martin, 1978). Consequently, one of the biggest challenges in linking information and electronic management of information is to use the power of communication technology to meet the information needs of the audience. Pacific Northwest Laboratories` (PNL) investigation of translating its print manual system to an on-line system fits this challenge precisely. PNL`s manualsmore » contain a tremendous amount of information for which manual holders are responsible. To perform their tasks in compliance with policy and procedure guidelines, users need to access information accurately, conveniently, and quickly. In order to select and use information management tools wisely, answers must be sought to a few basic questions. Communication experts cite four key questions: What do users want? What do users need? What characteristics of an on-line information system affect its usefulness? Who are the users whose wants and needs are to be met? Once these questions are answered, attention can be focused on finding the best match between user requirements and technology characteristics and weighing the costs and benefits of proposed options.« less

  12. An e-health driven laboratory information system to support HIV treatment in Peru: E-quity for laboratory personnel, health providers and people living with HIV.

    PubMed

    García, Patricia J; Vargas, Javier H; Caballero N, Patricia; Calle V, Javier; Bayer, Angela M

    2009-12-10

    Peru has a concentrated HIV epidemic with an estimated 76,000 people living with HIV (PLHIV). Access to highly active antiretroviral therapy (HAART) expanded between 2004-2006 and the Peruvian National Institute of Health was named by the Ministry of Health as the institution responsible for carrying out testing to monitor the effectiveness of HAART. However, a national public health laboratory information system did not exist. We describe the design and implementation of an e-health driven, web-based laboratory information system--NETLAB--to communicate laboratory results for monitoring HAART to laboratory personnel, health providers and PLHIV. We carried out a needs assessment of the existing public health laboratory system, which included the generation and subsequent review of flowcharts of laboratory testing processes to generate better, more efficient streamlined processes, improving them and eliminating duplications. Next, we designed NETLAB as a modular system, integrating key security functions. The system was implemented and evaluated. The three main components of the NETLAB system, registration, reporting and education, began operating in early 2007. The number of PLHIV with recorded CD4 counts and viral loads increased by 1.5 times, to reach 18,907. Publication of test results with NETLAB took an average of 1 day, compared to a pre-NETLAB average of 60 days. NETLAB reached 2,037 users, including 944 PLHIV and 1,093 health providers, during its first year and a half. The percentage of overall PLHIV and health providers who were aware of NETLAB and had a NETLAB password has also increased substantially. NETLAB is an effective laboratory management tool since it is directly integrated into the national laboratory system and streamlined existing processes at the local, regional and national levels. The system also represents the best possible source of timely laboratory information for health providers and PLHIV, allowing patients to access their own results and other helpful information about their health, extending the scope of HIV treatment beyond the health facility and providing a model for other countries to follow. The NETLAB system now includes 100 diseases of public health importance for which the Peruvian National Institute of Health and the network of public health laboratories provide testing and results.

  13. HWINPUT program users' guide

    DOT National Transportation Integrated Search

    1992-02-01

    HWINPUT is a VNTSC-developed user friendly program written in : Microsoft Fortran version 4.01 for the IBM PC/AT. This program : is an integral part of the Federal Highway Administration's : Mobile Noise Data Gathering and Analysis Laboratory and is ...

  14. 3D Printing in the Laboratory: Maximize Time and Funds with Customized and Open-Source Labware.

    PubMed

    Coakley, Meghan; Hurt, Darrell E

    2016-08-01

    3D printing, also known as additive manufacturing, is the computer-guided process of fabricating physical objects by depositing successive layers of material. It has transformed manufacturing across virtually every industry, bringing about incredible advances in research and medicine. The rapidly growing consumer market now includes convenient and affordable "desktop" 3D printers. These are being used in the laboratory to create custom 3D-printed equipment, and a growing community of designers are contributing open-source, cost-effective innovations that can be used by both professionals and enthusiasts. User stories from investigators at the National Institutes of Health and the biomedical research community demonstrate the power of 3D printing to save valuable time and funding. While adoption of 3D printing has been slow in the biosciences to date, the potential is vast. The market predicts that within several years, 3D printers could be commonplace within the home; with so many practical uses for 3D printing, we anticipate that the technology will also play an increasingly important role in the laboratory. © 2016 Society for Laboratory Automation and Screening.

  15. An Online Virtual Laboratory of Electricity

    ERIC Educational Resources Information Center

    Gómez Tejedor, J. A.; Moltó Martínez, G.; Barros Vidaurre, C.

    2008-01-01

    In this article, we describe a Java-based virtual laboratory, accessible via the Internet by means of a Web browser. This remote laboratory enables the students to build both direct and alternating current circuits. The program includes a graphical user interface which resembles the connection board, and also the electrical components and tools…

  16. [Clinical application of mass spectrometry in the pediatric field: current topics].

    PubMed

    Yamaguchi, Seiji

    2013-09-01

    Mass spectrometry, including tandem mass spectrometry (MS/MS) and gas chromatography-mass spectrometry (GC/MS), is becoming prominent in the diagnosis of metabolic disorders in the pediatric field. It enables biochemical diagnosis of metabolic disorders from the metabolic profiles obtained by MS/MS and/or GC/MS. In neonatal mass screening for inherited metabolic disease (IMD) using MS/MS, amino acids and acylcarnitines on dried blood spots are analyzed. The target diseases include amino acidemia, urea cycle disorder, organic acidemia, and fatty acid oxidation disorder. In the MS/MS screening, organic acid analysis using GC/MS is required for differential and/or definite diagnosis of the IMDs. GC/MS data processing, however, is difficult, and metabolic diagnosis often requires the necessary skills and expertize. We developed an automated system of GC/MS data processing and autodiagnosis, and the biochemical diagnosis using GC/MS became markedly easier and user-friendly. Mass spectrometric techniques will expand from research laboratories to clinical laboratories in the near future.

  17. Application of Cognitive Task Analysis in User Requirements and Prototype Design Presentation/Briefing

    DTIC Science & Technology

    2005-10-01

    AFRL-HE-WP-TP-2005-0030 AIR FORCE RESEARCH LABORATORY Application of Cognitive Task Analysis in User Requirements and Prototype Design Presentation...TITLE AND SUBTITLE 5a. CONTRACT NUMBER FA8650-04-C-6406 Application of Cognitive Task Analysis in User Requirements 5b.GRANTNUMBER and Prototype...maintainer experience 21 21 Questions & Answers Application of Cognitive Task Analysis in User Requirements Definition and Prototype Design Christopher Curtis

  18. ORNL Fuels, Engines, and Emissions Research Center (FEERC)

    ScienceCinema

    None

    2018-02-13

    This video highlights the Vehicle Research Laboratory's capabilities at the Fuels, Engines, and Emissions Research Center (FEERC). FEERC is a Department of Energy user facility located at the Oak Ridge National Laboratory.

  19. Factors Affecting Acceptance of Hospital Information Systems Based on Extended Technology Acceptance Model: A Case Study in Three Paraclinical Departments.

    PubMed

    Nadri, Hamed; Rahimi, Bahlol; Lotfnezhad Afshar, Hadi; Samadbeik, Mahnaz; Garavand, Ali

    2018-04-01

     Regardless of the acceptance of users, information and communication systems can be considered as a health intervention designed to improve the care delivered to patients. This study aimed to determine the adoption and use of the extended Technology Acceptance Model (TAM2) by the users of hospital information system (HIS) in paraclinical departments including laboratory, radiology, and nutrition and to investigate the key factors of adoption and use of these systems.  A standard questionnaire was used to collect the data from nearly 253 users of these systems in paraclinical departments of eight university hospitals in two different cities of Iran. A total of 202 questionnaires including valid responses were used in this study (105 in Urmia and 97 in Khorramabad). The data were processed using LISREL and SPSS software and statistical analysis technique was based on the structural equation modeling (SEM).  It was found that the original TAM constructs had a significant impact on the staffs' behavioral intention to adopt HIS in paraclinical departments. The results of this study indicated that cognitive instrumental processes (job relevance, output quality, result demonstrability, and perceived ease of use), except for result demonstrability, were significant predictors of intention to use, whereas the result revealed no significant relationship between social influence processes (subjective norm, voluntariness, and image) and the users' behavioral intention to use the system.  The results confirmed that several factors in the TAM2 that were important in previous studies were not significant in paraclinical departments and in government-owned hospitals. The users' behavior factors are essential for successful usage of the system and should be considered. It provides valuable information for hospital system providers and policy makers in understanding the adoption challenges as well as practical guidance for the successful implementation of information systems in paraclinical departments. Schattauer GmbH Stuttgart.

  20. Isolation gowns in health care settings: Laboratory studies, regulations and standards, and potential barriers of gown selection and use

    PubMed Central

    Kilinc Balci, F. Selcen

    2016-01-01

    Although they play an important role in infection prevention and control, textile materials and personal protective equipment (PPE) used in health care settings are known to be one of the sources of cross-infection. Gowns are recommended to prevent transmission of infectious diseases in certain settings; however, laboratory and field studies have produced mixed results of their efficacy. PPE used in health care is regulated as either class I (low risk) or class II (intermediate risk) devices in the United States. Many organizations have published guidelines for the use of PPE, including isolation gowns, in health care settings. In addition, the Association for the Advancement of Medical Instrumentation published a guidance document on the selection of gowns and a classification standard on liquid barrier performance for both surgical and isolation gowns. However, there is currently no existing standard specific to isolation gowns that considers not only the barrier resistance but also a wide array of end user desired attributes. As a result, infection preventionists and purchasing agents face several difficulties in the selection process, and end users have limited or no information on the levels of protection provided by isolation gowns. Lack of knowledge about the performance of protective clothing used in health care became more apparent during the 2014 Ebola epidemic. This article reviews laboratory studies, regulations, guidelines and standards pertaining to isolation gowns, characterization problems, and other potential barriers of isolation gown selection and use. PMID:26391468

  1. Houston Methodist Variant Viewer: An Application to Support Clinical Laboratory Interpretation of Next-generation Sequencing Data for Cancer

    PubMed Central

    Christensen, Paul A.; Ni, Yunyun; Bao, Feifei; Hendrickson, Heather L.; Greenwood, Michael; Thomas, Jessica S.; Long, S. Wesley; Olsen, Randall J.

    2017-01-01

    Introduction: Next-generation-sequencing (NGS) is increasingly used in clinical and research protocols for patients with cancer. NGS assays are routinely used in clinical laboratories to detect mutations bearing on cancer diagnosis, prognosis and personalized therapy. A typical assay may interrogate 50 or more gene targets that encompass many thousands of possible gene variants. Analysis of NGS data in cancer is a labor-intensive process that can become overwhelming to the molecular pathologist or research scientist. Although commercial tools for NGS data analysis and interpretation are available, they are often costly, lack key functionality or cannot be customized by the end user. Methods: To facilitate NGS data analysis in our clinical molecular diagnostics laboratory, we created a custom bioinformatics tool termed Houston Methodist Variant Viewer (HMVV). HMVV is a Java-based solution that integrates sequencing instrument output, bioinformatics analysis, storage resources and end user interface. Results: Compared to the predicate method used in our clinical laboratory, HMVV markedly simplifies the bioinformatics workflow for the molecular technologist and facilitates the variant review by the molecular pathologist. Importantly, HMVV reduces time spent researching the biological significance of the variants detected, standardizes the online resources used to perform the variant investigation and assists generation of the annotated report for the electronic medical record. HMVV also maintains a searchable variant database, including the variant annotations generated by the pathologist, which is useful for downstream quality improvement and research projects. Conclusions: HMVV is a clinical grade, low-cost, feature-rich, highly customizable platform that we have made available for continued development by the pathology informatics community. PMID:29226007

  2. The impact of SLMTA in improving laboratory quality systems in the Caribbean Region.

    PubMed

    Guevara, Giselle; Gordon, Floris; Irving, Yvette; Whyms, Ismae; Parris, Keith; Beckles, Songee; Maruta, Talkmore; Ndlovu, Nqobile; Albalak, Rachel; Alemnji, George

    Past efforts to improve laboratory quality systems and to achieve accreditation for better patient care in the Caribbean Region have been slow. To describe the impact of the Strengthening of Laboratory Management Toward Accreditation (SLMTA) training programme and mentorship amongst five clinical laboratories in the Caribbean after 18 months. Five national reference laboratories from four countries participated in the SLMTA programme that incorporated classroom teaching and implementation of improvement projects. Mentors were assigned to the laboratories to guide trainees on their improvement projects and to assist in the development of Quality Management Systems (QMS). Audits were conducted at baseline, six months, exit (at 12 months) and post-SLMTA (at 18 months) using the Stepwise Laboratory Quality Improvement Process Towards Accreditation (SLIPTA) checklist to measure changes in implementation of the QMS during the period. At the end of each audit, a comprehensive implementation plan was developed in order to address gaps. Baseline audit scores ranged from 19% to 52%, corresponding to 0 stars on the SLIPTA five-star scale. After 18 months, one laboratory reached four stars, two reached three stars and two reached two stars. There was a corresponding decrease in nonconformities and development of over 100 management and technical standard operating procedures in each of the five laboratories. The tremendous improvement in these five Caribbean laboratories shows that SLMTA coupled with mentorship is an effective, user-friendly, flexible and customisable approach to the implementation of laboratory QMS. It is recommended that other laboratories in the region consider using the SLMTA training programme as they engage in quality systems improvement and preparation for accreditation.

  3. Chemical applications of synchrotron radiation: Workshop report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1989-04-01

    The most recent in a series of topical meetings for Advanced Photon Source user subgroups, the Workshop on Chemical Applications of Synchrotron Radiation (held at Argonne National Laboratory, October 3-4, 1988) dealt with surfaces and kinetics, spectroscopy, small-angle scattering, diffraction, and topography and imaging. The primary objectives were to provide an educational resource for the chemistry community on the scientific research being conducted at existing synchrotron sources and to indicate some of the unique opportunities that will be made available with the Advanced Photon Source. The workshop organizers were also interested in gauging the interest of chemists in the fieldmore » of synchrotron radiation. Interest expressed at the meeting has led to initial steps toward formation of a Chemistry Users Group at the APS. Individual projects are processed separately for the data bases.« less

  4. Database Performance Monitoring for the Photovoltaic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klise, Katherine A.

    The Database Performance Monitoring (DPM) software (copyright in processes) is being developed at Sandia National Laboratories to perform quality control analysis on time series data. The software loads time indexed databases (currently csv format), performs a series of quality control tests defined by the user, and creates reports which include summary statistics, tables, and graphics. DPM can be setup to run on an automated schedule defined by the user. For example, the software can be run once per day to analyze data collected on the previous day. HTML formatted reports can be sent via email or hosted on a website.more » To compare performance of several databases, summary statistics and graphics can be gathered in a dashboard view which links to detailed reporting information for each database. The software can be customized for specific applications.« less

  5. Predicting mesoscale microstructural evolution in electron beam welding

    DOE PAGES

    Rodgers, Theron M.; Madison, Jonathan D.; Tikare, Veena; ...

    2016-03-16

    Using the kinetic Monte Carlo simulator, Stochastic Parallel PARticle Kinetic Simulator, from Sandia National Laboratories, a user routine has been developed to simulate mesoscale predictions of a grain structure near a moving heat source. Here, we demonstrate the use of this user routine to produce voxelized, synthetic, three-dimensional microstructures for electron-beam welding by comparing them with experimentally produced microstructures. When simulation input parameters are matched to experimental process parameters, qualitative and quantitative agreement for both grain size and grain morphology are achieved. The method is capable of simulating both single- and multipass welds. As a result, the simulations provide anmore » opportunity for not only accelerated design but also the integration of simulation and experiments in design such that simulations can receive parameter bounds from experiments and, in turn, provide predictions of a resultant microstructure.« less

  6. Center for Integrated Nanotechnologies 2011 Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanders, Antonya

    We are pleased to share with you this 2011 edition of the Annual Report from the Center for Integrated Nanotechnologies (CINT) and the growing excitement we feel around cementing our brand as a leader in integration nanoscience. This can be seen most readily in the momentum we have achieved in our signature Integration Focus Activities (IFAs). These efforts unite our scientists across our four scientific Thrust areas with our users to concentrate research on larger-scale nanoscience integration challenges for specific classes of nanomaterials, systems, and phenomena. All three of our current IFAs (p. 10) now have a full head ofmore » steam, and nearly 30% of our current user projects map in some meaningful way to one of these IFAs. As part of our redoubled effort to increase our industrial user base, we are also looking to leverage these IFAs to build a stronger link to and spur recruitment within our industrial user community. We believe that the IFAs are a natural community-building tool with an intrinsic value proposition for industry; an R&D pipeline that can lead to more mature, more commercially well-positioned technologies. Finally, as nanoscience and nanotechnology are maturing, we as a research community are beginning to see our efforts extend in many exciting new directions. Our focus on nanoscience integration positions us very well to capitalize on new opportunities including the emerging Mesoscale Initiative within the DOE Office of Science. Many aspects of mesoscale science are embodied in the integration of nanoscale building blocks. We are equally proud of our continuing strong performance in support of our user program. We have fully transitioned to our new user proposal database providing enhanced convenience and flexibility for proposal submission and review. In our two regular proposal calls this year we received a total of 225 proposals, an increase of 10% over our 2010 performance. Our official count on number of users for the period remains at {approx}350 and continues to reflect full engagement of our scientific staff. We are also seeing a steady increase in our industrial user base, with the number of industrial proposals (including Rapid Access proposals) doubling in 2011. We attribute this in part of our outreach efforts including our focused industrial session in each of our past two annual User Conferences. The Center for Integrated Nanotechnologies (CINT) is a Department of Energy/Office of Science Nanoscale Science Research Center (NSRC) operating as a national user facility devoted to establishing the scientific principles that govern the design, performance, and integration of nanoscale materials. Jointly operated by Los Alamos and Sandia National Laboratories, CINT explores the continuum from scientific discovery to use-inspired research, with a focus on the integration of nanoscale materials and structures to achieve new properties and performance and their incorporation into the micro- and macro worlds. Through its Core Facility at Sandia National Laboratories and its Gateway Facility at Los Alamos National Laboratory, CINT provides open access to tools and expertise needed to explore the continuum from scientific discovery to the integration of nanostructures into the micro- and macro worlds. In its overall operations, CINT strives to achieve the following goals common to all Nanoscale Science Research Centers: (1) Conduct forefront research in nanoscale science; (2) Operate as a user facility for scientific research; (3) Provide user access to the relevant BES-supported expertise and capabilities at the host national laboratory; and (4) Leverage other relevant national laboratory capabilities to enhance scientific opportunities for the nanoscience user community. These additional goals are specific to the unique CINT mission: (5) Establish and lead a scientific community dedicated to solving nanoscale science integration challenges; and (6) Create a single user facility program that combines expertise and facilities at both Los Alamos and Sandia National Laboratories. The CINT user program provides the international scientific community with open access to world-class scientific staff and state-of-the-art facilities for theory and simulation, nanomaterials synthesis and characterization, and unique capabilities for nanoscale materials integration, from the level of nanoscale synthesis to the fabrication of micro- and macroscale structures and devices. The staff of CINT includes laboratory scientists, postdocs and technical support staff who are leaders in the nanoscience research programs in CINT scientific thrust areas: (1) Nanoscale Electronics and Mechanics, (2) Nanophotonics and Optical Nanomaterials, (3) Soft, Biological and Composite Nanomaterials, and (4) Theory and Simulation of Nanoscale Phenomena.« less

  7. URPD: a specific product primer design tool

    PubMed Central

    2012-01-01

    Background Polymerase chain reaction (PCR) plays an important role in molecular biology. Primer design fundamentally determines its results. Here, we present a currently available software that is not located in analyzing large sequence but used for a rather straight-forward way of visualizing the primer design process for infrequent users. Findings URPD (yoUR Primer Design), a web-based specific product primer design tool, combines the NCBI Reference Sequences (RefSeq), UCSC In-Silico PCR, memetic algorithm (MA) and genetic algorithm (GA) primer design methods to obtain specific primer sets. A friendly user interface is accomplished by built-in parameter settings. The incorporated smooth pipeline operations effectively guide both occasional and advanced users. URPD contains an automated process, which produces feasible primer pairs that satisfy the specific needs of the experimental design with practical PCR amplifications. Visual virtual gel electrophoresis and in silico PCR provide a simulated PCR environment. The comparison of Practical gel electrophoresis comparison to virtual gel electrophoresis facilitates and verifies the PCR experiment. Wet-laboratory validation proved that the system provides feasible primers. Conclusions URPD is a user-friendly tool that provides specific primer design results. The pipeline design path makes it easy to operate for beginners. URPD also provides a high throughput primer design function. Moreover, the advanced parameter settings assist sophisticated researchers in performing experiential PCR. Several novel functions, such as a nucleotide accession number template sequence input, local and global specificity estimation, primer pair redesign, user-interactive sequence scale selection, and virtual and practical PCR gel electrophoresis discrepancies have been developed and integrated into URPD. The URPD program is implemented in JAVA and freely available at http://bio.kuas.edu.tw/urpd/. PMID:22713312

  8. URPD: a specific product primer design tool.

    PubMed

    Chuang, Li-Yeh; Cheng, Yu-Huei; Yang, Cheng-Hong

    2012-06-19

    Polymerase chain reaction (PCR) plays an important role in molecular biology. Primer design fundamentally determines its results. Here, we present a currently available software that is not located in analyzing large sequence but used for a rather straight-forward way of visualizing the primer design process for infrequent users. URPD (yoUR Primer Design), a web-based specific product primer design tool, combines the NCBI Reference Sequences (RefSeq), UCSC In-Silico PCR, memetic algorithm (MA) and genetic algorithm (GA) primer design methods to obtain specific primer sets. A friendly user interface is accomplished by built-in parameter settings. The incorporated smooth pipeline operations effectively guide both occasional and advanced users. URPD contains an automated process, which produces feasible primer pairs that satisfy the specific needs of the experimental design with practical PCR amplifications. Visual virtual gel electrophoresis and in silico PCR provide a simulated PCR environment. The comparison of Practical gel electrophoresis comparison to virtual gel electrophoresis facilitates and verifies the PCR experiment. Wet-laboratory validation proved that the system provides feasible primers. URPD is a user-friendly tool that provides specific primer design results. The pipeline design path makes it easy to operate for beginners. URPD also provides a high throughput primer design function. Moreover, the advanced parameter settings assist sophisticated researchers in performing experiential PCR. Several novel functions, such as a nucleotide accession number template sequence input, local and global specificity estimation, primer pair redesign, user-interactive sequence scale selection, and virtual and practical PCR gel electrophoresis discrepancies have been developed and integrated into URPD. The URPD program is implemented in JAVA and freely available at http://bio.kuas.edu.tw/urpd/.

  9. Predicting Mental Imagery-Based BCI Performance from Personality, Cognitive Profile and Neurophysiological Patterns

    PubMed Central

    Jeunet, Camille; N’Kaoua, Bernard; Subramanian, Sriram; Hachet, Martin; Lotte, Fabien

    2015-01-01

    Mental-Imagery based Brain-Computer Interfaces (MI-BCIs) allow their users to send commands to a computer using their brain-activity alone (typically measured by ElectroEncephaloGraphy—EEG), which is processed while they perform specific mental tasks. While very promising, MI-BCIs remain barely used outside laboratories because of the difficulty encountered by users to control them. Indeed, although some users obtain good control performances after training, a substantial proportion remains unable to reliably control an MI-BCI. This huge variability in user-performance led the community to look for predictors of MI-BCI control ability. However, these predictors were only explored for motor-imagery based BCIs, and mostly for a single training session per subject. In this study, 18 participants were instructed to learn to control an EEG-based MI-BCI by performing 3 MI-tasks, 2 of which were non-motor tasks, across 6 training sessions, on 6 different days. Relationships between the participants’ BCI control performances and their personality, cognitive profile and neurophysiological markers were explored. While no relevant relationships with neurophysiological markers were found, strong correlations between MI-BCI performances and mental-rotation scores (reflecting spatial abilities) were revealed. Also, a predictive model of MI-BCI performance based on psychometric questionnaire scores was proposed. A leave-one-subject-out cross validation process revealed the stability and reliability of this model: it enabled to predict participants’ performance with a mean error of less than 3 points. This study determined how users’ profiles impact their MI-BCI control ability and thus clears the way for designing novel MI-BCI training protocols, adapted to the profile of each user. PMID:26625261

  10. GREENSCOPE: A Method for Modeling Chemical Process ...

    EPA Pesticide Factsheets

    Current work within the U.S. Environmental Protection Agency’s National Risk Management Research Laboratory is focused on the development of a method for modeling chemical process sustainability. The GREENSCOPE methodology, defined for the four bases of Environment, Economics, Efficiency, and Energy, can evaluate processes with over a hundred different indicators. These indicators provide a means for realizing the principles of green chemistry and green engineering in the context of sustainability. Development of the methodology has centered around three focal points. One is a taxonomy of impacts that describe the indicators and provide absolute scales for their evaluation. The setting of best and worst limits for the indicators allows the user to know the status of the process under study in relation to understood values. Thus, existing or imagined processes can be evaluated according to their relative indicator scores, and process modifications can strive towards realizable targets. A second area of focus is in advancing definitions of data needs for the many indicators of the taxonomy. Each of the indicators has specific data that is necessary for their calculation. Values needed and data sources have been identified. These needs can be mapped according to the information source (e.g., input stream, output stream, external data, etc.) for each of the bases. The user can visualize data-indicator relationships on the way to choosing selected ones for evalua

  11. EML Gamma Spectrometry Data Evaluation Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Decker, Karin M.

    2001-01-01

    This report presents the results of the analyses for the third EML Gamma Spectrometry Data Evaluation Program (October 1999). This program assists laboratories in providing more accurate gamma spectra analysis results and provides a means for users of gamma data to assess how a laboratory performed on various types of gamma spectrometry analyses. This is accomplished through the use of synthetic gamma spectra. A calibration spectrum, a background spectrum, and three sample spectra are sent to each participant in the spectral file format requested by the laboratory. The calibration spectrum contains nuclides covering the energy range from 59.5 keV tomore » 1836 keV. The participants are told fallout and fission product nuclides could be present. The sample spectra are designed to test the ability of the software and user to properly resolve multiplets and to identify and quantify nuclides in a complicated fission product spectrum. The participants were asked to report values and uncertainties as Becquerel per sample with no decay correction. Thirty-one sets of results were reported from a total of 60 laboratories who received the spectra. Six foreign laboratories participated. The percentage of the results within 1 of the expected value was 68, 33, and 46 for samples 1, 2, and 3, respectively. From all three samples, 18% of the results were more than 3 from the expected value. Eighty-three (12%) values out of a total of 682 expected results were not reported for the three samples. Approximately 30% of these false negatives were due the laboratories not reporting 144Pr in sample 2 which was present at the minimum detectable activity level. There were 53 false positives reported with 25% of these responses due to problems with background subtraction. The results show improvement in the ability of the software or user to resolve peaks separated by 1 keV. Improvement is still needed either in the analysis report produced by the software or in the review of these results by the users.« less

  12. Simulation and experimental studies of operators` decision styles and crew composition while using an ecological and traditional user interface for the control room of a nuclear power plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meshkati, N.; Buller, B.J.; Azadeh, M.A.

    1995-04-01

    The goal of this research is threefold: (1) use of the Skill-, Rule-, and Knowledge-based levels of cognitive control -- the SRK framework -- to develop an integrated information processing conceptual framework (for integration of workstation, job, and team design); (2) to evaluate the user interface component of this framework -- the Ecological display; and (3) to analyze the effect of operators` individual information processing behavior and decision styles on handling plant disturbances plus their performance on, and preference for, Traditional and Ecological user interfaces. A series of studies were conducted. In Part I, a computer simulation model and amore » mathematical model were developed. In Part II, an experiment was designed and conducted at the EBR-II plant of the Argonne National Laboratory-West in Idaho Falls, Idaho. It is concluded that: the integrated SRK-based information processing model for control room operations is superior to the conventional rule-based model; operators` individual decision styles and the combination of their styles play a significant role in effective handling of nuclear power plant disturbances; use of the Ecological interface results in significantly more accurate event diagnosis and recall of various plant parameters, faster response to plant transients, and higher ratings of subject preference; and operators` decision styles affect on both their performance and preference for the Ecological interface.« less

  13. JPSS Science Data Services for the Direct Readout Community

    NASA Technical Reports Server (NTRS)

    Chander, Gyanesh; Lutz, Bob

    2014-01-01

    The Suomi National Polar-orbiting Partnership (S-NPP) and Joint Polar Satellite System (JPSS) High Rate Data (HRD) link provides Direct Broadcast data to users in real-time, utilizing their own remote field terminals. The Field Terminal Support (FTS) provides the resources needed to support the Direct Readout communities by providing software, documentation, and periodic updates to enable them to produce data products from SNPP and JPSS. The FTS distribution server will also provide the necessary ancillary and auxiliary data needed for processing the broadcasts, as well as making orbital data available to assist in locating the satellites of interest. In addition, the FTS provides development support for the algorithm and software through GSFC Direct Readout Laboratory (DRL) International Polar Orbiter Processing Package (IPOPP) and University of Wisconsin (UWISC) Community Satellite Processing Package (CSPP), to enable users to integrate the algorithms into their remote terminals. The support the JPSS Program provides to the institutions developing and maintaining these two software packages, will demonstrate the ability to produce ready-to-use products from the HRD link and provide risk reduction effort at a minimal cost. This paper discusses the key functions and system architecture of FTS.

  14. Evaluation of the Capillary Blood Glucose Self-monitoring Program

    PubMed Central

    Augusto, Mariana Cristina; Nitsche, Maria José Trevizani; Parada, Cristina Maria Garcia de Lima; Zanetti, Maria Lúcia; Carvalhaes, Maria Antonieta de Barros Leite

    2014-01-01

    OBJECTIVE: to evaluate the structure, process and results of the Capillary Blood Glucose Self-monitoring Program in a Brazilian city. METHOD: epidemiological, cross-sectional study. The methodological framework of Donabedian was used to construct indicators of structure, process and outcome. A random sample (n = 288) of users enrolled and 96 health professionals who worked in the program was studied. Two questionnaires were used that were constructed for this study, one for professionals and one for users, both containing data for the evaluation of structure, process and outcome. Anthropometric measures and laboratory results were collected by consulting the patients' health records. The analysis involved descriptive statistics. RESULTS: most of the professionals were not qualified to work in the program and were not knowledgeable about the set of criteria for patient registration. None of the patients received complete and correct orientations about the program and the percentage with skills to perform conducts autonomously was 10%. As regards the result indicators, 86.4% of the patients and 81.3% of the professionals evaluated the program positively. CONCLUSION: the evaluation indicators designed revealed that one of the main objectives of the program, self-care skills, has not been achieved. PMID:25493676

  15. 42 CFR 93.213 - Institution.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... STUDIES OF HAZARDOUS SUBSTANCES RELEASES AND FACILITIES PUBLIC HEALTH SERVICE POLICIES ON RESEARCH... research, biomedical or behavioral research training, or activities related to that research or training... research laboratories, research and development centers, national user facilities, industrial laboratories...

  16. The Effect of Procedural Guidance on Students' Skill Enhancement in a Virtual Chemistry Laboratory

    ERIC Educational Resources Information Center

    Ullah, Sehat; Ali, Numan; Rahman, Sami Ur

    2016-01-01

    Various cognitive aids (such as change of color, arrows, etc.) are provided in virtual environments to assist users in task realization. These aids increase users' performance but lead to reduced learning because there is less cognitive load on the users. In this paper we present a new concept of procedural guidance in which textual information…

  17. Use of artificial intelligence in analytical systems for the clinical laboratory

    PubMed Central

    Truchaud, Alain; Ozawa, Kyoichi; Pardue, Harry; Schnipelsky, Paul

    1995-01-01

    The incorporation of information-processing technology into analytical systems in the form of standard computing software has recently been advanced by the introduction of artificial intelligence (AI), both as expert systems and as neural networks. This paper considers the role of software in system operation, control and automation, and attempts to define intelligence. AI is characterized by its ability to deal with incomplete and imprecise information and to accumulate knowledge. Expert systems, building on standard computing techniques, depend heavily on the domain experts and knowledge engineers that have programmed them to represent the real world. Neural networks are intended to emulate the pattern-recognition and parallel processing capabilities of the human brain and are taught rather than programmed. The future may lie in a combination of the recognition ability of the neural network and the rationalization capability of the expert system. In the second part of the paper, examples are given of applications of AI in stand-alone systems for knowledge engineering and medical diagnosis and in embedded systems for failure detection, image analysis, user interfacing, natural language processing, robotics and machine learning, as related to clinical laboratories. It is concluded that AI constitutes a collective form of intellectual propery, and that there is a need for better documentation, evaluation and regulation of the systems already being used in clinical laboratories. PMID:18924784

  18. User Manual and Source Code for a LAMMPS Implementation of Constant Energy Dissipative Particle Dynamics (DPD-E)

    DTIC Science & Technology

    2014-06-01

    User Manual and Source Code for a LAMMPS Implementation of Constant Energy Dissipative Particle Dynamics (DPD-E) by James P. Larentzos...Laboratory Aberdeen Proving Ground, MD 21005-5069 ARL-SR-290 June 2014 User Manual and Source Code for a LAMMPS Implementation of Constant...3. DATES COVERED (From - To) September 2013–February 2014 4. TITLE AND SUBTITLE User Manual and Source Code for a LAMMPS Implementation of

  19. Voting with Their Seats: Computer Laboratory Design and the Casual User

    ERIC Educational Resources Information Center

    Spennemann, Dirk H. R.; Atkinson, John; Cornforth, David

    2007-01-01

    Student computer laboratories are provided by most teaching institutions around the world; however, what is the most effective layout for such facilities? The log-in data files from computer laboratories at a regional university in Australia were analysed to determine whether there was a pattern in student seating. In particular, it was…

  20. A Choice of Terminals: Spatial Patterning in Computer Laboratories

    ERIC Educational Resources Information Center

    Spennemann, Dirk; Cornforth, David; Atkinson, John

    2007-01-01

    Purpose: This paper seeks to examine the spatial patterns of student use of machines in each laboratory to whether there are underlying commonalities. Design/methodology/approach: The research was carried out by assessing the user behaviour in 16 computer laboratories at a regional university in Australia. Findings: The study found that computers…

  1. Future Shop: A Model Career Placement & Transition Laboratory.

    ERIC Educational Resources Information Center

    Floyd, Deborah L.; And Others

    During 1988-89, the Collin County Community College District (CCCCD) conducted a project to develop, implement, and evaluate a model career laboratory called a "Future Shop." The laboratory was designed to let users explore diverse career options, job placement opportunities, and transfer resources. The Future Shop lab had three major components:…

  2. A user-friendly approach to cost accounting in laboratory animal facilities.

    PubMed

    Baker, David G

    2011-08-19

    Cost accounting is an essential management activity for laboratory animal facility management. In this report, the author describes basic principles of cost accounting and outlines steps for carrying out cost accounting in laboratory animal facilities. Methods of post hoc cost accounting analysis for maximizing the efficiency of facility operations are also described.

  3. Radiological Monitoring Equipment For Real-Time Quantification Of Area Contamination In Soils And Facility Decommissioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    M. V. Carpenter; Jay A. Roach; John R Giles

    2005-09-01

    The environmental restoration industry offers several sys¬tems that perform scan-type characterization of radiologically contaminated areas. The Idaho National Laboratory (INL) has developed and deployed a suite of field systems that rapidly scan, characterize, and analyse radiological contamination in surface soils. The base system consists of a detector, such as sodium iodide (NaI) spectrometers, a global positioning system (GPS), and an integrated user-friendly computer interface. This mobile concept was initially developed to provide precertifica¬tion analyses of soils contaminated with uranium, thorium, and radium at the Fernald Closure Project, near Cincinnati, Ohio. INL has expanded the functionality of this basic system tomore » create a suite of integrated field-deployable analytical systems. Using its engineering and radiation measurement expertise, aided by computer hardware and software support, INL has streamlined the data acquisition and analysis process to provide real-time information presented on wireless screens and in the form of coverage maps immediately available to field technicians. In addition, custom software offers a user-friendly interface with user-selectable alarm levels and automated data quality monitoring functions that validate the data. This system is deployed from various platforms, depending on the nature of the survey. The deployment platforms include a small all-terrain vehicle used to survey large, relatively flat areas, a hand-pushed unit for areas where manoeuvrability is important, an excavator-mounted system used to scan pits and trenches where personnel access is restricted, and backpack- mounted systems to survey rocky shoreline features and other physical settings that preclude vehicle-based deployment. Variants of the base system include sealed proportional counters for measuring actinides (i.e., plutonium-238 and americium-241) in building demolitions, soil areas, roadbeds, and process line routes at the Miamisburg Closure Project near Dayton, Ohio. In addition, INL supports decontamination operations at the Oak Ridge National Laboratory.« less

  4. Genomics Virtual Laboratory: A Practical Bioinformatics Workbench for the Cloud

    PubMed Central

    Afgan, Enis; Sloggett, Clare; Goonasekera, Nuwan; Makunin, Igor; Benson, Derek; Crowe, Mark; Gladman, Simon; Kowsar, Yousef; Pheasant, Michael; Horst, Ron; Lonie, Andrew

    2015-01-01

    Background Analyzing high throughput genomics data is a complex and compute intensive task, generally requiring numerous software tools and large reference data sets, tied together in successive stages of data transformation and visualisation. A computational platform enabling best practice genomics analysis ideally meets a number of requirements, including: a wide range of analysis and visualisation tools, closely linked to large user and reference data sets; workflow platform(s) enabling accessible, reproducible, portable analyses, through a flexible set of interfaces; highly available, scalable computational resources; and flexibility and versatility in the use of these resources to meet demands and expertise of a variety of users. Access to an appropriate computational platform can be a significant barrier to researchers, as establishing such a platform requires a large upfront investment in hardware, experience, and expertise. Results We designed and implemented the Genomics Virtual Laboratory (GVL) as a middleware layer of machine images, cloud management tools, and online services that enable researchers to build arbitrarily sized compute clusters on demand, pre-populated with fully configured bioinformatics tools, reference datasets and workflow and visualisation options. The platform is flexible in that users can conduct analyses through web-based (Galaxy, RStudio, IPython Notebook) or command-line interfaces, and add/remove compute nodes and data resources as required. Best-practice tutorials and protocols provide a path from introductory training to practice. The GVL is available on the OpenStack-based Australian Research Cloud (http://nectar.org.au) and the Amazon Web Services cloud. The principles, implementation and build process are designed to be cloud-agnostic. Conclusions This paper provides a blueprint for the design and implementation of a cloud-based Genomics Virtual Laboratory. We discuss scope, design considerations and technical and logistical constraints, and explore the value added to the research community through the suite of services and resources provided by our implementation. PMID:26501966

  5. Usability Evaluation of Laboratory Information Systems.

    PubMed

    Mathews, Althea; Marc, David

    2017-01-01

    Numerous studies have revealed widespread clinician frustration with the usability of electronic health records (EHRs) that is counterproductive to adoption of EHR systems to meet the aims of health-care reform. With poor system usability comes increased risk of negative unintended consequences. Usability issues could lead to user error and workarounds that have the potential to compromise patient safety and negatively impact the quality of care.[1] While there is ample research on EHR usability, there is little information on the usability of laboratory information systems (LISs). Yet, LISs facilitate the timely provision of a great deal of the information needed by physicians to make patient care decisions.[2] Medical and technical advances in genomics that require processing of an increased volume of complex laboratory data further underscore the importance of developing user-friendly LISs. This study aims to add to the body of knowledge on LIS usability. A survey was distributed among LIS users at hospitals across the United States. The survey consisted of the ten-item System Usability Scale (SUS). In addition, participants were asked to rate the ease of performing 24 common tasks with a LIS. Finally, respondents provided comments on what they liked and disliked about using the LIS to provide diagnostic insight into LIS perceived usability. The overall mean SUS score of 59.7 for the LIS evaluated is significantly lower than the benchmark of 68 ( P < 0.001). All LISs evaluated received mean SUS scores below 68 except for Orchard Harvest (78.7). While the years of experience using the LIS was found to be a statistically significant influence on mean SUS scores, the combined effect of years of experience and LIS used did not account for the statistically significant difference in the mean SUS score between Orchard Harvest and each of the other LISs evaluated. The results of this study indicate that overall usability of LISs is poor. Usability lags that of systems evaluated across 446 usability surveys.

  6. Genomics Virtual Laboratory: A Practical Bioinformatics Workbench for the Cloud.

    PubMed

    Afgan, Enis; Sloggett, Clare; Goonasekera, Nuwan; Makunin, Igor; Benson, Derek; Crowe, Mark; Gladman, Simon; Kowsar, Yousef; Pheasant, Michael; Horst, Ron; Lonie, Andrew

    2015-01-01

    Analyzing high throughput genomics data is a complex and compute intensive task, generally requiring numerous software tools and large reference data sets, tied together in successive stages of data transformation and visualisation. A computational platform enabling best practice genomics analysis ideally meets a number of requirements, including: a wide range of analysis and visualisation tools, closely linked to large user and reference data sets; workflow platform(s) enabling accessible, reproducible, portable analyses, through a flexible set of interfaces; highly available, scalable computational resources; and flexibility and versatility in the use of these resources to meet demands and expertise of a variety of users. Access to an appropriate computational platform can be a significant barrier to researchers, as establishing such a platform requires a large upfront investment in hardware, experience, and expertise. We designed and implemented the Genomics Virtual Laboratory (GVL) as a middleware layer of machine images, cloud management tools, and online services that enable researchers to build arbitrarily sized compute clusters on demand, pre-populated with fully configured bioinformatics tools, reference datasets and workflow and visualisation options. The platform is flexible in that users can conduct analyses through web-based (Galaxy, RStudio, IPython Notebook) or command-line interfaces, and add/remove compute nodes and data resources as required. Best-practice tutorials and protocols provide a path from introductory training to practice. The GVL is available on the OpenStack-based Australian Research Cloud (http://nectar.org.au) and the Amazon Web Services cloud. The principles, implementation and build process are designed to be cloud-agnostic. This paper provides a blueprint for the design and implementation of a cloud-based Genomics Virtual Laboratory. We discuss scope, design considerations and technical and logistical constraints, and explore the value added to the research community through the suite of services and resources provided by our implementation.

  7. Prospective memory functioning among ecstasy/polydrug users: evidence from the Cambridge Prospective Memory Test (CAMPROMPT).

    PubMed

    Hadjiefthyvoulou, Florentia; Fisk, John E; Montgomery, Catharine; Bridges, Nikola

    2011-06-01

    Prospective memory (PM) deficits in recreational drug users have been documented in recent years. However, the assessment of PM has largely been restricted to self-reported measures that fail to capture the distinction between event-based and time-based PM. The aim of the present study is to address this limitation. Extending our previous research, we augmented the range laboratory measures of PM by employing the CAMPROMPT test battery to investigate the impact of illicit drug use on prospective remembering in a sample of cannabis only, ecstasy/polydrug and non-users of illicit drugs, separating event and time-based PM performance. We also administered measures of executive function and retrospective memory in order to establish whether ecstasy/polydrug deficits in PM were mediated by group differences in these processes. Ecstasy/polydrug users performed significantly worse on both event and time-based prospective memory tasks in comparison to both cannabis only and non-user groups. Furthermore, it was found that across the whole sample, better retrospective memory and executive functioning was associated with superior PM performance. Nevertheless, this association did not mediate the drug-related effects that were observed. Consistent with our previous study, recreational use of cocaine was linked to PM deficits. PM deficits have again been found among ecstasy/polydrug users, which appear to be unrelated to group differences in executive function and retrospective memory. However, the possibility that these are attributable to cocaine use cannot be excluded.

  8. The EnzymeTracker: an open-source laboratory information management system for sample tracking.

    PubMed

    Triplet, Thomas; Butler, Gregory

    2012-01-26

    In many laboratories, researchers store experimental data on their own workstation using spreadsheets. However, this approach poses a number of problems, ranging from sharing issues to inefficient data-mining. Standard spreadsheets are also error-prone, as data do not undergo any validation process. To overcome spreadsheets inherent limitations, a number of proprietary systems have been developed, which laboratories need to pay expensive license fees for. Those costs are usually prohibitive for most laboratories and prevent scientists from benefiting from more sophisticated data management systems. In this paper, we propose the EnzymeTracker, a web-based laboratory information management system for sample tracking, as an open-source and flexible alternative that aims at facilitating entry, mining and sharing of experimental biological data. The EnzymeTracker features online spreadsheets and tools for monitoring numerous experiments conducted by several collaborators to identify and characterize samples. It also provides libraries of shared data such as protocols, and administration tools for data access control using OpenID and user/team management. Our system relies on a database management system for efficient data indexing and management and a user-friendly AJAX interface that can be accessed over the Internet. The EnzymeTracker facilitates data entry by dynamically suggesting entries and providing smart data-mining tools to effectively retrieve data. Our system features a number of tools to visualize and annotate experimental data, and export highly customizable reports. It also supports QR matrix barcoding to facilitate sample tracking. The EnzymeTracker was designed to be easy to use and offers many benefits over spreadsheets, thus presenting the characteristics required to facilitate acceptance by the scientific community. It has been successfully used for 20 months on a daily basis by over 50 scientists. The EnzymeTracker is freely available online at http://cubique.fungalgenomics.ca/enzymedb/index.html under the GNU GPLv3 license.

  9. The EnzymeTracker: an open-source laboratory information management system for sample tracking

    PubMed Central

    2012-01-01

    Background In many laboratories, researchers store experimental data on their own workstation using spreadsheets. However, this approach poses a number of problems, ranging from sharing issues to inefficient data-mining. Standard spreadsheets are also error-prone, as data do not undergo any validation process. To overcome spreadsheets inherent limitations, a number of proprietary systems have been developed, which laboratories need to pay expensive license fees for. Those costs are usually prohibitive for most laboratories and prevent scientists from benefiting from more sophisticated data management systems. Results In this paper, we propose the EnzymeTracker, a web-based laboratory information management system for sample tracking, as an open-source and flexible alternative that aims at facilitating entry, mining and sharing of experimental biological data. The EnzymeTracker features online spreadsheets and tools for monitoring numerous experiments conducted by several collaborators to identify and characterize samples. It also provides libraries of shared data such as protocols, and administration tools for data access control using OpenID and user/team management. Our system relies on a database management system for efficient data indexing and management and a user-friendly AJAX interface that can be accessed over the Internet. The EnzymeTracker facilitates data entry by dynamically suggesting entries and providing smart data-mining tools to effectively retrieve data. Our system features a number of tools to visualize and annotate experimental data, and export highly customizable reports. It also supports QR matrix barcoding to facilitate sample tracking. Conclusions The EnzymeTracker was designed to be easy to use and offers many benefits over spreadsheets, thus presenting the characteristics required to facilitate acceptance by the scientific community. It has been successfully used for 20 months on a daily basis by over 50 scientists. The EnzymeTracker is freely available online at http://cubique.fungalgenomics.ca/enzymedb/index.html under the GNU GPLv3 license. PMID:22280360

  10. Data analysis considerations for pesticides determined by National Water Quality Laboratory schedule 2437

    USGS Publications Warehouse

    Shoda, Megan E.; Nowell, Lisa H.; Stone, Wesley W.; Sandstrom, Mark W.; Bexfield, Laura M.

    2018-04-02

    In 2013, the U.S. Geological Survey National Water Quality Laboratory (NWQL) made a new method available for the analysis of pesticides in filtered water samples: laboratory schedule 2437. Schedule 2437 is an improvement on previous analytical methods because it determines the concentrations of 225 fungicides, herbicides, insecticides, and associated degradates in one method at similar or lower concentrations than previously available methods. Additionally, the pesticides included in schedule 2437 were strategically identified in a prioritization analysis that assessed likelihood of occurrence, prevalence of use, and potential toxicity. When the NWQL reports pesticide concentrations for analytes in schedule 2437, the laboratory also provides supplemental information useful to data users for assessing method performance and understanding data quality. That supplemental information is discussed in this report, along with an initial analysis of analytical recovery of pesticides in water-quality samples analyzed by schedule 2437 during 2013–2015. A total of 523 field matrix spike samples and their paired environmental samples and 277 laboratory reagent spike samples were analyzed for this report (1,323 samples total). These samples were collected in the field as part of the U.S. Geological Survey National Water-Quality Assessment groundwater and surface-water studies and as part of the NWQL quality-control program. This report reviews how pesticide samples are processed by the NWQL, addresses how to obtain all the data necessary to interpret pesticide concentrations, explains the circumstances that result in a reporting level change or the occurrence of a raised reporting level, and describes the calculation and assessment of recovery. This report also discusses reasons why a data user might choose to exclude data in an interpretive analysis and outlines the approach used to identify the potential for decreased data quality in the assessment of method recovery. The information provided in this report is essential to understanding pesticide data determined by schedule 2437 and should be reviewed before interpretation of these data.

  11. Automatic Radiated Susceptibility Test System for Payload Equipment

    NASA Technical Reports Server (NTRS)

    Ngo, Hoai T.; Sturman, John C.; Sargent, Noel B.

    1995-01-01

    An automatic radiated susceptibility test system (ARSTS) was developed for NASA Lewis Research Center's Electro-magnetic Interference laboratory. According to MSFC-SPEC 521B, any electrical or electronic equipment that will be transported by the spacelab and space shuttle must be tested for susceptibility to electromagnetic interference. This state-of-the-art automatic test system performs necessary calculations; analyzes, processes, and records a great quantity of measured data; and monitors the equipment being tested in real-time and with minimal user intervention. ARSTS reduces costly test time, increases test accuracy, and provides reliable test results.

  12. KNET - DISTRIBUTED COMPUTING AND/OR DATA TRANSFER PROGRAM

    NASA Technical Reports Server (NTRS)

    Hui, J.

    1994-01-01

    KNET facilitates distributed computing between a UNIX compatible local host and a remote host which may or may not be UNIX compatible. It is capable of automatic remote login. That is, it performs on the user's behalf the chore of handling host selection, user name, and password to the designated host. Once the login has been successfully completed, the user may interactively communicate with the remote host. Data output from the remote host may be directed to the local screen, to a local file, and/or to a local process. Conversely, data input from the keyboard, a local file, or a local process may be directed to the remote host. KNET takes advantage of the multitasking and terminal mode control features of the UNIX operating system. A parent process is used as the upper layer for interfacing with the local user. A child process is used for a lower layer for interfacing with the remote host computer, and optionally one or more child processes can be used for the remote data output. Output may be directed to the screen and/or to the local processes under the control of a data pipe switch. In order for KNET to operate, the local and remote hosts must observe a common communications protocol. KNET is written in ANSI standard C-language for computers running UNIX. It has been successfully implemented on several Sun series computers and a DECstation 3100 and used to run programs remotely on VAX VMS and UNIX based computers. It requires 100K of RAM under SunOS and 120K of RAM under DEC RISC ULTRIX. An electronic copy of the documentation is provided on the distribution medium. The standard distribution medium for KNET is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format. KNET was developed in 1991 and is a copyrighted work with all copyright vested in NASA. UNIX is a registered trademark of AT&T Bell Laboratories. Sun and SunOS are trademarks of Sun Microsystems, Inc. DECstation, VAX, VMS, and ULTRIX are trademarks of Digital Equipment Corporation.

  13. ProteoSign: an end-user online differential proteomics statistical analysis platform.

    PubMed

    Efstathiou, Georgios; Antonakis, Andreas N; Pavlopoulos, Georgios A; Theodosiou, Theodosios; Divanach, Peter; Trudgian, David C; Thomas, Benjamin; Papanikolaou, Nikolas; Aivaliotis, Michalis; Acuto, Oreste; Iliopoulos, Ioannis

    2017-07-03

    Profiling of proteome dynamics is crucial for understanding cellular behavior in response to intrinsic and extrinsic stimuli and maintenance of homeostasis. Over the last 20 years, mass spectrometry (MS) has emerged as the most powerful tool for large-scale identification and characterization of proteins. Bottom-up proteomics, the most common MS-based proteomics approach, has always been challenging in terms of data management, processing, analysis and visualization, with modern instruments capable of producing several gigabytes of data out of a single experiment. Here, we present ProteoSign, a freely available web application, dedicated in allowing users to perform proteomics differential expression/abundance analysis in a user-friendly and self-explanatory way. Although several non-commercial standalone tools have been developed for post-quantification statistical analysis of proteomics data, most of them are not end-user appealing as they often require very stringent installation of programming environments, third-party software packages and sometimes further scripting or computer programming. To avoid this bottleneck, we have developed a user-friendly software platform accessible via a web interface in order to enable proteomics laboratories and core facilities to statistically analyse quantitative proteomics data sets in a resource-efficient manner. ProteoSign is available at http://bioinformatics.med.uoc.gr/ProteoSign and the source code at https://github.com/yorgodillo/ProteoSign. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  14. Physiological markers of biased decision-making in problematic Internet users.

    PubMed

    Nikolaidou, Maria; Fraser, Danaë Stanton; Hinvest, Neal

    2016-09-01

    Background and aims Addiction has been reliably associated with biased emotional reactions to risky choices. Problematic Internet use (PIU) is a relatively new concept and its classification as an addiction is debated. Implicit emotional responses were measured in individuals expressing nonproblematic and problematic Internet behaviors while they made risky/ambiguous decisions to explore whether they showed similar responses to those found in agreed-upon addictions. Methods The design of the study was cross sectional. Participants were adult Internet users (N = 72). All testing took place in the Psychophysics Laboratory at the University of Bath, UK. Participants were given the Iowa Gambling Task (IGT) which provides an index of an individual's ability to process and learn probabilities of reward and loss. Integration of emotions into current decision-making frameworks is vital for optimal performance on the IGT and thus, skin conductance responses (SCRs) to reward, punishment, and in anticipation of both were measured to assess emotional function. Results Performance on the IGT did not differ between the groups of Internet users. However, problematic Internet users expressed increased sensitivity to punishment as revealed by stronger SCRs to trials with higher punishment magnitude. Discussion and conclusions PIU seems to differ on behavioral and physiological levels with other addictions. However, our data imply that problematic Internet users were more risk-sensitive, which is a suggestion that needs to be incorporated into in any measure and, potentially, any intervention for PIU.

  15. Marijuana withdrawal and aggression among a representative sample of U.S. marijuana users

    PubMed Central

    Smith, Philip H.; Homish, Gregory G.; Leonard, Kenneth E.; Collins, R. Lorraine

    2013-01-01

    Background Previous laboratory-based research suggests that withdrawal from marijuana may cause increased aggression. It is unclear whether this finding extends beyond the laboratory setting to the general population of marijuana users. The purpose of this study was to test a cross-sectional association between marijuana withdrawal symptoms and aggression among a representative sample of U.S. adult marijuana users, and to test whether this association was moderated by previous history of aggression. Methods Data were analyzed from the National Epidemiologic Survey on Alcohol and Related Conditions. Wave Two data (2004–2005) were used for all variables except for history of aggression, which was assessed during the Wave One interview (2001–2002). Two outcomes were examined: self-report general aggression and relationship aggression. Odds ratios for aggression based on withdrawal symptoms and the interaction between withdrawal symptoms and history of aggression were calculated using logistic regression, adjusting for covariates and accounting for the complex survey design. Results Among marijuana users with a history of aggression, marijuana withdrawal was associated with approximately 60% higher odds of past year relationship aggression (p < 0.05). There was no association between withdrawal symptoms and relationship aggression among those without a history of aggression, and no association with general aggression regardless of history of aggression. Conclusions The findings from this study support the notion that laboratory-based increases in aggression due to marijuana withdrawal extend to the general population of marijuana users who have a previous history of aggression. PMID:23380439

  16. Red Storm usage model :Version 1.12.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jefferson, Karen L.; Sturtevant, Judith E.

    Red Storm is an Advanced Simulation and Computing (ASC) funded massively parallel supercomputer located at Sandia National Laboratories (SNL). The Red Storm Usage Model (RSUM) documents the capabilities and the environment provided for the FY05 Tri-Lab Level II Limited Availability Red Storm User Environment Milestone and the FY05 SNL Level II Limited Availability Red Storm Platform Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model is focused on the needs of the ASC user working in the secure computing environments at Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL),more » and SNL. Additionally, the Red Storm Usage Model maps the provided capabilities to the Tri-Lab ASC Computing Environment (ACE) requirements. The ACE requirements reflect the high performance computing requirements for the ASC community and have been updated in FY05 to reflect the community's needs. For each section of the RSUM, Appendix I maps the ACE requirements to the Limited Availability User Environment capabilities and includes a description of ACE requirements met and those requirements that are not met in that particular section. The Red Storm Usage Model, along with the ACE mappings, has been issued and vetted throughout the Tri-Lab community.« less

  17. User Needs, Benefits, and Integration of Robotic Systems in a Space Station Laboratory

    NASA Technical Reports Server (NTRS)

    Dodd, W. R.; Badgley, M. B.; Konkel, C. R.

    1989-01-01

    The methodology, results and conclusions of all tasks of the User Needs, Benefits, and Integration Study (UNBIS) of Robotic Systems in a Space Station Laboratory are summarized. Study goals included the determination of user requirements for robotics within the Space Station, United States Laboratory. In Task 1, three experiments were selected to determine user needs and to allow detailed investigation of microgravity requirements. In Task 2, a NASTRAN analysis of Space Station response to robotic disturbances, and acceleration measurement of a standard industrial robot (Intelledex Model 660) resulted in selection of two ranges of microgravity manipulation: Level 1 (10-3 to 10-5 G at greater than 1 Hz) and Level 2 (less than equal 10-6 G at 0.1 Hz). This task included an evaluation of microstepping methods for controlling stepper motors and concluded that an industrial robot actuator can perform milli-G motion without modification. Relative merits of end-effectors and manipulators were studied in Task 3 in order to determine their ability to perform a range of tasks related to the three microgravity experiments. An Effectivity Rating was established for evaluating these robotic system capabilities. Preliminary interface requirements for an orbital flight demonstration were determined in Task 4. Task 5 assessed the impact of robotics.

  18. Personal Electronic Health Records: Understanding User Requirements and Needs in Chronic Cancer Care

    PubMed Central

    Winkler, Eva; Kamradt, Martina; Längst, Gerda; Eckrich, Felicitas; Heinze, Oliver; Bergh, Bjoern; Szecsenyi, Joachim; Ose, Dominik

    2015-01-01

    Background The integration of new information and communication technologies (ICTs) is becoming increasingly important in reorganizing health care. Adapting ICTs as supportive tools to users' needs and daily practices is vital for adoption and use. Objective In order to develop a Web-based personal electronic health record (PEPA), we explored user requirements and needs with regard to desired information and functions. Methods A qualitative study across health care sectors and health professions was conducted in a regional health care setting in Germany. Overall, 10 semistructured focus groups were performed, collecting views of 3 prospective user groups: patients with colorectal cancer (n=12) and representatives from patient support groups (n=2), physicians (n=17), and non-medical HCPs (n=16). Data were audio- and videotaped, transcribed verbatim, and thematically analyzed using qualitative content analysis. Results For both patients and HCPs, it was central to have a tool representing the chronology of illness and its care processes, for example, patients wanted to track their long-term laboratory findings (eg, tumor markers). Designing health information in a patient accessible way was highlighted as important. Users wanted to have general and tumor-specific health information available in a PEPA. Functions such as filtering information and adding information by patients (eg, on their well-being or electronic communication with HCPs via email) were discussed. Conclusions In order to develop a patient/user centered tool that is tailored to user needs, it is essential to address their perspectives. A challenge for implementation will be how to design PEPA’s health data in a patient accessible way. Adequate patient support and technical advice for users have to be addressed. PMID:25998006

  19. Acquisition and production of skilled behavior in dynamic decision-making tasks. Semiannual Status Report M.S. Thesis - Georgia Inst. of Tech., Nov. 1992

    NASA Technical Reports Server (NTRS)

    Kirlik, Alex; Kossack, Merrick Frank

    1993-01-01

    This status report consists of a thesis entitled 'Ecological Task Analysis: A Method for Display Enhancements.' Previous use of various analysis processes for the purpose of display interface design or enhancement has run the risk of failing to improve user performance due to the analysis resulting in only a sequencial listing of user tasks. Adopting an ecological approach to performing the task analysis, however, may result in the necessary modeling of an unpredictable and variable task domain required to improve user performance. Kirlik has proposed an Ecological Task Analysis framework which is designed for this purpose. It is the purpose of this research to measure this framework's effectiveness at enhancing display interfaces in order to improve user performance. Following the proposed framework, an ecological task analysis of experienced users of a complex and dynamic laboratory task, Star Cruiser, was performed. Based on this analysis, display enhancements were proposed and implemented. An experiment was then conducted to compare this new version of Star Cruiser to the original. By measuring user performance at different tasks, it was determined that during early sessions, use of the enhanced display contributed to better user performance compared to that achieved using the original display. Furthermore, the results indicate that the enhancements proposed as a result of the ecological task analysis affected user performance differently depending on whether they are enhancements which aid in the selection of a possible action or in the performance of an action. Generalizations of these findings to larger, more complex systems were avoided since the analysis was only performed on this one particular system.

  20. Review of wireless and wearable electroencephalogram systems and brain-computer interfaces--a mini-review.

    PubMed

    Lin, Chin-Teng; Ko, Li-Wei; Chang, Meng-Hsiu; Duann, Jeng-Ren; Chen, Jing-Ying; Su, Tung-Ping; Jung, Tzyy-Ping

    2010-01-01

    Biomedical signal monitoring systems have rapidly advanced in recent years, propelled by significant advances in electronic and information technologies. Brain-computer interface (BCI) is one of the important research branches and has become a hot topic in the study of neural engineering, rehabilitation, and brain science. Traditionally, most BCI systems use bulky, wired laboratory-oriented sensing equipments to measure brain activity under well-controlled conditions within a confined space. Using bulky sensing equipments not only is uncomfortable and inconvenient for users, but also impedes their ability to perform routine tasks in daily operational environments. Furthermore, owing to large data volumes, signal processing of BCI systems is often performed off-line using high-end personal computers, hindering the applications of BCI in real-world environments. To be practical for routine use by unconstrained, freely-moving users, BCI systems must be noninvasive, nonintrusive, lightweight and capable of online signal processing. This work reviews recent online BCI systems, focusing especially on wearable, wireless and real-time systems. Copyright 2009 S. Karger AG, Basel.

  1. PIMS sequencing extension: a laboratory information management system for DNA sequencing facilities

    PubMed Central

    2011-01-01

    Background Facilities that provide a service for DNA sequencing typically support large numbers of users and experiment types. The cost of services is often reduced by the use of liquid handling robots but the efficiency of such facilities is hampered because the software for such robots does not usually integrate well with the systems that run the sequencing machines. Accordingly, there is a need for software systems capable of integrating different robotic systems and managing sample information for DNA sequencing services. In this paper, we describe an extension to the Protein Information Management System (PIMS) that is designed for DNA sequencing facilities. The new version of PIMS has a user-friendly web interface and integrates all aspects of the sequencing process, including sample submission, handling and tracking, together with capture and management of the data. Results The PIMS sequencing extension has been in production since July 2009 at the University of Leeds DNA Sequencing Facility. It has completely replaced manual data handling and simplified the tasks of data management and user communication. Samples from 45 groups have been processed with an average throughput of 10000 samples per month. The current version of the PIMS sequencing extension works with Applied Biosystems 3130XL 96-well plate sequencer and MWG 4204 or Aviso Theonyx liquid handling robots, but is readily adaptable for use with other combinations of robots. Conclusions PIMS has been extended to provide a user-friendly and integrated data management solution for DNA sequencing facilities that is accessed through a normal web browser and allows simultaneous access by multiple users as well as facility managers. The system integrates sequencing and liquid handling robots, manages the data flow, and provides remote access to the sequencing results. The software is freely available, for academic users, from http://www.pims-lims.org/. PMID:21385349

  2. User Facilities

    Science.gov Websites

    Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron

  3. Usability testing of a prototype multi-user telehealth kiosk.

    PubMed

    Courtney, Karen L; Matthews, Judith T; McMillan, Julie M; Person Mecca, Laurel; Smailagic, Asim; Siewiorek, Daniel

    2015-01-01

    The overall purpose of this study was to learn how community-dwelling older adults would interact with our prototype multi-user telehealth kiosk and their views about its usability. Seven subjects participated in laboratory-based usability sessions to evaluate the physical design, appearance, functionality and perceived ease of use of a multi-user telehealth kiosk prototype. During usability testing participants recommended 18 new features (29% of comments), identified 15 software errors (23% of comments) and 29 user interface errors (47% of comments).

  4. Laboratory x-ray micro-computed tomography: a user guideline for biological samples

    PubMed Central

    2017-01-01

    Abstract Laboratory x-ray micro–computed tomography (micro-CT) is a fast-growing method in scientific research applications that allows for non-destructive imaging of morphological structures. This paper provides an easily operated “how to” guide for new potential users and describes the various steps required for successful planning of research projects that involve micro-CT. Background information on micro-CT is provided, followed by relevant setup, scanning, reconstructing, and visualization methods and considerations. Throughout the guide, a Jackson's chameleon specimen, which was scanned at different settings, is used as an interactive example. The ultimate aim of this paper is make new users familiar with the concepts and applications of micro-CT in an attempt to promote its use in future scientific studies. PMID:28419369

  5. A user's guide to the Mariner 9 television reduced data record

    NASA Technical Reports Server (NTRS)

    Seidman, J. B.; Green, W. B.; Jepsen, P. L.; Ruiz, R. M.; Thorpe, T. E.

    1973-01-01

    The Mariner 9 television experiment used two cameras to photograph Mars from an orbiting spacecraft. For quantitative analysis of the image data transmitted to earth, the pictures were processed by digital computer to remove camera-induced distortions. The removal process was performed by the JPL Image Processing Laboratory (IPL) using calibration data measured during prelaunch testing of the cameras. The Reduced Data Record (RDR) is the set of data which results from the distortion-removal, or decalibration, process. The principal elements of the RDR are numerical data on magnetic tape and photographic data. Numerical data are the result of correcting for geometric and photometric distortions and residual-image effects. Photographic data are reproduced on negative and positive transparency films, strip contact and enlargement prints, and microfiche positive transparency film. The photographic data consist of two versions of each TV frame created by applying two special enhancement processes to the numerical data.

  6. Software Engineering Laboratory (SEL) data base reporting software user's guide and system description. Volume 2: Program descriptions

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The structure and functions of each reporting software program for the Software Engineering Laboratory data base are described. Baseline diagrams, module descriptions, and listings of program generation files are included.

  7. Reflectance Experiment Laboratory (RELAB) Description and User's Manual

    NASA Technical Reports Server (NTRS)

    Pieters, Carle M.; Hiroi, Takahiro; Pratt, Steve F.; Patterson, Bill

    2004-01-01

    Spectroscopic data acquired in the laboratory provide the interpretive foundation upon which compositional information about unexplored or unsampled planetary surfaces is derived from remotely obtained reflectance spectra. The RELAB is supported by NASA as a multi-user spectroscopy facility, and laboratory time can be made available at no charge to investigators who are in funded NASA programs. RELAB has two operational spectrometers available to NASA scientists: 1) a near- ultraviolet, visible, and near-infrared bidirectional spectrometer and 2) a near- and mid- infrared FT-IR spectrometer. The overall purpose of the design and operation of the RELAB bidirectional spectrometer is to obtain high precision, high spectral resolution, bidirectional reflectance spectra of earth and planetary materials. One of the key elements of its design is the ability to measure samples using viewing geometries specified by the user. This allows investigators to simulate, under laboratory conditions, reflectance spectra obtained remotely (i.e., with spaceborne, telescopic, and airborne systems) as well as to investigate geometry dependent reflectance properties of geologic materials. The Nicolet 740 FT-IR spectrometer currently operates in reflectance mode from 0.9 to 25 Fm. Use and scheduling of the RELAB is monitored by a 4-member advisory committee. NASA investigators should direct inquiries to the Science Manager or RELAB Operator.

  8. An automated metrics system to measure and improve the success of laboratory automation implementation.

    PubMed

    Benn, Neil; Turlais, Fabrice; Clark, Victoria; Jones, Mike; Clulow, Stephen

    2007-03-01

    The authors describe a system for collecting usage metrics from widely distributed automation systems. An application that records and stores usage data centrally, calculates run times, and charts the data was developed. Data were collected over 20 months from at least 28 workstations. The application was used to plot bar charts of date versus run time for individual workstations, the automation in a specific laboratory, or automation of a specified type. The authors show that revised user training, redeployment of equipment, and running complimentary processes on one workstation can increase the average number of runs by up to 20-fold and run times by up to 450%. Active monitoring of usage leads to more effective use of automation. Usage data could be used to determine whether purchasing particular automation was a good investment.

  9. Human factors in telemanipulation: Perspectives from the Oak Ridge National Laboratory experience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Draper, J.V.

    1994-01-01

    Personnel at the Robotics and Process Systems Division (RPSD) of the Oak Ridge National Laboratory (ORNL) have extensive experience designing, building, and operating teleoperators for a variety of settings, including space, battlefields, nuclear fuel reprocessing plants, and hazardous waste retrieval. In the course of the last decade and a half, the RPSD designed, built, and operated 4 telemanipulators (M-2, ASM, LTM, CESAR arm) and operated another half dozen (M-8, Model 50, TOS SM-229, RM-10, PaR 5000, BilArm 83A). During this period, human factors professionals have been closely integrated with RPSD design teams, investigating telemanipulator feedback and feed forward, designing cockpitsmore » and control rooms, training users and designers, and helping to develop performance specifications for telemanipulators. This paper presents a brief review of this and other work, with an aim towards providing perspectives on some of the human factors aspects of telemanipulation. The first section of the paper examines user tasks during supervisory control and discusses how telemanipulator responsiveness determines the appropriate control metaphor for continuous manual control. The second section provides an ecological perspective on telemanipulator feedback and feed-forward. The third section briefly describes the RPSD control room design approach and how design projects often serve as systems integrators.« less

  10. A clinically useful diabetes electronic medical record: lessons from the past; pointers toward the future.

    PubMed

    Gorman, C; Looker, J; Fisk, T; Oelke, W; Erickson, D; Smith, S; Zimmerman, B

    1996-01-01

    We have analysed the deficiencies of paper medical records in facilitating the care of patients with diabetes and have developed an electronic medical record that corrects some of them. The diabetes electronic medical record (DEMR) is designed to facilitate the work of a busy diabetes clinic. Design principles include heavy reliance on graphic displays of laboratory and clinical data, consistent color coding and aggregation of data needed to facilitate the different types of clinical encounter (initial consultation, continuing care visit, insulin adjustment visit, dietitian encounter, nurse educator encounter, obstetric patient, transplant patient, visits for problems unrelated to diabetes). Data input is by autoflow from the institutional laboratories, by desk attendants or on-line by all users. Careful attention has been paid to making data entry a point and click process wherever possible. Opportunity for free text comment is provided on every screen. On completion of the encounter a narrative text summary of the visit is generated by the computer and is annotated by the care giver. Currently there are about 7800 patients in the system. Remaining challenges include the adaptation of the system to accommodate the occasional user, development of portable laptop derivatives that remain compatible with the parent system and improvements in the screen structure and graphic display formats.

  11. User Facilities | Argonne National Laboratory

    Science.gov Websites

    , including biology and medicine. More than 7,000 scientists conduct experiments at Argonne user facilities Transformations IGSBInstitute for Genomics and Systems Biology IMEInstitute for Molecular Engineering JCESRJoint Science Center SBCStructural Biology Center Energy.gov U.S. Department of Energy Office of Science

  12. Everyday and prospective memory deficits in ecstasy/polydrug users.

    PubMed

    Hadjiefthyvoulou, Florentia; Fisk, John E; Montgomery, Catharine; Bridges, Nikola

    2011-04-01

    The impact of ecstasy/polydrug use on real-world memory (i.e. everyday memory, cognitive failures and prospective memory [PM]) was investigated in a sample of 42 ecstasy/polydrug users and 31 non-ecstasy users. Laboratory-based PM tasks were administered along with self-reported measures of PM to test whether any ecstasy/polydrug-related impairment on the different aspects of PM was present. Self-reported measures of everyday memory and cognitive failures were also administered. Ecstasy/polydrug associated deficits were observed on both laboratory and self-reported measures of PM and everyday memory. The present study extends previous research by demonstrating that deficits in PM are real and cannot be simply attributed to self-misperceptions. The deficits observed reflect some general capacity underpinning both time- and event-based PM contexts and are not task specific. Among this group of ecstasy/polydrug users recreational use of cocaine was also prominently associated with PM deficits. Further research might explore the differential effects of individual illicit drugs on real-world memory.

  13. The impact of SLMTA in improving laboratory quality systems in the Caribbean Region

    PubMed Central

    Gordon, Floris; Irving, Yvette; Whyms, Ismae; Parris, Keith; Beckles, Songee; Maruta, Talkmore; Ndlovu, Nqobile; Albalak, Rachel; Alemnji, George

    2014-01-01

    Background Past efforts to improve laboratory quality systems and to achieve accreditation for better patient care in the Caribbean Region have been slow. Objective To describe the impact of the Strengthening of Laboratory Management Toward Accreditation (SLMTA) training programme and mentorship amongst five clinical laboratories in the Caribbean after 18 months. Method Five national reference laboratories from four countries participated in the SLMTA programme that incorporated classroom teaching and implementation of improvement projects. Mentors were assigned to the laboratories to guide trainees on their improvement projects and to assist in the development of Quality Management Systems (QMS). Audits were conducted at baseline, six months, exit (at 12 months) and post-SLMTA (at 18 months) using the Stepwise Laboratory Quality Improvement Process Towards Accreditation (SLIPTA) checklist to measure changes in implementation of the QMS during the period. At the end of each audit, a comprehensive implementation plan was developed in order to address gaps. Results Baseline audit scores ranged from 19% to 52%, corresponding to 0 stars on the SLIPTA five-star scale. After 18 months, one laboratory reached four stars, two reached three stars and two reached two stars. There was a corresponding decrease in nonconformities and development of over 100 management and technical standard operating procedures in each of the five laboratories. Conclusion The tremendous improvement in these five Caribbean laboratories shows that SLMTA coupled with mentorship is an effective, user-friendly, flexible and customisable approach to the implementation of laboratory QMS. It is recommended that other laboratories in the region consider using the SLMTA training programme as they engage in quality systems improvement and preparation for accreditation. PMID:27066396

  14. Semi-automated Anatomical Labeling and Inter-subject Warping of High-Density Intracranial Recording Electrodes in Electrocorticography.

    PubMed

    Hamilton, Liberty S; Chang, David L; Lee, Morgan B; Chang, Edward F

    2017-01-01

    In this article, we introduce img_pipe, our open source python package for preprocessing of imaging data for use in intracranial electrocorticography (ECoG) and intracranial stereo-EEG analyses. The process of electrode localization, labeling, and warping for use in ECoG currently varies widely across laboratories, and it is usually performed with custom, lab-specific code. This python package aims to provide a standardized interface for these procedures, as well as code to plot and display results on 3D cortical surface meshes. It gives the user an easy interface to create anatomically labeled electrodes that can also be warped to an atlas brain, starting with only a preoperative T1 MRI scan and a postoperative CT scan. We describe the full capabilities of our imaging pipeline and present a step-by-step protocol for users.

  15. Analysis of severe storm data

    NASA Technical Reports Server (NTRS)

    Hickey, J. S.

    1983-01-01

    The Mesoscale Analysis and Space Sensor (MASS) Data Management and Analysis System developed by Atsuko Computing International (ACI) on the MASS HP-1000 Computer System within the Systems Dynamics Laboratory of the Marshall Space Flight Center is described. The MASS Data Management and Analysis System was successfully implemented and utilized daily by atmospheric scientists to graphically display and analyze large volumes of conventional and satellite derived meteorological data. The scientists can process interactively various atmospheric data (Sounding, Single Level, Gird, and Image) by utilizing the MASS (AVE80) share common data and user inputs, thereby reducing overhead, optimizing execution time, and thus enhancing user flexibility, useability, and understandability of the total system/software capabilities. In addition ACI installed eight APPLE III graphics/imaging computer terminals in individual scientist offices and integrated them into the MASS HP-1000 Computer System thus providing significant enhancement to the overall research environment.

  16. MIRATE: MIps RATional dEsign Science Gateway.

    PubMed

    Busato, Mirko; Distefano, Rosario; Bates, Ferdia; Karim, Kal; Bossi, Alessandra Maria; López Vilariño, José Manuel; Piletsky, Sergey; Bombieri, Nicola; Giorgetti, Alejandro

    2018-06-13

    Molecularly imprinted polymers (MIPs) are high affinity robust synthetic receptors, which can be optimally synthesized and manufactured more economically than their biological equivalents (i.e. antibody). In MIPs production, rational design based on molecular modeling is a commonly employed technique. This mostly aids in (i) virtual screening of functional monomers (FMs), (ii) optimization of monomer-template ratio, and (iii) selectivity analysis. We present MIRATE, an integrated science gateway for the intelligent design of MIPs. By combining and adapting multiple state-of-the-art bioinformatics tools into automated and innovative pipelines, MIRATE guides the user through the entire process of MIPs' design. The platform allows the user to fully customize each stage involved in the MIPs' design, with the main goal to support the synthesis in the wet-laboratory. MIRATE is freely accessible with no login requirement at http://mirate.di.univr.it/. All major browsers are supported.

  17. Semi-automated Anatomical Labeling and Inter-subject Warping of High-Density Intracranial Recording Electrodes in Electrocorticography

    PubMed Central

    Hamilton, Liberty S.; Chang, David L.; Lee, Morgan B.; Chang, Edward F.

    2017-01-01

    In this article, we introduce img_pipe, our open source python package for preprocessing of imaging data for use in intracranial electrocorticography (ECoG) and intracranial stereo-EEG analyses. The process of electrode localization, labeling, and warping for use in ECoG currently varies widely across laboratories, and it is usually performed with custom, lab-specific code. This python package aims to provide a standardized interface for these procedures, as well as code to plot and display results on 3D cortical surface meshes. It gives the user an easy interface to create anatomically labeled electrodes that can also be warped to an atlas brain, starting with only a preoperative T1 MRI scan and a postoperative CT scan. We describe the full capabilities of our imaging pipeline and present a step-by-step protocol for users. PMID:29163118

  18. Evaluation of the performance of the OneTouch Select Plus blood glucose test system against ISO 15197:2013.

    PubMed

    Setford, Steven; Smith, Antony; McColl, David; Grady, Mike; Koria, Krisna; Cameron, Hilary

    2015-01-01

    Assess laboratory and in-clinic performance of the OneTouch Select(®) Plus test system against ISO 15197:2013 standard for measurement of blood glucose. System performance assessed in laboratory against key patient, environmental and pharmacologic factors. User performance was assessed in clinic by system-naïve lay-users. Healthcare professionals assessed system accuracy on diabetes subjects in clinic. The system demonstrated high levels of performance, meeting ISO 15197:2013 requirements in laboratory testing (precision, linearity, hematocrit, temperature, humidity and altitude). System performance was tested against 28 interferents, with an adverse interfering effect only being recorded for pralidoxime iodide. Clinic user performance results fulfilled ISO 15197:2013 accuracy criteria. Subjects agreed that the color range indicator clearly showed if they were low, in-range or high and helped them better understand glucose results. The system evaluated is accurate and meets all ISO 15197:2013 requirements as per the tests described. The color range indicator helped subjects understand glucose results and supports patients in following healthcare professional recommendations on glucose targets.

  19. External Quality Assurance Programs Managed by the U.S. Geological Survey in Support of the National Atmospheric Deposition Program/Mercury Deposition Network

    USGS Publications Warehouse

    Latysh, Natalie E.; Wetherbee, Gregory A.

    2007-01-01

    The U.S. Geological Survey (USGS) Branch of Quality Systems operates external quality assurance programs for the National Atmospheric Deposition Program/Mercury Deposition Network (NADP/MDN). Beginning in 2004, three programs have been implemented: the system blank program, the interlaboratory comparison program, and the blind audit program. Each program was designed to measure error contributed by specific components in the data-collection process. The system blank program assesses contamination that may result from sampling equipment, field exposure, and routine handling and processing of the wet-deposition samples. The interlaboratory comparison program evaluates bias and precision of analytical results produced by the Mercury Analytical Laboratory (HAL) for the NADP/MDN, operated by Frontier GeoSciences, Inc. The HAL's performance is compared with the performance of five other laboratories. The blind audit program assesses bias and variability of MDN data produced by the HAL using solutions disguised as environmental samples to ascertain true laboratory performance. This report documents the implementation of quality assurance procedures for the NADP/MDN and the operating procedures for each of the external quality assurance programs conducted by the USGS. The USGS quality assurance information provides a measure of confidence to NADP/MDN data users that measurement variability is distinguished from environmental signals.

  20. Theory and High-Energy-Density Laser Experiments Relevant to Accretion Processes in Cataclysmic Variables

    NASA Astrophysics Data System (ADS)

    Krauland, Christine; Drake, R.; Loupias, B.; Falize, E.; Busschaert, C.; Ravasio, A.; Yurchak, R.; Pelka, A.; Koenig, M.; Kuranz, C. C.; Plewa, T.; Huntington, C. M.; Kaczala, D. N.; Klein, S.; Sweeney, R.; Villete, B.; Young, R.; Keiter, P. A.

    2012-05-01

    We present results from high-energy-density (HED) laboratory experiments that explore the contribution of radiative shock waves to the evolving dynamics of the cataclysmic variable (CV) systems in which they reside. CVs can be classified under two main categories, non-magnetic and magnetic. In the process of accretion, both types involve strongly radiating shocks that provide the main source of radiation in the binary systems. This radiation can cause varying structure to develop depending on the optical properties of the material on either side of the shock. The ability of high-intensity lasers to create large energy densities in targets of millimeter-scale volume makes it feasible to create similar radiative shocks in the laboratory. We provide an overview of both CV systems and their connection to the designed and executed laboratory experiments preformed on two laser facilities. Available data and accompanying simulations will likewise be shown. Funded by the NNSA-DS and SC-OFES Joint Prog. in High-Energy-Density Lab. Plasmas, by the Nat. Laser User Facility Prog. in NNSA-DS and by the Predictive Sci. Acad. Alliances Prog. in NNSA-ASC, under grant numbers are DE-FG52-09NA29548, DE-FG52-09NA29034, and DE-FC52-08NA28616.

  1. Phaedra, a protocol-driven system for analysis and validation of high-content imaging and flow cytometry.

    PubMed

    Cornelissen, Frans; Cik, Miroslav; Gustin, Emmanuel

    2012-04-01

    High-content screening has brought new dimensions to cellular assays by generating rich data sets that characterize cell populations in great detail and detect subtle phenotypes. To derive relevant, reliable conclusions from these complex data, it is crucial to have informatics tools supporting quality control, data reduction, and data mining. These tools must reconcile the complexity of advanced analysis methods with the user-friendliness demanded by the user community. After review of existing applications, we realized the possibility of adding innovative new analysis options. Phaedra was developed to support workflows for drug screening and target discovery, interact with several laboratory information management systems, and process data generated by a range of techniques including high-content imaging, multicolor flow cytometry, and traditional high-throughput screening assays. The application is modular and flexible, with an interface that can be tuned to specific user roles. It offers user-friendly data visualization and reduction tools for HCS but also integrates Matlab for custom image analysis and the Konstanz Information Miner (KNIME) framework for data mining. Phaedra features efficient JPEG2000 compression and full drill-down functionality from dose-response curves down to individual cells, with exclusion and annotation options, cell classification, statistical quality controls, and reporting.

  2. AccessScope project: Accessible light microscope for users with upper limb mobility or visual impairments.

    PubMed

    Mansoor, Awais; Ahmed, Wamiq M; Samarapungavan, Ala; Cirillo, John; Schwarte, David; Robinson, J Paul; Duerstock, Bradley S

    2010-01-01

    A web-based application was developed to remotely view slide specimens and control all functions of a research-level light microscopy workstation, called AccessScope. Students and scientists with upper limb mobility and visual impairments are often unable to use a light microscope by themselves and must depend on others in its operation. Users with upper limb mobility impairments and low vision were recruited to assist in the design process of the AccessScope personal computer (PC) user interface. Participants with these disabilities were evaluated in their ability to use AccessScope to perform microscopical tasks. AccessScope usage was compared with inspecting prescanned slide images by grading participants' identification and understanding of histological features and knowledge of microscope operation. With AccessScope subjects were able to independently perform common light microscopy functions through an Internet browser by employing different PC pointing devices or accessibility software according to individual abilities. Subjects answered more histology and microscope usage questions correctly after first participating in an AccessScope test session. AccessScope allowed users with upper limb or visual impairments to successfully perform light microscopy without assistance. This unprecedented capability is crucial for students and scientists with disabilities to perform laboratory coursework or microscope-based research and pursue science, technology, engineering, and mathematics fields.

  3. The new NCPSS BL19U2 beamline at the SSRF for small-angle X-ray scattering from biological macromolecules in solution.

    PubMed

    Li, Na; Li, Xiuhong; Wang, Yuzhu; Liu, Guangfeng; Zhou, Ping; Wu, Hongjin; Hong, Chunxia; Bian, Fenggang; Zhang, Rongguang

    2016-10-01

    The beamline BL19U2 is located in the Shanghai Synchrotron Radiation Facility (SSRF) and is its first beamline dedicated to biological material small-angle X-ray scattering (BioSAXS). The electrons come from an undulator which can provide high brilliance for the BL19U2 end stations. A double flat silicon crystal (111) monochromator is used in BL19U2, with a tunable monochromatic photon energy ranging from 7 to 15 keV. To meet the rapidly growing demands of crystallographers, biochemists and structural biologists, the BioSAXS beamline allows manual and automatic sample loading/unloading. A Pilatus 1M detector (Dectris) is employed for data collection, characterized by a high dynamic range and a short readout time. The highly automated data processing pipeline SASFLOW was integrated into BL19U2, with help from the BioSAXS group of the European Molecular Biology Laboratory (EMBL, Hamburg), which provides a user-friendly interface for data processing. The BL19U2 beamline was officially opened to users in March 2015. To date, feedback from users has been positive and the number of experimental proposals at BL19U2 is increasing. A description of the new BioSAXS beamline and the setup characteristics is given, together with examples of data obtained.

  4. Diagnostics aid for mass spectrometer trouble-shooting

    NASA Astrophysics Data System (ADS)

    Filby, E. E.; Rankin, R. A.; Webb, G. W.

    The MS Expert system provides problem diagnostics for instruments used in the Mass Spectrometry Laboratory (MSL). The most critical results generated on these mass spectrometers are the uranium concentration and isotopic content data used for process control and materials accountability at the Idaho Chemical Processing Plant. The two purposes of the system are: (1) to minimize instrument downtime and thereby provide the best possible support to the Plant, and (2) to improve long-term data quality. This system combines the knowledge of several experts on mass spectrometry to provide a diagnostic tool, and can make these skills available on a more timely basis. It integrates code written in the Pascal language with a knowledge base entered into a commercial expert system shell. The user performs some preliminary status checks, and then selects from among several broad diagnostic categories. These initial steps provide input to the rule base. The overall analysis provides the user with a set of possible solutions to the observed problems, graded as to their probabilities. Besides the trouble-shooting benefits expected from this system, it will also provide structures diagnostic training for lab personnel. In addition, development of the system knowledge base has already produced a better understanding of instrument behavior. Two key findings are that a good user interface is necessary for full acceptance of the tool, and a development system should include standard programming capabilities as well as the expert system shell.

  5. Comparing Binaural Pre-processing Strategies II

    PubMed Central

    Hu, Hongmei; Krawczyk-Becker, Martin; Marquardt, Daniel; Herzke, Tobias; Coleman, Graham; Adiloğlu, Kamil; Bomke, Katrin; Plotz, Karsten; Gerkmann, Timo; Doclo, Simon; Kollmeier, Birger; Hohmann, Volker; Dietz, Mathias

    2015-01-01

    Several binaural audio signal enhancement algorithms were evaluated with respect to their potential to improve speech intelligibility in noise for users of bilateral cochlear implants (CIs). 50% speech reception thresholds (SRT50) were assessed using an adaptive procedure in three distinct, realistic noise scenarios. All scenarios were highly nonstationary, complex, and included a significant amount of reverberation. Other aspects, such as the perfectly frontal target position, were idealized laboratory settings, allowing the algorithms to perform better than in corresponding real-world conditions. Eight bilaterally implanted CI users, wearing devices from three manufacturers, participated in the study. In all noise conditions, a substantial improvement in SRT50 compared to the unprocessed signal was observed for most of the algorithms tested, with the largest improvements generally provided by binaural minimum variance distortionless response (MVDR) beamforming algorithms. The largest overall improvement in speech intelligibility was achieved by an adaptive binaural MVDR in a spatially separated, single competing talker noise scenario. A no-pre-processing condition and adaptive differential microphones without a binaural link served as the two baseline conditions. SRT50 improvements provided by the binaural MVDR beamformers surpassed the performance of the adaptive differential microphones in most cases. Speech intelligibility improvements predicted by instrumental measures were shown to account for some but not all aspects of the perceptually obtained SRT50 improvements measured in bilaterally implanted CI users. PMID:26721921

  6. Laboratory Investigations for the Role of Flushing Media in Diamond Drilling of Marble

    NASA Astrophysics Data System (ADS)

    Bhatnagar, A.; Khandelwal, Manoj; Rao, K. U. M.

    2011-05-01

    Marble is used as a natural stone for decorative purposes from ages. Marble is a crystalline rock, composed predominantly of calcite, dolomite or serpentine. The presence of impurities imparts decorative pattern and colors. The diamond-based operations are extensively used in the mining and processing of marble. Marble is mined out in the form of blocks of cuboids shape and has to undergo extensive processing to make it suitable for the end users. The processing operation includes slabbing, sizing, polishing, etc. Diamond drilling is also commonly used for the exploration of different mineral deposits throughout the world. In this paper an attempt has been made to enhance the performance of diamond drilling on marble rocks by adding polyethylene-oxide (PEO) in the flushing water. The effect of PEO added with the drilling water was studied by varying different machine parameters and flushing media concentration in the laboratory. The responses were rate of penetration and torque at bit-rock interface. Different physico-mechanical properties of marble were also determined. It was found that flushing water added with PEO can substantially enhance the penetration rates and reduce the torque developed at the bit-rock interface as compared to plain flushing water.

  7. Improvised double-embedding technique of minute biopsies: a mega boon to histopathology laboratory.

    PubMed

    Yadav, Lokendra; Thomas, Sarega; Kini, Usha

    2015-01-01

    Optimal orientation of minute mucosal biopsies is essential for a definite diagnosis in gastrointestinal pathology or to visualize neural plexuses in Hirschsprung disease. The problem of minute size of the biopsy and its orientation gets compounded when they are from neonates and mandates exhaustive strip cuts, thus delaying reporting. A modified agar-paraffin technique is aimed to make tissue embedding efficient and user-friendly by inking mapping biopsies (one or more) either fresh or fixed with surgical coloring inks followed by embedding first in agar after orientation and followed thereafter by processing, re-embedding in paraffin wax, sectioning and staining. The tissues in agar paraffin block were found to be well processed, firm, held secure and well preserved. The blocks were easy to cut, with serial sections of thickness 2-3 μ and easy to spread. The colored inks remained permanently on the tissues both in the block as well as on the sections which helped in easy identification of tissues. Agar did not interfere with any stain such as Hematoxylin and Eosin or with histochemical stains, enzyme histochemistry or immunohistochemistry. Inking biopsies and pooling them in a block when obtained from the same patient reduced the number of tissue blocks. The modified agar-paraffin embedding technique is a simple reliable user friendly method that can greatly improve the quality of diagnostic information from minute biopsies by optimal orientation, better quality of sections, faster turnaround time and cost-effectiveness by economizing on the number of paraffin blocks, manpower, chemical reagents and laboratory infrastructure.

  8. Access to the NCAR Research Data Archive via the Globus Data Transfer Service

    NASA Astrophysics Data System (ADS)

    Cram, T.; Schuster, D.; Ji, Z.; Worley, S. J.

    2014-12-01

    The NCAR Research Data Archive (RDA; http://rda.ucar.edu) contains a large and diverse collection of meteorological and oceanographic observations, operational and reanalysis outputs, and remote sensing datasets to support atmospheric and geoscience research. The RDA contains greater than 600 dataset collections which support the varying needs of a diverse user community. The number of RDA users is increasing annually, and the most popular method used to access the RDA data holdings is through web based protocols, such as wget and cURL based scripts. In the year 2013, 10,000 unique users downloaded greater than 820 terabytes of data from the RDA, and customized data products were prepared for more than 29,000 user-driven requests. In order to further support this increase in web download usage, the RDA is implementing the Globus data transfer service (www.globus.org) to provide a GridFTP data transfer option for the user community. The Globus service is broadly scalable, has an easy to install client, is sustainably supported, and provides a robust, efficient, and reliable data transfer option for RDA users. This paper highlights the main functionality and usefulness of the Globus data transfer service for accessing the RDA holdings. The Globus data transfer service, developed and supported by the Computation Institute at The University of Chicago and Argonne National Laboratory, uses the GridFTP as a fast, secure, and reliable method for transferring data between two endpoints. A Globus user account is required to use this service, and data transfer endpoints are defined on the Globus web interface. In the RDA use cases, the access endpoint is created on the RDA data server at NCAR. The data user defines the receiving endpoint for the data transfer, which can be the main file system at a host institution, a personal work station, or laptop. Once initiated, the data transfer runs as an unattended background process by Globus, and Globus ensures that the transfer is accurately fulfilled. Users can monitor the data transfer progress on the Globus web interface and optionally receive an email notification once it is complete. Globus also provides a command-line interface to support scripted transfers, which can be useful when embedded in data processing workflows.

  9. Space Radiation Effects Laboratory

    NASA Technical Reports Server (NTRS)

    1969-01-01

    The SREL User's Handbook is designed to provide information needed by those who plan experiments involving the accelerators at this laboratory. Thus the Handbook will contain information on the properties of the machines, the beam parameters, the facilities and services provided for experimenters, etc. This information will be brought up to date as new equipment is added and modifications accomplished. This Handbook is influenced by the many excellent models prepared at other accelerator laboratories. In particular, the CERN Synchrocyclotron User's Handbook (November 1967) is closely followed in some sections, since the SREL Synchrocyclotron is a duplicate of the CERN machine. We wish to thank Dr. E. G. Michaelis for permission to draw so heavily on his work, particularly in Section II of this Handbook. We hope that the Handbook will prove useful, and will welcome suggestions and criticism.

  10. Single-centre experience with Renal PatientView, a web-based system that provides patients with access to their laboratory results.

    PubMed

    Woywodt, Alexander; Vythelingum, Kervina; Rayner, Scott; Anderton, John; Ahmed, Aimun

    2014-10-01

    Renal PatientView (RPV) is a novel, web-based system in the UK that provides patients with access to their laboratory results, in conjunction with patient information. To study how renal patients within our centre access and use RPV. We sent out questionnaires in December 2011 to all 651 RPV users under our care. We collected information on aspects such as the frequency and timing of RPV usage, the parameters viewed by users, and the impact of RPV on their care. A total of 295 (45 %) questionnaires were returned. The predominant users of RPV were transplant patients (42 %) followed by pre-dialysis chronic kidney disease patients (37 %). Forty-two percent of RPV users accessed their results after their clinic appointments, 38 % prior to visiting the clinic. The majority of patients (76 %) had used the system to discuss treatment with their renal physician, while 20 % of patients gave permission to other members of their family to use RPV to monitor results on their behalf. Most users (78 %) reported accessing RPV on average 1-5 times/month. Most patients used RPV to monitor their kidney function, 81 % to check creatinine levels, 57 % to check potassium results. Ninety-two percent of patients found RPV easy to use and 93 % felt that overall the system helps them in taking care of their condition; 53 % of patients reported high satisfaction with RPV. Our results provide interesting insight into use of a system that gives patients web-based access to laboratory results. The fact that 20 % of patients delegate access to relatives also warrants further study. We propose that online access to laboratory results should be offered to all renal patients, although clinicians need to be mindful of the 'digital divide', i.e. part of the population that is not amenable to IT-based strategies for patient empowerment.

  11. Inventory and Billing Systems for Multiple Users.

    ERIC Educational Resources Information Center

    Frazier, Lavon

    Washington State University developed a comprehensive supplies inventory system and a generalized billing system with multiple users in mind. The supplies inventory control system developed for Central Stores, a self-sustaining service center that purchases and warehouses office, laboratory, and hardware supplies, was called AIMS, An Inventory…

  12. Audiovisual Speech Web-Lab: an Internet teaching and research laboratory.

    PubMed

    Gordon, M S; Rosenblum, L D

    2001-05-01

    Internet resources now enable laboratories to make full-length experiments available on line. A handful of existing web sites offer users the ability to participate in experiments and generate usable data. We have integrated this technology into a web site that also provides full discussion of the theoretical and methodological aspects of the experiments using text and simple interactive demonstrations. The content of the web site (http://www.psych.ucr.edu/avspeech/lab) concerns audiovisual speech perception and its relation to face perception. The site is designed to be useful for users of multiple interests and levels of expertise.

  13. The Impact of Audiovisual Feedback on the Learning Outcomes of a Remote and Virtual Laboratory Class

    ERIC Educational Resources Information Center

    Lindsay, E.; Good, M.

    2009-01-01

    Remote and virtual laboratory classes are an increasingly prevalent alternative to traditional hands-on laboratory experiences. One of the key issues with these modes of access is the provision of adequate audiovisual (AV) feedback to the user, which can be a complicated and resource-intensive challenge. This paper reports on a comparison of two…

  14. Community Extreme Tonnage User Service (CETUS): A 5000 Ton Open Research Facility in the United States

    NASA Astrophysics Data System (ADS)

    Danielson, L. R.; Righter, K.; Vander Kaaden, K. E.; Rowland, R. L., II; Draper, D. S.; McCubbin, F. M.

    2017-12-01

    Large sample volume 5000 ton multi-anvil presses have contributed to the exploration of deep Earth and planetary interiors, synthesis of ultra-hard and other novel materials, and serve as a sample complement to pressure and temperature regimes already attainable by diamond anvil cell experiments. However, no such facility exists in the Western Hemisphere. We are establishing an open user facility for the entire research community, with the unique capability of a 5000 ton multi-anvil and deformation press, HERA (High pressure Experimental Research Apparatus), supported by a host of extant co-located experimental and analytical laboratories and research staff. We offer wide range of complementary and/or preparatory experimental options. Any required synthesis of materials or follow up experiments can be carried out controlled atmosphere furnaces, piston cylinders, multi-anvil, or experimental impact apparatus. Additionally, our division houses two machine shops that would facilitate any modification or custom work necessary for development of CETUS, one for general fabrication and one located specifically within our experimental facilities. We also have a general sample preparation laboratory, specifically for experimental samples, that allows users to quickly and easily prepare samples for ebeam analyses and more. Our focus as contract staff is on serving the scientific needs of our users and collaborators. We are seeking community expert input on multiple aspects of this facility, such as experimental assembly design, module modifications, immediate projects, and future innovation initiatives. We've built a cooperative network of 12 (and growing) collaborating institutions, including COMPRES. CETUS is a coordinated effort leveraging HERA with our extant experimental, analytical, and planetary process modelling instrumentation and expertise in order to create a comprehensive model of the origin and evolution of our solar system and beyond. We are looking to engage the community in how the CETUS facility can best serve your needs.

  15. A client/server system for Internet access to biomedical text/image databanks.

    PubMed

    Thoma, G R; Long, L R; Berman, L E

    1996-01-01

    Internet access to mixed text/image databanks is finding application in the medical world. An example is a database of medical X-rays and associated data consisting of demographic, socioeconomic, physician's exam, medical laboratory and other information collected as part of a nationwide health survey conducted by the government. Another example is a collection of digitized cryosection images, CT and MR taken of cadavers as part of the National Library of Medicine's Visible Human Project. In both cases, the challenge is to provide access to both the image and the associated text for a wide end user community to create atlases, conduct epidemiological studies, to develop image-specific algorithms for compression, enhancement and other types of image processing, among many other applications. The databanks mentioned above are being created in prototype form. This paper describes the prototype system developed for the archiving of the data and the client software to enable a broad range of end users to access the archive, retrieve text and image data, display the data and manipulate the images. System design considerations include; data organization in a relational database management system with object-oriented extensions; a hierarchical organization of the image data by different resolution levels for different user classes; client design based on common hardware and software platforms incorporating SQL search capability, X Window, Motif and TAE (a development environment supporting rapid prototyping and management of graphic-oriented user interfaces); potential to include ultra high resolution display monitors as a user option; intuitive user interface paradigm for building complex queries; and contrast enhancement, magnification and mensuration tools for better viewing by the user.

  16. Development of a Fan-Filter Unit Test Standard, LaboratoryValidations, and its Applications across Industries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Tengfang

    2006-10-20

    Lawrence Berkeley National Laboratory (LBNL) is now finalizing the Phase 2 Research and Demonstration Project on characterizing 2-foot x 4-foot (61-cm x 122-cm) fan-filter units in the market using the first-ever standard laboratory test method developed at LBNL.[1][2][3] Fan-filter units deliver re-circulated air and provide particle filtration control for clean environments. Much of the energy in cleanrooms (and minienvironments) is consumed by 2-foot x 4-foot (61-cm x 122-cm) or 4-foot x 4-foot (122-cm x 122-cm) fan-filter units that are typically located in the ceiling (25-100% coverage) of cleanroom controlled environments. Thanks to funding support by the California Energy Commission's Industrialmore » Program of the Public Interest Energy Research (PIER) Program, and significant participation from manufacturers and users of fan-filter units from around the world, LBNL has developed and performed a series of standard laboratory tests and reporting on a variety of 2-foot x 4-foot (61-cm x 122-cm) fan-filter units (FFUs). Standard laboratory testing reports have been completed and reported back to anonymous individual participants in this project. To date, such reports on standard testing of FFU performance have provided rigorous and useful data for suppliers and end users to better understand, and more importantly, to quantitatively characterize performance of FFU products under a variety of operating conditions.[1] In the course of the project, the standard laboratory method previously developed at LBNL has been under continuous evaluation and update.[2][3] Based upon the updated standard, it becomes feasible for users and suppliers to characterize and evaluate energy performance of FFUs in a consistent way.« less

  17. CRIM-TRACK: sensor system for detection of criminal chemical substances

    NASA Astrophysics Data System (ADS)

    Munk, Jens K.; Buus, Ole T.; Larsen, Jan; Dossi, Eleftheria; Tatlow, Sol; Lässig, Lina; Sandström, Lars; Jakobsen, Mogens H.

    2015-10-01

    Detection of illegal compounds requires a reliable, selective and sensitive detection device. The successful device features automated target acquisition, identification and signal processing. It is portable, fast, user friendly, sensitive, specific, and cost efficient. LEAs are in need of such technology. CRIM-TRACK is developing a sensing device based on these requirements. We engage highly skilled specialists from research institutions, industry, SMEs and LEAs and rely on a team of end users to benefit maximally from our prototypes. Currently we can detect minute quantities of drugs, explosives and precursors thereof in laboratory settings. Using colorimetric technology we have developed prototypes that employ disposable sensing chips. Ease of operation and intuitive sensor response are highly prioritized features that we implement as we gather data to feed into machine learning. With machine learning our ability to detect threat compounds amidst harmless substances improves. Different end users prefer their equipment optimized for their specific field. In an explosives-detecting scenario, the end user may prefer false positives over false negatives, while the opposite may be true in a drug-detecting scenario. Such decisions will be programmed to match user preference. Sensor output can be as detailed as the sensor allows. The user can be informed of the statistics behind the detection, identities of all detected substances, and quantities thereof. The response can also be simplified to "yes" vs. "no". The technology under development in CRIM-TRACK will provide custom officers, police and other authorities with an effective tool to control trafficking of illegal drugs and drug precursors.

  18. Evaluation of the impact of a total automation system in a large core laboratory on turnaround time.

    PubMed

    Lou, Amy H; Elnenaei, Manal O; Sadek, Irene; Thompson, Shauna; Crocker, Bryan D; Nassar, Bassam

    2016-11-01

    Growing financial and workload pressures on laboratories coupled with user demands for faster turnaround time (TAT) has steered the implementation of total laboratory automation (TLA). The current study evaluates the impact of a complex TLA on core laboratory efficiency through the analysis of the In-lab to Report TAT (IR-TAT) for five representative tests based on the different requested priorities. Mean, median and outlier percentages (OP) for IR-TAT were determined following TLA implementation and where possible, compared to the pre-TLA era. The shortest mean IR-TAT via the priority lanes of the TLA was 22min for Complete Blood Count (CBC), followed by 34min, 39min and 40min for Prothrombin time (PT), urea and potassium testing respectively. The mean IR-TAT for STAT CBC loaded directly on to the analyzers was 5min shorter than that processed via the TLA. The mean IR-TATs for both STAT potassium and urea via offline centrifugation were comparable to that processed by the TLA. The longest mean IR-TAT via regular lanes of the TLA was 62min for Thyroid-Stimulating Hormone (TSH) while the shortest was 17min for CBC. All parameters for IR-TAT for CBC and PT tests decreased significantly post- TLA across all requested priorities in particular the outlier percentage (OP) at 30 and 60min. TLA helps to efficiently manage substantial volumes of samples across all requested priorities. Manual processing for small STAT volumes, at both the initial centrifugation stage and front loading directly on to analyzers, is however likely to yield the shortest IR-TAT. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  19. Computational Science News | Computational Science | NREL

    Science.gov Websites

    -Cooled High-Performance Computing Technology at the ESIF February 28, 2018 NREL Launches New Website for High-Performance Computing System Users The National Renewable Energy Laboratory (NREL) Computational Science Center has launched a revamped website for users of the lab's high-performance computing (HPC

  20. Accommodation requirements for microgravity science and applications research on space station

    NASA Technical Reports Server (NTRS)

    Uhran, M. L.; Holland, L. R.; Wear, W. O.

    1985-01-01

    Scientific research conducted in the microgravity environment of space represents a unique opportunity to explore and exploit the benefits of materials processing in the virtual abscence of gravity induced forces. NASA has initiated the preliminary design of a permanently manned space station that will support technological advances in process science and stimulate the development of new and improved materials having applications across the commercial spectrum. A study is performed to define from the researchers' perspective, the requirements for laboratory equipment to accommodate microgravity experiments on the space station. The accommodation requirements focus on the microgravity science disciplines including combustion science, electronic materials, metals and alloys, fluids and transport phenomena, glasses and ceramics, and polymer science. User requirements have been identified in eleven research classes, each of which contain an envelope of functional requirements for related experiments having similar characteristics, objectives, and equipment needs. Based on these functional requirements seventeen items of experiment apparatus and twenty items of core supporting equipment have been defined which represent currently identified equipment requirements for a pressurized laboratory module at the initial operating capability of the NASA space station.

  1. A broadband multimedia TeleLearning system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Ruiping; Karmouch, A.

    1996-12-31

    In this paper we discuss a broadband multimedia TeleLearning system under development in the Multimedia Information Research Laboratory at the University of Ottawa. The system aims at providing a seamless environment for TeleLearning using the latest telecommunication and multimedia information processing technology. It basically consists of a media production center, a courseware author site, a courseware database, a courseware user site, and an on-line facilitator site. All these components are distributed over an ATM network and work together to offer a multimedia interactive courseware service. An MHEG-based model is exploited in designing the system architecture to achieve the real-time, interactive,more » and reusable information interchange through heterogeneous platforms. The system architecture, courseware processing strategies, courseware document models are presented.« less

  2. Inter-laboratory agreement on embryo classification and clinical decision: Conventional morphological assessment vs. time lapse.

    PubMed

    Martínez-Granados, Luis; Serrano, María; González-Utor, Antonio; Ortíz, Nereyda; Badajoz, Vicente; Olaya, Enrique; Prados, Nicolás; Boada, Montse; Castilla, Jose A

    2017-01-01

    The aim of this study is to determine inter-laboratory variability on embryo assessment using time-lapse platform and conventional morphological assessment. This study compares the data obtained from a pilot study of external quality control (EQC) of time lapse, performed in 2014, with the classical EQC of the Spanish Society for the Study of Reproductive Biology (ASEBIR) performed in 2013 and 2014. In total, 24 laboratories (8 using EmbryoScope™, 15 using Primo Vision™ and one with both platforms) took part in the pilot study. The clinics that used EmbryoScope™ analysed 31 embryos and those using Primo Vision™ analysed 35. The classical EQC was implemented by 39 clinics, based on an analysis of 25 embryos per year. Both groups were required to evaluate various qualitative morphological variables (cell fragmentation, the presence of vacuoles, blastomere asymmetry and multinucleation), to classify the embryos in accordance with ASEBIR criteria and to stipulate the clinical decision taken. In the EQC time-lapse pilot study, the groups were asked to determine, as well as the above characteristics, the embryo development times, the number, opposition and size of pronuclei, the direct division of 1 into 3 cells and/or of 3 into 5 cells and false divisions. The degree of agreement was determined by calculating the intra-class correlation coefficients and the coefficient of variation for the quantitative variables and the Gwet index for the qualitative variables. For both EmbryoScope™ and Primo Vision™, two periods of greater inter-laboratory variability were observed in the times of embryo development events. One peak of variability was recorded among the laboratories addressing the first embryo events (extrusion of the second polar body and the appearance of pronuclei); the second peak took place between the times corresponding to the 8-cell and morula stages. In most of the qualitative variables analysed regarding embryo development, there was almost-perfect inter-laboratory agreement among conventional morphological assessment (CMA), EmbryoScope™ and Primo Vision™, except for false divisions, vacuoles and asymmetry (users of all methods) and multinucleation (users of Primo Vision™), where the degree of agreement was lower. The inter-laboratory agreement on embryo classification according to the ASEBIR criteria was moderate-substantial (Gwet 0.41-0.80) for the laboratories using CMA and EmbryoScope™, and fair-moderate (Gwet 0.21-0.60) for those using Primo Vision™. The inter-laboratory agreement for clinical decision was moderate (Gwet 0.41-0.60) on day 5 for CMA users and almost perfect (Gwet 0.81-1) for time-lapse users. In conclusion, time-lapse technology does not improve inter-laboratory agreement on embryo classification or the analysis of each morphological variable. Moreover, depending on the time-lapse platform used, inter-laboratory agreement may be lower than that obtained by CMA. However, inter-laboratory agreement on clinical decisions is improved with the use of time lapse, regardless of the platform used.

  3. Inter-laboratory agreement on embryo classification and clinical decision: Conventional morphological assessment vs. time lapse

    PubMed Central

    Serrano, María; González-Utor, Antonio; Ortíz, Nereyda; Badajoz, Vicente; Olaya, Enrique; Prados, Nicolás; Boada, Montse; Castilla, Jose A.

    2017-01-01

    The aim of this study is to determine inter-laboratory variability on embryo assessment using time-lapse platform and conventional morphological assessment. This study compares the data obtained from a pilot study of external quality control (EQC) of time lapse, performed in 2014, with the classical EQC of the Spanish Society for the Study of Reproductive Biology (ASEBIR) performed in 2013 and 2014. In total, 24 laboratories (8 using EmbryoScope™, 15 using Primo Vision™ and one with both platforms) took part in the pilot study. The clinics that used EmbryoScope™ analysed 31 embryos and those using Primo Vision™ analysed 35. The classical EQC was implemented by 39 clinics, based on an analysis of 25 embryos per year. Both groups were required to evaluate various qualitative morphological variables (cell fragmentation, the presence of vacuoles, blastomere asymmetry and multinucleation), to classify the embryos in accordance with ASEBIR criteria and to stipulate the clinical decision taken. In the EQC time-lapse pilot study, the groups were asked to determine, as well as the above characteristics, the embryo development times, the number, opposition and size of pronuclei, the direct division of 1 into 3 cells and/or of 3 into 5 cells and false divisions. The degree of agreement was determined by calculating the intra-class correlation coefficients and the coefficient of variation for the quantitative variables and the Gwet index for the qualitative variables. For both EmbryoScope™ and Primo Vision™, two periods of greater inter-laboratory variability were observed in the times of embryo development events. One peak of variability was recorded among the laboratories addressing the first embryo events (extrusion of the second polar body and the appearance of pronuclei); the second peak took place between the times corresponding to the 8-cell and morula stages. In most of the qualitative variables analysed regarding embryo development, there was almost-perfect inter-laboratory agreement among conventional morphological assessment (CMA), EmbryoScope™ and Primo Vision™, except for false divisions, vacuoles and asymmetry (users of all methods) and multinucleation (users of Primo Vision™), where the degree of agreement was lower. The inter-laboratory agreement on embryo classification according to the ASEBIR criteria was moderate-substantial (Gwet 0.41–0.80) for the laboratories using CMA and EmbryoScope™, and fair-moderate (Gwet 0.21–0.60) for those using Primo Vision™. The inter-laboratory agreement for clinical decision was moderate (Gwet 0.41–0.60) on day 5 for CMA users and almost perfect (Gwet 0.81–1) for time-lapse users. In conclusion, time-lapse technology does not improve inter-laboratory agreement on embryo classification or the analysis of each morphological variable. Moreover, depending on the time-lapse platform used, inter-laboratory agreement may be lower than that obtained by CMA. However, inter-laboratory agreement on clinical decisions is improved with the use of time lapse, regardless of the platform used. PMID:28841654

  4. XYFREZ.4 User’s Manual.

    DTIC Science & Technology

    1987-12-01

    F T FILE I MEuSpecial Report 87-26 December 1987 US Army Corps of Engineers Cold Regions Research & Engineering Laboratory XYFREZ.4 User’s manual...Freeze/thaw User’s manual 19. ABSTRACT (Continue on reverse if necessary and identify by block number) - -- Using the program XYFREZ, version 4, one...may simulate two-dimensional conduction of heat, with or without phase change. The mathematical method employed uses finite elements in space and

  5. AVIRIS Reflectance Retrievals: UCSB Users Manual. Appendix 1

    NASA Technical Reports Server (NTRS)

    Roberts, Dar A.; Prentiss, Dylan

    2001-01-01

    The following write-up is designed to help students and researchers take Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) radiance data and retrieve surface reflectance. In the event that the software is not available, but a user has access to a reflectance product, this document is designed to provide a better understanding of how AVIRIS reflectance was retrieved. This guide assumes that the reader has both a basic understanding of the UNIX computing environment, and that of spectroscopy. Knowledge of the Interactive Data Language (IDL) and the Environment for Visualizing Images (ENVI) is helpful. This is a working document, and many of the fine details described in the following pages have been previously undocumented. After having read this document the reader should be able to process AVIRIS to reflectance, provided access to all of the code is possible. The AVIRIS radiance data itself is pre-processed at the Jet Propulsion Laboratory (JPL) in Pasadena, California. The first section of this paper describes how to read data from tape and byte-swap the data. Section 2 describes the procedure in preparing support files before running the 'h2o' suite of programs. Section 3 describes the four programs used in the process, h2olut9.f, h2ospl9.f, vlsfit9.f and rfl9.f.

  6. Argonne Collaborative Center for Energy Storage Science (ACCESS)

    Science.gov Websites

    Analysis and Diagnostics Laboratory (EADL) Post- Test Facility Access Proven Capabilities Argonne has Analysis, Modeling and Prototyping (CAMP) Electrochemical Analysis and Diagnostics Laboratory (EADL) Post -Test Facility Argonne User Facilities Industries Transportation Consumer Electronics Defense Electric

  7. Laboratory Directed Research & Development (LDRD)

    Science.gov Websites

    Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron

  8. Bar-Code System for a Microbiological Laboratory

    NASA Technical Reports Server (NTRS)

    Law, Jennifer; Kirschner, Larry

    2007-01-01

    A bar-code system has been assembled for a microbiological laboratory that must examine a large number of samples. The system includes a commercial bar-code reader, computer hardware and software components, plus custom-designed database software. The software generates a user-friendly, menu-driven interface.

  9. Concept for a commercial space station laboratory

    NASA Technical Reports Server (NTRS)

    Wood, P. W.; Stark, P. M.

    1984-01-01

    The concept of a privately owned and operated fee-for-service laboratory as an element of a civil manned space station, envisioned as the venture of a group of private investors and an experienced laboratory operator to be undertaken with the cooperation of NASA is discussed. This group would acquire, outfit, activate, and operate the labortory on a fee-for-service basis, providing laboratory services to commercial firms, universities, and government agencies, including NASA. This concept was developed to identify, stimulate, and assist potential commercial users of a manned space station. A number of the issues which would be related to the concept, including the terms under which NASA might consider permitting private ownership and operation of a major space station component, the policies with respect to international participation in the construction and use of the space station, the basis for charging users for services received from the space station, and the types of support that NASA might be willing to provide to assist private industry in carrying out such a venture are discussed.

  10. Effects of Varenicline Alone and in Combination With Low-dose Naltrexone on Alcohol-primed Smoking in Heavy-drinking Tobacco Users: A Preliminary Laboratory Study.

    PubMed

    Roberts, Walter; Shi, Julia M; Tetrault, Jeanette M; McKee, Sherry A

    Heavy-drinking tobacco users are less likely to successfully quit smoking than their moderate-drinking counterparts, even when they are prescribed smoking cessation medication. One strategy for improving treatment outcomes in this subgroup of tobacco users may be to combine medication therapies to target both alcohol and tobacco use simultaneously. Adding naltrexone to frontline smoking cessation treatments may improve treatment outcomes in this group. This double-blind, placebo-controlled human laboratory study examined the effects of varenicline (2 mg/d) and varenicline (2 mg/d), combined with a low dose of naltrexone (25 mg/d) on alcohol-primed smoking behavior in a laboratory model of smoking relapse in heavy-drinking tobacco users (n = 30). Participants attended a laboratory session and received an alcohol challenge (target breath alcohol concentration = 0.030 g/dL). They completed a smoking delay task that assessed their ability to resist smoking followed by an ad libitum smoking phase (primary outcomes). They also provided ratings of subjective drug effects and craving, and carbon monoxide levels were measured after smoking (secondary outcomes). Participants receiving varenicline monotherapy delayed smoking longer and smoked fewer cigarettes than those on placebo. Participants receiving varenicline + low-dose naltrexone did not delay smoking longer than those receiving varenicline alone. Participants in both active medication arms smoked fewer cigarettes ad libitum than those receiving placebo. Varenicline can improve smoking outcomes even after an alcohol prime, supporting its use in heavy drinkers who wish to quit smoking. Findings did not support increased efficacy of combined varenicline + low-dose naltrexone relative to varenicline monotherapy.

  11. Development of hospital information systems: user participation and factors affecting it.

    PubMed

    Rahimi, Bahlol; Safdari, Reza; Jebraeily, Mohamad

    2014-12-01

    Given the large volume of data generated in hospitals, in order to efficiently management them; using hospital information system (HIS) is critical. User participation is one of the major factors in the success of HIS that in turn leads Information needs and processes to be correctly predicted and also their commitment to the development of HIS to be augmented. The purpose of this study is to investigate the participation rate of users in different stages of HIS development as well as to identify the factors affecting it. This is a descriptive-cross sectional study which was inducted in 2014. The study population consists of 140 HIS users (from different types of job including physicians, nurses, laboratory, radiology and HIM staffs) from Teaching Hospitals Affiliated to Urmia University of Medical Sciences. Data were collected using a self-structured questionnaire which was estimated as both reliable and valid. The data were analyzed by SPSS software descriptive statistics and analytical statistics (t-test and chi-square). The highest participation rate of users in the four-stage development of the HIS was related to the implementation phase (2.88) and the lowest participation rate was related to analysis (1.23). The test results showed that the rate of user participation was not satisfactory in none of the stages of development (P< 0.05). The most important factors in increasing user participation include established teamwork from end-users and the support of top managers from HIS development. According to the results obtained from the study, it seems that health care administrators must have a detailed plan for user participation prior to the development and purchase of HIS so that they identify the real needs as well as increase their commitment and motivations to develop, maintain and upgrade the system, and in this way, the success of the system will be assured.

  12. Level 1 Processing of MODIS Direct Broadcast Data at the GSFC DAAC

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Kempler, Steven J. (Technical Monitor)

    2001-01-01

    The GSFC DAAC is working to test and package the MODIS Level 1 Processing software for Aqua Direct Broadcast data. This entails the same code base, but different lookup tables for Aqua and Terra. However, the most significant change is the use of ancillary attitude and ephemeris files instead of orbit/attitude information within the science data stream (as with Terra). In addition, we are working on Linux: ports of the algorithms, which could eventually enable processing on PC clusters. Finally, the GSFC DAAC is also working with the GSFC Direct Readout laboratory to ingest Level 0 data from the GSFC DB antenna into the main DAAC, enabling level 1 production in near real time in support of applications users, such as the Synergy project. The mechanism developed for this could conceivably be extended to other participating stations.

  13. Fabricating micro-instruments in surface-micromachined polycrystalline silicon

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Comtois, J.H.; Michalicek, M.A.; Barron, C.C.

    1997-04-01

    Smaller, lighter instruments can be fabricated as Micro-Electro-Mechanical Systems (MEMS), having micron scale moving parts packaged together with associated control and measurement electronics. Batch fabrication of these devices will make economical applications such as condition-based machine maintenance and remote sensing. The choice of instrumentation is limited only by the designer`s imagination. This paper presents one genre of MEMS fabrication, surface-micromachined polycrystalline silicon (polysilicon). Two currently available but slightly different polysilicon processes are presented. One is the ARPA-sponsored ``Multi-User MEMS ProcesS`` (MUMPS), available commercially through MCNC; the other is the Sandia National Laboratories ``Sandia Ultra-planar Multilevel MEMS Technology`` (SUMMiT). Example componentsmore » created in both processes will be presented, with an emphasis on actuators, actuator force testing instruments, and incorporating actuators into larger instruments.« less

  14. Robot-assisted preparation of oncology drugs: the role of nurses.

    PubMed

    Palma, Elisabetta; Bufarini, Celestino

    2012-12-15

    Since 2007, the preparation of cancer drugs at the Pharmacy of the University Hospital of Ancona has been progressively robotized. Currently, the process of preparation of intravenous cancer drugs is almost totally automated (95%). At present, the Cytotoxic laboratory of Ancona is the sole, in Europe, that can count on two robots. The robots handle 56 oncology molecules, which correspond to more than 160 different vials. The production rate in 2011 exceeded 19,000 preparations. The quality of compounding and the sterility controls are satisfactory, every step of the process is traceable. The nursing staff played a fundamental role in the robot development process. The nursing staff and the pharmacists are still collaborating with the robotic engineers in order to increase efficiency, ergonomics and user-friendliness of the robots. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Using XML Configuration-Driven Development to Create a Customizable Ground Data System

    NASA Technical Reports Server (NTRS)

    Nash, Brent; DeMore, Martha

    2009-01-01

    The Mission data Processing and Control Subsystem (MPCS) is being developed as a multi-mission Ground Data System with the Mars Science Laboratory (MSL) as the first fully supported mission. MPCS is a fully featured, Java-based Ground Data System (GDS) for telecommand and telemetry processing based on Configuration-Driven Development (CDD). The eXtensible Markup Language (XML) is the ideal language for CDD because it is easily readable and editable by all levels of users and is also backed by a World Wide Web Consortium (W3C) standard and numerous powerful processing tools that make it uniquely flexible. The CDD approach adopted by MPCS minimizes changes to compiled code by using XML to create a series of configuration files that provide both coarse and fine grained control over all aspects of GDS operation.

  16. Implementation of the qualities of radiodiagnostic: mammography

    NASA Astrophysics Data System (ADS)

    Pacífico, L. C.; Magalhães, L. A. G.; Peixoto, J. G. P.; Fernandes, E.

    2018-03-01

    The objective of the present study was to evaluate the expanded uncertainty of the mammographic calibration process and present the result of the internal audit performed at the Laboratory of Radiological Sciences (LCR). The qualities of the mammographic beans that are references in the LCR, comprises two irradiation conditions: no-attenuated beam and attenuated beam. Both had satisfactory results, with an expanded uncertainty equals 2,1%. The internal audit was performed, and the degree of accordance with the ISO/IEC 17025 was evaluated. The result of the internal audit was satisfactory. We conclude that LCR can perform calibrations on mammography qualities for end users.

  17. The Federal Aviation Administration/Massachusetts Institute of Technology (FAA/MIT) Lincoln Laboratory Doppler weather radar program

    NASA Technical Reports Server (NTRS)

    Evans, James E.

    1988-01-01

    The program focuses on providing real-time information on hazardous aviation weather to end users such as air traffic control and pilots. Existing systems will soon be replaced by a Next Generation Weather Radar (NEXRAD), which will be concerned with detecting such hazards as heavy rain and hail, turbulence, low-altitude wind shear, and mesocyclones and tornadoes. Other systems in process are the Central Weather Processor (CWP), and the terminal Doppler weather radar (TDWR). Weather measurements near Memphis are central to ongoing work, especially in the area of microbursts and wind shear.

  18. Forest Resource Information System. Phase 3: System transfer report

    NASA Technical Reports Server (NTRS)

    Mroczynski, R. P. (Principal Investigator)

    1981-01-01

    Transfer of the forest reserve information system (FRIS) from the Laboratory for Applications of Remote Sensing to St. Regis Paper Company is described. Modifications required for the transfer of the LARYS image processing software are discussed. The reformatting, geometric correction, image registration, and documentation performed for preprocessing transfer are described. Data turnaround was improved and geometrically corrected and ground-registered CCT LANDSAT 3 data provided to the user. The technology transfer activities are summarized. An application test performed in order to assess a Florida land acquisition is described. A benefit/cost analysis of FRIS is presented.

  19. Physiological markers of biased decision-making in problematic Internet users

    PubMed Central

    Nikolaidou, Maria; Fraser, Danaë Stanton

    2016-01-01

    Background and aims Addiction has been reliably associated with biased emotional reactions to risky choices. Problematic Internet use (PIU) is a relatively new concept and its classification as an addiction is debated. Implicit emotional responses were measured in individuals expressing nonproblematic and problematic Internet behaviors while they made risky/ambiguous decisions to explore whether they showed similar responses to those found in agreed-upon addictions. Methods The design of the study was cross sectional. Participants were adult Internet users (N = 72). All testing took place in the Psychophysics Laboratory at the University of Bath, UK. Participants were given the Iowa Gambling Task (IGT) which provides an index of an individual’s ability to process and learn probabilities of reward and loss. Integration of emotions into current decision-making frameworks is vital for optimal performance on the IGT and thus, skin conductance responses (SCRs) to reward, punishment, and in anticipation of both were measured to assess emotional function. Results Performance on the IGT did not differ between the groups of Internet users. However, problematic Internet users expressed increased sensitivity to punishment as revealed by stronger SCRs to trials with higher punishment magnitude. Discussion and conclusions PIU seems to differ on behavioral and physiological levels with other addictions. However, our data imply that problematic Internet users were more risk-sensitive, which is a suggestion that needs to be incorporated into in any measure and, potentially, any intervention for PIU. PMID:27554505

  20. Where's My Data - WMD

    NASA Technical Reports Server (NTRS)

    Quach, William L.; Sesplaukis, Tadas; Owen-Mankovich, Kyran J.; Nakamura, Lori L.

    2012-01-01

    WMD provides a centralized interface to access data stored in the Mission Data Processing and Control System (MPCS) GDS (Ground Data Systems) databases during MSL (Mars Science Laboratory) Testbeds and ATLO (Assembly, Test, and Launch Operations) test sessions. The MSL project organizes its data based on venue (Testbed, ATLO, Ops), with each venue's data stored on a separate database, making it cumbersome for users to access data across the various venues. WMD allows sessions to be retrieved through a Web-based search using several criteria: host name, session start date, or session ID number. Sessions matching the search criteria will be displayed and users can then select a session to obtain and analyze the associated data. The uniqueness of this software comes from its collection of data retrieval and analysis features provided through a single interface. This allows users to obtain their data and perform the necessary analysis without having to worry about where and how to get the data, which may be stored in various locations. Additionally, this software is a Web application that only requires a standard browser without additional plug-ins, providing a cross-platform, lightweight solution for users to retrieve and analyze their data. This software solves the problem of efficiently and easily finding and retrieving data from thousands of MSL Testbed and ATLO sessions. WMD allows the user to retrieve their session in as little as one mouse click, and then to quickly retrieve additional data associated with the session.

  1. Capability of the Gas Analysis and Testing Laboratory at the NASA Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Broerman, Craig; Jimenez, Javier; Sweterlitsch, Jeff

    2012-01-01

    The Gas Analysis and Testing Laboratory is an integral part of the testing performed at the NASA Johnson Space Center. The Gas Analysis and Testing Laboratory is a high performance laboratory providing real time analytical instruments to support manned and unmanned testing. The lab utilizes precision gas chromatographs, gas analyzers and spectrophotometers to support the technology development programs within the NASA community. The Gas Analysis and Testing Laboratory works with a wide variety of customers and provides engineering support for user-specified applications in compressed gas, chemical analysis, general and research laboratory.

  2. Capability of the Gas Analysis and Testing Laboratory at the NASA Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Broerman, Craig; Jimenez, Javier; Sweterlitsch, Jeff

    2011-01-01

    The Gas Analysis and Testing Laboratory is an integral part of the testing performed at the NASA Johnson Space Center. The Gas Analysis and Testing Laboratory is a high performance laboratory providing real time analytical instruments to support manned and unmanned testing. The lab utilizes precision gas chromatographs, gas analyzers and spectrophotometers to support the technology development programs within the NASA community. The Gas Analysis and Testing Laboratory works with a wide variety of customers and provides engineering support for user-specified applications in compressed gas, chemical analysis, general and research laboratory

  3. Molecular Environmental Science: An Assessment of Research Accomplishments, Available Synchrotron Radiation Facilities, and Needs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, G

    2004-02-05

    Synchrotron-based techniques are fundamental to research in ''Molecular Environmental Science'' (MES), an emerging field that involves molecular-level studies of chemical and biological processes affecting the speciation, properties, and behavior of contaminants, pollutants, and nutrients in the ecosphere. These techniques enable the study of aqueous solute complexes, poorly crystalline materials, solid-liquid interfaces, mineral-aqueous solution interactions, microbial biofilm-heavy metal interactions, heavy metal-plant interactions, complex material microstructures, and nanomaterials, all of which are important components or processes in the environment. Basic understanding of environmental materials and processes at the molecular scale is essential for risk assessment and management, and reduction of environmental pollutantsmore » at field, landscape, and global scales. One of the main purposes of this report is to illustrate the role of synchrotron radiation (SR)-based studies in environmental science and related fields and their impact on environmental problems of importance to society. A major driving force for MES research is the need to characterize, treat, and/or dispose of vast quantities of contaminated materials, including groundwater, sediments, and soils, and to process wastes, at an estimated cost exceeding 150 billion dollars through 2070. A major component of this problem derives from high-level nuclear waste. Other significant components come from mining and industrial wastes, atmospheric pollutants derived from fossil fuel consumption, agricultural pesticides and fertilizers, and the pollution problems associated with animal waste run-off, all of which have major impacts on human health and welfare. Addressing these problems requires the development of new characterization and processing technologies--efforts that require information on the chemical speciation of heavy metals, radionuclides, and xenobiotic organic compounds and their reactions with environmental materials. To achieve this goal, both fundamental and targeted studies of complex environmental systems at a molecular level are needed, and examples of both types of studies are presented herein. These examples illustrate the fact that MES SR studies have led to a revolution in our understanding of the fundamental physical and chemical aspects of natural systems. The MES SR user community has continued to experience strong growth at U.S. SR laboratories, with MES researchers comprising up to 15% of the total user base. Further growth and development of the MES community is being hindered by insufficient resources, including support personnel, materials preparation facilities, and available beam time at U.S. SR laboratories. ''EnviroSync'' recommends the following actions, in cooperation with U.S. SR laboratory directors, to meet the MES community's needs.« less

  4. Clinical analysis of genome next-generation sequencing data using the Omicia platform

    PubMed Central

    Coonrod, Emily M; Margraf, Rebecca L; Russell, Archie; Voelkerding, Karl V; Reese, Martin G

    2013-01-01

    Aims Next-generation sequencing is being implemented in the clinical laboratory environment for the purposes of candidate causal variant discovery in patients affected with a variety of genetic disorders. The successful implementation of this technology for diagnosing genetic disorders requires a rapid, user-friendly method to annotate variants and generate short lists of clinically relevant variants of interest. This report describes Omicia’s Opal platform, a new software tool designed for variant discovery and interpretation in a clinical laboratory environment. The software allows clinical scientists to process, analyze, interpret and report on personal genome files. Materials & Methods To demonstrate the software, the authors describe the interactive use of the system for the rapid discovery of disease-causing variants using three cases. Results & Conclusion Here, the authors show the features of the Opal system and their use in uncovering variants of clinical significance. PMID:23895124

  5. Setup, Validation and Quality Control of a Centralized WGS laboratory - Lessons Learned.

    PubMed

    Arnold, Cath; Edwards, Kirstin; Desai, Meeta; Platt, Steve; Green, Jonathan; Conway, David

    2018-04-25

    Routine use of Whole Genome analysis for infectious diseases can be used to enlighten various scenarios pertaining to Public Health, including identification of microbial pathogens; relating individual cases to an outbreak of infectious disease; establishing an association between an outbreak of food poisoning and a specific food vehicle; inferring drug susceptibility; source tracing of contaminants and study of variations in the genome affect pathogenicity/virulence. We describe the setup, validation and ongoing verification of a centralised WGS laboratory to carry out the sequencing for these public health functions for the National Infection Services, Public Health England in the UK. The performance characteristics and Quality Control metrics measured during validation and verification of the entire end to end process (accuracy, precision, reproducibility and repeatability) are described and include information regarding the automated pass and release of data to service users without intervention. © Crown copyright 2018.

  6. CD-ROM source data uploaded to the operating and storage devices of an IBM 3090 mainframe through a PC terminal.

    PubMed

    Boros, L G; Lepow, C; Ruland, F; Starbuck, V; Jones, S; Flancbaum, L; Townsend, M C

    1992-07-01

    A powerful method of processing MEDLINE and CINAHL source data uploaded to the IBM 3090 mainframe computer through an IBM/PC is described. Data are first downloaded from the CD-ROM's PC devices to floppy disks. These disks then are uploaded to the mainframe computer through an IBM/PC equipped with WordPerfect text editor and computer network connection (SONNGATE). Before downloading, keywords specifying the information to be accessed are typed at the FIND prompt of the CD-ROM station. The resulting abstracts are downloaded into a file called DOWNLOAD.DOC. The floppy disks containing the information are simply carried to an IBM/PC which has a terminal emulation (TELNET) connection to the university-wide computer network (SONNET) at the Ohio State University Academic Computing Services (OSU ACS). The WordPerfect (5.1) processes and saves the text into DOS format. Using the File Transfer Protocol (FTP, 130,000 bytes/s) of SONNET, the entire text containing the information obtained through the MEDLINE and CINAHL search is transferred to the remote mainframe computer for further processing. At this point, abstracts in the specified area are ready for immediate access and multiple retrieval by any PC having network switch or dial-in connection after the USER ID, PASSWORD and ACCOUNT NUMBER are specified by the user. The system provides the user an on-line, very powerful and quick method of searching for words specifying: diseases, agents, experimental methods, animals, authors, and journals in the research area downloaded. The user can also copy the TItles, AUthors and SOurce with optional parts of abstracts into papers under edition. This arrangement serves the special demands of a research laboratory by handling MEDLINE and CINAHL source data resulting after a search is performed with keywords specified for ongoing projects. Since the Ohio State University has a centrally founded mainframe system, the data upload, storage and mainframe operations are free.

  7. Assessing electronic cigarette effects and regulatory impact: Challenges with user self-reported device power.

    PubMed

    Rudy, Alyssa K; Leventhal, Adam M; Goldenson, Nicholas I; Eissenberg, Thomas

    2017-10-01

    Electronic cigarettes (ECIGs) aerosolize liquids for user inhalation that usually contain nicotine. ECIG nicotine emission is determined, in part, by user behavior, liquid nicotine concentration, and electrical power. Whether users are able to report accurately nicotine concentration and device electrical power has not been evaluated. This study's purpose was to examine if ECIG users could provide data relevant to understanding ECIG nicotine emission, particularly liquid nicotine concentration (mg/ml) as well as battery voltage (V) and heater resistance (ohms, Ω) - needed to calculate power (watts, W). Adult ECIG users (N=165) were recruited from Los Angeles, CA for research studies examining the effects of ECIG use. We asked all participants who visited the laboratory to report liquid nicotine concentration, V, and Ω. Liquid nicotine concentration was reported by 89.7% (mean=9.5mg/ml, SD=7.3), and responses were consistent with the distribution of liquids available in commonly marketed products. The majority could not report voltage (51.5%) or resistance (63.6%). Of the 40 participants (24.8%) who reported voltage and resistance, there was a substantial power range (2.2-32,670W) the upper limit of which exceeds that of the highest ECIG reported by any user to our knowledge (i.e., 2512W). If 2512W is taken as the upper limit, only 30 (18.2%) reported valid results (mean 237.3W, SD=370.6; range=2.2-1705.3W). Laboratory, survey, and other researchers interested in understanding ECIG effects to inform users and policymakers may need to use methods other than user self-report to obtain information regarding device power. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Organization of Biomedical Data for Collaborative Scientific Research: A Research Information Management System

    PubMed Central

    Myneni, Sahiti; Patel, Vimla L.

    2010-01-01

    Biomedical researchers often work with massive, detailed and heterogeneous datasets. These datasets raise new challenges of information organization and management for scientific interpretation, as they demand much of the researchers’ time and attention. The current study investigated the nature of the problems that researchers face when dealing with such data. Four major problems identified with existing biomedical scientific information management methods were related to data organization, data sharing, collaboration, and publications. Therefore, there is a compelling need to develop an efficient and user-friendly information management system to handle the biomedical research data. This study evaluated the implementation of an information management system, which was introduced as part of the collaborative research to increase scientific productivity in a research laboratory. Laboratory members seemed to exhibit frustration during the implementation process. However, empirical findings revealed that they gained new knowledge and completed specified tasks while working together with the new system. Hence, researchers are urged to persist and persevere when dealing with any new technology, including an information management system in a research laboratory environment. PMID:20543892

  9. 3D Printing in the Laboratory: Maximize Time and Funds with Customized and Open-Source Labware

    PubMed Central

    Coakley, Meghan; Hurt, Darrell E.

    2016-01-01

    3D printing, also known as additive manufacturing, is the computer-guided process of fabricating physical objects by depositing successive layers of material. It has transformed manufacturing across virtually every industry, bringing about incredible advances in research and medicine. The rapidly growing consumer market now includes convenient and affordable “desktop” 3D printers. These are being used in the laboratory to create custom 3D-printed equipment, and a growing community of designers are contributing open-source, cost-effective innovations that can be used by both professionals and enthusiasts. User stories from investigators at the National Institutes of Health and the biomedical research community demonstrate the power of 3D printing to save valuable time and funding. While adoption of 3D printing has been slow in the biosciences to date, the potential is vast. The market predicts that within several years, 3D printers could be commonplace within the home; with so many practical uses for 3D printing, we anticipate that the technology will also play an increasingly important role in the laboratory. PMID:27197798

  10. Essential attributes identified in the design of a Laboratory Information Management System for a high throughput siRNA screening laboratory.

    PubMed

    Grandjean, Geoffrey; Graham, Ryan; Bartholomeusz, Geoffrey

    2011-11-01

    In recent years high throughput screening operations have become a critical application in functional and translational research. Although a seemingly unmanageable amount of data is generated by these high-throughput, large-scale techniques, through careful planning, an effective Laboratory Information Management System (LIMS) can be developed and implemented in order to streamline all phases of a workflow. Just as important as data mining and analysis procedures at the end of complex processes is the tracking of individual steps of applications that generate such data. Ultimately, the use of a customized LIMS will enable users to extract meaningful results from large datasets while trusting the robustness of their assays. To illustrate the design of a custom LIMS, this practical example is provided to highlight the important aspects of the design of a LIMS to effectively modulate all aspects of an siRNA screening service. This system incorporates inventory management, control of workflow, data handling and interaction with investigators, statisticians and administrators. All these modules are regulated in a synchronous manner within the LIMS. © 2011 Bentham Science Publishers

  11. Organization of Biomedical Data for Collaborative Scientific Research: A Research Information Management System.

    PubMed

    Myneni, Sahiti; Patel, Vimla L

    2010-06-01

    Biomedical researchers often work with massive, detailed and heterogeneous datasets. These datasets raise new challenges of information organization and management for scientific interpretation, as they demand much of the researchers' time and attention. The current study investigated the nature of the problems that researchers face when dealing with such data. Four major problems identified with existing biomedical scientific information management methods were related to data organization, data sharing, collaboration, and publications. Therefore, there is a compelling need to develop an efficient and user-friendly information management system to handle the biomedical research data. This study evaluated the implementation of an information management system, which was introduced as part of the collaborative research to increase scientific productivity in a research laboratory. Laboratory members seemed to exhibit frustration during the implementation process. However, empirical findings revealed that they gained new knowledge and completed specified tasks while working together with the new system. Hence, researchers are urged to persist and persevere when dealing with any new technology, including an information management system in a research laboratory environment.

  12. U.S. Army Research Laboratory Image Enhancement Test Bed User’s Manual

    DTIC Science & Technology

    2013-07-01

    Web Services; ARL-TR- 6393; U.S. Army Research Laboratory: Adelphi, MD, March 2013. 2. Young, S. Susan; Driggers, Ronald G. Superresolution Image...Young, S. Susan; Theilke, Matthew.; Schuler, Jonathan M. Superresolution Performance for Undersampled Imagers. Optical Engineering 2005, 44 (01). 4

  13. A tailored 200 parameter VME based data acquisition system for IBA at the Lund Ion Beam Analysis Facility - Hardware and software

    NASA Astrophysics Data System (ADS)

    Elfman, Mikael; Ros, Linus; Kristiansson, Per; Nilsson, E. J. Charlotta; Pallon, Jan

    2016-03-01

    With the recent advances towards modern Ion Beam Analysis (IBA), going from one- or few-parameter detector systems to multi-parameter systems, it has been necessary to expand and replace the more than twenty years old CAMAC based system. A new VME multi-parameter (presently up to 200 channels) data acquisition and control system has been developed and implemented at the Lund Ion Beam Analysis Facility (LIBAF). The system is based on the VX-511 Single Board Computer (SBC), acting as master with arbiter functionality and consists of standard VME modules like Analog to Digital Converters (ADC's), Charge to Digital Converters (QDC's), Time to Digital Converters (TDC's), scaler's, IO-cards, high voltage and waveform units. The modules have been specially selected to support all of the present detector systems in the laboratory, with the option of future expansion. Typically, the detector systems consist of silicon strip detectors, silicon drift detectors and scintillator detectors, for detection of charged particles, X-rays and γ-rays. The data flow of the raw data buffers out from the VME bus to the final storage place on a 16 terabyte network attached storage disc (NAS-disc) is described. The acquisition process, remotely controlled over one of the SBCs ethernet channels, is also discussed. The user interface is written in the Kmax software package, and is used to control the acquisition process as well as for advanced online and offline data analysis through a user-friendly graphical user interface (GUI). In this work the system implementation, layout and performance are presented. The user interface and possibilities for advanced offline analysis are also discussed and illustrated.

  14. Integration of analytical instruments with computer scripting.

    PubMed

    Carvalho, Matheus C

    2013-08-01

    Automation of laboratory routines aided by computer software enables high productivity and is the norm nowadays. However, the integration of different instruments made by different suppliers is still difficult, because to accomplish it, the user must have knowledge of electronics and/or low-level programming. An alternative approach is to control different instruments without an electronic connection between them, relying only on their software interface on a computer. This can be achieved through scripting, which is the emulation of user operations (mouse clicks and keyboard inputs) on the computer. The main advantages of this approach are its simplicity, which enables people with minimal knowledge of computer programming to employ it, and its universality, which enables the integration of instruments made by different suppliers, meaning that the user is totally free to choose the devices to be integrated. Therefore, scripting can be a useful, accessible, and economic solution for laboratory automation.

  15. [Experimental nuclear physics]. Annual report 1988

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1988-05-01

    This is the May 1988 annual report of the Nuclear Physics Laboratory of the University of Washington. It contains chapters on astrophysics, giant resonances, heavy ion induced reactions, fundamental symmetries, polarization in nuclear reactions, medium energy reactions, accelerator mass spectrometry (AMS), research by outside users, Van de Graaff and ion sources, the Laboratory`s booster linac project work, instrumentation, and computer systems. An appendix lists Laboratory personnel, Ph.D. degrees granted in the 1987-88 academic year, and publications. Refs., 27 figs., 4 tabs.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zwicker, Andrew P.; Bloom, Josh; Albertson, Robert

    3D printing has become popular for a variety of users, from industrial to the home hobbyist, to scientists and engineers interested in producing their own laboratory equipment. In order to determine the suitability of 3D printed parts for our plasma physics laboratory, we measured the accuracy, strength, vacuum compatibility, and electrical properties of pieces printed in plastic. The flexibility of rapidly creating custom parts has led to the 3D printer becoming an invaluable resource in our laboratory and is equally suitable for producing equipment for advanced undergraduate laboratories.

  17. User interface issues in supporting human-computer integrated scheduling

    NASA Technical Reports Server (NTRS)

    Cooper, Lynne P.; Biefeld, Eric W.

    1991-01-01

    Explored here is the user interface problems encountered with the Operations Missions Planner (OMP) project at the Jet Propulsion Laboratory (JPL). OMP uses a unique iterative approach to planning that places additional requirements on the user interface, particularly to support system development and maintenance. These requirements are necessary to support the concepts of heuristically controlled search, in-progress assessment, and iterative refinement of the schedule. The techniques used to address the OMP interface needs are given.

  18. Tapping the Molecular Potential of Microalgae to Produce Biomass (JGI Seventh Annual User Meeting 2012: Genomics of Energy and Environment)

    ScienceCinema

    Sayre, Richard; Kyrpides, Nikos

    2018-05-03

    Richard Sayre, from Los Alamos National Laboratory, presents a talk titled "Tapping the Molecular Potential of Microalgae to Produce Biomass" at the JGI 7th Annual Users Meeting: Genomics of Energy & Environment Meeting on March 22, 2012 in Walnut Creek, California.

  19. The Linac Coherent Light Source

    DOE PAGES

    White, William E.; Robert, Aymeric; Dunne, Mike

    2015-05-01

    The Linac Coherent Light Source (LCLS) at the SLAC National Accelerator Laboratory was the first hard X-ray free-electron laser (FEL) to operate as a user facility. After five years of operation, LCLS is now a mature FEL user facility. Our personal views about opportunities and challenges inherent to these unique light sources are discussed.

  20. End-User Searching in a Large Library Network: A Case Study of Patent Attorneys.

    ERIC Educational Resources Information Center

    Vollaro, Alice J.; Hawkins, Donald T.

    1986-01-01

    Reports results of study of a group of end users (patent attorneys) doing their own online searching at AT&T Bell Laboratories. Highlights include DIALOG databases used by the attorneys, locations and searching modes, characteristics of patent attorney searchers, and problem areas. Questionnaire is appended. (5 references) (EJS)

  1. Tapping the Molecular Potential of Microalgae to Produce Biomass (JGI Seventh Annual User Meeting 2012: Genomics of Energy and Environment)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sayre, Richard; Kyrpides, Nikos

    2012-03-22

    Richard Sayre, from Los Alamos National Laboratory, presents a talk titled "Tapping the Molecular Potential of Microalgae to Produce Biomass" at the JGI 7th Annual Users Meeting: Genomics of Energy & Environment Meeting on March 22, 2012 in Walnut Creek, California.

  2. Estimating fire behavior with FIRECAST: user's manual

    Treesearch

    Jack D. Cohen

    1986-01-01

    FIRECAST is a computer program that estimates fire behavior in terms of six fire parameters. Required inputs vary depending on the outputs desired by the fire manager. Fuel model options available to users are these: Northern Forest Fire Laboratory (NFFL), National Fire Danger Rating System (NFDRS), and southern California brushland (SCAL). The program has been...

  3. Genome-Scale Discovery of Cell Wall Biosynthesis Genes in Populus (JGI Seventh Annual User Meeting 2012: Genomics of Energy and Environment)

    ScienceCinema

    Muchero, Wellington

    2018-01-15

    Wellington Muchero from Oak Ridge National Laboratory gives a talk titled "Discovery of Cell Wall Biosynthesis Genes in Populus" at the JGI 7th Annual Users Meeting: Genomics of Energy & Environment Meeting on March 22, 2012 in Walnut Creek, California.

  4. EMSL Quarterly Highlights Report Second Quarter, Fiscal Year 2010 (January 1, 2010 through March 31, 2010)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    West, Staci A.; Showalter, Mary Ann; Manke, Kristin L.

    2010-04-20

    The Environmental Molecular Sciences Laboratory (EMSL) is a U.S. Department of Energy (DOE) national scientific user facility located at Pacific Northwest National Laboratory (PNNL) in Richland, Washington. EMSL is operated by PNNL for the DOE-Office of Biological and Environmental Research. At one location, EMSL offers a comprehensive array of leading-edge resources and expertise. Access to the instrumentation and expertise is obtained on a peer-reviewed proposal basis. Staff members work with researchers to expedite access to these capabilities. The "EMSL Quarterly Highlights Report" documents current research and activities of EMSL staff and users.

  5. A 'smart' tube holder enables real-time sample monitoring in a standard lab centrifuge.

    PubMed

    Hoang, Tony; Moskwa, Nicholas; Halvorsen, Ken

    2018-01-01

    The centrifuge is among the oldest and most widely used pieces of laboratory equipment, with significant applications that include clinical diagnostics and biomedical research. A major limitation of laboratory centrifuges is their "black box" nature, limiting sample observation to before and after centrifugation. Thus, optimized protocols require significant trial and error, while unoptimized protocols waste time by centrifuging longer than necessary or material due to incomplete sedimentation. Here, we developed an instrumented centrifuge tube receptacle compatible with several commercial benchtop centrifuges that can provide real-time sample analysis during centrifugation. We demonstrated the system by monitoring cell separations during centrifugation for different spin speeds, concentrations, buffers, cell types, and temperatures. We show that the collected data are valuable for analytical purposes (e.g. quality control), or as feedback to the user or the instrument. For the latter, we verified an adaptation where complete sedimentation turned off the centrifuge and notified the user by a text message. Our system adds new functionality to existing laboratory centrifuges, saving users time and providing useful feedback. This add-on potentially enables new analytical applications for an instrument that has remained largely unchanged for decades.

  6. A ‘smart’ tube holder enables real-time sample monitoring in a standard lab centrifuge

    PubMed Central

    Hoang, Tony; Moskwa, Nicholas

    2018-01-01

    The centrifuge is among the oldest and most widely used pieces of laboratory equipment, with significant applications that include clinical diagnostics and biomedical research. A major limitation of laboratory centrifuges is their “black box” nature, limiting sample observation to before and after centrifugation. Thus, optimized protocols require significant trial and error, while unoptimized protocols waste time by centrifuging longer than necessary or material due to incomplete sedimentation. Here, we developed an instrumented centrifuge tube receptacle compatible with several commercial benchtop centrifuges that can provide real-time sample analysis during centrifugation. We demonstrated the system by monitoring cell separations during centrifugation for different spin speeds, concentrations, buffers, cell types, and temperatures. We show that the collected data are valuable for analytical purposes (e.g. quality control), or as feedback to the user or the instrument. For the latter, we verified an adaptation where complete sedimentation turned off the centrifuge and notified the user by a text message. Our system adds new functionality to existing laboratory centrifuges, saving users time and providing useful feedback. This add-on potentially enables new analytical applications for an instrument that has remained largely unchanged for decades. PMID:29659624

  7. Experimenter's laboratory for visualized interactive science

    NASA Technical Reports Server (NTRS)

    Hansen, Elaine R.; Klemp, Marjorie K.; Lasater, Sally W.; Szczur, Marti R.; Klemp, Joseph B.

    1992-01-01

    The science activities of the 1990's will require the analysis of complex phenomena and large diverse sets of data. In order to meet these needs, we must take advantage of advanced user interaction techniques: modern user interface tools; visualization capabilities; affordable, high performance graphics workstations; and interoperable data standards and translator. To meet these needs, we propose to adopt and upgrade several existing tools and systems to create an experimenter's laboratory for visualized interactive science. Intuitive human-computer interaction techniques have already been developed and demonstrated at the University of Colorado. A Transportable Applications Executive (TAE+), developed at GSFC, is a powerful user interface tool for general purpose applications. A 3D visualization package developed by NCAR provides both color shaded surface displays and volumetric rendering in either index or true color. The Network Common Data Form (NetCDF) data access library developed by Unidata supports creation, access and sharing of scientific data in a form that is self-describing and network transparent. The combination and enhancement of these packages constitutes a powerful experimenter's laboratory capable of meeting key science needs of the 1990's. This proposal encompasses the work required to build and demonstrate this capability.

  8. Experimenter's laboratory for visualized interactive science

    NASA Technical Reports Server (NTRS)

    Hansen, Elaine R.; Klemp, Marjorie K.; Lasater, Sally W.; Szczur, Marti R.; Klemp, Joseph B.

    1993-01-01

    The science activities of the 1990's will require the analysis of complex phenomena and large diverse sets of data. In order to meet these needs, we must take advantage of advanced user interaction techniques: modern user interface tools; visualization capabilities; affordable, high performance graphics workstations; and interoperatable data standards and translator. To meet these needs, we propose to adopt and upgrade several existing tools and systems to create an experimenter's laboratory for visualized interactive science. Intuitive human-computer interaction techniques have already been developed and demonstrated at the University of Colorado. A Transportable Applications Executive (TAE+), developed at GSFC, is a powerful user interface tool for general purpose applications. A 3D visualization package developed by NCAR provides both color-shaded surface displays and volumetric rendering in either index or true color. The Network Common Data Form (NetCDF) data access library developed by Unidata supports creation, access and sharing of scientific data in a form that is self-describing and network transparent. The combination and enhancement of these packages constitutes a powerful experimenter's laboratory capable of meeting key science needs of the 1990's. This proposal encompasses the work required to build and demonstrate this capability.

  9. A PDA study management tool (SMT) utilizing wireless broadband and full DICOM viewing capability

    NASA Astrophysics Data System (ADS)

    Documet, Jorge; Liu, Brent; Zhou, Zheng; Huang, H. K.; Documet, Luis

    2007-03-01

    During the last 4 years IPI (Image Processing and Informatics) Laboratory has been developing a web-based Study Management Tool (SMT) application that allows Radiologists, Film librarians and PACS-related (Picture Archiving and Communication System) users to dynamically and remotely perform Query/Retrieve operations in a PACS network. The users utilizing a regular PDA (Personal Digital Assistant) can remotely query a PACS archive to distribute any study to an existing DICOM (Digital Imaging and Communications in Medicine) node. This application which has proven to be convenient to manage the Study Workflow [1, 2] has been extended to include a DICOM viewing capability in the PDA. With this new feature, users can take a quick view of DICOM images providing them mobility and convenience at the same time. In addition, we are extending this application to Metropolitan-Area Wireless Broadband Networks. This feature requires Smart Phones that are capable of working as a PDA and have access to Broadband Wireless Services. With the extended application to wireless broadband technology and the preview of DICOM images, the Study Management Tool becomes an even more powerful tool for clinical workflow management.

  10. Effect of input compression and input frequency response on music perception in cochlear implant users.

    PubMed

    Halliwell, Emily R; Jones, Linor L; Fraser, Matthew; Lockley, Morag; Hill-Feltham, Penelope; McKay, Colette M

    2015-06-01

    A study was conducted to determine whether modifications to input compression and input frequency response characteristics can improve music-listening satisfaction in cochlear implant users. Experiment 1 compared three pre-processed versions of music and speech stimuli in a laboratory setting: original, compressed, and flattened frequency response. Music excerpts comprised three music genres (classical, country, and jazz), and a running speech excerpt was compared. Experiment 2 implemented a flattened input frequency response in the speech processor program. In a take-home trial, participants compared unaltered and flattened frequency responses. Ten and twelve adult Nucleus Freedom cochlear implant users participated in Experiments 1 and 2, respectively. Experiment 1 revealed a significant preference for music stimuli with a flattened frequency response compared to both original and compressed stimuli, whereas there was a significant preference for the original (rising) frequency response for speech stimuli. Experiment 2 revealed no significant mean preference for the flattened frequency response, with 9 of 11 subjects preferring the rising frequency response. Input compression did not alter music enjoyment. Comparison of the two experiments indicated that individual frequency response preferences may depend on the genre or familiarity, and particularly whether the music contained lyrics.

  11. Subsurface Transport Over Multiple Phases Demonstration Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-01-05

    The STOMP simulator is a suite of numerical simulators developed by Pacific Northwest National Laboratory for addressing problems involving coupled multifluid hydrologic, thermal, geochemical, and geomechanical processes in the subsurface. The simulator has been applied to problems concerning environmental remediation, environmental stewardship, carbon sequestration, conventional petroleum production, and the production of unconventional hydrocarbon fuels. The simulator is copyrighted by Battelle Memorial Institute, and is available outside of PNNL via use agreements. To promote the open exchange of scientific ideas the simulator is provided as source code. A demonstration version of the simulator has been developed, which will provide potential newmore » users with an executable (not source code) implementation of the software royalty free. Demonstration versions will be offered via the STOMP website for all currently available operational modes of the simulator. The demonstration versions of the simulator will be configured with the direct banded linear system solver and have a limit of 1,000 active grid cells. This will provide potential new users with an opportunity to apply the code to simple problems, including many of the STOMP short course problems, without having to pay a license fee. Users will be required to register on the STOMP website prior to receiving an executable.« less

  12. The Development of a Microbial Challenge Test with Acholeplasma laidlawii To Rate Mycoplasma-Retentive Filters by Filter Manufacturers.

    PubMed

    Folmsbee, Martha; Lentine, Kerry Roche; Wright, Christine; Haake, Gerhard; Mcburnie, Leesa; Ashtekar, Dilip; Beck, Brian; Hutchison, Nick; Okhio-Seaman, Laura; Potts, Barbara; Pawar, Vinayak; Windsor, Helena

    2014-01-01

    Mycoplasma are bacteria that can penetrate 0.2 and 0.22 μm rated sterilizing-grade filters and even some 0.1 μm rated filters. Primary applications for mycoplasma filtration include large scale mammalian and bacterial cell culture media and serum filtration. The Parenteral Drug Association recognized the absence of standard industry test parameters for testing and classifying 0.1 μm rated filters for mycoplasma clearance and formed a task force to formulate consensus test parameters. The task force established some test parameters by common agreement, based upon general industry practices, without the need for additional testing. However, the culture medium and incubation conditions, for generating test mycoplasma cells, varied from filter company to filter company and was recognized as a serious gap by the task force. Standardization of the culture medium and incubation conditions required collaborative testing in both commercial filter company laboratories and in an Independent laboratory (Table I). The use of consensus test parameters will facilitate the ultimate cross-industry goal of standardization of 0.1 μm filter claims for mycoplasma clearance. However, it is still important to recognize filter performance will depend on the actual conditions of use. Therefore end users should consider, using a risk-based approach, whether process-specific evaluation of filter performance may be warranted for their application. Mycoplasma are small bacteria that have the ability to penetrate sterilizing-grade filters. Filtration of large-scale mammalian and bacterial cell culture media is an example of an industry process where effective filtration of mycoplasma is required. The Parenteral Drug Association recognized the absence of industry standard test parameters for evaluating mycoplasma clearance filters by filter manufacturers and formed a task force to formulate such a consensus among manufacturers. The use of standardized test parameters by filter manufacturers, including the preparation of the culture broth, will facilitate the end user's evaluation of the mycoplasma clearance claims provided by filter vendors. However, it is still important to recognize filter performance will depend on the actual conditions of use; therefore end users should consider, using a risk-based approach, whether process-specific evaluation of filter performance may be warranted for their application. © PDA, Inc. 2014.

  13. Analytical performance, agreement and user-friendliness of six point-of-care testing urine analysers for urinary tract infection in general practice

    PubMed Central

    Schot, Marjolein J C; van Delft, Sanne; Kooijman-Buiting, Antoinette M J; de Wit, Niek J; Hopstaken, Rogier M

    2015-01-01

    Objective Various point-of-care testing (POCT) urine analysers are commercially available for routine urine analysis in general practice. The present study compares analytical performance, agreement and user-friendliness of six different POCT urine analysers for diagnosing urinary tract infection in general practice. Setting All testing procedures were performed at a diagnostic centre for primary care in the Netherlands. Urine samples were collected at four general practices. Primary and secondary outcome measures Analytical performance and agreement of the POCT analysers regarding nitrite, leucocytes and erythrocytes, with the laboratory reference standard, was the primary outcome measure, and analysed by calculating sensitivity, specificity, positive and negative predictive value, and Cohen's κ coefficient for agreement. Secondary outcome measures were the user-friendliness of the POCT analysers, in addition to other characteristics of the analysers. Results The following six POCT analysers were evaluated: Uryxxon Relax (Macherey Nagel), Urisys 1100 (Roche), Clinitek Status (Siemens), Aution 11 (Menarini), Aution Micro (Menarini) and Urilyzer (Analyticon). Analytical performance was good for all analysers. Compared with laboratory reference standards, overall agreement was good, but differed per parameter and per analyser. Concerning the nitrite test, the most important test for clinical practice, all but one showed perfect agreement with the laboratory standard. For leucocytes and erythrocytes specificity was high, but sensitivity was considerably lower. Agreement for leucocytes varied between good to very good, and for the erythrocyte test between fair and good. First-time users indicated that the analysers were easy to use. They expected higher productivity and accuracy when using these analysers in daily practice. Conclusions The overall performance and user-friendliness of all six commercially available POCT urine analysers was sufficient to justify routine use in suspected urinary tract infections in general practice. PMID:25986635

  14. User interface development and metadata considerations for the Atmospheric Radiation Measurement (ARM) archive

    NASA Technical Reports Server (NTRS)

    Singley, P. T.; Bell, J. D.; Daugherty, P. F.; Hubbs, C. A.; Tuggle, J. G.

    1993-01-01

    This paper will discuss user interface development and the structure and use of metadata for the Atmospheric Radiation Measurement (ARM) Archive. The ARM Archive, located at Oak Ridge National Laboratory (ORNL) in Oak Ridge, Tennessee, is the data repository for the U.S. Department of Energy's (DOE's) ARM Project. After a short description of the ARM Project and the ARM Archive's role, we will consider the philosophy and goals, constraints, and prototype implementation of the user interface for the archive. We will also describe the metadata that are stored at the archive and support the user interface.

  15. Safety Sensor Testing Laboratory | Hydrogen and Fuel Cells | NREL

    Science.gov Websites

    collaborations, trainings and workshops, and academic research and development support. Work in the laboratory (temperature, pressure, and relative humidity) and gas parameters (flow and composition) Quantitative sensor services to assist end-users on sensor selection and use Assist developers in quantitative assessment of

  16. Appropriate Use Policy | High-Performance Computing | NREL

    Science.gov Websites

    users of the National Renewable Energy Laboratory (NREL) High Performance Computing (HPC) resources government agency, National Laboratory, University, or private entity, the intellectual property terms (if issued a multifactor token which may be a physical token or a virtual token used with one-time password

  17. High-throughput neuroimaging-genetics computational infrastructure

    PubMed Central

    Dinov, Ivo D.; Petrosyan, Petros; Liu, Zhizhong; Eggert, Paul; Hobel, Sam; Vespa, Paul; Woo Moon, Seok; Van Horn, John D.; Franco, Joseph; Toga, Arthur W.

    2014-01-01

    Many contemporary neuroscientific investigations face significant challenges in terms of data management, computational processing, data mining, and results interpretation. These four pillars define the core infrastructure necessary to plan, organize, orchestrate, validate, and disseminate novel scientific methods, computational resources, and translational healthcare findings. Data management includes protocols for data acquisition, archival, query, transfer, retrieval, and aggregation. Computational processing involves the necessary software, hardware, and networking infrastructure required to handle large amounts of heterogeneous neuroimaging, genetics, clinical, and phenotypic data and meta-data. Data mining refers to the process of automatically extracting data features, characteristics and associations, which are not readily visible by human exploration of the raw dataset. Result interpretation includes scientific visualization, community validation of findings and reproducible findings. In this manuscript we describe the novel high-throughput neuroimaging-genetics computational infrastructure available at the Institute for Neuroimaging and Informatics (INI) and the Laboratory of Neuro Imaging (LONI) at University of Southern California (USC). INI and LONI include ultra-high-field and standard-field MRI brain scanners along with an imaging-genetics database for storing the complete provenance of the raw and derived data and meta-data. In addition, the institute provides a large number of software tools for image and shape analysis, mathematical modeling, genomic sequence processing, and scientific visualization. A unique feature of this architecture is the Pipeline environment, which integrates the data management, processing, transfer, and visualization. Through its client-server architecture, the Pipeline environment provides a graphical user interface for designing, executing, monitoring validating, and disseminating of complex protocols that utilize diverse suites of software tools and web-services. These pipeline workflows are represented as portable XML objects which transfer the execution instructions and user specifications from the client user machine to remote pipeline servers for distributed computing. Using Alzheimer's and Parkinson's data, we provide several examples of translational applications using this infrastructure1. PMID:24795619

  18. User roles and contributions during the new product development process in collaborative innovation communities.

    PubMed

    Guo, Wei; Zheng, Qing; An, Weijin; Peng, Wei

    2017-09-01

    Collaborative innovation (co-innovation) community emerges as a new product design platform where companies involve users in the new product development (NPD) process. Large numbers of users participate and contribute to the process voluntarily. This exploratory study investigates the heterogeneous roles of users based on a global co-innovation project in online community. Content analysis, social network analysis and cluster method are employed to measure user behaviors, distinguish user roles, and analyze user contributions. The study identifies six user roles that emerge during the NPD process in co-innovation community: project leader, active designer, generalist, communicator, passive designer, and observer. The six user roles differ in their contribution forms and quality. This paper contributes to research on co-innovation in online communities, including design team structure, user roles and their contribution to design task and solution, as well as user value along the process. In addition, the study provides practices guidance on implementing project, attracting users, and designing platform for co-innovation community practitioners. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Students Using a Novel Web-Based Laboratory Class Support System: A Case Study in Food Chemistry Education

    ERIC Educational Resources Information Center

    van der Kolk, Koos; Beldman, Gerrit; Hartog, Rob; Gruppen, Harry

    2012-01-01

    The design, usage, and evaluation of a Web-based laboratory manual (WebLM) are described. The main aim of the WebLM is to support students while working in the laboratory by providing them with just-in-time information. Design guidelines for this electronic manual were derived from literature on cognitive load and user interface design. The WebLM…

  20. Electronic laboratory notebook: the academic point of view.

    PubMed

    Rudolphi, Felix; Goossen, Lukas J

    2012-02-27

    Based on a requirement analysis and alternative design considerations, a platform-independent electronic laboratory notebook (ELN) has been developed that specifically targets academic users. Its intuitive design and numerous productivity features motivate chemical researchers and students to record their data electronically. The data are stored in a highly structured form that offers substantial benefits over laboratory notebooks written on paper with regard to data retrieval, data mining, and exchange of results.

  1. Electronic laboratory system reduces errors in National Tuberculosis Program: a cluster randomized controlled trial.

    PubMed

    Blaya, J A; Shin, S S; Yale, G; Suarez, C; Asencios, L; Contreras, C; Rodriguez, P; Kim, J; Cegielski, P; Fraser, H S F

    2010-08-01

    To evaluate the impact of the e-Chasqui laboratory information system in reducing reporting errors compared to the current paper system. Cluster randomized controlled trial in 76 health centers (HCs) between 2004 and 2008. Baseline data were collected every 4 months for 12 months. HCs were then randomly assigned to intervention (e-Chasqui) or control (paper). Further data were collected for the same months the following year. Comparisons were made between intervention and control HCs, and before and after the intervention. Intervention HCs had respectively 82% and 87% fewer errors in reporting results for drug susceptibility tests (2.1% vs. 11.9%, P = 0.001, OR 0.17, 95%CI 0.09-0.31) and cultures (2.0% vs. 15.1%, P < 0.001, OR 0.13, 95%CI 0.07-0.24), than control HCs. Preventing missing results through online viewing accounted for at least 72% of all errors. e-Chasqui users sent on average three electronic error reports per week to the laboratories. e-Chasqui reduced the number of missing laboratory results at point-of-care health centers. Clinical users confirmed viewing electronic results not available on paper. Reporting errors to the laboratory using e-Chasqui promoted continuous quality improvement. The e-Chasqui laboratory information system is an important part of laboratory infrastructure improvements to support multidrug-resistant tuberculosis care in Peru.

  2. Battlefield decision aid for acoustical ground sensors with interface to meteorological data sources

    NASA Astrophysics Data System (ADS)

    Wilson, D. Keith; Noble, John M.; VanAartsen, Bruce H.; Szeto, Gregory L.

    2001-08-01

    The performance of acoustical ground sensors depends heavily on the local atmospheric and terrain conditions. This paper describes a prototype physics-based decision aid, called the Acoustic Battlefield Aid (ABFA), for predicting these environ-mental effects. ABFA integrates advanced models for acoustic propagation, atmospheric structure, and array signal process-ing into a convenient graphical user interface. The propagation calculations are performed in the frequency domain on user-definable target spectra. The solution method involves a parabolic approximation to the wave equation combined with a ter-rain diffraction model. Sensor performance is characterized with Cramer-Rao lower bounds (CRLBs). The CRLB calcula-tions include randomization of signal energy and wavefront orientation resulting from atmospheric turbulence. Available performance characterizations include signal-to-noise ratio, probability of detection, direction-finding accuracy for isolated receiving arrays, and location-finding accuracy for networked receiving arrays. A suite of integrated tools allows users to create new target descriptions from standard digitized audio files and to design new sensor array layouts. These tools option-ally interface with the ARL Database/Automatic Target Recognition (ATR) Laboratory, providing access to an extensive library of target signatures. ABFA also includes a Java-based capability for network access of near real-time data from sur-face weather stations or forecasts from the Army's Integrated Meteorological System. As an example, the detection footprint of an acoustical sensor, as it evolves over a 13-hour period, is calculated.

  3. A SOAP Web Service for accessing MODIS land product subsets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    SanthanaVannan, Suresh K; Cook, Robert B; Pan, Jerry Yun

    2011-01-01

    Remote sensing data from satellites have provided valuable information on the state of the earth for several decades. Since March 2000, the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor on board NASA s Terra and Aqua satellites have been providing estimates of several land parameters useful in understanding earth system processes at global, continental, and regional scales. However, the HDF-EOS file format, specialized software needed to process the HDF-EOS files, data volume, and the high spatial and temporal resolution of MODIS data make it difficult for users wanting to extract small but valuable amounts of information from the MODIS record. Tomore » overcome this usability issue, the NASA-funded Distributed Active Archive Center (DAAC) for Biogeochemical Dynamics at Oak Ridge National Laboratory (ORNL) developed a Web service that provides subsets of MODIS land products using Simple Object Access Protocol (SOAP). The ORNL DAAC MODIS subsetting Web service is a unique way of serving satellite data that exploits a fairly established and popular Internet protocol to allow users access to massive amounts of remote sensing data. The Web service provides MODIS land product subsets up to 201 x 201 km in a non-proprietary comma delimited text file format. Users can programmatically query the Web service to extract MODIS land parameters for real time data integration into models, decision support tools or connect to workflow software. Information regarding the MODIS SOAP subsetting Web service is available on the World Wide Web (WWW) at http://daac.ornl.gov/modiswebservice.« less

  4. A Robust Camera-Based Interface for Mobile Entertainment

    PubMed Central

    Roig-Maimó, Maria Francesca; Manresa-Yee, Cristina; Varona, Javier

    2016-01-01

    Camera-based interfaces in mobile devices are starting to be used in games and apps, but few works have evaluated them in terms of usability or user perception. Due to the changing nature of mobile contexts, this evaluation requires extensive studies to consider the full spectrum of potential users and contexts. However, previous works usually evaluate these interfaces in controlled environments such as laboratory conditions, therefore, the findings cannot be generalized to real users and real contexts. In this work, we present a robust camera-based interface for mobile entertainment. The interface detects and tracks the user’s head by processing the frames provided by the mobile device’s front camera, and its position is then used to interact with the mobile apps. First, we evaluate the interface as a pointing device to study its accuracy, and different factors to configure such as the gain or the device’s orientation, as well as the optimal target size for the interface. Second, we present an in the wild study to evaluate the usage and the user’s perception when playing a game controlled by head motion. Finally, the game is published in an application store to make it available to a large number of potential users and contexts and we register usage data. Results show the feasibility of using this robust camera-based interface for mobile entertainment in different contexts and by different people. PMID:26907288

  5. Sally Ride EarthKAM - Automated Image Geo-Referencing Using Google Earth Web Plug-In

    NASA Technical Reports Server (NTRS)

    Andres, Paul M.; Lazar, Dennis K.; Thames, Robert Q.

    2013-01-01

    Sally Ride EarthKAM is an educational program funded by NASA that aims to provide the public the ability to picture Earth from the perspective of the International Space Station (ISS). A computer-controlled camera is mounted on the ISS in a nadir-pointing window; however, timing limitations in the system cause inaccurate positional metadata. Manually correcting images within an orbit allows the positional metadata to be improved using mathematical regressions. The manual correction process is time-consuming and thus, unfeasible for a large number of images. The standard Google Earth program allows for the importing of KML (keyhole markup language) files that previously were created. These KML file-based overlays could then be manually manipulated as image overlays, saved, and then uploaded to the project server where they are parsed and the metadata in the database is updated. The new interface eliminates the need to save, download, open, re-save, and upload the KML files. Everything is processed on the Web, and all manipulations go directly into the database. Administrators also have the control to discard any single correction that was made and validate a correction. This program streamlines a process that previously required several critical steps and was probably too complex for the average user to complete successfully. The new process is theoretically simple enough for members of the public to make use of and contribute to the success of the Sally Ride EarthKAM project. Using the Google Earth Web plug-in, EarthKAM images, and associated metadata, this software allows users to interactively manipulate an EarthKAM image overlay, and update and improve the associated metadata. The Web interface uses the Google Earth JavaScript API along with PHP-PostgreSQL to present the user the same interface capabilities without leaving the Web. The simpler graphical user interface will allow the public to participate directly and meaningfully with EarthKAM. The use of similar techniques is being investigated to place ground-based observations in a Google Mars environment, allowing the MSL (Mars Science Laboratory) Science Team a means to visualize the rover and its environment.

  6. Status of stable isotope enrichment, products, and services at the Oak Ridge National Laboratory

    NASA Astrophysics Data System (ADS)

    Scott Aaron, W.; Tracy, Joe G.; Collins, Emory D.

    1997-02-01

    The Oak Ridge National Laboratory (ORNL) has been supplying enriched stable and radioactive isotopes to the research, medical, and industrial communities for over 50 y. Very significant changes have occurred in this effort over the past several years, and, while many of these changes have had a negative impact on the availability of enriched isotopes, more recent developments are actually improving the situation for both the users and the producers of enriched isotopes. ORNL is still a major producer and distributor of radioisotopes, but future isotope enrichment operations to be conducted at the Isotope Enrichment Facility (IEF) will be limited to stable isotopes. Among the positive changes in the enriched stable isotope area are a well-functioning, long-term contract program, which offers stability and pricing advantages; the resumption of calutron operations; the adoption of prorated conversion charges, which greatly improves the pricing of isotopes to small users; ISO 9002 registration of the IEF's quality management system; and a much more customer-oriented business philosophy. Efforts are also being made to restore and improve upon the extensive chemical and physical form processing capablities that once existed in the enriched stable isotope program. Innovative ideas are being pursued in both technical and administrative areas to encourage the beneficial use of enriched stable isotopes and the development of related technologies.

  7. Neurovascular Network Explorer 2.0: A Simple Tool for Exploring and Sharing a Database of Optogenetically-evoked Vasomotion in Mouse Cortex In Vivo.

    PubMed

    Uhlirova, Hana; Tian, Peifang; Kılıç, Kıvılcım; Thunemann, Martin; Sridhar, Vishnu B; Chmelik, Radim; Bartsch, Hauke; Dale, Anders M; Devor, Anna; Saisan, Payam A

    2018-05-04

    The importance of sharing experimental data in neuroscience grows with the amount and complexity of data acquired and various techniques used to obtain and process these data. However, the majority of experimental data, especially from individual studies of regular-sized laboratories never reach wider research community. A graphical user interface (GUI) engine called Neurovascular Network Explorer 2.0 (NNE 2.0) has been created as a tool for simple and low-cost sharing and exploring of vascular imaging data. NNE 2.0 interacts with a database containing optogenetically-evoked dilation/constriction time-courses of individual vessels measured in mice somatosensory cortex in vivo by 2-photon microscopy. NNE 2.0 enables selection and display of the time-courses based on different criteria (subject, branching order, cortical depth, vessel diameter, arteriolar tree) as well as simple mathematical manipulation (e.g. averaging, peak-normalization) and data export. It supports visualization of the vascular network in 3D and enables localization of the individual functional vessel diameter measurements within vascular trees. NNE 2.0, its source code, and the corresponding database are freely downloadable from UCSD Neurovascular Imaging Laboratory website 1 . The source code can be utilized by the users to explore the associated database or as a template for databasing and sharing their own experimental results provided the appropriate format.

  8. Engineered nanomaterials: toward effective safety management in research laboratories.

    PubMed

    Groso, Amela; Petri-Fink, Alke; Rothen-Rutishauser, Barbara; Hofmann, Heinrich; Meyer, Thierry

    2016-03-15

    It is still unknown which types of nanomaterials and associated doses represent an actual danger to humans and environment. Meanwhile, there is consensus on applying the precautionary principle to these novel materials until more information is available. To deal with the rapid evolution of research, including the fast turnover of collaborators, a user-friendly and easy-to-apply risk assessment tool offering adequate preventive and protective measures has to be provided. Based on new information concerning the hazards of engineered nanomaterials, we improved a previously developed risk assessment tool by following a simple scheme to gain in efficiency. In the first step, using a logical decision tree, one of the three hazard levels, from H1 to H3, is assigned to the nanomaterial. Using a combination of decision trees and matrices, the second step links the hazard with the emission and exposure potential to assign one of the three nanorisk levels (Nano 3 highest risk; Nano 1 lowest risk) to the activity. These operations are repeated at each process step, leading to the laboratory classification. The third step provides detailed preventive and protective measures for the determined level of nanorisk. We developed an adapted simple and intuitive method for nanomaterial risk management in research laboratories. It allows classifying the nanoactivities into three levels, additionally proposing concrete preventive and protective measures and associated actions. This method is a valuable tool for all the participants in nanomaterial safety. The users experience an essential learning opportunity and increase their safety awareness. Laboratory managers have a reliable tool to obtain an overview of the operations involving nanomaterials in their laboratories; this is essential, as they are responsible for the employee safety, but are sometimes unaware of the works performed. Bringing this risk to a three-band scale (like other types of risks such as biological, radiation, chemical, etc.) facilitates the management for occupational health and safety specialists. Institutes and school managers can obtain the necessary information to implement an adequate safety management system. Having an easy-to-use tool enables a dialog between all these partners, whose semantic and priorities in terms of safety are often different.

  9. Interlaboratory Reproducibility of Droplet Digital Polymerase Chain Reaction Using a New DNA Reference Material Format.

    PubMed

    Pinheiro, Leonardo B; O'Brien, Helen; Druce, Julian; Do, Hongdo; Kay, Pippa; Daniels, Marissa; You, Jingjing; Burke, Daniel; Griffiths, Kate; Emslie, Kerry R

    2017-11-07

    Use of droplet digital PCR technology (ddPCR) is expanding rapidly in the diversity of applications and number of users around the world. Access to relatively simple and affordable commercial ddPCR technology has attracted wide interest in use of this technology as a molecular diagnostic tool. For ddPCR to effectively transition to a molecular diagnostic setting requires processes for method validation and verification and demonstration of reproducible instrument performance. In this study, we describe the development and characterization of a DNA reference material (NMI NA008 High GC reference material) comprising a challenging methylated GC-rich DNA template under a novel 96-well microplate format. A scalable process using high precision acoustic dispensing technology was validated to produce the DNA reference material with a certified reference value expressed in amount of DNA molecules per well. An interlaboratory study, conducted using blinded NA008 High GC reference material to assess reproducibility among seven independent laboratories demonstrated less than 4.5% reproducibility relative standard deviation. With the exclusion of one laboratory, laboratories had appropriate technical competency, fully functional instrumentation, and suitable reagents to perform accurate ddPCR based DNA quantification measurements at the time of the study. The study results confirmed that NA008 High GC reference material is fit for the purpose of being used for quality control of ddPCR systems, consumables, instrumentation, and workflow.

  10. Detecting the Norovirus Season in Sweden Using Search Engine Data – Meeting the Needs of Hospital Infection Control Teams

    PubMed Central

    Edelstein, Michael; Wallensten, Anders; Zetterqvist, Inga; Hulth, Anette

    2014-01-01

    Norovirus outbreaks severely disrupt healthcare systems. We evaluated whether Websök, an internet-based surveillance system using search engine data, improved norovirus surveillance and response in Sweden. We compared Websök users' characteristics with the general population, cross-correlated weekly Websök searches with laboratory notifications between 2006 and 2013, compared the time Websök and laboratory data crossed the epidemic threshold and surveyed infection control teams about their perception and use of Websök. Users of Websök were not representative of the general population. Websök correlated with laboratory data (b = 0.88-0.89) and gave an earlier signal to the onset of the norovirus season compared with laboratory-based surveillance. 17/21 (81%) infection control teams answered the survey, of which 11 (65%) believed Websök could help with infection control plans. Websök is a low-resource, easily replicable system that detects the norovirus season as reliably as laboratory data, but earlier. Using Websök in routine surveillance can help infection control teams prepare for the yearly norovirus season. PMID:24955857

  11. Web Delivery of Interactive Laboratories: Comparison of Three Authoring Tools.

    NASA Astrophysics Data System (ADS)

    Silbar, Richard R.

    2001-11-01

    It is well-known that the more the end user (e.g., a student) interacts with a subject, the better he or she will learn it. This is particularly so in technical subjects. One way to do this is to have "laboratories" in which the student manipulates objects on the screen with keyboard or mouse and then sees the outcome of those actions. An example of such a laboratory can be seen at http://www.whistlesoft.com/ silbar/demo/vecadd, which deals with addition of two vectors in the geometric approach. This laboratory was built using Macromedia's Authorware. The problem with Authorware for this purpose is that, if one wants to deliver the training over the Web, that requires the download and installation of a big plug-in. As an experiment I recently tried to build clones of the Vector Addition Laboratory using Macromedia's Director or Flash, each of which have smaller plug-ins which are often already installed in the user's browser. I was able to come up with Director and Flash versions that are similar to (but definitely not the same as) the Authorware version. This talk goes into these differences and demonstrates the techniques used.

  12. Detecting the norovirus season in Sweden using search engine data--meeting the needs of hospital infection control teams.

    PubMed

    Edelstein, Michael; Wallensten, Anders; Zetterqvist, Inga; Hulth, Anette

    2014-01-01

    Norovirus outbreaks severely disrupt healthcare systems. We evaluated whether Websök, an internet-based surveillance system using search engine data, improved norovirus surveillance and response in Sweden. We compared Websök users' characteristics with the general population, cross-correlated weekly Websök searches with laboratory notifications between 2006 and 2013, compared the time Websök and laboratory data crossed the epidemic threshold and surveyed infection control teams about their perception and use of Websök. Users of Websök were not representative of the general population. Websök correlated with laboratory data (b = 0.88-0.89) and gave an earlier signal to the onset of the norovirus season compared with laboratory-based surveillance. 17/21 (81%) infection control teams answered the survey, of which 11 (65%) believed Websök could help with infection control plans. Websök is a low-resource, easily replicable system that detects the norovirus season as reliably as laboratory data, but earlier. Using Websök in routine surveillance can help infection control teams prepare for the yearly norovirus season.

  13. Common HEP UNIX Environment

    NASA Astrophysics Data System (ADS)

    Taddei, Arnaud

    After it had been decided to design a common user environment for UNIX platforms among HEP laboratories, a joint project between DESY and CERN had been started. The project consists in 2 phases: 1. Provide a common user environment at shell level, 2. Provide a common user environment at graphical level (X11). Phase 1 is in production at DESY and at CERN as well as at PISA and RAL. It has been developed around the scripts originally designed at DESY Zeuthen improved and extended with a 2 months project at CERN with a contribution from DESY Hamburg. It consists of a set of files which are customizing the environment for the 6 main shells (sh, csh, ksh, bash, tcsh, zsh) on the main platforms (AIX, HP-UX, IRIX, SunOS, Solaris 2, OSF/1, ULTRIX, etc.) and it is divided at several "sociological" levels: HEP, site, machine, cluster, group of users and user with some levels which are optional. The second phase is under design and a first proposal has been published. A first version of the phase 2 exists already for AIX and Solaris, and it should be available for all other platforms, by the time of the conference. This is a major collective work between several HEP laboratories involved in the HEPiX-scripts and HEPiX-X11 working-groups.

  14. A microfluidic device for preparing next generation DNA sequencing libraries and for automating other laboratory protocols that require one or more column chromatography steps.

    PubMed

    Tan, Swee Jin; Phan, Huan; Gerry, Benjamin Michael; Kuhn, Alexandre; Hong, Lewis Zuocheng; Min Ong, Yao; Poon, Polly Suk Yean; Unger, Marc Alexander; Jones, Robert C; Quake, Stephen R; Burkholder, William F

    2013-01-01

    Library preparation for next-generation DNA sequencing (NGS) remains a key bottleneck in the sequencing process which can be relieved through improved automation and miniaturization. We describe a microfluidic device for automating laboratory protocols that require one or more column chromatography steps and demonstrate its utility for preparing Next Generation sequencing libraries for the Illumina and Ion Torrent platforms. Sixteen different libraries can be generated simultaneously with significantly reduced reagent cost and hands-on time compared to manual library preparation. Using an appropriate column matrix and buffers, size selection can be performed on-chip following end-repair, dA tailing, and linker ligation, so that the libraries eluted from the chip are ready for sequencing. The core architecture of the device ensures uniform, reproducible column packing without user supervision and accommodates multiple routine protocol steps in any sequence, such as reagent mixing and incubation; column packing, loading, washing, elution, and regeneration; capture of eluted material for use as a substrate in a later step of the protocol; and removal of one column matrix so that two or more column matrices with different functional properties can be used in the same protocol. The microfluidic device is mounted on a plastic carrier so that reagents and products can be aliquoted and recovered using standard pipettors and liquid handling robots. The carrier-mounted device is operated using a benchtop controller that seals and operates the device with programmable temperature control, eliminating any requirement for the user to manually attach tubing or connectors. In addition to NGS library preparation, the device and controller are suitable for automating other time-consuming and error-prone laboratory protocols requiring column chromatography steps, such as chromatin immunoprecipitation.

  15. A Microfluidic Device for Preparing Next Generation DNA Sequencing Libraries and for Automating Other Laboratory Protocols That Require One or More Column Chromatography Steps

    PubMed Central

    Tan, Swee Jin; Phan, Huan; Gerry, Benjamin Michael; Kuhn, Alexandre; Hong, Lewis Zuocheng; Min Ong, Yao; Poon, Polly Suk Yean; Unger, Marc Alexander; Jones, Robert C.; Quake, Stephen R.; Burkholder, William F.

    2013-01-01

    Library preparation for next-generation DNA sequencing (NGS) remains a key bottleneck in the sequencing process which can be relieved through improved automation and miniaturization. We describe a microfluidic device for automating laboratory protocols that require one or more column chromatography steps and demonstrate its utility for preparing Next Generation sequencing libraries for the Illumina and Ion Torrent platforms. Sixteen different libraries can be generated simultaneously with significantly reduced reagent cost and hands-on time compared to manual library preparation. Using an appropriate column matrix and buffers, size selection can be performed on-chip following end-repair, dA tailing, and linker ligation, so that the libraries eluted from the chip are ready for sequencing. The core architecture of the device ensures uniform, reproducible column packing without user supervision and accommodates multiple routine protocol steps in any sequence, such as reagent mixing and incubation; column packing, loading, washing, elution, and regeneration; capture of eluted material for use as a substrate in a later step of the protocol; and removal of one column matrix so that two or more column matrices with different functional properties can be used in the same protocol. The microfluidic device is mounted on a plastic carrier so that reagents and products can be aliquoted and recovered using standard pipettors and liquid handling robots. The carrier-mounted device is operated using a benchtop controller that seals and operates the device with programmable temperature control, eliminating any requirement for the user to manually attach tubing or connectors. In addition to NGS library preparation, the device and controller are suitable for automating other time-consuming and error-prone laboratory protocols requiring column chromatography steps, such as chromatin immunoprecipitation. PMID:23894273

  16. Multiyear Program Plan for the High Temperature Materials Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arvid E. Pasto

    2000-03-17

    Recently, the U.S. Department of Energy's (DOE) Office of Heavy Vehicle Technologies (OHVT) prepared a Technology Roadmap describing the challenges facing development of higher fuel efficiency, less polluting sport utility vehicles, vans, and commercial trucks. Based on this roadmap, a multiyear program plan (MYPP) was also developed, in which approaches to solving the numerous challenges are enumerated. Additional planning has been performed by DOE and national laboratory staff, on approaches to solving the numerous challenges faced by heavy vehicle system improvements. Workshops and planning documents have been developed concerning advanced aerodynamics, frictional and other parasitic losses, and thermal management. Similarly,more » the Heavy Vehicle Propulsion Materials Program has developed its own multiyear program plan. The High Temperature Materials Laboratory, a major user facility sponsored by OHVT, has now developed its program plan, described herein. Information was gathered via participation in the development of OHVT's overall Technology Roadmap and MYPP, through personal contacts within the materials-user community, and from attendance at conferences and expositions. Major materials issues for the heavy vehicle industry currently center on trying to increase efficiency of (diesel) engines while at the same time reducing emissions (particularly NO{sub x} and particulates). These requirements dictate the use of increasingly stronger, higher-temperature capable and more corrosion-resistant materials of construction, as well as advanced catalysts, particulate traps, and other pollution-control devices. Exhaust gas recirculation (EGR) is a technique which will certainly be applied to diesel engines in the near future, and its use represents a formidable challenge, as will be described later. Energy-efficient, low cost materials processing methods and surface treatments to improve wear, fracture, and corrosion resistance are also required.« less

  17. Stand-alone polarization-modulation infrared reflection absorption spectroscopy instrument optimized for the study of catalytic processes at elevated pressures

    NASA Astrophysics Data System (ADS)

    Kestell, John D.; Mudiyanselage, Kumudu; Ye, Xinyi; Nam, Chang-Yong; Stacchiola, Dario; Sadowski, Jerzy; Boscoboinik, J. Anibal

    2017-10-01

    This paper describes the design and construction of a compact, "user-friendly" polarization-modulation infrared reflection absorption spectroscopy (PM-IRRAS) instrument at the Center for Functional Nanomaterials (CFN) of Brookhaven National Laboratory, which allows studying surfaces at pressures ranging from ultra-high vacuum to 100 Torr. Surface infrared spectroscopy is ideally suited for studying these processes as the vibrational frequencies of the IR chromophores are sensitive to the nature of the bonding environment on the surface. Relying on the surface selection rules, by modulating the polarization of incident light, it is possible to separate the contributions from the isotropic gas or solution phase, from the surface bound species. A spectral frequency range between 1000 cm-1 and 4000 cm-1 can be acquired. While typical spectra with a good signal to noise ratio can be obtained at elevated pressures of gases in ˜2 min at 4 cm-1 resolution, we have also acquired higher resolution spectra at 0.25 cm-1 with longer acquisition times. By way of verification, CO uptake on a heavily oxidized Ru(0001) sample was studied. As part of this test study, the presence of CO adsorbed on Ru bridge sites was confirmed, in agreement with previous ambient pressure X ray photoelectron spectroscopy studies. In terms of instrument performance, it was also determined that the gas phase contribution from CO could be completely removed even up to pressures close to 100 Torr. A second test study demonstrated the use of the technique for studying morphological properties of a spin coated polymer on a conductive surface. Note that this is a novel application of this technique. In this experiment, the polarization of incident light was modulated manually (vs. through a photoelastic modulator). It was demonstrated, in good agreement with the literature, that the polymer chains preferentially lie parallel with the surface. This PM-IRRAS system is small, modular, and easily reconfigurable. It also features a "vacuum suitcase" that allows for the integration of the PM-IRRAS system with the rest of the suite of instrumentation at our laboratory available to external users through the CFN user proposal system.

  18. LimsPortal and BonsaiLIMS: development of a lab information management system for translational medicine

    PubMed Central

    2011-01-01

    Background Laboratory Information Management Systems (LIMS) are an increasingly important part of modern laboratory infrastructure. As typically very sophisticated software products, LIMS often require considerable resources to select, deploy and maintain. Larger organisations may have access to specialist IT support to assist with requirements elicitation and software customisation, however smaller groups will often have limited IT support to perform the kind of iterative development that can resolve the difficulties that biologists often have when specifying requirements. Translational medicine aims to accelerate the process of treatment discovery by bringing together multiple disciplines to discover new approaches to treating disease, or novel applications of existing treatments. The diverse set of disciplines and complexity of processing procedures involved, especially with the use of high throughput technologies, bring difficulties in customizing a generic LIMS to provide a single system for managing sample related data within a translational medicine research setting, especially where limited IT support is available. Results We have designed and developed a LIMS, BonsaiLIMS, around a very simple data model that can be easily implemented using a variety of technologies, and can be easily extended as specific requirements dictate. A reference implementation using Oracle 11 g database and the Python framework, Django is presented. Conclusions By focusing on a minimal feature set and a modular design we have been able to deploy the BonsaiLIMS system very quickly. The benefits to our institute have been the avoidance of the prolonged implementation timescales, budget overruns, scope creep, off-specifications and user fatigue issues that typify many enterprise software implementations. The transition away from using local, uncontrolled records in spreadsheet and paper formats to a centrally held, secured and backed-up database brings the immediate benefits of improved data visibility, audit and overall data quality. The open-source availability of this software allows others to rapidly implement a LIMS which in itself might sufficiently address user requirements. In situations where this software does not meet requirements, it can serve to elicit more accurate specifications from end-users for a more heavyweight LIMS by acting as a demonstrable prototype. PMID:21569484

  19. LimsPortal and BonsaiLIMS: development of a lab information management system for translational medicine.

    PubMed

    Bath, Timothy G; Bozdag, Selcuk; Afzal, Vackar; Crowther, Daniel

    2011-05-13

    Laboratory Information Management Systems (LIMS) are an increasingly important part of modern laboratory infrastructure. As typically very sophisticated software products, LIMS often require considerable resources to select, deploy and maintain. Larger organisations may have access to specialist IT support to assist with requirements elicitation and software customisation, however smaller groups will often have limited IT support to perform the kind of iterative development that can resolve the difficulties that biologists often have when specifying requirements. Translational medicine aims to accelerate the process of treatment discovery by bringing together multiple disciplines to discover new approaches to treating disease, or novel applications of existing treatments. The diverse set of disciplines and complexity of processing procedures involved, especially with the use of high throughput technologies, bring difficulties in customizing a generic LIMS to provide a single system for managing sample related data within a translational medicine research setting, especially where limited IT support is available. We have designed and developed a LIMS, BonsaiLIMS, around a very simple data model that can be easily implemented using a variety of technologies, and can be easily extended as specific requirements dictate. A reference implementation using Oracle 11 g database and the Python framework, Django is presented. By focusing on a minimal feature set and a modular design we have been able to deploy the BonsaiLIMS system very quickly. The benefits to our institute have been the avoidance of the prolonged implementation timescales, budget overruns, scope creep, off-specifications and user fatigue issues that typify many enterprise software implementations. The transition away from using local, uncontrolled records in spreadsheet and paper formats to a centrally held, secured and backed-up database brings the immediate benefits of improved data visibility, audit and overall data quality. The open-source availability of this software allows others to rapidly implement a LIMS which in itself might sufficiently address user requirements. In situations where this software does not meet requirements, it can serve to elicit more accurate specifications from end-users for a more heavyweight LIMS by acting as a demonstrable prototype.

  20. Pharmacology Portal: An Open Database for Clinical Pharmacologic Laboratory Services.

    PubMed

    Karlsen Bjånes, Tormod; Mjåset Hjertø, Espen; Lønne, Lars; Aronsen, Lena; Andsnes Berg, Jon; Bergan, Stein; Otto Berg-Hansen, Grim; Bernard, Jean-Paul; Larsen Burns, Margrete; Toralf Fosen, Jan; Frost, Joachim; Hilberg, Thor; Krabseth, Hege-Merete; Kvan, Elena; Narum, Sigrid; Austgulen Westin, Andreas

    2016-01-01

    More than 50 Norwegian public and private laboratories provide one or more analyses for therapeutic drug monitoring or testing for drugs of abuse. Practices differ among laboratories, and analytical repertoires can change rapidly as new substances become available for analysis. The Pharmacology Portal was developed to provide an overview of these activities and to standardize the practices and terminology among laboratories. The Pharmacology Portal is a modern dynamic web database comprising all available analyses within therapeutic drug monitoring and testing for drugs of abuse in Norway. Content can be retrieved by using the search engine or by scrolling through substance lists. The core content is a substance registry updated by a national editorial board of experts within the field of clinical pharmacology. This ensures quality and consistency regarding substance terminologies and classification. All laboratories publish their own repertoires in a user-friendly workflow, adding laboratory-specific details to the core information in the substance registry. The user management system ensures that laboratories are restricted from editing content in the database core or in repertoires within other laboratory subpages. The portal is for nonprofit use, and has been fully funded by the Norwegian Medical Association, the Norwegian Society of Clinical Pharmacology, and the 8 largest pharmacologic institutions in Norway. The database server runs an open-source content management system that ensures flexibility with respect to further development projects, including the potential expansion of the Pharmacology Portal to other countries. Copyright © 2016 Elsevier HS Journals, Inc. All rights reserved.

  1. An integrated user-oriented laboratory for verification of digital flight control systems: Features and capabilities

    NASA Technical Reports Server (NTRS)

    Defeo, P.; Doane, D.; Saito, J.

    1982-01-01

    A Digital Flight Control Systems Verification Laboratory (DFCSVL) has been established at NASA Ames Research Center. This report describes the major elements of the laboratory, the research activities that can be supported in the area of verification and validation of digital flight control systems (DFCS), and the operating scenarios within which these activities can be carried out. The DFCSVL consists of a palletized dual-dual flight-control system linked to a dedicated PDP-11/60 processor. Major software support programs are hosted in a remotely located UNIVAC 1100 accessible from the PDP-11/60 through a modem link. Important features of the DFCSVL include extensive hardware and software fault insertion capabilities, a real-time closed loop environment to exercise the DFCS, an integrated set of software verification tools, and a user-oriented interface to all the resources and capabilities.

  2. Software algorithms for false alarm reduction in LWIR hyperspectral chemical agent detection

    NASA Astrophysics Data System (ADS)

    Manolakis, D.; Model, J.; Rossacci, M.; Zhang, D.; Ontiveros, E.; Pieper, M.; Seeley, J.; Weitz, D.

    2008-04-01

    The long-wave infrared (LWIR) hyperpectral sensing modality is one that is often used for the problem of detection and identification of chemical warfare agents (CWA) which apply to both military and civilian situations. The inherent nature and complexity of background clutter dictates a need for sophisticated and robust statistical models which are then used in the design of optimum signal processing algorithms that then provide the best exploitation of hyperspectral data to ultimately make decisions on the absence or presence of potentially harmful CWAs. This paper describes the basic elements of an automated signal processing pipeline developed at MIT Lincoln Laboratory. In addition to describing this signal processing architecture in detail, we briefly describe the key signal models that form the foundation of these algorithms as well as some spatial processing techniques used for false alarm mitigation. Finally, we apply this processing pipeline to real data measured by the Telops FIRST hyperspectral (FIRST) sensor to demonstrate its practical utility for the user community.

  3. Space Station Freedom user's guide

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This guide is intended to inform prospective users of the accommodations and resources provided by the Space Station Freedom program. Using this information, they can determine if Space Station Freedom is an appropriate laboratory or facility for their research objectives. The steps that users must follow to fly a payload on Freedom are described. This guide covers the accommodations and resources available on the Space Station during the Man-Tended Capability (MTC) period, scheduled to begin the end of 1996, and a Permanently Manned Capability (PMC) beginning in late 1999.

  4. Social image quality

    NASA Astrophysics Data System (ADS)

    Qiu, Guoping; Kheiri, Ahmed

    2011-01-01

    Current subjective image quality assessments have been developed in the laboratory environments, under controlledconditions, and are dependent on the participation of limited numbers of observers. In this research, with the help of Web 2.0 and social media technology, a new method for building a subjective image quality metric has been developed where the observers are the Internet users. A website with a simple user interface that enables Internet users from anywhere at any time to vote for a better quality version of a pair of the same image has been constructed. Users' votes are recorded and used to rank the images according to their perceived visual qualities. We have developed three rank aggregation algorithms to process the recorded pair comparison data, the first uses a naive approach, the second employs a Condorcet method, and the third uses the Dykstra's extension of Bradley-Terry method. The website has been collecting data for about three months and has accumulated over 10,000 votes at the time of writing this paper. Results show that the Internet and its allied technologies such as crowdsourcing offer a promising new paradigm for image and video quality assessment where hundreds of thousands of Internet users can contribute to building more robust image quality metrics. We have made Internet user generated social image quality (SIQ) data of a public image database available online (http://www.hdri.cs.nott.ac.uk/siq/) to provide the image quality research community with a new source of ground truth data. The website continues to collect votes and will include more public image databases and will also be extended to include videos to collect social video quality (SVQ) data. All data will be public available on the website in due course.

  5. Brain-computer interface technology: a review of the first international meeting.

    PubMed

    Wolpaw, J R; Birbaumer, N; Heetderks, W J; McFarland, D J; Peckham, P H; Schalk, G; Donchin, E; Quatrano, L A; Robinson, C J; Vaughan, T M

    2000-06-01

    Over the past decade, many laboratories have begun to explore brain-computer interface (BCI) technology as a radically new communication option for those with neuromuscular impairments that prevent them from using conventional augmentative communication methods. BCI's provide these users with communication channels that do not depend on peripheral nerves and muscles. This article summarizes the first international meeting devoted to BCI research and development. Current BCI's use electroencephalographic (EEG) activity recorded at the scalp or single-unit activity recorded from within cortex to control cursor movement, select letters or icons, or operate a neuroprosthesis. The central element in each BCI is a translation algorithm that converts electrophysiological input from the user into output that controls external devices. BCI operation depends on effective interaction between two adaptive controllers, the user who encodes his or her commands in the electrophysiological input provided to the BCI, and the BCI which recognizes the commands contained in the input and expresses them in device control. Current BCI's have maximum information transfer rates of 5-25 b/min. Achievement of greater speed and accuracy depends on improvements in signal processing, translation algorithms, and user training. These improvements depend on increased interdisciplinary cooperation between neuroscientists, engineers, computer programmers, psychologists, and rehabilitation specialists, and on adoption and widespread application of objective methods for evaluating alternative methods. The practical use of BCI technology depends on the development of appropriate applications, identification of appropriate user groups, and careful attention to the needs and desires of individual users. BCI research and development will also benefit from greater emphasis on peer-reviewed publications, and from adoption of standard venues for presentations and discussion.

  6. The wired patient: patterns of electronic patient portal use among patients with cardiac disease or diabetes.

    PubMed

    Jones, James Brian; Weiner, Jonathan P; Shah, Nirav R; Stewart, Walter F

    2015-02-20

    As providers develop an electronic health record-based infrastructure, patients are increasingly using Web portals to access their health information and participate electronically in the health care process. Little is known about how such portals are actually used. In this paper, our goal was to describe the types and patterns of portal users in an integrated delivery system. We analyzed 12 months of data from Web server log files on 2282 patients using a Web-based portal to their electronic health record (EHR). We obtained data for patients with cardiovascular disease and/or diabetes who had a Geisinger Clinic primary care provider and were registered "MyGeisinger" Web portal users. Hierarchical cluster analysis was applied to longitudinal data to profile users based on their frequency, intensity, and consistency of use. User types were characterized by basic demographic data from the EHR. We identified eight distinct portal user groups. The two largest groups (41.98%, 948/2258 and 24.84%, 561/2258) logged into the portal infrequently but had markedly different levels of engagement with their medical record. Other distinct groups were characterized by tracking biometric measures (10.54%, 238/2258), sending electronic messages to their provider (9.25%, 209/2258), preparing for an office visit (5.98%, 135/2258), and tracking laboratory results (4.16%, 94/2258). There are naturally occurring groups of EHR Web portal users within a population of adult primary care patients with chronic conditions. More than half of the patient cohort exhibited distinct patterns of portal use linked to key features. These patterns of portal access and interaction provide insight into opportunities for electronic patient engagement strategies.

  7. Progress in the development of paper-based diagnostics for low-resource point-of-care settings

    PubMed Central

    Byrnes, Samantha; Thiessen, Gregory; Fu, Elain

    2014-01-01

    This Review focuses on recent work in the field of paper microfluidics that specifically addresses the goal of translating the multistep processes that are characteristic of gold-standard laboratory tests to low-resource point-of-care settings. A major challenge is to implement multistep processes with the robust fluid control required to achieve the necessary sensitivity and specificity of a given application in a user-friendly package that minimizes equipment. We review key work in the areas of fluidic controls for automation in paper-based devices, readout methods that minimize dedicated equipment, and power and heating methods that are compatible with low-resource point-of-care settings. We also highlight a focused set of recent applications and discuss future challenges. PMID:24256361

  8. User Experience in Digital Games: Differences between Laboratory and Home

    ERIC Educational Resources Information Center

    Takatalo, Jari; Hakkinen, Jukka; Kaistinen, Jyrki; Nyman, Gote

    2011-01-01

    Playing entertainment computer, video, and portable games, namely, digital games, is receiving more and more attention in academic research. Games are studied in different situations with numerous methods, but little is known about if and how the playing situation affects the user experience (UX) in games. In addition, it is hard to understand and…

  9. Crystalline-silicon reliability lessons for thin-film modules

    NASA Technical Reports Server (NTRS)

    Ross, R. G., Jr.

    1985-01-01

    The reliability of crystalline silicon modules has been brought to a high level with lifetimes approaching 20 years, and excellent industry credibility and user satisfaction. The transition from crystalline modules to thin film modules is comparable to the transition from discrete transistors to integrated circuits. New cell materials and monolithic structures will require new device processing techniques, but the package function and design will evolve to a lesser extent. Although there will be new encapsulants optimized to take advantage of the mechanical flexibility and low temperature processing features of thin films, the reliability and life degradation stresses and mechanisms will remain mostly unchanged. Key reliability technologies in common between crystalline and thin film modules include hot spot heating, galvanic and electrochemical corrosion, hail impact stresses, glass breakage, mechanical fatigue, photothermal degradation of encapsulants, operating temperature, moisture sorption, circuit design strategies, product safety issues, and the process required to achieve a reliable product from a laboratory prototype.

  10. JPL Project Information Management: A Continuum Back to the Future

    NASA Technical Reports Server (NTRS)

    Reiz, Julie M.

    2009-01-01

    This slide presentation reviews the practices and architecture that support information management at JPL. This practice has allowed concurrent use and reuse of information by primary and secondary users. The use of this practice is illustrated in the evolution of the Mars Rovers from the Mars Pathfinder to the development of the Mars Science Laboratory. The recognition of the importance of information management during all phases of a project life cycle has resulted in the design of an information system that includes metadata, has reduced the risk of information loss through the use of an in-process appraisal, shaping of project's appreciation for capturing and managing the information on one project for re-use by future projects as a natural outgrowth of the process. This process has also assisted in connection of geographically disbursed partners into a team through sharing information, common tools and collaboration.

  11. Virtual Special Issue on Catalysis at the U.S. Department of Energy’s National Laboratories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pruski, Marek; Sadow, Aaron; Slowing, Igor

    Catalysis research at the U.S. Department of Energy's (DOE's) National Laboratories covers a wide range of research topics in heterogeneous catalysis, homogeneous/ molecular catalysis, electrocatalysis, and surface science. Since much of the work at National Laboratories is funded by DOE, the research is largely focused on addressing DOE’s mission to ensure America’s security and prosperity by addressing its energy, environmental, and nuclear challenges through trans-formative science and technology solutions. The catalysis research carried out at the DOE National Laboratories ranges from very fundamental catalysis science, funded by DOE’s Office of Basic Energy Sciences (BES), to applied research and development (R&D)more » in areas such as biomass conversion to fuels and chemicals, fuel cells, and vehicle emission control with primary funding from DOE’s Office of Energy Efficiency and Renewable Energy. National Laboratories are home to many DOE Office of Science national scientific user facilities that provide researchers with the most advanced tools of modern science, including accelerators, colliders, supercomputers, light sources, and neutron sources, as well as facilities for studying the nanoworld and the terrestrial environment. National Laboratory research programs typically feature teams of researchers working closely together, often joining scientists from different disciplines to attack scientific and technical problems using a variety of tools and techniques available at the DOE national scientific user facilities. Along with collaboration between National Laboratory scientists, interactions with university colleagues are common in National Laboratory catalysis R&D. In some cases, scientists have joint appoint-ments at a university and a National Laboratory.« less

  12. Virtual Special Issue on Catalysis at the U.S. Department of Energy’s National Laboratories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pruski, Marek; Sadow, Aaron D.; Slowing, Igor I.

    Catalysis research at the U.S. Department of Energy’s (DOE’s) National Laboratories covers a wide range of research topics in heterogeneous catalysis, homogeneous/molecular catalysis, biocatalysis, electrocatalysis, and surface science. Since much of the work at National Laboratories is funded by DOE, the research is largely focused on addressing DOE’s mission to ensure America’s security and prosperity by addressing its energy, environmental, and nuclear challenges through transformative science and technology solutions. The catalysis research carried out at the DOE National Laboratories ranges from very fundamental catalysis science, funded by DOE’s Office of Basic Energy Sciences (BES), to applied research and development (R&D)more » in areas such as biomass conversion to fuels and chemicals, fuel cells, and vehicle emission control with primary funding from DOE’s Office of Energy Efficiency and Renewable Energy. National Laboratories are home to many DOE Office of Science national scientific user facilities that provide researchers with the most advanced tools of modern science, including accelerators, colliders, supercomputers, light sources, and neutron sources, as well as facilities for studying the nanoworld and the terrestrial environment. National Laboratory research programs typically feature teams of researchers working closely together, often joining scientists from different disciplines to tackle scientific and technical problems using a variety of tools and techniques available at the DOE national scientific user facilities. Along with collaboration between National Laboratory scientists, interactions with university colleagues are common in National Laboratory catalysis R&D. In some cases, scientists have joint appointments at a university and a National Laboratory.« less

  13. Modeling Laser-Driven Laboratory Astrophysics Experiments Using the CRASH Code

    NASA Astrophysics Data System (ADS)

    Grosskopf, Michael; Keiter, P.; Kuranz, C. C.; Malamud, G.; Trantham, M.; Drake, R.

    2013-06-01

    Laser-driven, laboratory astrophysics experiments can provide important insight into the physical processes relevant to astrophysical systems. The radiation hydrodynamics code developed by the Center for Radiative Shock Hydrodynamics (CRASH) at the University of Michigan has been used to model experimental designs for high-energy-density laboratory astrophysics campaigns on OMEGA and other high-energy laser facilities. This code is an Eulerian, block-adaptive AMR hydrodynamics code with implicit multigroup radiation transport and electron heat conduction. The CRASH model has been used on many applications including: radiative shocks, Kelvin-Helmholtz and Rayleigh-Taylor experiments on the OMEGA laser; as well as laser-driven ablative plumes in experiments by the Astrophysical Collisionless Shocks Experiments with Lasers (ACSEL) collaboration. We report a series of results with the CRASH code in support of design work for upcoming high-energy-density physics experiments, as well as comparison between existing experimental data and simulation results. This work is funded by the Predictive Sciences Academic Alliances Program in NNSA-ASC via grant DEFC52- 08NA28616, by the NNSA-DS and SC-OFES Joint Program in High-Energy-Density Laboratory Plasmas, grant number DE-FG52-09NA29548, and by the National Laser User Facility Program, grant number DE-NA0000850.

  14. Cost and implementation analysis of a personal digital assistant system for laboratory data collection.

    PubMed

    Blaya, J A; Gomez, W; Rodriguez, P; Fraser, H

    2008-08-01

    One hundred and twenty-six public health centers and laboratories in Lima, Peru, without internet. We have previously shown that a personal digital assistant (PDA) based system reduces data collection delays and errors for tuberculosis (TB) laboratory results when compared to a paper system. To assess the data collection efficiency of each system and the resources required to develop, implement and transfer the PDA-based system to a resource-poor setting. Time-motion study of data collectors using the PDA-based and paper systems. Cost analysis of developing, implementing and transferring the PDA-based system to a local organization and their redeployment of the system. Work hours spent collecting and processing results decreased by 60% (P < 0.001). Users perceived this decrease to be 70% and had no technical problems they failed to fix. The total cost and time to develop and implement the intervention was US$26092 and 22 weeks. The cost to extend the system to cover nine more districts was $1125 and to implement collecting patient weights was $4107. A PDA-based system drastically reduced the effort required to collect TB laboratory results from remote locations. With the framework described, open-source software and local development, organizations in resource-poor settings could reap the benefits of this technology.

  15. Three-Dimensional Online Visualization and Engagement Tools for the Geosciences

    NASA Astrophysics Data System (ADS)

    Cockett, R.; Moran, T.; Pidlisecky, A.

    2013-12-01

    Educational tools often sacrifice interactivity in favour of scalability so they can reach more users. This compromise leads to tools that may be viewed as second tier when compared to more engaging activities performed in a laboratory; however, the resources required to deliver laboratory exercises that are scalable is often impractical. Geoscience education is well situated to benefit from interactive online learning tools that allow users to work in a 3D environment. Visible Geology (http://3ptscience.com/visiblegeology) is an innovative web-based application designed to enable visualization of geologic structures and processes through the use of interactive 3D models. The platform allows users to conceptualize difficult, yet important geologic principles in a scientifically accurate manner by developing unique geologic models. The environment allows students to interactively practice their visualization and interpretation skills by creating and interacting with their own models and terrains. Visible Geology has been designed from a user centric perspective resulting in a simple and intuitive interface. The platform directs students to build there own geologic models by adding beds and creating geologic events such as tilting, folding, or faulting. The level of ownership and interactivity encourages engagement, leading learners to discover geologic relationships on their own, in the context of guided assignments. In January 2013, an interactive geologic history assignment was developed for a 700-student introductory geology class at The University of British Columbia. The assignment required students to distinguish the relative age of geologic events to construct a geologic history. Traditionally this type of exercise has been taught through the use of simple geologic cross-sections showing crosscutting relationships; from these cross-sections students infer the relative age of geologic events. In contrast, the Visible Geology assignment offers students a unique experience where they first create their own geologic events allowing them to directly see how the timing of a geologic event manifests in the model and resulting cross-sections. By creating each geologic event in the model themselves, the students gain a deeper understanding of the processes and relative order of events. The resulting models can be shared amongst students, and provide instructors with a basis for guiding inquiry to address misconceptions. The ease of use of the assignment, including automatic assessment, made this tool practical for deployment in this 700 person class. The outcome of this type of large scale deployment is that students, who would normally not experience a lab exercise, gain exposure to interactive 3D thinking. Engaging tools and software that puts the user in control of their learning experiences is critical for moving to scalable, yet engaging, online learning environments.

  16. The Effects of Web-Based Patient Access to Laboratory Results in British Columbia: A Patient Survey on Comprehension and Anxiety.

    PubMed

    Mák, Geneviève; Smith Fowler, Heather; Leaver, Chad; Hagens, Simon; Zelmer, Jennifer

    2015-08-04

    Web-based patient access to personal health information is limited but increasing in Canada and internationally. This exploratory study aimed to increase understanding of how Web-based access to laboratory test results in British Columbia (Canada), which has been broadly available since 2010, affects patients' experiences. In November 2013, we surveyed adults in British Columbia who had had a laboratory test in the previous 12 months. Using a retrospective cohort design, we compared reported wait-time for results, test result comprehension, and anxiety levels of "service users" who had Web-based access to their test results (n=2047) with those of a general population panel that did not have Web-based access (n=1245). The vast majority of service users (83.99%, 95% CI 82.31-85.67) said they received their results within "a few days", compared to just over a third of the comparison group (37.84%, 95% CI 34.96-40.73). Most in both groups said they understood their test results, but the rate was lower for service users than the comparison group (75.55%, 95% CI 73.58-77.49 vs 84.69%, 95% CI 82.59-86.81). There was no significant difference between groups in levels of reported anxiety after receiving test results. While most patients who received their laboratory test results online reported little anxiety after receiving their results and were satisfied with the service, there may be opportunities to improve comprehension of results.

  17. Developing a Collaborative Agenda for Humanities and Social Scientific Research on Laboratory Animal Science and Welfare.

    PubMed

    Davies, Gail F; Greenhough, Beth J; Hobson-West, Pru; Kirk, Robert G W; Applebee, Ken; Bellingan, Laura C; Berdoy, Manuel; Buller, Henry; Cassaday, Helen J; Davies, Keith; Diefenbacher, Daniela; Druglitrø, Tone; Escobar, Maria Paula; Friese, Carrie; Herrmann, Kathrin; Hinterberger, Amy; Jarrett, Wendy J; Jayne, Kimberley; Johnson, Adam M; Johnson, Elizabeth R; Konold, Timm; Leach, Matthew C; Leonelli, Sabina; Lewis, David I; Lilley, Elliot J; Longridge, Emma R; McLeod, Carmen M; Miele, Mara; Nelson, Nicole C; Ormandy, Elisabeth H; Pallett, Helen; Poort, Lonneke; Pound, Pandora; Ramsden, Edmund; Roe, Emma; Scalway, Helen; Schrader, Astrid; Scotton, Chris J; Scudamore, Cheryl L; Smith, Jane A; Whitfield, Lucy; Wolfensohn, Sarah

    2016-01-01

    Improving laboratory animal science and welfare requires both new scientific research and insights from research in the humanities and social sciences. Whilst scientific research provides evidence to replace, reduce and refine procedures involving laboratory animals (the '3Rs'), work in the humanities and social sciences can help understand the social, economic and cultural processes that enhance or impede humane ways of knowing and working with laboratory animals. However, communication across these disciplinary perspectives is currently limited, and they design research programmes, generate results, engage users, and seek to influence policy in different ways. To facilitate dialogue and future research at this interface, we convened an interdisciplinary group of 45 life scientists, social scientists, humanities scholars, non-governmental organisations and policy-makers to generate a collaborative research agenda. This drew on methods employed by other agenda-setting exercises in science policy, using a collaborative and deliberative approach for the identification of research priorities. Participants were recruited from across the community, invited to submit research questions and vote on their priorities. They then met at an interactive workshop in the UK, discussed all 136 questions submitted, and collectively defined the 30 most important issues for the group. The output is a collaborative future agenda for research in the humanities and social sciences on laboratory animal science and welfare. The questions indicate a demand for new research in the humanities and social sciences to inform emerging discussions and priorities on the governance and practice of laboratory animal research, including on issues around: international harmonisation, openness and public engagement, 'cultures of care', harm-benefit analysis and the future of the 3Rs. The process outlined below underlines the value of interdisciplinary exchange for improving communication across different research cultures and identifies ways of enhancing the effectiveness of future research at the interface between the humanities, social sciences, science and science policy.

  18. Developing a Collaborative Agenda for Humanities and Social Scientific Research on Laboratory Animal Science and Welfare

    PubMed Central

    Davies, Gail F.; Greenhough, Beth J; Hobson-West, Pru; Kirk, Robert G. W.; Applebee, Ken; Bellingan, Laura C.; Berdoy, Manuel; Buller, Henry; Cassaday, Helen J.; Davies, Keith; Diefenbacher, Daniela; Druglitrø, Tone; Escobar, Maria Paula; Friese, Carrie; Herrmann, Kathrin; Hinterberger, Amy; Jarrett, Wendy J.; Jayne, Kimberley; Johnson, Adam M.; Johnson, Elizabeth R.; Konold, Timm; Leach, Matthew C.; Leonelli, Sabina; Lewis, David I.; Lilley, Elliot J.; Longridge, Emma R.; McLeod, Carmen M.; Miele, Mara; Nelson, Nicole C.; Ormandy, Elisabeth H.; Pallett, Helen; Poort, Lonneke; Pound, Pandora; Ramsden, Edmund; Roe, Emma; Scalway, Helen; Schrader, Astrid; Scotton, Chris J.; Scudamore, Cheryl L.; Smith, Jane A.; Whitfield, Lucy; Wolfensohn, Sarah

    2016-01-01

    Improving laboratory animal science and welfare requires both new scientific research and insights from research in the humanities and social sciences. Whilst scientific research provides evidence to replace, reduce and refine procedures involving laboratory animals (the ‘3Rs’), work in the humanities and social sciences can help understand the social, economic and cultural processes that enhance or impede humane ways of knowing and working with laboratory animals. However, communication across these disciplinary perspectives is currently limited, and they design research programmes, generate results, engage users, and seek to influence policy in different ways. To facilitate dialogue and future research at this interface, we convened an interdisciplinary group of 45 life scientists, social scientists, humanities scholars, non-governmental organisations and policy-makers to generate a collaborative research agenda. This drew on methods employed by other agenda-setting exercises in science policy, using a collaborative and deliberative approach for the identification of research priorities. Participants were recruited from across the community, invited to submit research questions and vote on their priorities. They then met at an interactive workshop in the UK, discussed all 136 questions submitted, and collectively defined the 30 most important issues for the group. The output is a collaborative future agenda for research in the humanities and social sciences on laboratory animal science and welfare. The questions indicate a demand for new research in the humanities and social sciences to inform emerging discussions and priorities on the governance and practice of laboratory animal research, including on issues around: international harmonisation, openness and public engagement, ‘cultures of care’, harm-benefit analysis and the future of the 3Rs. The process outlined below underlines the value of interdisciplinary exchange for improving communication across different research cultures and identifies ways of enhancing the effectiveness of future research at the interface between the humanities, social sciences, science and science policy. PMID:27428071

  19. Experience with Data Science as an Intern with the Jet Propulsion Laboratory

    NASA Astrophysics Data System (ADS)

    Whittell, J.; Mattmann, C. A.; Whitehall, K. D.; Ramirez, P.; Goodale, C. E.; Boustani, M.; Hart, A. F.; Kim, J.; Waliser, D. E.; Joyce, M. J.

    2013-12-01

    The Regional Climate Model Evaluation System (RCMES, http://rcmes.jpl.nasa.gov) at NASA's Jet Propulsion Laboratory seeks to improve regional climate model output by comparing past model predictions with Earth-orbiting satellite data (Mattmann et al. 2013). RCMES ingests satellite and RCM data and processes these data into a common format; as needed, the software queries the RCMES database for these datasets, on which it runs a series of statistical metrics including model-satellite comparisons. The development of the RCMES software relies on collaboration between climatologists and computer scientists, as evinced by RCMES longstanding work with CORDEX (Kim et al. 2012). Over a total of 17 weeks in 2011, 2012, and 2013, I worked as an intern at NASA's Jet Propulsion Laboratory in a supportive capacity for RCMES. A high school student, I had no formal background in either Earth science or computer technology, but was immersed in both fields. In 2011, I researched three earth-science data management projects, producing a high-level explanation of these endeavors. The following year, I studied Python, contributing a command-line user interface to the RCMES project code. In 2013, I assisted with data acquisition, wrote a file header information plugin, and the visualization tool GrADS. The experience demonstrated the importance of an interdisciplinary approach to data processing: to streamline data ingestion and processing, scientists must understand, at least on a high-level, any programs they might utilize while to best serve the needs of earth scientists, software engineers must understand the science behind the data they handle.

  20. External quality-assurance programs managed by the U.S. Geological Survey in support of the National Atmospheric Deposition Program/National Trends Network

    USGS Publications Warehouse

    Latysh, Natalie E.; Wetherbee, Gregory A.

    2005-01-01

    The U.S. Geological Survey, Branch of Quality Systems, operates the external quality-assurance programs for the National Atmospheric Deposition Program/National Trends Network (NADP/NTN). Beginning in 1978, six different programs have been implemented?the intersite-comparison program, the blind-audit program, the sample-handling evaluation program, the field-audit program, the interlaboratory-comparison program, and the collocated-sampler program. Each program was designed to measure error contributed by specific components in the data-collection process. The intersite-comparison program, which was discontinued in 2004, was designed to assess the accuracy and reliability of field pH and specific-conductance measurements made by site operators. The blind-audit and sample-handling evaluation programs, which also were discontinued in 2002 and 2004, respectively, assessed contamination that may result from sampling equipment and routine handling and processing of the wet-deposition samples. The field-audit program assesses the effects of sample handling, processing, and field exposure. The interlaboratory-comparison program evaluates bias and precision of analytical results produced by the contract laboratory for NADP, the Illinois State Water Survey, Central Analytical Laboratory, and compares its performance with the performance of international laboratories. The collocated-sampler program assesses the overall precision of wet-deposition data collected by NADP/NTN. This report documents historical operations and the operating procedures for each of these external quality-assurance programs. USGS quality-assurance information allows NADP/NTN data users to discern between actual environmental trends and inherent measurement variability.

  1. User's manual for a material transport code on the Octopus Computer Network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naymik, T.G.; Mendez, G.D.

    1978-09-15

    A code to simulate material transport through porous media was developed at Oak Ridge National Laboratory. This code has been modified and adapted for use at Lawrence Livermore Laboratory. This manual, in conjunction with report ORNL-4928, explains the input, output, and execution of the code on the Octopus Computer Network.

  2. Low cost, high performance processing of single particle cryo-electron microscopy data in the cloud.

    PubMed

    Cianfrocco, Michael A; Leschziner, Andres E

    2015-05-08

    The advent of a new generation of electron microscopes and direct electron detectors has realized the potential of single particle cryo-electron microscopy (cryo-EM) as a technique to generate high-resolution structures. Calculating these structures requires high performance computing clusters, a resource that may be limiting to many likely cryo-EM users. To address this limitation and facilitate the spread of cryo-EM, we developed a publicly available 'off-the-shelf' computing environment on Amazon's elastic cloud computing infrastructure. This environment provides users with single particle cryo-EM software packages and the ability to create computing clusters with 16-480+ CPUs. We tested our computing environment using a publicly available 80S yeast ribosome dataset and estimate that laboratories could determine high-resolution cryo-EM structures for $50 to $1500 per structure within a timeframe comparable to local clusters. Our analysis shows that Amazon's cloud computing environment may offer a viable computing environment for cryo-EM.

  3. Analyzing checkpointing trends for applications on the IBM Blue Gene/P system.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naik, H.; Gupta, R.; Beckman, P.

    Current petascale systems have tens of thousands of hardware components and complex system software stacks, which increase the probability of faults occurring during the lifetime of a process. Checkpointing has been a popular method of providing fault tolerance in high-end systems. While considerable research has been done to optimize checkpointing, in practice the method still involves a high-cost overhead for users. In this paper, we study the checkpointing overhead seen by applications running on leadership-class machines such as the IBM Blue Gene/P at Argonne National Laboratory. We study various applications and design a methodology to assist users in understanding andmore » choosing checkpointing frequency and reducing the overhead incurred. In particular, we study three popular applications -- the Grid-Based Projector-Augmented Wave application, the Carr-Parrinello Molecular Dynamics application, and a Nek5000 computational fluid dynamics application -- and analyze their memory usage and possible checkpointing trends on 32,768 processors of the Blue Gene/P system.« less

  4. Healthwatch-2 System Overview

    NASA Technical Reports Server (NTRS)

    Barszcz, Eric; Mosher, Marianne; Huff, Edward M.

    2004-01-01

    Healthwatch-2 (HW-2) is a research tool designed to facilitate the development and testing of in-flight health monitoring algorithms. HW-2 software is written in C/C++ and executes on an x86-based computer running the Linux operating system. The executive module has interfaces for collecting various signal data, such as vibration, torque, tachometer, and GPS. It is designed to perform in-flight time or frequency averaging based on specifications defined in a user-supplied configuration file. Averaged data are then passed to a user-supplied algorithm written as a Matlab function. This allows researchers a convenient method for testing in-flight algorithms. In addition to its in-flight capabilities, HW-2 software is also capable of reading archived flight data and processing it as if collected in-flight. This allows algorithms to be developed and tested in the laboratory before being flown. Currently HW-2 has passed its checkout phase and is collecting data on a Bell OH-58C helicopter operated by the U.S. Army at NASA Ames Research Center.

  5. Institutional Transformation 2.5 Building Module Help Manual.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Villa, Daniel

    The Institutional Transformation (IX) building module is a software tool developed at Sandia National Laboratories to evaluate energy conservation measures (ECMs) on hundreds of DOE-2 building energy models simultaneously. In IX, ECMs can be designed through parameterizing DOE-2 building models and doing further processing via visual basic for applications subroutines. IX provides the functionality to handle multiple building models for different years, which enables incrementally changing a site of hundreds of buildings over time. It also enables evaluation of the effects of changing climate, comparisons between data and modeling results, and energy use of centralized utility buildings (CUBs). IX consistsmore » of a Microsoft Excel(r) user interface, Microsoft Access(r) database, and Microsoft Excel(r) CUB build utility whose functionalities are described in detail in this report. In addition to descriptions of the user interfaces, descriptions of every ECM already designed in IX is included. SAND2016-8983 IX 2.5 Help Manual« less

  6. Measurement and Instrumentation Challenges at X-ray Free Electron Lasers

    NASA Astrophysics Data System (ADS)

    Feng, Yiping

    2015-03-01

    X-ray Free Electron Laser sources based on the Self Amplified Spontaneous Emission process are intrinsically chaotic, giving rise to pulse-to-pulse fluctuations in all physical properties, including intensity, position and pointing, spatial and temporal profiles, spectral content, timing, and coherence. These fluctuations represents special challenges to users whose experiments are designed to reveal small changes in the underlying physical quantities, which would otherwise be completely washed out without using the proper diagnostics tools. Due to the X-ray FEL's unique characteristics such as the unprecedented peak power and nearly full spatial coherence, there are many technical challenges in conceiving and implementing these devices that are highly transmissive, provide sufficient signal-to-noise ratio, and most importantly work in the single-shot mode. Portions of this research were carried out at the Linac Coherent Light Source (LCLS) at the SLAC National Accelerator Laboratory. LCLS is an Office of Science User Facility operated for the U.S. Department of Energy Office of Science by Stanford Univ.

  7. FIREDOC users manual, 3rd edition

    NASA Astrophysics Data System (ADS)

    Jason, Nora H.

    1993-12-01

    FIREDOC is the on-line bibliographic database which reflects the holdings (published reports, journal articles, conference proceedings, books, and audiovisual items) of the Fire Research Information Services (FRIS) at the Building and Fire Research Laboratory (BFRL), National Institute of Standards and Technology (NIST). This manual provides step-by-step procedures for entering and exiting the database via telecommunication lines, as well as a number of techniques for searching the database and processing the results of the searches. This Third Edition is necessitated by the change to a UNIX platform. The new computer allows for faster response time if searching via a modem and, in addition, offers internet accessibility. FIREDOC may be used with personal computers, using DOS or Windows, or with Macintosh computers and workstations. A new section on how to access Internet is included, and one on how to obtain the references of interest to you. Appendix F: Quick Guide to Getting Started will be useful to both modem and Internet users.

  8. Perceptions of a medical microbiology service: a survey of laboratory users.

    PubMed Central

    Morgan, M S

    1995-01-01

    AIM--To ascertain the perception of laboratory users regarding the quality of the medical microbiology services in a district general hospital. METHODS--Detailed questionnaires were circulated to all clinicians in the locality, with headings covering the quality of medical advice provided, the availability of information on specimen collection, format of request forms, specimen transport arrangements, turnaround times, the quality and need for interpretative advice, and the overall impression of the quality of the services provided. RESULTS--Two hundred and thirty five replies were received, giving a response rate of 69%. Transportation of specimens and communication of reports were identified as priority areas for improvement. The overall quality of the service was perceived as satisfactory, although areas were identified where substantial improvements could be made, some at little or no cost to the laboratory. CONCLUSIONS--The survey focused clinicians' attention on the service, raised the profile of the laboratory, and resulted in improved communications and a better understanding of customer needs. Overall, the exercise was felt to be extremely useful, and worthwhile repeating to gauge the effect of the changes instituted as a result. PMID:8537489

  9. A generalized architecture of quantum secure direct communication for N disjointed users with authentication

    NASA Astrophysics Data System (ADS)

    Farouk, Ahmed; Zakaria, Magdy; Megahed, Adel; Omara, Fatma A.

    2015-11-01

    In this paper, we generalize a secured direct communication process between N users with partial and full cooperation of quantum server. So, N - 1 disjointed users u1, u2, …, uN-1 can transmit a secret message of classical bits to a remote user uN by utilizing the property of dense coding and Pauli unitary transformations. The authentication process between the quantum server and the users are validated by EPR entangled pair and CNOT gate. Afterwards, the remained EPR will generate shared GHZ states which are used for directly transmitting the secret message. The partial cooperation process indicates that N - 1 users can transmit a secret message directly to a remote user uN through a quantum channel. Furthermore, N - 1 users and a remote user uN can communicate without an established quantum channel among them by a full cooperation process. The security analysis of authentication and communication processes against many types of attacks proved that the attacker cannot gain any information during intercepting either authentication or communication processes. Hence, the security of transmitted message among N users is ensured as the attacker introduces an error probability irrespective of the sequence of measurement.

  10. A generalized architecture of quantum secure direct communication for N disjointed users with authentication.

    PubMed

    Farouk, Ahmed; Zakaria, Magdy; Megahed, Adel; Omara, Fatma A

    2015-11-18

    In this paper, we generalize a secured direct communication process between N users with partial and full cooperation of quantum server. So, N - 1 disjointed users u1, u2, …, uN-1 can transmit a secret message of classical bits to a remote user uN by utilizing the property of dense coding and Pauli unitary transformations. The authentication process between the quantum server and the users are validated by EPR entangled pair and CNOT gate. Afterwards, the remained EPR will generate shared GHZ states which are used for directly transmitting the secret message. The partial cooperation process indicates that N - 1 users can transmit a secret message directly to a remote user uN through a quantum channel. Furthermore, N - 1 users and a remote user uN can communicate without an established quantum channel among them by a full cooperation process. The security analysis of authentication and communication processes against many types of attacks proved that the attacker cannot gain any information during intercepting either authentication or communication processes. Hence, the security of transmitted message among N users is ensured as the attacker introduces an error probability irrespective of the sequence of measurement.

  11. In Situ Observation of Directed Nanoparticle Aggregation During the Synthesis of Ordered Nanoporous Metal in Soft Templates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parent, Lucas R.; Robinson, David B.; Cappillino, Patrick J.

    2014-02-11

    The prevalent approach to developing new nanomaterials is a trial and error process of iteratively altering synthesis procedures and then characterizing the resulting nanostructures. This is fundamentally limited in that the growth processes that occur during synthesis can only be inferred from the final synthetic structure. Directly observing real-time nanomaterial growth provides unprecedented insight into the relationship between synthesis conditions and product evolution, and facilitates a mechanistic approach to nanomaterial development. Here we use in situ liquid stage scanning transmission electron microscopy to observe the growth of mesoporous palladium in a solvated block copolymer (BCP) template under various synthesis conditions,more » and ultimately determine a refined synthesis procedure that yields ordered pores. We find that at low organic solvent (tetrahydrofuran, THF) content, the BCP assembles into a rigid, cylindrical micelle array with a high degree of short-range order, but poor long-range order. Upon slowing the THF evaporation rate using a solvent-vapor anneal step, the long-range order is greatly improved. The electron beam induces nucleation of small particles in the aqueous phase around the micelles. The small particles then flocculate and grow into denser structures that surround the micelles, forming an ordered mesoporous structure. The microscope observations revealed that template disorder can be addressed prior to reaction, and is not invariably induced by the growth process itself, allowing us to more quickly optimize the synthetic method. This work was conducted in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL), a national scientific user facility sponsored by DOE’s Office of Biological and Environmental Research and located at Pacific Northwest National Laboratory. The Pacific Northwest National Laboratory is operated by Battelle for the U.S. Department of Energy under contract DE-AC05-76RL01830. This research was funded in part by: the Presidential Early Career Award for Scientist and Engineers for I.A., the University of California Academic Senate and the University of California Laboratory fee research grant, the Laboratory-Directed Research and Development program at Sandia National Laboratories, and the Chemical Imaging Initiative at Pacific Northwest National Laboratory. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.« less

  12. GMOseek: a user friendly tool for optimized GMO testing.

    PubMed

    Morisset, Dany; Novak, Petra Kralj; Zupanič, Darko; Gruden, Kristina; Lavrač, Nada; Žel, Jana

    2014-08-01

    With the increasing pace of new Genetically Modified Organisms (GMOs) authorized or in pipeline for commercialization worldwide, the task of the laboratories in charge to test the compliance of food, feed or seed samples with their relevant regulations became difficult and costly. Many of them have already adopted the so called "matrix approach" to rationalize the resources and efforts used to increase their efficiency within a limited budget. Most of the time, the "matrix approach" is implemented using limited information and some proprietary (if any) computational tool to efficiently use the available data. The developed GMOseek software is designed to support decision making in all the phases of routine GMO laboratory testing, including the interpretation of wet-lab results. The tool makes use of a tabulated matrix of GM events and their genetic elements, of the laboratory analysis history and the available information about the sample at hand. The tool uses an optimization approach to suggest the most suited screening assays for the given sample. The practical GMOseek user interface allows the user to customize the search for a cost-efficient combination of screening assays to be employed on a given sample. It further guides the user to select appropriate analyses to determine the presence of individual GM events in the analyzed sample, and it helps taking a final decision regarding the GMO composition in the sample. GMOseek can also be used to evaluate new, previously unused GMO screening targets and to estimate the profitability of developing new GMO screening methods. The presented freely available software tool offers the GMO testing laboratories the possibility to select combinations of assays (e.g. quantitative real-time PCR tests) needed for their task, by allowing the expert to express his/her preferences in terms of multiplexing and cost. The utility of GMOseek is exemplified by analyzing selected food, feed and seed samples from a national reference laboratory for GMO testing and by comparing its performance to existing tools which use the matrix approach. GMOseek proves superior when tested on real samples in terms of GMO coverage and cost efficiency of its screening strategies, including its capacity of simple interpretation of the testing results.

  13. End-user perspective of low-cost sensors for outdoor air pollution monitoring.

    PubMed

    Rai, Aakash C; Kumar, Prashant; Pilla, Francesco; Skouloudis, Andreas N; Di Sabatino, Silvana; Ratti, Carlo; Yasar, Ansar; Rickerby, David

    2017-12-31

    Low-cost sensor technology can potentially revolutionise the area of air pollution monitoring by providing high-density spatiotemporal pollution data. Such data can be utilised for supplementing traditional pollution monitoring, improving exposure estimates, and raising community awareness about air pollution. However, data quality remains a major concern that hinders the widespread adoption of low-cost sensor technology. Unreliable data may mislead unsuspecting users and potentially lead to alarming consequences such as reporting acceptable air pollutant levels when they are above the limits deemed safe for human health. This article provides scientific guidance to the end-users for effectively deploying low-cost sensors for monitoring air pollution and people's exposure, while ensuring reasonable data quality. We review the performance characteristics of several low-cost particle and gas monitoring sensors and provide recommendations to end-users for making proper sensor selection by summarizing the capabilities and limitations of such sensors. The challenges, best practices, and future outlook for effectively deploying low-cost sensors, and maintaining data quality are also discussed. For data quality assurance, a two-stage sensor calibration process is recommended, which includes laboratory calibration under controlled conditions by the manufacturer supplemented with routine calibration checks performed by the end-user under final deployment conditions. For large sensor networks where routine calibration checks are impractical, statistical techniques for data quality assurance should be utilised. Further advancements and adoption of sophisticated mathematical and statistical techniques for sensor calibration, fault detection, and data quality assurance can indeed help to realise the promised benefits of a low-cost air pollution sensor network. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Specific attentional dysfunction in adults following early start of cannabis use.

    PubMed

    Ehrenreich, H; Rinn, T; Kunert, H J; Moeller, M R; Poser, W; Schilling, L; Gigerenzer, G; Hoehe, M R

    1999-03-01

    The present study tested the hypothesis that chronic interference by cannabis with endogenous cannabinoid systems during peripubertal development causes specific and persistent brain alterations in humans. As an index of cannabinoid action, visual scanning, along with other attentional functions, was chosen. Visual scanning undergoes a major maturation process around age 12-15 years and, in addition, the visual system is known to react specifically and sensitively to cannabinoids. From 250 individuals consuming cannabis regularly, 99 healthy pure cannabis users were selected. They were free of any other past or present drug abuse, or history of neuropsychiatric disease. After an interview, physical examination, analysis of routine laboratory parameters, plasma/urine analyses for drugs, and MMPI testing, users and respective controls were subjected to a computer-assisted attention test battery comprising visual scanning, alertness, divided attention, flexibility, and working memory. Of the potential predictors of test performance within the user group, including present age, age of onset of cannabis use, degree of acute intoxication (THC+THCOH plasma levels), and cumulative toxicity (estimated total life dose), an early age of onset turned out to be the only predictor, predicting impaired reaction times exclusively in visual scanning. Early-onset users (onset before age 16; n = 48) showed a significant impairment in reaction times in this function, whereas late-onset users (onset after age 16; n = 51) did not differ from controls (n = 49). These data suggest that beginning cannabis use during early adolescence may lead to enduring effects on specific attentional functions in adulthood. Apparently, vulnerable periods during brain development exist that are subject to persistent alterations by interfering exogenous cannabinoids.

  15. Multiscale Laboratory Infrastructure and Services to users: Plans within EPOS

    NASA Astrophysics Data System (ADS)

    Spiers, Chris; Willingshofer, Ernst; Drury, Martyn; Funiciello, Francesca; Rosenau, Matthias; Scarlato, Piergiorgio; Sagnotti, Leonardo; EPOS WG6, Corrado Cimarelli

    2015-04-01

    The participant countries in EPOS embody a wide range of world-class laboratory infrastructures ranging from high temperature and pressure experimental facilities, to electron microscopy, micro-beam analysis, analogue modeling and paleomagnetic laboratories. Most data produced by the various laboratory centres and networks are presently available only in limited "final form" in publications. Many data remain inaccessible and/or poorly preserved. However, the data produced at the participating laboratories are crucial to serving society's need for geo-resources exploration and for protection against geo-hazards. Indeed, to model resource formation and system behaviour during exploitation, we need an understanding from the molecular to the continental scale, based on experimental data. This contribution will describe the plans that the laboratories community in Europe is making, in the context of EPOS. The main objectives are: • To collect and harmonize available and emerging laboratory data on the properties and processes controlling rock system behaviour at multiple scales, in order to generate products accessible and interoperable through services for supporting research activities. • To co-ordinate the development, integration and trans-national usage of the major solid Earth Science laboratory centres and specialist networks. The length scales encompassed by the infrastructures included range from the nano- and micrometer levels (electron microscopy and micro-beam analysis) to the scale of experiments on centimetre sized samples, and to analogue model experiments simulating the reservoir scale, the basin scale and the plate scale. • To provide products and services supporting research into Geo-resources and Geo-storage, Geo-hazards and Earth System Evolution. If the EPOS Implementation Phase proposal presently under construction is successful, then a range of services and transnational activities will be put in place to realize these objectives.

  16. Facilities for macromolecular crystallography at the Helmholtz-Zentrum Berlin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mueller, Uwe; Darowski, Nora; Fuchs, Martin R.

    2012-03-20

    Three macromolecular crystallography (MX) beamlines at the Helmholtz-Zentrum Berlin (HZB) are available for the regional, national and international structural biology user community. The state-of-the-art synchrotron beamlines for MX BL14.1, BL14.2 and BL14.3 are located within the low-[beta] section of the BESSY II electron storage ring. All beamlines are fed from a superconducting 7 T wavelength-shifter insertion device. BL14.1 and BL14.2 are energy tunable in the range 5-16 keV, while BL14.3 is a fixed-energy side station operated at 13.8 keV. All beamlines are equipped with CCD detectors. BL14.1 and BL14.2 are in regular user operation providing about 200 beam days permore » year and about 600 user shifts to approximately 50 research groups across Europe. BL14.3 has initially been used as a test facility and was brought into regular user mode operation during the year 2010. BL14.1 has recently been upgraded with a microdiffractometer including a mini-[kappa] goniometer and an automated sample changer. Other user facilities include office space adjacent to the beamlines, a sample preparation laboratory, a biology laboratory (safety level 1) and high-end computing resources. In this article the instrumentation of the beamlines is described, and a summary of the experimental possibilities of the beamlines and the provided ancillary equipment for the user community is given.« less

  17. Facilities for macromolecular crystallography at the Helmholtz-Zentrum Berlin

    PubMed Central

    Mueller, Uwe; Darowski, Nora; Fuchs, Martin R.; Förster, Ronald; Hellmig, Michael; Paithankar, Karthik S.; Pühringer, Sandra; Steffien, Michael; Zocher, Georg; Weiss, Manfred S.

    2012-01-01

    Three macromolecular crystallography (MX) beamlines at the Helmholtz-Zentrum Berlin (HZB) are available for the regional, national and international structural biology user community. The state-of-the-art synchrotron beamlines for MX BL14.1, BL14.2 and BL14.3 are located within the low-β section of the BESSY II electron storage ring. All beamlines are fed from a superconducting 7 T wavelength-shifter insertion device. BL14.1 and BL14.2 are energy tunable in the range 5–16 keV, while BL14.3 is a fixed-energy side station operated at 13.8 keV. All three beamlines are equipped with CCD detectors. BL14.1 and BL14.2 are in regular user operation providing about 200 beam days per year and about 600 user shifts to approximately 50 research groups across Europe. BL14.3 has initially been used as a test facility and was brought into regular user mode operation during the year 2010. BL14.1 has recently been upgraded with a microdiffractometer including a mini-κ goniometer and an automated sample changer. Additional user facilities include office space adjacent to the beamlines, a sample preparation laboratory, a biology laboratory (safety level 1) and high-end computing resources. In this article the instrumentation of the beamlines is described, and a summary of the experimental possibilities of the beamlines and the provided ancillary equipment for the user community is given. PMID:22514183

  18. MyEEW: A Smartphone App for the ShakeAlert System

    NASA Astrophysics Data System (ADS)

    Strauss, J. A.; Allen, S.; Allen, R. M.; Hellweg, M.

    2015-12-01

    Earthquake Early Warning (EEW) is a system that can provide a few to tens of seconds warning prior to ground shaking at a user's location. The goal and purpose of such a system is to reduce, or minimize, the damage, costs, and casualties resulting from an earthquake. A demonstration earthquake early warning system (ShakeAlert) is undergoing testing in the United States by the UC Berkeley Seismological Laboratory, Caltech, ETH Zurich, University of Washington, the USGS, and beta users in California and the Pacific Northwest. The UC Berkeley Seismological Laboratory has created a smartphone app called MyEEW, which interfaces with the ShakeAlert system to deliver early warnings to individual users. Many critical facilities (transportation, police, and fire) have control rooms, which could run a centralized interface, but our ShakeAlert Beta Testers have also expressed their need for mobile options. This app augments the basic ShakeAlert Java desktop applet by allowing workers off-site (or merely out of hearing range) to be informed of coming hazards. MyEEW receives information from the ShakeAlert system to provide users with real-time information about shaking that is about to happen at their individual location. It includes a map, timer, and earthquake information similar to the Java desktop User Display. The app will also feature educational material to help users craft their own response and resiliency strategies. The app will be open to UC Berkeley Earthquake Research Affiliates members for testing in the near future.

  19. Space station related investigations in Europe

    NASA Astrophysics Data System (ADS)

    Wienss, W.; Vallerain, E.

    1984-10-01

    Studies pertaining to the definition of Europe's role in the Space Station program are described, with consideration given to such elements as pressurized modules as laboratories for materials processing and life sciences, unpressurized elements, and service vehicles for on-orbit maintenance and repair activities. Candidate elements were selected against such criteria as clean interfaces, the satisfaction of European user needs, new technology items, and European financial capabilities; and their technical and programmatic implications were examined. Different scenarios were considered, ranging from a fully Space-Station-dependent case to a completely autonomous, free-flying man-tendable configuration. Recommendations on a collaboration between Europe and the United States are presented.

  20. Feeding People's Curiosity: Leveraging the Cloud for Automatic Dissemination of Mars Images

    NASA Technical Reports Server (NTRS)

    Knight, David; Powell, Mark

    2013-01-01

    Smartphones and tablets have made wireless computing ubiquitous, and users expect instant, on-demand access to information. The Mars Science Laboratory (MSL) operations software suite, MSL InterfaCE (MSLICE), employs a different back-end image processing architecture compared to that of the Mars Exploration Rovers (MER) in order to better satisfy modern consumer-driven usage patterns and to offer greater server-side flexibility. Cloud services are a centerpiece of the server-side architecture that allows new image data to be delivered automatically to both scientists using MSLICE and the general public through the MSL website (http://mars.jpl.nasa.gov/msl/).

Top