Sample records for process control laboratory

  1. 21 CFR 111.110 - What quality control operations are required for laboratory operations associated with the...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... laboratory operations associated with the production and process control system? 111.110 Section 111.110 Food... OPERATIONS FOR DIETARY SUPPLEMENTS Production and Process Control System: Requirements for Quality Control... production and process control system? Quality control operations for laboratory operations associated with...

  2. A senior manufacturing laboratory for determining injection molding process capability

    NASA Technical Reports Server (NTRS)

    Wickman, Jerry L.; Plocinski, David

    1992-01-01

    The following is a laboratory experiment designed to further understanding of materials science. This subject material is directed at an upper level undergraduate/graduate student in an Engineering or Engineering Technology program. It is assumed that the student has a thorough understanding of the process and quality control. The format of this laboratory does not follow that which is normally recommended because of the nature of process capability and that of the injection molding equipment and tooling. This laboratory is instead developed to be used as a point of departure for determining process capability for any process in either a quality control laboratory or a manufacturing environment where control charts, process capability, and experimental or product design are considered important topics.

  3. VIEW OF THE INTERIOR OF BUILDING 125, THE STANDARDS LABORATORY. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    VIEW OF THE INTERIOR OF BUILDING 125, THE STANDARDS LABORATORY. THE PRIMARY FUNCTION OF THE STANDARDS LABORATORY WAS TO ENSURE AND IMPLEMENT A SYSTEM OF QUALITY CONTROL FOR INCOMING MATERIALS USED IN MANUFACTURING PROCESSES. SEVERAL ENGINEERING CONTROLS WERE USED TO ASSURE ACCURACY OF THE CALIBRATION PROCESSES INCLUDING: FLEX-FREE GRANITE TABLES, AIR LOCKED DOORS, TEMPERATURE CONTROLS, AND A SUPER-CLEAN ENVIRONMENT - Rocky Flats Plant, Standards Laboratory, Immediately north of 215A water tower & adjacent to Third Street, Golden, Jefferson County, CO

  4. Testing a Constrained MPC Controller in a Process Control Laboratory

    ERIC Educational Resources Information Center

    Ricardez-Sandoval, Luis A.; Blankespoor, Wesley; Budman, Hector M.

    2010-01-01

    This paper describes an experiment performed by the fourth year chemical engineering students in the process control laboratory at the University of Waterloo. The objective of this experiment is to test the capabilities of a constrained Model Predictive Controller (MPC) to control the operation of a Double Pipe Heat Exchanger (DPHE) in real time.…

  5. Systems integration for the Kennedy Space Center (KSC) Robotics Applications Development Laboratory (RADL)

    NASA Technical Reports Server (NTRS)

    Davis, V. Leon; Nordeen, Ross

    1988-01-01

    A laboratory for developing robotics technology for hazardous and repetitive Shuttle and payload processing activities is discussed. An overview of the computer hardware and software responsible for integrating the laboratory systems is given. The center's anthropomorphic robot is placed on a track allowing it to be moved to different stations. Various aspects of the laboratory equipment are described, including industrial robot arm control, smart systems integration, the supervisory computer, programmable process controller, real-time tracking controller, image processing hardware, and control display graphics. Topics of research include: automated loading and unloading of hypergolics for space vehicles and payloads; the use of mobile robotics for security, fire fighting, and hazardous spill operations; nondestructive testing for SRB joint and seal verification; Shuttle Orbiter radiator damage inspection; and Orbiter contour measurements. The possibility of expanding the laboratory in the future is examined.

  6. Safety in the Chemical Laboratory: Fire Safety and Fire Control in the Chemistry Laboratory.

    ERIC Educational Resources Information Center

    Wilbraham, A. C.

    1979-01-01

    Discusses fire safety and fire control in the chemistry laboratory. The combustion process, extinguishing equipment, extinguisher maintenance and location, and fire safety and practices are included. (HM)

  7. Improved compliance by BPM-driven workflow automation.

    PubMed

    Holzmüller-Laue, Silke; Göde, Bernd; Fleischer, Heidi; Thurow, Kerstin

    2014-12-01

    Using methods and technologies of business process management (BPM) for the laboratory automation has important benefits (i.e., the agility of high-level automation processes, rapid interdisciplinary prototyping and implementation of laboratory tasks and procedures, and efficient real-time process documentation). A principal goal of the model-driven development is the improved transparency of processes and the alignment of process diagrams and technical code. First experiences of using the business process model and notation (BPMN) show that easy-to-read graphical process models can achieve and provide standardization of laboratory workflows. The model-based development allows one to change processes quickly and an easy adaption to changing requirements. The process models are able to host work procedures and their scheduling in compliance with predefined guidelines and policies. Finally, the process-controlled documentation of complex workflow results addresses modern laboratory needs of quality assurance. BPMN 2.0 as an automation language to control every kind of activity or subprocess is directed to complete workflows in end-to-end relationships. BPMN is applicable as a system-independent and cross-disciplinary graphical language to document all methods in laboratories (i.e., screening procedures or analytical processes). That means, with the BPM standard, a communication method of sharing process knowledge of laboratories is also available. © 2014 Society for Laboratory Automation and Screening.

  8. 1. VIEW OF A PORTION OF THE HYDRIDE PROCESSING LABORATORY. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. VIEW OF A PORTION OF THE HYDRIDE PROCESSING LABORATORY. OPERATIONS IN THE GLOVE BOX IN THE BACKGROUND OF THE PHOTOGRAPH INCLUDED HYDRIDING OF PLUTONIUM AND HYDRIDE SEPARATION. IN THE FOREGROUND, THE VACUUM MONITOR CONTROL PANEL MEASURED TEMPERATURES WITHIN THE GLOVEBOX. THE CENTER CONTROL PANEL REGULATED THE FURNACE INSIDE THE GLOVE BOX USED IN THE HYDRIDING PROCESSES. THIS EQUIPMENT WAS ESSENTIAL TO THE HYDRIDING PROCESS, AS WELL AS OTHER GLOVE BOX OPERATIONS. - Rocky Flats Plant, Plutonium Laboratory, North-central section of industrial area at 79 Drive, Golden, Jefferson County, CO

  9. PLACE: an open-source python package for laboratory automation, control, and experimentation.

    PubMed

    Johnson, Jami L; Tom Wörden, Henrik; van Wijk, Kasper

    2015-02-01

    In modern laboratories, software can drive the full experimental process from data acquisition to storage, processing, and analysis. The automation of laboratory data acquisition is an important consideration for every laboratory. When implementing a laboratory automation scheme, important parameters include its reliability, time to implement, adaptability, and compatibility with software used at other stages of experimentation. In this article, we present an open-source, flexible, and extensible Python package for Laboratory Automation, Control, and Experimentation (PLACE). The package uses modular organization and clear design principles; therefore, it can be easily customized or expanded to meet the needs of diverse laboratories. We discuss the organization of PLACE, data-handling considerations, and then present an example using PLACE for laser-ultrasound experiments. Finally, we demonstrate the seamless transition to post-processing and analysis with Python through the development of an analysis module for data produced by PLACE automation. © 2014 Society for Laboratory Automation and Screening.

  10. Laboratory Control System's Effects on Student Achievement and Attitudes

    ERIC Educational Resources Information Center

    Cicek, Fatma Gozalan; Taspinar, Mehmet

    2016-01-01

    Problem Statement: The current study investigates whether the learning environment designed based on the laboratory control system affects the academic achievement, the attitude toward the learning-teaching process and the retention of the students in computer education. Purpose of Study: The study aims to identify the laboratory control system…

  11. Potential of Laboratory Execution Systems (LESs) to Simplify the Application of Business Process Management Systems (BPMSs) in Laboratory Automation.

    PubMed

    Neubert, Sebastian; Göde, Bernd; Gu, Xiangyu; Stoll, Norbert; Thurow, Kerstin

    2017-04-01

    Modern business process management (BPM) is increasingly interesting for laboratory automation. End-to-end workflow automation and improved top-level systems integration for information technology (IT) and automation systems are especially prominent objectives. With the ISO Standard Business Process Model and Notation (BPMN) 2.X, a system-independent and interdisciplinary accepted graphical process control notation is provided, allowing process analysis, while also being executable. The transfer of BPM solutions to structured laboratory automation places novel demands, for example, concerning the real-time-critical process and systems integration. The article discusses the potential of laboratory execution systems (LESs) for an easier implementation of the business process management system (BPMS) in hierarchical laboratory automation. In particular, complex application scenarios, including long process chains based on, for example, several distributed automation islands and mobile laboratory robots for a material transport, are difficult to handle in BPMSs. The presented approach deals with the displacement of workflow control tasks into life science specialized LESs, the reduction of numerous different interfaces between BPMSs and subsystems, and the simplification of complex process modelings. Thus, the integration effort for complex laboratory workflows can be significantly reduced for strictly structured automation solutions. An example application, consisting of a mixture of manual and automated subprocesses, is demonstrated by the presented BPMS-LES approach.

  12. A Multi-User Remote Academic Laboratory System

    ERIC Educational Resources Information Center

    Barrios, Arquimedes; Panche, Stifen; Duque, Mauricio; Grisales, Victor H.; Prieto, Flavio; Villa, Jose L.; Chevrel, Philippe; Canu, Michael

    2013-01-01

    This article describes the development, implementation and preliminary operation assessment of Multiuser Network Architecture to integrate a number of Remote Academic Laboratories for educational purposes on automatic control. Through the Internet, real processes or physical experiments conducted at the control engineering laboratories of four…

  13. Importance of implementing an analytical quality control system in a core laboratory.

    PubMed

    Marques-Garcia, F; Garcia-Codesal, M F; Caro-Narros, M R; Contreras-SanFeliciano, T

    2015-01-01

    The aim of the clinical laboratory is to provide useful information for screening, diagnosis and monitoring of disease. The laboratory should ensure the quality of extra-analytical and analytical process, based on set criteria. To do this, it develops and implements a system of internal quality control, designed to detect errors, and compare its data with other laboratories, through external quality control. In this way it has a tool to detect the fulfillment of the objectives set, and in case of errors, allowing corrective actions to be made, and ensure the reliability of the results. This article sets out to describe the design and implementation of an internal quality control protocol, as well as its periodical assessment intervals (6 months) to determine compliance with pre-determined specifications (Stockholm Consensus(1)). A total of 40 biochemical and 15 immunochemical methods were evaluated using three different control materials. Next, a standard operation procedure was planned to develop a system of internal quality control that included calculating the error of the analytical process, setting quality specifications, and verifying compliance. The quality control data were then statistically depicted as means, standard deviations, and coefficients of variation, as well as systematic, random, and total errors. The quality specifications were then fixed and the operational rules to apply in the analytical process were calculated. Finally, our data were compared with those of other laboratories through an external quality assurance program. The development of an analytical quality control system is a highly structured process. This should be designed to detect errors that compromise the stability of the analytical process. The laboratory should review its quality indicators, systematic, random and total error at regular intervals, in order to ensure that they are meeting pre-determined specifications, and if not, apply the appropriate corrective actions. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.

  14. [Establishment of Quality Control System of Nucleic Acid Detection for Ebola Virus in Sierra Leone-China Friendship Biological Safety Laboratory].

    PubMed

    Wang, Qin; Zhang, Yong; Nie, Kai; Wang, Huanyu; Du, Haijun; Song, Jingdong; Xiao, Kang; Lei, Wenwen; Guo, Jianqiang; Wei, Hejiang; Cai, Kun; Wang, Yanhai; Wu, Jiang; Gerald, Bangura; Kamara, Idrissa Laybohr; Liang, Mifang; Wu, Guizhen; Dong, Xiaoping

    2016-03-01

    The quality control process throughout the Ebola virus nucleic acid detection in Sierra Leone-China Friendship Biological Safety Laboratory (SLE-CHN Biosafety Lab) was described in detail, in order to comprehensively display the scientific, rigorous, accurate and efficient practice in detection of Ebola virus of first batch detection team in SLE-CHN Biosafety Lab. Firstly, the key points of laboratory quality control system was described, including the managements and organizing, quality control documents and information management, instrument, reagents and supplies, assessment, facilities design and space allocation, laboratory maintenance and biosecurity. Secondly, the application of quality control methods in the whole process of the Ebola virus detection, including before the test, during the test and after the test, was analyzed. The excellent and professional laboratory staffs, the implementation of humanized management are the cornerstone of the success; High-level biological safety protection is the premise for effective quality control and completion of Ebola virus detection tasks. And professional logistics is prerequisite for launching the laboratory diagnosis of Ebola virus. The establishment and running of SLE-CHN Biosafety Lab has landmark significance for the friendship between Sierra Leone and China, and the lab becomes the most important base for Ebola virus laboratory testing in Sierra Leone.

  15. Developing Learning Tool of Control System Engineering Using Matrix Laboratory Software Oriented on Industrial Needs

    NASA Astrophysics Data System (ADS)

    Isnur Haryudo, Subuh; Imam Agung, Achmad; Firmansyah, Rifqi

    2018-04-01

    The purpose of this research is to develop learning media of control technique using Matrix Laboratory software with industry requirement approach. Learning media serves as a tool for creating a better and effective teaching and learning situation because it can accelerate the learning process in order to enhance the quality of learning. Control Techniques using Matrix Laboratory software can enlarge the interest and attention of students, with real experience and can grow independent attitude. This research design refers to the use of research and development (R & D) methods that have been modified by multi-disciplinary team-based researchers. This research used Computer based learning method consisting of computer and Matrix Laboratory software which was integrated with props. Matrix Laboratory has the ability to visualize the theory and analysis of the Control System which is an integration of computing, visualization and programming which is easy to use. The result of this instructional media development is to use mathematical equations using Matrix Laboratory software on control system application with DC motor plant and PID (Proportional-Integral-Derivative). Considering that manufacturing in the field of Distributed Control systems (DCSs), Programmable Controllers (PLCs), and Microcontrollers (MCUs) use PID systems in production processes are widely used in industry.

  16. Interfacing LabVIEW With Instrumentation for Electronic Failure Analysis and Beyond

    NASA Technical Reports Server (NTRS)

    Buchanan, Randy K.; Bryan, Coleman; Ludwig, Larry

    1996-01-01

    The Laboratory Virtual Instrumentation Engineering Workstation (LabVIEW) software is designed such that equipment and processes related to control systems can be operationally lined and controlled by the use of a computer. Various processes within the failure analysis laboratories of NASA's Kennedy Space Center (KSC) demonstrate the need for modernization and, in some cases, automation, using LabVIEW. An examination of procedures and practices with the Failure Analaysis Laboratory resulted in the conclusion that some device was necessary to elevate the potential users of LabVIEW to an operational level in minimum time. This paper outlines the process involved in creating a tutorial application to enable personnel to apply LabVIEW to their specific projects. Suggestions for furthering the extent to which LabVIEW is used are provided in the areas of data acquisition and process control.

  17. Walk-through survey report: control technology for fermentation processes at Wyeth Laboratories, Inc. , West Chester, Pennsylvania

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martinez, K.F.

    A walk-through survey was conducted at Wyeth Laboratories, Incorporated, West Chester, Pennsylvania in November, 1983. The purpose of the survey was to evaluate the control technology for the fermentation processes. The facility produced penicillin-V and penicillin-G using the microbial strain Penicillium-chrysogenum. Medical examinations were available for fermentation and extraction process workers. Safety shoes and glasses and disposable dust respirators were provided. The author concludes that Wyeth has in operation an apparently effective system of control measures.

  18. 40 CFR 455.21 - Specialized definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... pollution control blowdown, steam jet blowdown, vacuum pump water, pump seal water, safety equipment.../process laboratory quality control wastewater. Notwithstanding any other regulation, process wastewater...

  19. 40 CFR 455.21 - Specialized definitions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... pollution control blowdown, steam jet blowdown, vacuum pump water, pump seal water, safety equipment.../process laboratory quality control wastewater. Notwithstanding any other regulation, process wastewater...

  20. 40 CFR 455.21 - Specialized definitions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... pollution control blowdown, steam jet blowdown, vacuum pump water, pump seal water, safety equipment.../process laboratory quality control wastewater. Notwithstanding any other regulation, process wastewater...

  1. International Co-Operation in Control Engineering Education Using Online Experiments

    ERIC Educational Resources Information Center

    Henry, Jim; Schaedel, Herbert M.

    2005-01-01

    This paper describes the international co-operation experience in teaching control engineering with laboratories being conducted remotely by students via the Internet. This paper describes how the students ran the experiments and their personal experiences with the laboratory. A tool for process identification and controller tuning based on…

  2. Application of Sigma Metrics Analysis for the Assessment and Modification of Quality Control Program in the Clinical Chemistry Laboratory of a Tertiary Care Hospital.

    PubMed

    Iqbal, Sahar; Mustansar, Tazeen

    2017-03-01

    Sigma is a metric that quantifies the performance of a process as a rate of Defects-Per-Million opportunities. In clinical laboratories, sigma metric analysis is used to assess the performance of laboratory process system. Sigma metric is also used as a quality management strategy for a laboratory process to improve the quality by addressing the errors after identification. The aim of this study is to evaluate the errors in quality control of analytical phase of laboratory system by sigma metric. For this purpose sigma metric analysis was done for analytes using the internal and external quality control as quality indicators. Results of sigma metric analysis were used to identify the gaps and need for modification in the strategy of laboratory quality control procedure. Sigma metric was calculated for quality control program of ten clinical chemistry analytes including glucose, chloride, cholesterol, triglyceride, HDL, albumin, direct bilirubin, total bilirubin, protein and creatinine, at two control levels. To calculate the sigma metric imprecision and bias was calculated with internal and external quality control data, respectively. The minimum acceptable performance was considered as 3 sigma. Westgard sigma rules were applied to customize the quality control procedure. Sigma level was found acceptable (≥3) for glucose (L2), cholesterol, triglyceride, HDL, direct bilirubin and creatinine at both levels of control. For rest of the analytes sigma metric was found <3. The lowest value for sigma was found for chloride (1.1) at L2. The highest value of sigma was found for creatinine (10.1) at L3. HDL was found with the highest sigma values at both control levels (8.8 and 8.0 at L2 and L3, respectively). We conclude that analytes with the sigma value <3 are required strict monitoring and modification in quality control procedure. In this study application of sigma rules provided us the practical solution for improved and focused design of QC procedure.

  3. Multiple-User, Multitasking, Virtual-Memory Computer System

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.; Roth, Don J.; Stang, David B.

    1993-01-01

    Computer system designed and programmed to serve multiple users in research laboratory. Provides for computer control and monitoring of laboratory instruments, acquisition and anlaysis of data from those instruments, and interaction with users via remote terminals. System provides fast access to shared central processing units and associated large (from megabytes to gigabytes) memories. Underlying concept of system also applicable to monitoring and control of industrial processes.

  4. Assessing technical performance in differential gene expression experiments with external spike-in RNA control ratio mixtures.

    PubMed

    Munro, Sarah A; Lund, Steven P; Pine, P Scott; Binder, Hans; Clevert, Djork-Arné; Conesa, Ana; Dopazo, Joaquin; Fasold, Mario; Hochreiter, Sepp; Hong, Huixiao; Jafari, Nadereh; Kreil, David P; Łabaj, Paweł P; Li, Sheng; Liao, Yang; Lin, Simon M; Meehan, Joseph; Mason, Christopher E; Santoyo-Lopez, Javier; Setterquist, Robert A; Shi, Leming; Shi, Wei; Smyth, Gordon K; Stralis-Pavese, Nancy; Su, Zhenqiang; Tong, Weida; Wang, Charles; Wang, Jian; Xu, Joshua; Ye, Zhan; Yang, Yong; Yu, Ying; Salit, Marc

    2014-09-25

    There is a critical need for standard approaches to assess, report and compare the technical performance of genome-scale differential gene expression experiments. Here we assess technical performance with a proposed standard 'dashboard' of metrics derived from analysis of external spike-in RNA control ratio mixtures. These control ratio mixtures with defined abundance ratios enable assessment of diagnostic performance of differentially expressed transcript lists, limit of detection of ratio (LODR) estimates and expression ratio variability and measurement bias. The performance metrics suite is applicable to analysis of a typical experiment, and here we also apply these metrics to evaluate technical performance among laboratories. An interlaboratory study using identical samples shared among 12 laboratories with three different measurement processes demonstrates generally consistent diagnostic power across 11 laboratories. Ratio measurement variability and bias are also comparable among laboratories for the same measurement process. We observe different biases for measurement processes using different mRNA-enrichment protocols.

  5. Preanalytical errors in medical laboratories: a review of the available methodologies of data collection and analysis.

    PubMed

    West, Jamie; Atherton, Jennifer; Costelloe, Seán J; Pourmahram, Ghazaleh; Stretton, Adam; Cornes, Michael

    2017-01-01

    Preanalytical errors have previously been shown to contribute a significant proportion of errors in laboratory processes and contribute to a number of patient safety risks. Accreditation against ISO 15189:2012 requires that laboratory Quality Management Systems consider the impact of preanalytical processes in areas such as the identification and control of non-conformances, continual improvement, internal audit and quality indicators. Previous studies have shown that there is a wide variation in the definition, repertoire and collection methods for preanalytical quality indicators. The International Federation of Clinical Chemistry Working Group on Laboratory Errors and Patient Safety has defined a number of quality indicators for the preanalytical stage, and the adoption of harmonized definitions will support interlaboratory comparisons and continual improvement. There are a variety of data collection methods, including audit, manual recording processes, incident reporting mechanisms and laboratory information systems. Quality management processes such as benchmarking, statistical process control, Pareto analysis and failure mode and effect analysis can be used to review data and should be incorporated into clinical governance mechanisms. In this paper, The Association for Clinical Biochemistry and Laboratory Medicine PreAnalytical Specialist Interest Group review the various data collection methods available. Our recommendation is the use of the laboratory information management systems as a recording mechanism for preanalytical errors as this provides the easiest and most standardized mechanism of data capture.

  6. An approach to quality and performance control in a computer-assisted clinical chemistry laboratory.

    PubMed Central

    Undrill, P E; Frazer, S C

    1979-01-01

    A locally developed, computer-based clinical chemistry laboratory system has been in operation since 1970. This utilises a Digital Equipment Co Ltd PDP 12 and an interconnected PDP 8/F computer. Details are presented of the performance and quality control techniques incorporated into the system. Laboratory performance is assessed through analysis of results from fixed-level control sera as well as from cumulative sum methods. At a simple level the presentation may be considered purely indicative, while at a more sophisticated level statistical concepts have been introduced to aid the laboratory controller in decision-making processes. PMID:438340

  7. Laboratory Processes for Confirmation of Lymphogranuloma Venereum Infection During a 2015 Investigation of a Cluster of Cases in the United States.

    PubMed

    Kersh, Ellen N; Pillay, Allan; de Voux, Alex; Chen, Cheng

    2017-11-01

    In September 2015, the Centers for Disease Control and Prevention were notified of a suspected outbreak investigation of lymphogranuloma venereum (LGV) cases by the Michigan Department of Health and Human Services. The Centers for Disease Control and Prevention offered support with a laboratory-developed polymerase chain reaction test for LGV. This note describes the laboratory workflow and procedures used for the laboratory confirmation of LGV infection.

  8. ELECTRONICS UPGRADE TO THE SAVANNAH RIVER NATIONAL LABORATORY COULOMETER FOR PLUTONIUM AND NEPTUNIUM ASSAY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cordaro, J.; Holland, M.; Reeves, G.

    The Savannah River Site (SRS) has the analytical measurement capability to perform high-precision plutonium concentration measurements by controlled-potential coulometry. State-of-the-art controlled-potential coulometers were designed and fabricated by the Savannah River National Laboratory and installed in the Analytical Laboratories process control laboratory. The Analytical Laboratories uses coulometry for routine accountability measurements of and for verification of standard preparations used to calibrate other plutonium measurement systems routinely applied to process control, nuclear safety, and other accountability applications. The SRNL Coulometer has a demonstrated measurement reliability of {approx}0.05% for 10 mg samples. The system has also been applied to the characterization of neptuniummore » standard solutions with a comparable reliability. The SRNL coulometer features: a patented current integration system; continuous electrical calibration versus Faraday's Constants and Ohm's Law; the control-potential adjustment technique for enhanced application of the Nernst Equation; a wide operating room temperature range; and a fully automated instrument control and data acquisition capability. Systems have been supplied to the International Atomic Energy Agency (IAEA), Russia, Japanese Atomic Energy Agency (JAEA) and the New Brunswick Laboratory (NBL). The most recent vintage of electronics was based on early 1990's integrated circuits. Many of the components are no longer available. At the request of the IAEA and the Department of State, SRNL has completed an electronics upgrade of their controlled-potential coulometer design. Three systems have built with the new design, one for the IAEA which was installed at SAL in May 2011, one system for Los Alamos National Laboratory, (LANL) and one for the SRS Analytical Laboratory. The LANL and SRS systems are undergoing startup testing with installation scheduled for this summer.« less

  9. The Biochemistry of the Muscle Contraction Process: An Undergraduate Laboratory Experiment Using Viscosity to Follow the Progress of a Reaction.

    ERIC Educational Resources Information Center

    Belliveau, James F.; And Others

    1981-01-01

    Describes an undergraduate laboratory experiment using viscosity to follow the progress of the contractile process in muscles. This simple, short experiment illustrates the action of ATP as the source of energy in the contractile process and the catalytic effect of calcium ions as a control in the energy producing process. (CS)

  10. Efficiency in pathology laboratories: a survey of operations management in NHS bacteriology.

    PubMed

    Szczepura, A K

    1991-01-01

    In recent years pathology laboratory services in the U.K. have experienced large increases in demand. But the extent to which U.K. laboratories have introduced controls to limit unnecessary procedures within the laboratory was previously unclear. This paper presents the results of a survey of all 343 NHS bacteriology laboratories which records the extent to which such operations management controls are now in place. The survey shows large differences between laboratories. Quality controls over inputs, the use of screening tests as a culture substitute, the use of direct susceptibility testing, controls over routine antibiotic susceptibility testing, and controls over reporting of results all vary widely. The survey also records the prevalence of hospital antibiotic policies, the extent to which laboratories produce antibiograms for user clinicians, the degree of computerisation in data handling, and the degree of automation in processing specimens. Finally, the survey uncovers a large variation between NHS labs in the percentage of bacteriology samples which prove positive and lead to antibiotic susceptibility tests being carried out.

  11. Can a combination of average of normals and "real time" External Quality Assurance replace Internal Quality Control?

    PubMed

    Badrick, Tony; Graham, Peter

    2018-03-28

    Internal Quality Control and External Quality Assurance are separate but related processes that have developed independently in laboratory medicine over many years. They have different sample frequencies, statistical interpretations and immediacy. Both processes have evolved absorbing new understandings of the concept of laboratory error, sample material matrix and assay capability. However, we do not believe at the coalface that either process has led to much improvement in patient outcomes recently. It is the increasing reliability and automation of analytical platforms along with improved stability of reagents that has reduced systematic and random error, which in turn has minimised the risk of running less frequent IQC. We suggest that it is time to rethink the role of both these processes and unite them into a single approach using an Average of Normals model supported by more frequent External Quality Assurance samples. This new paradigm may lead to less confusion for laboratory staff and quicker responses to and identification of out of control situations.

  12. Materials Science Laboratory - Columnar-to-Equiaxed Transition in Solidification Processing and Microstructure Formation in Casting of Technical Alloys under Diffusive and Magnetically Controlled Convective Conditions

    NASA Technical Reports Server (NTRS)

    Gandin, Charles-Andre; Ratke, Lorenz

    2008-01-01

    The Materials Science Laboratory - Columnar-to-Equiaxed Transition in Solidification Processing and Microstructure Formation in Casting of Technical Alloys under Diffusive and Magnetically Controlled Convective Conditions (MSL-CETSOL and MICAST) are two investigations which supports research into metallurgical solidification, semiconductor crystal growth (Bridgman and zone melting), and measurement of thermo-physical properties of materials. This is a cooperative investigation with the European Space Agency (ESA) and National Aeronautics and Space Administration (NASA) for accommodation and operation aboard the International Space Station (ISS). Research Summary: Materials Science Laboratory - Columnar-to-Equiaxed Transition in Solidification Processing (CETSOL) and Microstructure Formation in Casting of Technical Alloys under Diffusive and Magnetically Controlled Convective Conditions (MICAST) are two complementary investigations which will examine different growth patterns and evolution of microstructures during crystallization of metallic alloys in microgravity. The aim of these experiments is to deepen the quantitative understanding of the physical principles that govern solidification processes in cast alloys by directional solidification.

  13. Colour stainability of indirect CAD-CAM processed composites vs. conventionally laboratory processed composites after immersion in staining solutions.

    PubMed

    Arocha, Mariana A; Basilio, Juan; Llopis, Jaume; Di Bella, Enrico; Roig, Miguel; Ardu, Stefano; Mayoral, Juan R

    2014-07-01

    The aim of this study was to determine, by using a spectrophotometer device, the colour stainability of two indirect CAD/CAM processed composites in comparison with two conventionally laboratory-processed composites after being immersed 4 weeks in staining solutions such as coffee, black tea and red wine, using distilled water as control group. Two indirect CAD/CAM composites (Lava Ultimate and Paradigm MZ100) and two conventionally laboratory-processed composites (SR Adoro and Premise Indirect) of shade A2 were selected (160 disc samples). Colour stainability was measured after 4 weeks of immersion in three staining solutions (black tea, coffee, red wine) and distilled water. Specimen's colour was measured each week by means of a spectrophotometer (CIE L*a*b* system). Statistical analysis was carried out performing repeated ANOVA measurements and Tukey's HSD test to evaluate differences in ΔE00 measurements between groups; the interactions among composites, staining solutions and time duration were also evaluated. All materials showed significant discoloration (p<0.01) when compared to control group. The highest ΔE00 observed was with red wine, whereas black tea showed the lowest one. Indirect laboratory-processed resin composites showed the highest colour stability compared with CAD/CAM resin blocks. CAD/CAM processed composites immersed in staining solutions showed lower colour stability when compared to conventionally laboratory-processed resin composites. The demand for CAD/CAM restorations has been increasing; however, colour stainability for such material has been insufficiently studied. Moreover, this has not been performed comparing CAD/CAM processed composites versus laboratory-processed indirect composites by immersing in staining solutions for long immersion periods. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. The Individualized Quality Control Plan - Coming Soon to Clinical Microbiology Laboratories Everywhere!

    PubMed

    Anderson, Nancy

    2015-11-15

    As of January 1, 2016, microbiology laboratories can choose to adopt a new quality control option, the Individualized Quality Control Plan (IQCP), under the Clinical Laboratory Improvement Amendments of 1988 (CLIA). This voluntary approach increases flexibility for meeting regulatory requirements and provides laboratories the opportunity to customize QC for their testing in their unique environments and by their testing personnel. IQCP is an all-inclusive approach to quality based on risk management to address potential errors in the total testing process. It includes three main steps, (1) performing a risk assessment, (2) developing a QC plan, and (3) monitoring the plan through quality assessment. Resources are available from the Centers for Medicare & Medicaid Services, Centers for Disease Control and Prevention, American Society for Microbiology, Clinical and Laboratory Standards Institute, and accrediting organizations, such as the College of American Pathologists and Joint Commission, to assist microbiology laboratories implementing IQCP.

  15. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds †

    PubMed Central

    Frank, Jared A.; Brill, Anthony; Kapila, Vikram

    2016-01-01

    Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their embedded hardware and software in the measurement and control of the test-beds. This approach is a first step towards replacing laboratory-grade peripherals with more compact and affordable smartphone-based platforms, whose interactive user interfaces can engender wider participation and engagement from learners. Demonstrative cases are presented in which the sensing, computation, control, and user interaction with three motor-based test-beds are handled by a mounted smartphone. Results of experiments and simulations are used to validate the feasibility of mounted smartphones as measurement and feedback control platforms for motor-based laboratory test-beds, report the measurement precision and closed-loop performance achieved with such platforms, and address challenges in the development of platforms to maintain system stability. PMID:27556464

  16. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds.

    PubMed

    Frank, Jared A; Brill, Anthony; Kapila, Vikram

    2016-08-20

    Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their embedded hardware and software in the measurement and control of the test-beds. This approach is a first step towards replacing laboratory-grade peripherals with more compact and affordable smartphone-based platforms, whose interactive user interfaces can engender wider participation and engagement from learners. Demonstrative cases are presented in which the sensing, computation, control, and user interaction with three motor-based test-beds are handled by a mounted smartphone. Results of experiments and simulations are used to validate the feasibility of mounted smartphones as measurement and feedback control platforms for motor-based laboratory test-beds, report the measurement precision and closed-loop performance achieved with such platforms, and address challenges in the development of platforms to maintain system stability.

  17. REDUCING WASTEWATER FROM CUCUMBER PICKLING PROCESS BY CONTROLLED CULTURE FERMENTATION

    EPA Science Inventory

    On a demonstration scale, the controlled culture fermentation process (CCF) developed by the U.S. Food Fermentation Laboratory was compared with the conventional natural fermentation process (NF) in regard to product quality and yield and volume and concentration of wastewaters. ...

  18. Quality-assurance plan for the analysis of fluvial sediment by the U.S. Geological Survey Kentucky Water Science Center Sediment Laboratory

    USGS Publications Warehouse

    Shreve, Elizabeth A.; Downs, Aimee C.

    2005-01-01

    This report describes laboratory procedures used by the U.S. Geological Survey Kentucky Water Science Center Sediment Laboratory for the processing and analysis of fluvial-sediment samples for concentration of sand and finer material. The report details the processing of a sediment sample through the laboratory from receiving the sediment sample, through the analytical process, to compiling results of the requested analysis. Procedures for preserving sample integrity, calibrating and maintaining of laboratory and field instruments and equipment, analyzing samples, internal quality assurance and quality control, and validity of the sediment-analysis results also are described. The report includes a list of references cited and a glossary of sediment and quality-assurance terms.

  19. Application of statistical process control to qualitative molecular diagnostic assays.

    PubMed

    O'Brien, Cathal P; Finn, Stephen P

    2014-01-01

    Modern pathology laboratories and in particular high throughput laboratories such as clinical chemistry have developed a reliable system for statistical process control (SPC). Such a system is absent from the majority of molecular laboratories and where present is confined to quantitative assays. As the inability to apply SPC to an assay is an obvious disadvantage this study aimed to solve this problem by using a frequency estimate coupled with a confidence interval calculation to detect deviations from an expected mutation frequency. The results of this study demonstrate the strengths and weaknesses of this approach and highlight minimum sample number requirements. Notably, assays with low mutation frequencies and detection of small deviations from an expected value require greater sample numbers to mitigate a protracted time to detection. Modeled laboratory data was also used to highlight how this approach might be applied in a routine molecular laboratory. This article is the first to describe the application of SPC to qualitative laboratory data.

  20. Sonication standard laboratory module

    DOEpatents

    Beugelsdijk, Tony; Hollen, Robert M.; Erkkila, Tracy H.; Bronisz, Lawrence E.; Roybal, Jeffrey E.; Clark, Michael Leon

    1999-01-01

    A standard laboratory module for automatically producing a solution of cominants from a soil sample. A sonication tip agitates a solution containing the soil sample in a beaker while a stepper motor rotates the sample. An aspirator tube, connected to a vacuum, draws the upper layer of solution from the beaker through a filter and into another beaker. This beaker can thereafter be removed for analysis of the solution. The standard laboratory module encloses an embedded controller providing process control, status feedback information and maintenance procedures for the equipment and operations within the standard laboratory module.

  1. Mechanisms and kinetics of cellulose fermentation for protein production

    NASA Technical Reports Server (NTRS)

    Dunlap, C. A.

    1971-01-01

    The development of a process (and ancillary processing and analytical techniques) to produce bacterial single-cell protein of good nutritional quality from waste cellulose is discussed. A fermentation pilot plant and laboratory were developed and have been in operation for about two years. Single-cell protein (SCP) can be produced from sugarcane bagasse--a typical agricultural cellulosic waste. The optimization and understanding of this process and its controlling variables are examined. Both batch and continuous fermentation runs have been made under controlled conditions in the 535 liter pilot plant vessel and in the laboratory 14-liter fermenters.

  2. METHOD-SPECIFIC PRECISION AND BIAS RELATIONSHIPS DEVELOPED FROM DATA SUBMITTED DURING USEPA DRINKING WATER LABORATORY PERFORMANCE EVALUATION STUDIES

    EPA Science Inventory

    This paper documents the process used by the United States Environmental Protection Agency (USEPA) to estimate the mean and standard deviation of data reported by in-control drinking water laboratories during Water Supply (WS) studies. This process is then applied to the data re...

  3. METHOD-SPECIFIC PRECISION AND BIAS RELATIONSHIPS DEVELOPED FROM DATA SUBMITTED DURING USEPA WASTEWATER LABORATORY PERFORMANCE EVALUATION STUDIES

    EPA Science Inventory

    This paper documents the process used by the United States Environmental Protection Agency (USEPA) to estimate the mean and standard deviation of data reported by in-control wastewater laboratories during Water Pollution (WP) studies. This process is then applied to the data rep...

  4. Errors in clinical laboratories or errors in laboratory medicine?

    PubMed

    Plebani, Mario

    2006-01-01

    Laboratory testing is a highly complex process and, although laboratory services are relatively safe, they are not as safe as they could or should be. Clinical laboratories have long focused their attention on quality control methods and quality assessment programs dealing with analytical aspects of testing. However, a growing body of evidence accumulated in recent decades demonstrates that quality in clinical laboratories cannot be assured by merely focusing on purely analytical aspects. The more recent surveys on errors in laboratory medicine conclude that in the delivery of laboratory testing, mistakes occur more frequently before (pre-analytical) and after (post-analytical) the test has been performed. Most errors are due to pre-analytical factors (46-68.2% of total errors), while a high error rate (18.5-47% of total errors) has also been found in the post-analytical phase. Errors due to analytical problems have been significantly reduced over time, but there is evidence that, particularly for immunoassays, interference may have a serious impact on patients. A description of the most frequent and risky pre-, intra- and post-analytical errors and advice on practical steps for measuring and reducing the risk of errors is therefore given in the present paper. Many mistakes in the Total Testing Process are called "laboratory errors", although these may be due to poor communication, action taken by others involved in the testing process (e.g., physicians, nurses and phlebotomists), or poorly designed processes, all of which are beyond the laboratory's control. Likewise, there is evidence that laboratory information is only partially utilized. A recent document from the International Organization for Standardization (ISO) recommends a new, broader definition of the term "laboratory error" and a classification of errors according to different criteria. In a modern approach to total quality, centered on patients' needs and satisfaction, the risk of errors and mistakes in pre- and post-examination steps must be minimized to guarantee the total quality of laboratory services.

  5. Laboratory cost control and financial management software.

    PubMed

    Mayer, M

    1998-02-09

    Economical constraints within the health care system advocate the introduction of tighter control of costs in clinical laboratories. Detailed cost information forms the basis for cost control and financial management. Based on the cost information, proper decisions regarding priorities, procedure choices, personnel policies and investments can be made. This presentation outlines some principles of cost analysis, describes common limitations of cost analysis, and exemplifies use of software to achieve optimized cost control. One commercially available cost analysis software, LabCost, is described in some detail. In addition to provision of cost information, LabCost also serves as a general management tool for resource handling, accounting, inventory management and billing. The application of LabCost in the selection process of a new high throughput analyzer for a large clinical chemistry service is taken as an example for decisions that can be assisted by cost evaluation. It is concluded that laboratory management that wisely utilizes cost analysis to support the decision-making process will undoubtedly have a clear advantage over those laboratories that fail to employ cost considerations to guide their actions.

  6. QCloud: A cloud-based quality control system for mass spectrometry-based proteomics laboratories

    PubMed Central

    Chiva, Cristina; Olivella, Roger; Borràs, Eva; Espadas, Guadalupe; Pastor, Olga; Solé, Amanda

    2018-01-01

    The increasing number of biomedical and translational applications in mass spectrometry-based proteomics poses new analytical challenges and raises the need for automated quality control systems. Despite previous efforts to set standard file formats, data processing workflows and key evaluation parameters for quality control, automated quality control systems are not yet widespread among proteomics laboratories, which limits the acquisition of high-quality results, inter-laboratory comparisons and the assessment of variability of instrumental platforms. Here we present QCloud, a cloud-based system to support proteomics laboratories in daily quality assessment using a user-friendly interface, easy setup, automated data processing and archiving, and unbiased instrument evaluation. QCloud supports the most common targeted and untargeted proteomics workflows, it accepts data formats from different vendors and it enables the annotation of acquired data and reporting incidences. A complete version of the QCloud system has successfully been developed and it is now open to the proteomics community (http://qcloud.crg.eu). QCloud system is an open source project, publicly available under a Creative Commons License Attribution-ShareAlike 4.0. PMID:29324744

  7. Practical solution for control of the pre-analytical phase in decentralized clinical laboratories for meeting the requirements of the medical laboratory accreditation standard DIN EN ISO 15189.

    PubMed

    Vacata, Vladimir; Jahns-Streubel, Gerlinde; Baldus, Mirjana; Wood, William Graham

    2007-01-01

    This report was written in response to the article by Wood published recently in this journal. It describes a practical solution to the problems of controlling the pre-analytical phase in the clinical diagnostic laboratory. As an indicator of quality in the pre-analytical phase of sample processing, a target analyte was chosen which is sensitive to delay in centrifugation and/or analysis. The results of analyses of the samples sent by satellite medical practitioners were compared with those from an on-site hospital laboratory with a controllable optimized pre-analytical phase. The aim of the comparison was: (a) to identify those medical practices whose mean/median sample values significantly deviate from those of the control situation in the hospital laboratory due to the possible problems in the pre-analytical phase; (b) to aid these laboratories in the process of rectifying these problems. A Microsoft Excel-based Pre-Analytical Survey tool (PAS tool) has been developed which addresses the above mentioned problems. It has been tested on serum potassium which is known to be sensitive to delay and/or irregularities in sample treatment. The PAS tool has been shown to be one possibility for improving the quality of the analyses by identifying the sources of problems within the pre-analytical phase, thus allowing them to be rectified. Additionally, the PAS tool has an educational value and can also be adopted for use in other decentralized laboratories.

  8. [Role of the independent microbiology laboratory in supporting infection control programs in small to mid-sized hospitals].

    PubMed

    Yanagisawa, Hideji

    2009-05-01

    With the revision of the Medical Service Law in 2006 by the Japanese Ministry of Health, Labour and Welfare (MHLW), all healthcare institutions are now required to implement a healthcare risk management program including infection control program. At a national level, an infection control surveillance program (JANIS) was implemented in July 2007. Regular weekly, monthly, and yearly infection control surveillance reports from independent microbiology laboratories can make significant contributions to infection control programs in small to mid-sized hospitals; furthermore, such programs are consistent with the framework of the MHLW's objective of strengthening risk management in healthcare institutions. Against the backdrop of current efforts to improve risk management, independent laboratories can make a significant contribution. Independent laboratories must play a role beyond merely receiving and processing specimens for microbiological examination. In addition to generating results for patients, hospital epidemiological data that contribute to local infection control programs must be a value-added component of the service. A major obstacle for independent laboratories to make a significant contribution to risk management is the current reimbursement system, which makes it economically impossible for independent laboratories to support infection control programs in healthcare institutions.

  9. An Interactive Computer-Aided Instructional Strategy and Assessment Methods for System Identification and Adaptive Control Laboratory

    ERIC Educational Resources Information Center

    Özbek, Necdet Sinan; Eker, Ilyas

    2015-01-01

    This study describes a set of real-time interactive experiments that address system identification and model reference adaptive control (MRAC) techniques. In constructing laboratory experiments that contribute to efficient teaching, experimental design and instructional strategy are crucial, but a process for doing this has yet to be defined. This…

  10. What's to Be Done About Laboratory Quality? Process Indicators, Laboratory Stewardship, the Outcomes Problem, Risk Assessment, and Economic Value: Responding to Contemporary Global Challenges.

    PubMed

    Meier, Frederick A; Badrick, Tony C; Sikaris, Kenneth A

    2018-02-17

    For 50 years, structure, process, and outcomes measures have assessed health care quality. For clinical laboratories, structural quality has generally been assessed by inspection. For assessing process, quality indicators (QIs), statistical monitors of steps in the clinical laboratory total testing, have proliferated across the globe. Connections between structural and process laboratory measures and patient outcomes, however, have rarely been demonstrated. To inform further development of clinical laboratory quality systems, we conducted a selective but worldwide review of publications on clinical laboratory quality assessment. Some QIs, like seven generic College of American Pathologists Q-Tracks monitors, have demonstrated significant process improvement; other measures have uncovered critical opportunities to improve test selection and result management. The College of Pathologists of Australasia Key Indicator Monitoring and Management System has deployed risk calculations, introduced from failure mode effects analysis, as surrogate measures for outcomes. Showing economic value from clinical laboratory testing quality is a challenge. Clinical laboratories should converge on fewer (7-14) rather than more (21-35) process monitors; monitors should cover all steps of the testing process under laboratory control and include especially high-risk specimen-quality QIs. Clinical laboratory stewardship, the combination of education interventions among clinician test orderers and report consumers with revision of test order formats and result reporting schemes, improves test ordering, but improving result reception is more difficult. Risk calculation reorders the importance of quality monitors by balancing three probabilities: defect frequency, weight of potential harm, and detection difficulty. The triple approach of (1) a more focused suite of generic consensus quality indicators, (2) more active clinical laboratory testing stewardship, and (3) integration of formal risk assessment, rather than competing with economic value, enhances it.

  11. Managing laboratory automation

    PubMed Central

    Saboe, Thomas J.

    1995-01-01

    This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Finally, some comments on future automation need are discussed. PMID:18925018

  12. Managing laboratory automation.

    PubMed

    Saboe, T J

    1995-01-01

    This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Finally, some comments on future automation need are discussed.

  13. Development of the Global Measles Laboratory Network.

    PubMed

    Featherstone, David; Brown, David; Sanders, Ray

    2003-05-15

    The routine reporting of suspected measles cases and laboratory testing of samples from these cases is the backbone of measles surveillance. The Global Measles Laboratory Network (GMLN) has developed standards for laboratory confirmation of measles and provides training resources for staff of network laboratories, reference materials and expertise for the development and quality control of testing procedures, and accurate information for the Measles Mortality Reduction and Regional Elimination Initiative. The GMLN was developed along the lines of the successful Global Polio Laboratory Network, and much of the polio laboratory infrastructure was utilized for measles. The GMLN has developed as countries focus on measles control activities following successful eradication of polio. Currently more than 100 laboratories are part of the global network and follow standardized testing and reporting procedures. A comprehensive laboratory accreditation process will be introduced in 2002 with six quality assurance and performance indicators.

  14. Pre-Analytical Components of Risk in Four Branches of Clinical Laboratory in Romania--Prospective Study.

    PubMed

    David, Remona E; Dobreanu, Minodora

    2016-01-01

    Development of quality measurement principles is a strategic point for each clinical laboratory. Preexamination process is the most critical and the most difficult to be managed. The aim of this study is to identify, quantify, and monitor the nonconformities of the pre-analytical process using quality indicators that can affect the patient's health safety in four different locations of a Romanian private clinical laboratory. The study group consisted of all the analysis requests received by the departments of biochemistry, hematology, and coagulation from January through March 2015. In order to collect the pre-analytical nonconformities, we created a "Risk Budget", using the entries from the "Evidence notebook--non-conform samples" from the above mentioned departments. The laboratory established the quality indicators by means of the risk management technique in order to identify and control the sources of errors, FMEA (Failure Modes and Effects Analyses), which had been implemented and monitored for its purposes and special needs. For the assessment of the control level over the processes, the results were transformed on the Six Sigma scale, using the Westgard calculation method and being obtained in this way the frequency with which an error may occur. (https://www.westgard. com/six-sigma-calculators.htm). The obtained results prove that the quantification and monitoring of the indicators can be a control instrument for the pre-analytic activities. The calculation of the Six Sigma value adds extra information to the study because it allows the detection of the processes which need improvement (Sigma value higher than 4 represents a well controlled process). The highest rates were observed for the hemolyzed and the lipemic samples, in the department of biochemistry and hemolyzed, insufficient sample volume, or clotted samples for the department of hematology and coagulation. Significant statistical differences between laboratories participating in the study have been recorded for these indicators. The elaborated study between the four branches of a Romanian private clinical laboratory was a challenge, and it helped in choosing strategic decisions regarding the improvement of the patient's health safety in the institution, corresponding to the accreditation requirements in accordance with ISO 15189:2013.

  15. A Process Dynamics and Control Experiment for the Undergraduate Laboratory

    ERIC Educational Resources Information Center

    Spencer, Jordan L.

    2009-01-01

    This paper describes a process control experiment. The apparatus includes a three-vessel glass flow system with a variable flow configuration, means for feeding dye solution controlled by a stepper-motor driven valve, and a flow spectrophotometer. Students use impulse response data and nonlinear regression to estimate three parameters of a model…

  16. Mistakes in a stat laboratory: types and frequency.

    PubMed

    Plebani, M; Carraro, P

    1997-08-01

    Application of Total Quality Management concepts to laboratory testing requires that the total process, including preanalytical and postanalytical phases, be managed so as to reduce or, ideally, eliminate all defects within the process itself. Indeed a "mistake" can be defined as any defect during the entire testing process, from ordering tests to reporting results. We evaluated the frequency and types of mistakes found in the "stat" section of the Department of Laboratory Medicine of the University-Hospital of Padova by monitoring four different departments (internal medicine, nephrology, surgery, and intensive care unit) for 3 months. Among a total of 40490 analyses, we identified 189 laboratory mistakes, a relative frequency of 0.47%. The distribution of mistakes was: preanalytical 68.2%, analytical 13.3%, and postanalytical 18.5%. Most of the laboratory mistakes (74%) did not affect patients' outcome. However, in 37 patients (19%), laboratory mistakes were associated with further inappropriate investigations, thus resulting in an unjustifiable increase in costs. Moreover, in 12 patients (6.4%) laboratory mistakes were associated with inappropriate care or inappropriate modification of therapy. The promotion of quality control and continuous improvement of the total testing process, including pre- and postanalytical phases, seems to be a prerequisite for an effective laboratory service.

  17. Microprocessors: Laboratory Simulation of Industrial Control Applications.

    ERIC Educational Resources Information Center

    Gedeon, David V.

    1981-01-01

    Describes a course to make technical managers more aware of computer technology and how data loggers, programmable controllers, and larger computer systems interact in a hierarchical configuration of manufacturing process control. (SK)

  18. Long story short: an introduction to the short-term and long-term Six Sigma quality and its importance in the laboratory medicine for the management of extra-analytical processes.

    PubMed

    Ialongo, Cristiano; Bernardini, Sergio

    2018-06-18

    There is a compelling need for quality tools that enable effective control of the extra-analytical phase. In this regard, Six Sigma seems to offer a valid methodological and conceptual opportunity, and in recent times, the International Federation of Clinical Chemistry and Laboratory Medicine has adopted it for indicating the performance requirements for non-analytical laboratory processes. However, the Six Sigma implies a distinction between short-term and long-term quality that is based on the dynamics of the processes. These concepts are still not widespread and applied in the field of laboratory medicine although they are of fundamental importance to exploit the full potential of this methodology. This paper reviews the Six Sigma quality concepts and shows how they originated from Shewhart's control charts, in respect of which they are not an alternative but a completion. It also discusses the dynamic nature of process and how it arises, concerning particularly the long-term dynamic mean variation, and explains why this leads to the fundamental distinction of quality we previously mentioned.

  19. Experimental Replication of an Aeroengine Combustion Instability

    NASA Technical Reports Server (NTRS)

    Cohen, J. M.; Hibshman, J. R.; Proscia, W.; Rosfjord, T. J.; Wake, B. E.; McVey, J. B.; Lovett, J.; Ondas, M.; DeLaat, J.; Breisacher, K.

    2000-01-01

    Combustion instabilities in gas turbine engines are most frequently encountered during the late phases of engine development, at which point they are difficult and expensive to fix. The ability to replicate an engine-traceable combustion instability in a laboratory-scale experiment offers the opportunity to economically diagnose the problem (to determine the root cause), and to investigate solutions to the problem, such as active control. The development and validation of active combustion instability control requires that the causal dynamic processes be reproduced in experimental test facilities which can be used as a test bed for control system evaluation. This paper discusses the process through which a laboratory-scale experiment was designed to replicate an instability observed in a developmental engine. The scaling process used physically-based analyses to preserve the relevant geometric, acoustic and thermo-fluid features. The process increases the probability that results achieved in the single-nozzle experiment will be scalable to the engine.

  20. Automation and Robotics in the Laboratory.

    ERIC Educational Resources Information Center

    DiCesare, Frank; And Others

    1985-01-01

    A general laboratory course featuring microcomputer interfacing for data acquisition, process control and automation, and robotics was developed at Rensselaer Polytechnic Institute and is now available to all junior engineering students. The development and features of the course are described. (JN)

  1. Statistical porcess control in Deep Space Network operation

    NASA Technical Reports Server (NTRS)

    Hodder, J. A.

    2002-01-01

    This report describes how the Deep Space Mission System (DSMS) Operations Program Office at the Jet Propulsion Laboratory's (EL) uses Statistical Process Control (SPC) to monitor performance and evaluate initiatives for improving processes on the National Aeronautics and Space Administration's (NASA) Deep Space Network (DSN).

  2. Analytical control test plan and microbiological methods for the water recovery test

    NASA Technical Reports Server (NTRS)

    Traweek, M. S. (Editor); Tatara, J. D. (Editor)

    1994-01-01

    Qualitative and quantitative laboratory results are important to the decision-making process. In some cases, they may represent the only basis for deciding between two or more given options or processes. Therefore, it is essential that handling of laboratory samples and analytical operations employed are performed at a deliberate level of conscientious effort. Reporting erroneous results can lead to faulty interpretations and result in misinformed decisions. This document provides analytical control specifications which will govern future test procedures related to all Water Recovery Test (WRT) Phase 3 activities to be conducted at the National Aeronautics and Space Administration/Marshall Space Flight Center (NASA/MSFC). This document addresses the process which will be used to verify analytical data generated throughout the test period, and to identify responsibilities of key personnel and participating laboratories, the chains of communication to be followed, and ensure that approved methodology and procedures are used during WRT activities. This document does not outline specifics, but provides a minimum guideline by which sampling protocols, analysis methodologies, test site operations, and laboratory operations should be developed.

  3. The concept and science process skills analysis in bomb calorimeter experiment as a foundation for the development of virtual laboratory of bomb calorimeter

    NASA Astrophysics Data System (ADS)

    Kurniati, D. R.; Rohman, I.

    2018-05-01

    This study aims to analyze the concepts and science process skills in bomb calorimeter experiment as a basis for developing the virtual laboratory of bomb calorimeter. This study employed research and development method (R&D) to gain the answer to the proposed problems. This paper discussed the concepts and process skills analysis. The essential concepts and process skills associated with bomb calorimeter are analyze by optimizing the bomb calorimeter experiment. The concepts analysis found seven fundamental concepts to be concerned in developing the virtual laboratory that are internal energy, burning heat, perfect combustion, incomplete combustion, calorimeter constant, bomb calorimeter, and Black principle. Since the concept of bomb calorimeter, perfect and incomplete combustion created to figure out the real situation and contain controllable variables, in virtual the concepts displayed in the form of simulation. Meanwhile, the last four concepts presented in the form of animation because no variable found to be controlled. The process skills analysis detect four notable skills to be developed that are ability to observe, design experiment, interpretation, and communication skills.

  4. [Study of continuous quality improvement for clinical laboratory processes via the platform of Hospital Group].

    PubMed

    Song, Wenqi; Shen, Ying; Peng, Xiaoxia; Tian, Jian; Wang, Hui; Xu, Lili; Nie, Xiaolu; Ni, Xin

    2015-05-26

    The program of continuous quality improvement in clinical laboratory processes for complete blood count (CBC) was launched via the platform of Beijing Children's Hospital Group in order to improve the quality of pediatric clinical laboratories. Fifteen children's hospitals of Beijing Children's Hospital group were investigated using the method of Chinese adapted continuous quality improvement with PDCA (Plan-Do-Check-Action). The questionnaire survey and inter-laboratory comparison was conducted to find the existing problems, to analyze reasons, to set forth quality targets and to put them into practice. Then, targeted training was conducted to 15 children's hospitals and the second questionnaire survey, self examinations by the clinical laboratories was performed. At the same time, the Group's online internal quality control platform was established. Overall effects of the program were evaluated so that lay a foundation for the next stage of PDCA. Both quality of control system documents and CBC internal quality control scheme for all of clinical laboratories were improved through this program. In addition, standardization of performance verification was also improved, especially with the comparable verification rate of precision and internal laboratory results up to 100%. In terms of instrument calibration and mandatory diagnostic rates, only three out of the 15 hospitals (20%) failed to pass muster in 2014 from 46.67% (seven out of the 15 hospitals) in 2013. The abnormal data of intraday precision variance coefficients of the five CBC indicator parameters (WBC, RBC, Hb, Plt and Hct) of all the 15 laboratories accounted for 1.2% (2/165) in 2014, a marked decrease from 9.6% (14/145) in 2013. While the number of the hospitals using only one horizontal quality control object for daily quality control has dropped to three from five. The 15 hospitals organized a total of 263 times of training in 2014 from 101 times in 2013, up 160%. The quality improvement program for the clinical laboratories launched via the Hospital Group platform can promote the joint development of the pediatric clinical laboratory discipline of all the member hospitals with remarkable improvement results, and the experience is recommendable for further rollout.

  5. Sigma Metrics Across the Total Testing Process.

    PubMed

    Charuruks, Navapun

    2017-03-01

    Laboratory quality control has been developed for several decades to ensure patients' safety, from a statistical quality control focus on the analytical phase to total laboratory processes. The sigma concept provides a convenient way to quantify the number of errors in extra-analytical and analytical phases through the defect per million and sigma metric equation. Participation in a sigma verification program can be a convenient way to monitor analytical performance continuous quality improvement. Improvement of sigma-scale performance has been shown from our data. New tools and techniques for integration are needed. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Instrument Quality Control.

    PubMed

    Jayakody, Chatura; Hull-Ryde, Emily A

    2016-01-01

    Well-defined quality control (QC) processes are used to determine whether a certain procedure or action conforms to a widely accepted standard and/or set of guidelines, and are important components of any laboratory quality assurance program (Popa-Burke et al., J Biomol Screen 14: 1017-1030, 2009). In this chapter, we describe QC procedures useful for monitoring the accuracy and precision of laboratory instrumentation, most notably automated liquid dispensers. Two techniques, gravimetric QC and photometric QC, are highlighted in this chapter. When used together, these simple techniques provide a robust process for evaluating liquid handler accuracy and precision, and critically underpin high-quality research programs.

  7. A Systematic Approach to Capacity Strengthening of Laboratory Systems for Control of Neglected Tropical Diseases in Ghana, Kenya, Malawi and Sri Lanka

    PubMed Central

    Njelesani, Janet; Dacombe, Russell; Palmer, Tanith; Smith, Helen; Koudou, Benjamin; Bockarie, Moses; Bates, Imelda

    2014-01-01

    Background The lack of capacity in laboratory systems is a major barrier to achieving the aims of the London Declaration (2012) on neglected tropical diseases (NTDs). To counter this, capacity strengthening initiatives have been carried out in NTD laboratories worldwide. Many of these initiatives focus on individuals' skills or institutional processes and structures ignoring the crucial interactions between the laboratory and the wider national and international context. Furthermore, rigorous methods to assess these initiatives once they have been implemented are scarce. To address these gaps we developed a set of assessment and monitoring tools that can be used to determine the capacities required and achieved by laboratory systems at the individual, organizational, and national/international levels to support the control of NTDs. Methodology and principal findings We developed a set of qualitative and quantitative assessment and monitoring tools based on published evidence on optimal laboratory capacity. We implemented the tools with laboratory managers in Ghana, Malawi, Kenya, and Sri Lanka. Using the tools enabled us to identify strengths and gaps in the laboratory systems from the following perspectives: laboratory quality benchmarked against ISO 15189 standards, the potential for the laboratories to provide support to national and regional NTD control programmes, and the laboratory's position within relevant national and international networks and collaborations. Conclusion We have developed a set of mixed methods assessment and monitoring tools based on evidence derived from the components needed to strengthen the capacity of laboratory systems to control NTDs. Our tools help to systematically assess and monitor individual, organizational, and wider system level capacity of laboratory systems for NTD control and can be applied in different country contexts. PMID:24603407

  8. LABORATORY PROCESS CONTROLLER USING NATURAL LANGUAGE COMMANDS FROM A PERSONAL COMPUTER

    NASA Technical Reports Server (NTRS)

    Will, H.

    1994-01-01

    The complex environment of the typical research laboratory requires flexible process control. This program provides natural language process control from an IBM PC or compatible machine. Sometimes process control schedules require changes frequently, even several times per day. These changes may include adding, deleting, and rearranging steps in a process. This program sets up a process control system that can either run without an operator, or be run by workers with limited programming skills. The software system includes three programs. Two of the programs, written in FORTRAN77, record data and control research processes. The third program, written in Pascal, generates the FORTRAN subroutines used by the other two programs to identify the user commands with the user-written device drivers. The software system also includes an input data set which allows the user to define the user commands which are to be executed by the computer. To set the system up the operator writes device driver routines for all of the controlled devices. Once set up, this system requires only an input file containing natural language command lines which tell the system what to do and when to do it. The operator can make up custom commands for operating and taking data from external research equipment at any time of the day or night without the operator in attendance. This process control system requires a personal computer operating under MS-DOS with suitable hardware interfaces to all controlled devices. The program requires a FORTRAN77 compiler and user-written device drivers. This program was developed in 1989 and has a memory requirement of about 62 Kbytes.

  9. [Quality Management and Quality Specifications of Laboratory Tests in Clinical Studies--Challenges in Pre-Analytical Processes in Clinical Laboratories].

    PubMed

    Ishibashi, Midori

    2015-01-01

    The cost, speed, and quality are the three important factors recently indicated by the Ministry of Health, Labour and Welfare (MHLW) for the purpose of accelerating clinical studies. Based on this background, the importance of laboratory tests is increasing, especially in the evaluation of clinical study participants' entry and safety, and drug efficacy. To assure the quality of laboratory tests, providing high-quality laboratory tests is mandatory. For providing adequate quality assurance in laboratory tests, quality control in the three fields of pre-analytical, analytical, and post-analytical processes is extremely important. There are, however, no detailed written requirements concerning specimen collection, handling, preparation, storage, and shipping. Most laboratory tests for clinical studies are performed onsite in a local laboratory; however, a part of laboratory tests is done in offsite central laboratories after specimen shipping. As factors affecting laboratory tests, individual and inter-individual variations are well-known. Besides these factors, standardizing the factors of specimen collection, handling, preparation, storage, and shipping, may improve and maintain the high quality of clinical studies in general. Furthermore, the analytical method, units, and reference interval are also important factors. It is concluded that, to overcome the problems derived from pre-analytical processes, it is necessary to standardize specimen handling in a broad sense.

  10. Virtual and remote robotic laboratory using EJS, MATLAB and LabVIEW.

    PubMed

    Chaos, Dictino; Chacón, Jesús; Lopez-Orozco, Jose Antonio; Dormido, Sebastián

    2013-02-21

    This paper describes the design and implementation of a virtual and remote laboratory based on Easy Java Simulations (EJS) and LabVIEW. The main application of this laboratory is to improve the study of sensors in Mobile Robotics, dealing with the problems that arise on the real world experiments. This laboratory allows the user to work from their homes, tele-operating a real robot that takes measurements from its sensors in order to obtain a map of its environment. In addition, the application allows interacting with a robot simulation (virtual laboratory) or with a real robot (remote laboratory), with the same simple and intuitive graphical user interface in EJS. Thus, students can develop signal processing and control algorithms for the robot in simulation and then deploy them on the real robot for testing purposes. Practical examples of application of the laboratory on the inter-University Master of Systems Engineering and Automatic Control are presented.

  11. Virtual and Remote Robotic Laboratory Using EJS, MATLAB and Lab VIEW

    PubMed Central

    Chaos, Dictino; Chacón, Jesús; Lopez-Orozco, Jose Antonio; Dormido, Sebastián

    2013-01-01

    This paper describes the design and implementation of a virtual and remote laboratory based on Easy Java Simulations (EJS) and LabVIEW. The main application of this laboratory is to improve the study of sensors in Mobile Robotics, dealing with the problems that arise on the real world experiments. This laboratory allows the user to work from their homes, tele-operating a real robot that takes measurements from its sensors in order to obtain a map of its environment. In addition, the application allows interacting with a robot simulation (virtual laboratory) or with a real robot (remote laboratory), with the same simple and intuitive graphical user interface in EJS. Thus, students can develop signal processing and control algorithms for the robot in simulation and then deploy them on the real robot for testing purposes. Practical examples of application of the laboratory on the inter-University Master of Systems Engineering and Automatic Control are presented. PMID:23429578

  12. Trends in International Persuasion: Persuasion in the Arms Control Negotiations.

    ERIC Educational Resources Information Center

    Hopmann, P. Terrence; Walcott, Charles

    An analysis of the bargaining process in international arms control negotiations is possible by developing a framework of interrelated hypotheses, by delineating and practicing interactions study called "Bargaining Process Analysis," and by formulating procedural steps that bridge the gap between laboratory studies and "real world" situations. In…

  13. Validating the Equilibrium Stage Model for an Azeotropic System in a Laboratorial Distillation Column

    ERIC Educational Resources Information Center

    Duarte, B. P. M.; Coelho Pinheiro, M. N.; Silva, D. C. M.; Moura, M. J.

    2006-01-01

    The experiment described is an excellent opportunity to apply theoretical concepts of distillation, thermodynamics of mixtures and process simulation at laboratory scale, and simultaneously enhance the ability of students to operate, control and monitor complex units.

  14. Sense and nonsense in the process of accreditation of a pathology laboratory.

    PubMed

    Long-Mira, Elodie; Washetine, Kevin; Hofman, Paul

    2016-01-01

    The aim of accreditation of a pathology laboratory is to control and optimize, in a permanent manner, good professional practice in clinical and molecular pathology, as defined by internationally established standards. Accreditation of a pathology laboratory is a key element in fine in increasing recognition of the quality of the analyses performed by a laboratory and in improving the care it provides to patients. One of the accreditation standards applied to clinical chemistry and pathology laboratories in the European Union is the ISO 15189 norm. Continued functioning of a pathology laboratory might in time be determined by whether or not it has succeeded the accreditation process. Necessary requirements for accreditation, according to the ISO 15189 norm, include an operational quality management system and continuous control of the methods used for diagnostic purposes. Given these goals, one would expect that all pathologists would agree on the positive effects of accreditation. Yet, some of the requirements stipulated in the accreditation standards, coming from the bodies that accredit pathology laboratories, and certain normative issues are perceived as arduous and sometimes not adapted to or even useless in daily pathology practice. The aim of this review is to elaborate why it is necessary to obtain accreditation but also why certain requirements for accreditation might be experienced as inappropriate.

  15. Flexible software architecture for user-interface and machine control in laboratory automation.

    PubMed

    Arutunian, E B; Meldrum, D R; Friedman, N A; Moody, S E

    1998-10-01

    We describe a modular, layered software architecture for automated laboratory instruments. The design consists of a sophisticated user interface, a machine controller and multiple individual hardware subsystems, each interacting through a client-server architecture built entirely on top of open Internet standards. In our implementation, the user-interface components are built as Java applets that are downloaded from a server integrated into the machine controller. The user-interface client can thereby provide laboratory personnel with a familiar environment for experiment design through a standard World Wide Web browser. Data management and security are seamlessly integrated at the machine-controller layer using QNX, a real-time operating system. This layer also controls hardware subsystems through a second client-server interface. This architecture has proven flexible and relatively easy to implement and allows users to operate laboratory automation instruments remotely through an Internet connection. The software architecture was implemented and demonstrated on the Acapella, an automated fluid-sample-processing system that is under development at the University of Washington.

  16. An intelligent CNC machine control system architecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, D.J.; Loucks, C.S.

    1996-10-01

    Intelligent, agile manufacturing relies on automated programming of digitally controlled processes. Currently, processes such as Computer Numerically Controlled (CNC) machining are difficult to automate because of highly restrictive controllers and poor software environments. It is also difficult to utilize sensors and process models for adaptive control, or to integrate machining processes with other tasks within a factory floor setting. As part of a Laboratory Directed Research and Development (LDRD) program, a CNC machine control system architecture based on object-oriented design and graphical programming has been developed to address some of these problems and to demonstrate automated agile machining applications usingmore » platform-independent software.« less

  17. Proceedings of the 3rd Annual SCOLE Workshop

    NASA Technical Reports Server (NTRS)

    Taylor, Lawrence W., Jr. (Compiler)

    1987-01-01

    Topics addressed include: modeling and controlling the Spacecraft Control Laboratory Experiment (SCOLE) configurations; slewing maneuvers; mathematical models; vibration damping; gravitational effects; structural dynamics; finite element method; distributed parameter system; on-line pulse control; stability augmentation; and stochastic processes.

  18. FORCED AIR VENTILATION FOR REMEDIATION OF UNSATURATED SOILS CONTAMINATED BY VOC

    EPA Science Inventory

    Parameters which were expected to control the removal process of VOCs from contaminated soil during the SVE operation were studied by means of numerical simulations and laboratory experiments in this project. Experimental results of SVE with soil columns in the laboratory indicat...

  19. An Advanced Undergraduate Chemistry Laboratory Experiment Exploring NIR Spectroscopy and Chemometrics

    ERIC Educational Resources Information Center

    Wanke, Randall; Stauffer, Jennifer

    2007-01-01

    An advanced undergraduate chemistry laboratory experiment to study the advantages and hazards of the coupling of NIR spectroscopy and chemometrics is described. The combination is commonly used for analysis and process control of various ingredients used in agriculture, petroleum and food products.

  20. Laboratory automation: trajectory, technology, and tactics.

    PubMed

    Markin, R S; Whalen, S A

    2000-05-01

    Laboratory automation is in its infancy, following a path parallel to the development of laboratory information systems in the late 1970s and early 1980s. Changes on the horizon in healthcare and clinical laboratory service that affect the delivery of laboratory results include the increasing age of the population in North America, the implementation of the Balanced Budget Act (1997), and the creation of disease management companies. Major technology drivers include outcomes optimization and phenotypically targeted drugs. Constant cost pressures in the clinical laboratory have forced diagnostic manufacturers into less than optimal profitability states. Laboratory automation can be a tool for the improvement of laboratory services and may decrease costs. The key to improvement of laboratory services is implementation of the correct automation technology. The design of this technology should be driven by required functionality. Automation design issues should be centered on the understanding of the laboratory and its relationship to healthcare delivery and the business and operational processes in the clinical laboratory. Automation design philosophy has evolved from a hardware-based approach to a software-based approach. Process control software to support repeat testing, reflex testing, and transportation management, and overall computer-integrated manufacturing approaches to laboratory automation implementation are rapidly expanding areas. It is clear that hardware and software are functionally interdependent and that the interface between the laboratory automation system and the laboratory information system is a key component. The cost-effectiveness of automation solutions suggested by vendors, however, has been difficult to evaluate because the number of automation installations are few and the precision with which operational data have been collected to determine payback is suboptimal. The trend in automation has moved from total laboratory automation to a modular approach, from a hardware-driven system to process control, from a one-of-a-kind novelty toward a standardized product, and from an in vitro diagnostics novelty to a marketing tool. Multiple vendors are present in the marketplace, many of whom are in vitro diagnostics manufacturers providing an automation solution coupled with their instruments, whereas others are focused automation companies. Automation technology continues to advance, acceptance continues to climb, and payback and cost justification methods are developing.

  1. In situ and laboratory investigations into contaminant migration

    NASA Astrophysics Data System (ADS)

    Williams, G. M.; Higgo, J. J. W.

    1994-07-01

    Predicting the spread of groundwater pollution demands a detailed understanding of the physical, chemical and microbial processes that control contaminant mobility in aquifers. Many field studies have been carried out around pollutant sources in an attempt to understand these processes, but quantitative results are often difficult to obtain because of the number of assumptions that have to be made about the flow regime or the source term which has given rise to the pollution. Models can be constructed with emphases on different processes to describe the known distribution of contaminants at any one time. However, if these models are to be used for predictive purposes, or to help remediation, it is important to identify and quantify individual processes precisely by independent or direct methods and not to rely on inference alone. Laboratory tests suffer from the fact that aquifer material has to be sampled and transferred to the laboratory. In the process, the sample may be disturbed physically thus altering its porosity, permeability and dispersive properties. It may be oxidised, thereby altering its chemistry, and the numbers, activity and character of any microbial population may change. In situ tracer experiments attempt to overcome the limitations of the laboratory by maintaining natural conditions, but at the same time allowing the injection of solute to be accurately defined and the hydraulic regime to be well controlled and monitored. Examples are given showing how integrated laboratory and field approaches have been used to study: (1) organic degradation in a pollution plume resulting from the disposal of industrial wastes and (2) the role of colloids in transporting radionuclides in an intergranular aquifer.

  2. Compliance of clinical microbiology laboratories in the United States with current recommendations for processing respiratory tract specimens from patients with cystic fibrosis.

    PubMed

    Zhou, Juyan; Garber, Elizabeth; Desai, Manisha; Saiman, Lisa

    2006-04-01

    Respiratory tract specimens from patients with cystic fibrosis (CF) require unique processing by clinical microbiology laboratories to ensure detection of all potential pathogens. The present study sought to determine the compliance of microbiology laboratories in the United States with recently published recommendations for CF respiratory specimens. Microbiology laboratory protocols from 150 of 190 (79%) CF care sites were reviewed. Most described the use of selective media for Burkholderia cepacia complex (99%), Staphylococcus aureus (82%), and Haemophilus influenzae (89%) and identified the species of all gram-negative bacilli (87%). Only 52% delineated the use of agar diffusion assays for susceptibility testing of Pseudomonas aeruginosa. Standardizing laboratory practices will improve treatment, infection control, and our understanding of the changing epidemiology of CF microbiology.

  3. [Automation and organization of technological process of urinalysis].

    PubMed

    Kolenkin, S M; Kishkun, A A; Kol'chenko, O L

    2000-12-01

    Results of introduction into practice of a working model of industrial technology of laboratory studies and KONE Specific Supra and Miditron M devices are shown as exemplified by clinical analysis of the urine. This technology helps standardize all stages and operations, improves the efficiency of quality control of laboratory studies, rationally organizes the work at all stages of the process, creates a system for permanent improvement of the efficiency of investigations at the preanalytical, analytical, and postanalytical stages of technological process of laboratory studies. As a result of introduction of this technology into laboratory practice, violations of quality criteria of clinical urinalysis decreased from 15 to 8% at the preanalytical stage and from 6 to 3% at the analytical stage. Automation of the analysis decreased the need in reagents 3-fold and improved the productivity at the analytical stage 4-fold.

  4. Quality-assurance plan for the analysis of fluvial sediment by laboratories of the U.S. Geological Survey

    USGS Publications Warehouse

    Matthes, Wilbur J.; Sholar, Clyde J.; George, John R.

    1992-01-01

    This report describes procedures used by the Iowa District sediment laboratory of the U.S. Geological Survey to assure the quality of sediment-laboratory data. These procedures can be used by other U.S. Geological Survey laboratories regardless of size and type of operation for quality assurance and quality control of specific sediment-laboratory processes. Also described are the equipment, specifications, calibration and maintenance, and the protocol for methods used in the analyses of fluvial sediment for concentration or particle size.

  5. PROCESS WATER BUILDING, TRA605. CONTROL PANEL SUPPLIES STATUS INDICATORS. CARD ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PROCESS WATER BUILDING, TRA-605. CONTROL PANEL SUPPLIES STATUS INDICATORS. CARD IN LOWER RIGHT WAS INSERTED BY INL PHOTOGRAPHER TO COVER AN OBSOLETE SECURITY RESTRICTION ON ORIGINAL NEGATIVE. INL NEGATIVE NO. 4219. Unknown Photographer, 2/13/1952 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  6. Automating multistep flow synthesis: approach and challenges in integrating chemistry, machines and logic

    PubMed Central

    Shukla, Chinmay A

    2017-01-01

    The implementation of automation in the multistep flow synthesis is essential for transforming laboratory-scale chemistry into a reliable industrial process. In this review, we briefly introduce the role of automation based on its application in synthesis viz. auto sampling and inline monitoring, optimization and process control. Subsequently, we have critically reviewed a few multistep flow synthesis and suggested a possible control strategy to be implemented so that it helps to reliably transfer the laboratory-scale synthesis strategy to a pilot scale at its optimum conditions. Due to the vast literature in multistep synthesis, we have classified the literature and have identified the case studies based on few criteria viz. type of reaction, heating methods, processes involving in-line separation units, telescopic synthesis, processes involving in-line quenching and process with the smallest time scale of operation. This classification will cover the broader range in the multistep synthesis literature. PMID:28684977

  7. Use of a Business Approach to Improve Disease Surveillance Data Management Systems and Information Technology Process in Florida's Bureau of STD Prevention and Control.

    PubMed

    Shiver, Stacy A; Schmitt, Karla; Cooksey, Adrian

    2009-01-01

    The business of sexually transmitted disease (STD) prevention and control demands technology that is capable of supporting a wide array of program activities-from the processing of laboratory test results to the complex and confidential process involved in contact investigation. The need for a tool that enables public health officials to successfully manage the complex operations encountered in an STD prevention and control program, and the need to operate in an increasingly poor resource environment, led the Florida Bureau of STD to develop the Patient Reporting Investigation Surveillance Manager. Its unique approach, technical architecture, and sociotechnical philosophy have made this business application successful in real-time monitoring of disease burden for local communities, identification of emerging outbreaks, monitoring and assurance of appropriate treatments, improving access to laboratory data, and improving the quality of data for epidemiologic analysis. Additionally, the effort attempted to create and release a product that promoted the Centers for Disease Control and Prevention's ideas for integration of programs and processes.

  8. Benefits of a comprehensive quality program for cryopreserved PBMC covering 28 clinical trials sites utilizing an integrated, analytical web-based portal

    PubMed Central

    Ducar, Constance; Smith, Donna; Pinzon, Cris; Stirewalt, Michael; Cooper, Cristine; McElrath, M. Juliana; Hural, John

    2014-01-01

    The HIV Vaccine Trials Network (HVTN) is a global network of 28 clinical trial sites dedicated to identifying an effective HIV vaccine. Cryopreservation of high-quality peripheral blood mononuclear cells (PBMC) is critical for the assessment of vaccine-induced cellular immune functions. The HVTN PBMC Quality Management Program is designed to ensure viable PBMC are processed, stored and shipped for clinical trial assays from all HVTN clinical trial sites. The program has evolved by developing and incorporating best practices for laboratory and specimen quality and implementing automated, web-based tools. These tools allow the site-affiliated processing laboratories and the central Laboratory Operations Unit to rapidly collect, analyze and report PBMC quality data. The HVTN PBMC Quality Management Program includes five key components: 1) Laboratory Assessment, 2) PBMC Training and Certification, 3) Internal Quality Control, 4) External Quality Control (EQC), and 5) Assay Specimen Quality Control. Fresh PBMC processing data is uploaded from each clinical site processing laboratory to a central HVTN Statistical and Data Management Center database for access and analysis on a web portal. Samples are thawed at a central laboratory for assay or specimen quality control and sample quality data is uploaded directly to the database by the central laboratory. Four year cumulative data covering 23,477 blood draws reveals an average fresh PBMC yield of 1.45×106 ±0.48 cells per milliliter of useable whole blood. 95% of samples were within the acceptable range for fresh cell yield of 0.8–3.2×106 cells/ml of usable blood. Prior to full implementation of the HVTN PBMC Quality Management Program, the 2007 EQC evaluations from 10 international sites showed a mean day 2 thawed viability of 83.1% and recovery of 67.5%. Since then, four year cumulative data covering 3338 specimens used in immunologic assays shows that 99.88% had acceptable viabilities (>66%) for use in cellular assays (mean, 91.46% ±4.5%), and 96.2% had acceptable recoveries (50%–130%) with a mean of recovery of 85.8% ±19.12% of the originally cryopreserved cells. EQC testing revealed that since August 2009, failed recoveries dropped from 4.1% to 1.6% and failed viabilities dropped from 1.0% to 0.3%. The HVTN PBMC quality program provides for laboratory assessment, training and tools for identifying problems, implementing corrective action and monitoring for improvements. These data support the benefits of implementing a comprehensive, web-based PBMC quality program for large clinical trials networks. PMID:24709391

  9. Causes of cine image quality deterioration in cardiac catheterization laboratories.

    PubMed

    Levin, D C; Dunham, L R; Stueve, R

    1983-10-01

    Deterioration of cineangiographic image quality can result from malfunctions or technical errors at a number of points along the cine imaging chain: generator and automatic brightness control, x-ray tube, x-ray beam geometry, image intensifier, optics, cine camera, cine film, film processing, and cine projector. Such malfunctions or errors can result in loss of image contrast, loss of spatial resolution, improper control of film optical density (brightness), or some combination thereof. While the electronic and photographic technology involved is complex, physicians who perform cardiac catheterization should be conversant with the problems and what can be done to solve them. Catheterization laboratory personnel have control over a number of factors that directly affect image quality, including radiation dose rate per cine frame, kilovoltage or pulse width (depending on type of automatic brightness control), cine run time, selection of small or large focal spot, proper object-intensifier distance and beam collimation, aperture of the cine camera lens, selection of cine film, processing temperature, processing immersion time, and selection of developer.

  10. Quality Control in Clinical Laboratory Samples

    DTIC Science & Technology

    2015-01-01

    is able to find and correct flaws in the analytical processes of a lab before potentially incorrect patient resu lts are released. According to...verifi es that the results produced are accurate and precise . Clinical labs use management of documentation as well as inco rporation of a continuous...improvement process to streamline the overall quality control process . QC samples are expected to be identical and tested identically to patient

  11. A Lean Six Sigma approach to the improvement of the selenium analysis method.

    PubMed

    Cloete, Bronwyn C; Bester, André

    2012-11-02

    Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL). The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC). Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM) was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any laboratory, and represents both a management discipline, and a standardised approach to problem solving and process optimisation.

  12. The optimization of total laboratory automation by simulation of a pull-strategy.

    PubMed

    Yang, Taho; Wang, Teng-Kuan; Li, Vincent C; Su, Chia-Lo

    2015-01-01

    Laboratory results are essential for physicians to diagnose medical conditions. Because of the critical role of medical laboratories, an increasing number of hospitals use total laboratory automation (TLA) to improve laboratory performance. Although the benefits of TLA are well documented, systems occasionally become congested, particularly when hospitals face peak demand. This study optimizes TLA operations. Firstly, value stream mapping (VSM) is used to identify the non-value-added time. Subsequently, batch processing control and parallel scheduling rules are devised and a pull mechanism that comprises a constant work-in-process (CONWIP) is proposed. Simulation optimization is then used to optimize the design parameters and to ensure a small inventory and a shorter average cycle time (CT). For empirical illustration, this approach is applied to a real case. The proposed methodology significantly improves the efficiency of laboratory work and leads to a reduction in patient waiting times and increased service level.

  13. Implementing a resource management program for accreditation process at the medical laboratory.

    PubMed

    Yenice, Sedef

    2009-03-01

    To plan for and provide adequate resources to meet the mission and goals of a medical laboratory in compliance with the requirements for laboratory accreditation by Joint Commission International. The related policies and procedures were developed based on standard requirements for resource management. Competency assessment provided continuing education and performance feedback to laboratory employees. Laboratory areas were designed for the efficient and safe performance of laboratory work. A physical environment was built up where hazards were controlled and personnel activities were managed to reduce the risk of injuries. An Employees Occupational Safety and Health Program (EOSHP) was developed to address all types of hazardous materials and wastes. Guidelines were defined to verify that the methods would produce accurate and reliable results. An active resource management program will be an effective way of assuring that systems are in control and continuous improvement is in progress.

  14. Validation of a 2 percent lactic acid antimicrobial rinse for mobile poultry slaughter operations.

    PubMed

    Killinger, Karen M; Kannan, Aditi; Bary, Andy I; Cogger, Craig G

    2010-11-01

    Poultry processing antimicrobial interventions are critical for pathogen control, and organic, mobile operations in Washington seek alternatives to chlorine. Laboratory and field studies (three replications each) evaluated lactic acid efficacy as a chlorine alternative. For the laboratory study, retail-purchased, conventionally processed chicken wings inoculated with Salmonella were randomly assigned to the following treatments: Salmonella inoculation followed by no treatment (10 wings) or by 3-min rinses of water, 50 to 100 ppm of chlorine, or 2% lactic acid (20 wings for each rinse treatment). Wings were sampled for Salmonella enumeration on xylose lysine desoxycholate agar. During pastured poultry processing at mobile slaughter units for each field study replication, 20 chicken carcasses were randomly assigned to each treatment: untreated control or 3-min immersion in lactic acid or chlorine. Whole-carcass rinses were examined for aerobic plate count (APC) on tryptic soy agar and coliforms on violet red bile agar. Untreated controls were also examined for Salmonella. In the laboratory study, lactic acid produced a significant (P < 0.01) Salmonella reduction compared with the inoculated no-rinse, water, and chlorine treatments, which were statistically similar to each other. In the field study, no Salmonella was detected on untreated controls. Lactic acid produced significant >2-log (P < 0.01) reductions in APC and coliforms, whereas chlorine resulted in slight, but significant 0.4-log reductions (P < 0.01) and 0.21-log reductions (P < 0.05) in APC and coliforms compared with untreated controls. Considering laboratory and field studies, lactic acid produced greater reductions in Salmonella, APC, and coliforms, validating its effectiveness as a chlorine alternative in mobile poultry slaughter operations.

  15. The Impact of Inquiry Based Instruction on Science Process Skills and Self-Efficacy Perceptions of Pre-Service Science Teachers at a University Level Biology Laboratory

    ERIC Educational Resources Information Center

    Sen, Ceylan; Sezen Vekli, Gülsah

    2016-01-01

    The aim of this study is to determine the influence of inquiry-based teaching approach on pre-service science teachers' laboratory self-efficacy perceptions and scientific process skills. The quasi experimental model with pre-test-post-test control group design was used as an experimental design in this research. The sample of this study included…

  16. Apollo 16 photographic standards documentation

    NASA Technical Reports Server (NTRS)

    Bourque, P. F.

    1972-01-01

    The activities of the Photographic Technology Division, and particularly the Photo Science Office, the Precision Processing Laboratory, and the Motion Picture Laboratory, in connection with the scientific photography of the Apollo 16 manned space mission are documented. Described are the preflight activities involved in establishing a standard process for each of the flight films, the manned in which flight films were handled upon arrival at the Manned Spacecraft Center in Houston, Texas, and how the flight films were processed and duplicated. The tone reproduction method of duplication is described. The specific sensitometric and chemical process controls are not included.

  17. Crystallization of Calcium Carbonate in a Large Scale Field Study

    NASA Astrophysics Data System (ADS)

    Ueckert, Martina; Wismeth, Carina; Baumann, Thomas

    2017-04-01

    The long term efficiency of geothermal facilities and aquifer thermal energy storage in the carbonaceous Malm aquifer in the Bavarian Molasse Basin is seriously affected by precipitations of carbonates. This is mainly caused by pressure and temperature changes leading to oversaturation during production. Crystallization starts with polymorphic nuclei of calcium carbonate and is often described as diffusion-reaction controlled. Here, calcite crystallization is favoured by high concentration gradients while aragonite crystallization is occurring at high reaction rates. The factors affecting the crystallization processes have been described for simplified, well controlled laboratory experiments, the knowledge about the behaviour in more complex natural systems is still limited. The crystallization process of the polymorphic forms of calcium carbonate were investigated during a heat storage test at our test site in the eastern part of the Bavarian Molasse Basin. Complementary laboratory experiments in an autoclave were run. Both, field and laboratory experiments were conducted with carbonaceous tap water. Within the laboratory experiments additionally ultra pure water was used. To avoid precipitations of the tap water, a calculated amount of {CO_2} was added prior to heating the water from 45 - 110°C (laboratory) resp. 65 - 110°C (field). A total water volume of 0.5 L (laboratory) resp. 1 L (field) was immediately sampled and filtrated through 10 - 0.1

  18. Workbook, Basic Mathematics and Wastewater Processing Calculations.

    ERIC Educational Resources Information Center

    New York State Dept. of Environmental Conservation, Albany.

    This workbook serves as a self-learning guide to basic mathematics and treatment plant calculations and also as a reference and source book for the mathematics of sewage treatment and processing. In addition to basic mathematics, the workbook discusses processing and process control, laboratory calculations and efficiency calculations necessary in…

  19. Fracture induced electromagnetic emissions: extending laboratory findings by observations at the geophysical scale

    NASA Astrophysics Data System (ADS)

    Potirakis, Stelios M.; Contoyiannis, Yiannis; Kopanas, John; Kalimeris, Anastasios; Antonopoulos, George; Peratzakis, Athanasios; Eftaxias, Konstantinos; Nomicos, Constantinos

    2014-05-01

    Under natural conditions, it is practically impossible to install an experimental network on the geophysical scale using the same instrumentations as in laboratory experiments for understanding, through the states of stress and strain and their time variation, the laws that govern the friction during the last stages of EQ generation, or to monitor (much less to control) the principal characteristics of a fracture process. Fracture-induced electromagnetic emissions (EME) in a wide range of frequency bands are sensitive to the micro-structural chances. Thus, their study constitutes a nondestructive method for the monitoring of the evolution of damage process at the laboratory scale. It has been suggested that fracture induced MHz-kHz electromagnetic (EM) emissions, which emerge from a few days up to a few hours before the main seismic shock occurrence permit a real time monitoring of the damage process during the last stages of earthquake preparation, as it happens at the laboratory scale. Since the EME are produced both in the case of the laboratory scale fracture and the EQ preparation process (geophysical scale fracture) they should present similar characteristics in these two scales. Therefore, both the laboratory experimenting scientists and the experimental scientists studying the pre-earthquake EME could benefit from each- other's results. Importantly, it is noted that when studying the fracture process by means of laboratory experiments, the fault growth process normally occurs violently in a fraction of a second. However, a major difference between the laboratory and natural processes is the order-of-magnitude differences in scale (in space and time), allowing the possibility of experimental observation at the geophysical scale for a range of physical processes which are not observable at the laboratory scale. Therefore, the study of fracture-induced EME is expected to reveal more information, especially for the last stages of the fracture process, when it is conducted at the geophysical scale. As a characteristic example, we discuss about the case of electromagnetic silence before the global rupture that was first observed in preseismic EME and recently was also observed in the EME measured during laboratory fracture experiments, completely revising the earlier views about the fracture-induced electromagnetic emissions.

  20. A whole process quality control system for energy measuring instruments inspection based on IOT technology

    NASA Astrophysics Data System (ADS)

    Yin, Bo; Liu, Li; Wang, Jiahan; Li, Xiran; Liu, Zhenbo; Li, Dewei; Wang, Jun; Liu, Lu; Wu, Jun; Xu, Tingting; Cui, He

    2017-10-01

    Electric energy measurement as a basic work, an accurate measurements play a vital role for the economic interests of both parties of power supply, the standardized management of the measurement laboratory at all levels is a direct factor that directly affects the fairness of measurement. Currently, the management of metering laboratories generally uses one-dimensional bar code as the recognition object, advances the testing process by manual management, most of the test data requires human input to generate reports. There are many problems and potential risks in this process: Data cannot be saved completely, cannot trace the status of inspection, the inspection process isn't completely controllable and so on. For the provincial metrology center's actual requirements of the whole process management for the performance test of the power measuring appliances, using of large-capacity RF tags as a process management information media, we developed a set of general measurement experiment management system, formulated a standardized full performance test process, improved the raw data recording mode of experimental process, developed a storehouse automatic inventory device, established a strict test sample transfer and storage system, ensured that all the raw data of the inspection can be traced back, achieved full life-cycle control of the sample, significantly improved the quality control level and the effectiveness of inspection work.

  1. Laboratory investigations of earthquake dynamics

    NASA Astrophysics Data System (ADS)

    Xia, Kaiwen

    In this thesis this will be attempted through controlled laboratory experiments that are designed to mimic natural earthquake scenarios. The earthquake dynamic rupturing process itself is a complicated phenomenon, involving dynamic friction, wave propagation, and heat production. Because controlled experiments can produce results without assumptions needed in theoretical and numerical analysis, the experimental method is thus advantageous over theoretical and numerical methods. Our laboratory fault is composed of carefully cut photoelastic polymer plates (Homahte-100, Polycarbonate) held together by uniaxial compression. As a unique unit of the experimental design, a controlled exploding wire technique provides the triggering mechanism of laboratory earthquakes. Three important components of real earthquakes (i.e., pre-existing fault, tectonic loading, and triggering mechanism) correspond to and are simulated by frictional contact, uniaxial compression, and the exploding wire technique. Dynamic rupturing processes are visualized using the photoelastic method and are recorded via a high-speed camera. Our experimental methodology, which is full-field, in situ, and non-intrusive, has better control and diagnostic capacity compared to other existing experimental methods. Using this experimental approach, we have investigated several problems: dynamics of earthquake faulting occurring along homogeneous faults separating identical materials, earthquake faulting along inhomogeneous faults separating materials with different wave speeds, and earthquake faulting along faults with a finite low wave speed fault core. We have observed supershear ruptures, subRayleigh to supershear rupture transition, crack-like to pulse-like rupture transition, self-healing (Heaton) pulse, and rupture directionality.

  2. Benefits of an automated GLP final report preparation software solution.

    PubMed

    Elvebak, Larry E

    2011-07-01

    The final product of analytical laboratories performing US FDA-regulated (or GLP) method validation and bioanalysis studies is the final report. Although there are commercial-off-the-shelf (COTS) software/instrument systems available to laboratory managers to automate and manage almost every aspect of the instrumental and sample-handling processes of GLP studies, there are few software systems available to fully manage the GLP final report preparation process. This lack of appropriate COTS tools results in the implementation of rather Byzantine and manual processes to cobble together all the information needed to generate a GLP final report. The manual nature of these processes results in the need for several iterative quality control and quality assurance events to ensure data accuracy and report formatting. The industry is in need of a COTS solution that gives laboratory managers and study directors the ability to manage as many portions as possible of the GLP final report writing process and the ability to generate a GLP final report with the click of a button. This article describes the COTS software features needed to give laboratory managers and study directors such a solution.

  3. The Role of Laboratory-Based Studies of the Physical and Biological Properties of Sea Ice in Supporting the Observation and Modeling of Ice Covered Seas

    NASA Astrophysics Data System (ADS)

    Light, B.; Krembs, C.

    2003-12-01

    Laboratory-based studies of the physical and biological properties of sea ice are an essential link between high latitude field observations and existing numerical models. Such studies promote improved understanding of climatic variability and its impact on sea ice and the structure of ice-dependent marine ecosystems. Controlled laboratory experiments can help identify feedback mechanisms between physical and biological processes and their response to climate fluctuations. Climatically sensitive processes occurring between sea ice and the atmosphere and sea ice and the ocean determine surface radiative energy fluxes and the transfer of nutrients and mass across these boundaries. High temporally and spatially resolved analyses of sea ice under controlled environmental conditions lend insight to the physics that drive these transfer processes. Techniques such as optical probing, thin section photography, and microscopy can be used to conduct experiments on natural sea ice core samples and laboratory-grown ice. Such experiments yield insight on small scale processes from the microscopic to the meter scale and can be powerful interdisciplinary tools for education and model parameterization development. Examples of laboratory investigations by the authors include observation of the response of sea ice microstructure to changes in temperature, assessment of the relationships between ice structure and the partitioning of solar radiation by first-year sea ice covers, observation of pore evolution and interfacial structure, and quantification of the production and impact of microbial metabolic products on the mechanical, optical, and textural characteristics of sea ice.

  4. Probing the Reproducibility of Leaf Growth and Molecular Phenotypes: A Comparison of Three Arabidopsis Accessions Cultivated in Ten Laboratories1[W

    PubMed Central

    Massonnet, Catherine; Vile, Denis; Fabre, Juliette; Hannah, Matthew A.; Caldana, Camila; Lisec, Jan; Beemster, Gerrit T.S.; Meyer, Rhonda C.; Messerli, Gaëlle; Gronlund, Jesper T.; Perkovic, Josip; Wigmore, Emma; May, Sean; Bevan, Michael W.; Meyer, Christian; Rubio-Díaz, Silvia; Weigel, Detlef; Micol, José Luis; Buchanan-Wollaston, Vicky; Fiorani, Fabio; Walsh, Sean; Rinn, Bernd; Gruissem, Wilhelm; Hilson, Pierre; Hennig, Lars; Willmitzer, Lothar; Granier, Christine

    2010-01-01

    A major goal of the life sciences is to understand how molecular processes control phenotypes. Because understanding biological systems relies on the work of multiple laboratories, biologists implicitly assume that organisms with the same genotype will display similar phenotypes when grown in comparable conditions. We investigated to what extent this holds true for leaf growth variables and metabolite and transcriptome profiles of three Arabidopsis (Arabidopsis thaliana) genotypes grown in 10 laboratories using a standardized and detailed protocol. A core group of four laboratories generated similar leaf growth phenotypes, demonstrating that standardization is possible. But some laboratories presented significant differences in some leaf growth variables, sometimes changing the genotype ranking. Metabolite profiles derived from the same leaf displayed a strong genotype × environment (laboratory) component. Genotypes could be separated on the basis of their metabolic signature, but only when the analysis was limited to samples derived from one laboratory. Transcriptome data revealed considerable plant-to-plant variation, but the standardization ensured that interlaboratory variation was not considerably larger than intralaboratory variation. The different impacts of the standardization on phenotypes and molecular profiles could result from differences of temporal scale between processes involved at these organizational levels. Our findings underscore the challenge of describing, monitoring, and precisely controlling environmental conditions but also demonstrate that dedicated efforts can result in reproducible data across multiple laboratories. Finally, our comparative analysis revealed that small variations in growing conditions (light quality principally) and handling of plants can account for significant differences in phenotypes and molecular profiles obtained in independent laboratories. PMID:20200072

  5. Controlling Laboratory Processes From A Personal Computer

    NASA Technical Reports Server (NTRS)

    Will, H.; Mackin, M. A.

    1991-01-01

    Computer program provides natural-language process control from IBM PC or compatible computer. Sets up process-control system that either runs without operator or run by workers who have limited programming skills. Includes three smaller programs. Two of them, written in FORTRAN 77, record data and control research processes. Third program, written in Pascal, generates FORTRAN subroutines used by other two programs to identify user commands with device-driving routines written by user. Also includes set of input data allowing user to define user commands to be executed by computer. Requires personal computer operating under MS-DOS with suitable hardware interfaces to all controlled devices. Also requires FORTRAN 77 compiler and device drivers written by user.

  6. Guidelines for the processing and quality assurance of benthic invertebrate samples collected as part of the National Water-Quality Assessment Program

    USGS Publications Warehouse

    Cuffney, T.F.; Gurtz, M.E.; Meador, M.R.

    1993-01-01

    Benthic invertebrate samples are collected as part of the U.S. Geological Survey's National Water-Quality Assessment Program. This is a perennial, multidisciplinary program that integrates biological, physical, and chemical indicators of water quality to evaluate status and trends and to develop an understanding of the factors controlling observed water quality. The Program examines water quality in 60 study units (coupled ground- and surface-water systems) that encompass most of the conterminous United States and parts of Alaska and Hawaii. Study-unit teams collect and process qualitative and semi-quantitative invertebrate samples according to standardized procedures. These samples are processed (elutriated and subsampled) in the field to produce as many as four sample components: large-rare, main-body, elutriate, and split. Each sample component is preserved in 10-percent formalin, and two components, large-rare and main-body, are sent to contract laboratories for further processing. The large-rare component is composed of large invertebrates that are removed from the sample matrix during field processing and placed in one or more containers. The main-body sample component consists of the remaining sample materials (sediment, detritus, and invertebrates) and is subsampled in the field to achieve a volume of 750 milliliters or less. The remaining two sample components, elutriate and split, are used for quality-assurance and quality-control purposes. Contract laboratories are used to identify and quantify invertebrates from the large-rare and main-body sample components according to the procedures and guidelines specified within this document. These guidelines allow the use of subsampling techniques to reduce the volume of sample material processed and to facilitate identifications. These processing procedures and techniques may be modified if the modifications provide equal or greater levels of accuracy and precision. The intent of sample processing is to determine the quantity of each taxon present in the semi-quantitative samples or to list the taxa present in qualitative samples. The processing guidelines provide standardized laboratory forms, sample labels, detailed sample processing flow charts, standardized format for electronic data, quality-assurance procedures and checks, sample tracking standards, and target levels for taxonomic determinations. The contract laboratory (1) is responsible for identifications and quantifications, (2) constructs reference collections, (3) provides data in hard copy and electronic forms, (4) follows specified quality-assurance and quality-control procedures, and (5) returns all processed and unprocessed portions of the samples. The U.S. Geological Survey's Quality Management Group maintains a Biological Quality-Assurance Unit, located at the National Water-Quality Laboratory, Arvada, Colorado, to oversee the use of contract laboratories and ensure the quality of data obtained from these laboratories according to the guidelines established in this document. This unit establishes contract specifications, reviews contractor performance (timeliness, accuracy, and consistency), enters data into the National Water Information System-II data base, maintains in-house reference collections, deposits voucher specimens in outside museums, and interacts with taxonomic experts within and outside the U.S. Geological Survey. This unit also modifies the existing sample processing and quality-assurance guidelines, establishes criteria and testing procedures for qualifying potential contract laboratories, identifies qualified taxonomic experts, and establishes voucher collections.

  7. Slew maneuvers on the SCOLE Laboratory Facility

    NASA Technical Reports Server (NTRS)

    Williams, Jeffrey P.

    1987-01-01

    The Spacecraft Control Laboratory Experiment (SCOLE) was conceived to provide a physical test bed for the investigation of control techniques for large flexible spacecraft. The control problems studied are slewing maneuvers and pointing operations. The slew is defined as a minimum time maneuver to bring the antenna line-of-sight (LOS) pointing to within an error limit of the pointing target. The second objective is to rotate about the LOS within the 0.02 degree error limit. The SCOLE problem is defined as two design challenges: control laws for a mathematical model of a large antenna attached to the Space Shuttle by a long flexible mast; and a control scheme on a laboratory representation of the structure modelled on the control laws. Control sensors and actuators are typical of those which the control designer would have to deal with on an actual spacecraft. Computational facilities consist of microcomputer based central processing units with appropriate analog interfaces for implementation of the primary control system, and the attitude estimation algorithm. Preliminary results of some slewing control experiments are given.

  8. Translational Behavior Analysis: From Laboratory Science in Stimulus Control to Intervention with Persons with Neurodevelopmental Disabilities

    ERIC Educational Resources Information Center

    McIlvane, William J.

    2009-01-01

    Throughout its history, laboratory research in the experimental analysis of behavior has been successful in elucidating and clarifying basic learning principles and processes in both humans and nonhumans. In parallel, applied behavior analysis has shown how fundamental behavior-analytic principles and procedures can be employed to promote…

  9. Virtual Control Systems Environment (VCSE)

    ScienceCinema

    Atkins, Will

    2018-02-14

    Will Atkins, a Sandia National Laboratories computer engineer discusses cybersecurity research work for process control systems. Will explains his work on the Virtual Control Systems Environment project to develop a modeling and simulation framework of the U.S. electric grid in order to study and mitigate possible cyberattacks on infrastructure.

  10. Quality of Liver and Kidney Function Tests among Public Medical Laboratories in Western Region of Amhara National Regional State of Ethiopia.

    PubMed

    Teka, Abaynesh; Kibatu, Girma

    2012-03-01

    Medical laboratories play essential roles in measurements of substances in body fluids for the purpose of diagnosis, treatment, prevention, and for greater understanding of the disease process. Thus, data generated from have to be reliable for which strict quality control, management and assurance are maintained. The aim of this study is to assess the accuracy and precision of clinical chemistry laboratories in western region of Amhara national regional state of Ethiopia in testing liver and kidney functions. Eight laboratories in hospitals and a Regional Health Research Laboratory Center participated in this study from February to March, 2011. Each participant was requested to measure six specimens for six chemistry tests from two control samples. Three hundred twenty four test results to be reported from all participant laboratories, if all measurements can be made, were designed to be collected and statistically evaluated. None of the study subject laboratories could deliver all the six tests for estimation of both liver and renal functions simultaneously during the study period. Only 213 values from the expected 324 values were reported and about 65 % of the 213 values reported fell outside of the allowable limits of errors for the chemistry tests of the control specimen used. This study finding showed that there were lack of accuracy and precision in chemistry measurements. A regular survey on medical laboratories should be conducted questioning the accuracy and precision of their analyses in order to sustain improvements in the quality of services provided by participating laboratories for the benefit of patients. Laboratory Quality Management Systems appreciate the need for regular quality control and quality assessment schemes in medical laboratories.

  11. [Development of pseudoviral competitive internal controls for RT-PCR detection of dengue virus].

    PubMed

    Hang, Xiao-Tong; Li, Jian-Dong; Zhang, Quan-Fu; Li, Chuan; Zhang, Shuo; Liang, Mi-Fang; Li, De-Xin

    2010-02-01

    Development of pseudoviral competitive internal controls for RT-PCR laboratory detection of dengue virus. The internal controls target gene were obtained by insertion of a 180 bp non-related DNA fragment into RT-PCR detection target of dengue virus between the forward and reverse PCR primer binding regions. A yellow florescence protein reporter gene was induced at downstream of internal controls target gene via internal ribosome entry site gene. HEK 293T cells were transfected with plasmid containing this whole cassette and lentiviral packaging support plasmid. Pseudoviral particle was recovered from the supernatant and analyzed quantitatively and qualitatively in simulated samples at the same tube under different experimental conditions. The established pseudoviral competitive internal controls can be used in the RT-PCR detection of different serotype dengue virus and the whole detection process can be monitored. The obtained fragment is easy to be differentiated in agarose electrophoresis. The pseudoviral competitive internal controls could be used for the quality control of the laboratory diagnosis process, simple to prepare, stable for storage, easy to be transformed into internal controls for other RNA virus.

  12. Summary Report for the Evaluation of Current QA Processes Within the FRMAC FAL and EPA MERL.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shanks, Sonoya T.; Redding, Ted; Jaussi, Lynn

    The Federal Radiological Monitoring and Assessment Center (FRMAC) relies on accurate and defensible analytical laboratory data to support its mission. Therefore, FRMAC must ensure that the environmental analytical laboratories providing analytical services maintain an ongoing capability to provide accurate analytical results to DOE. It is undeniable that the more Quality Assurance (QA) and Quality Control (QC) measures required of the laboratory, the less resources that are available for analysis of response samples. Being that QA and QC measures in general are understood to comprise a major effort related to a laboratory’s operations, requirements should only be considered if they aremore » deemed “value-added” for the FRMAC mission. This report provides observations of areas for improvement and potential interoperability opportunities in the areas of Batch Quality Control Requirements, Written Communications, Data Review Processes, Data Reporting Processes, along with the lessons learned as they apply to items in the early phase of a response that will be critical for developing a more efficient, integrated response for future interactions between the FRMAC and EPA assets.« less

  13. Wastewater treatment of chemical laboratory using electro assisted-phytoremediation (EAPR)

    NASA Astrophysics Data System (ADS)

    Putra, Rudy Syah; Trahadinata, Gilang Ahmad; Latif, Arif; Solehudin, Mochamad

    2017-03-01

    The EAPR process using water hyacinth (Eichornia crassipes) on the wastewater treatment of chemical laboratory had been evaluated. The purpose of the EAPR process was to decrease the BOD, COD and heavy metal concentration in the wastewater. The effectiveness of the process on the wastewater treatment was evaluated using COD, BOD, and heavy metal (Pb, Cu) concentration, respectively. The result showed that the EAPR process decrease the COD, BOD, Pb and Cu in the 4 h of EAPR process. Those concentrations were met the water quality standard of class IV according to government regulation No. 82/2001 regarding the water quality management and water pollution control of the Republic of Indonesia.

  14. Updated standards and processes for accreditation of echocardiographic laboratories from The European Association of Cardiovascular Imaging.

    PubMed

    Popescu, Bogdan A; Stefanidis, Alexandros; Nihoyannopoulos, Petros; Fox, Kevin F; Ray, Simon; Cardim, Nuno; Rigo, Fausto; Badano, Luigi P; Fraser, Alan G; Pinto, Fausto; Zamorano, Jose Luis; Habib, Gilbert; Maurer, Gerald; Lancellotti, Patrizio; Andrade, Maria Joao; Donal, Erwan; Edvardsen, Thor; Varga, Albert

    2014-07-01

    Standards for echocardiographic laboratories were proposed by the European Association of Echocardiography (now the European Association of Cardiovascular Imaging) 7 years ago in order to raise standards of practice and improve the quality of care. Criteria and requirements were published at that time for transthoracic, transoesophageal, and stress echocardiography. This paper reassesses and updates the quality standards to take account of experience and the technical developments of modern echocardiographic practice. It also discusses quality control, the incentives for laboratories to apply for accreditation, the reaccreditation criteria, and the current status and future prospects of the laboratory accreditation process. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2014. For permissions please email: journals.permissions@oup.com.

  15. Interfacing laboratory instruments to multiuser, virtual memory computers

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.; Stang, David B.; Roth, Don J.

    1989-01-01

    Incentives, problems and solutions associated with interfacing laboratory equipment with multiuser, virtual memory computers are presented. The major difficulty concerns how to utilize these computers effectively in a medium sized research group. This entails optimization of hardware interconnections and software to facilitate multiple instrument control, data acquisition and processing. The architecture of the system that was devised, and associated programming and subroutines are described. An example program involving computer controlled hardware for ultrasonic scan imaging is provided to illustrate the operational features.

  16. An Ontology for Identifying Cyber Intrusion Induced Faults in Process Control Systems

    NASA Astrophysics Data System (ADS)

    Hieb, Jeffrey; Graham, James; Guan, Jian

    This paper presents an ontological framework that permits formal representations of process control systems, including elements of the process being controlled and the control system itself. A fault diagnosis algorithm based on the ontological model is also presented. The algorithm can identify traditional process elements as well as control system elements (e.g., IP network and SCADA protocol) as fault sources. When these elements are identified as a likely fault source, the possibility exists that the process fault is induced by a cyber intrusion. A laboratory-scale distillation column is used to illustrate the model and the algorithm. Coupled with a well-defined statistical process model, this fault diagnosis approach provides cyber security enhanced fault diagnosis information to plant operators and can help identify that a cyber attack is underway before a major process failure is experienced.

  17. Benefits of a comprehensive quality program for cryopreserved PBMC covering 28 clinical trials sites utilizing an integrated, analytical web-based portal.

    PubMed

    Ducar, Constance; Smith, Donna; Pinzon, Cris; Stirewalt, Michael; Cooper, Cristine; McElrath, M Juliana; Hural, John

    2014-07-01

    The HIV Vaccine Trials Network (HVTN) is a global network of 28 clinical trial sites dedicated to identifying an effective HIV vaccine. Cryopreservation of high-quality peripheral blood mononuclear cells (PBMC) is critical for the assessment of vaccine-induced cellular immune functions. The HVTN PBMC Quality Management Program is designed to ensure that viable PBMC are processed, stored and shipped for clinical trial assays from all HVTN clinical trial sites. The program has evolved by developing and incorporating best practices for laboratory and specimen quality and implementing automated, web-based tools. These tools allow the site-affiliated processing laboratories and the central Laboratory Operations Unit to rapidly collect, analyze and report PBMC quality data. The HVTN PBMC Quality Management Program includes five key components: 1) Laboratory Assessment, 2) PBMC Training and Certification, 3) Internal Quality Control, 4) External Quality Control (EQC), and 5) Assay Specimen Quality Control. Fresh PBMC processing data is uploaded from each clinical site processing laboratory to a central HVTN Statistical and Data Management Center database for access and analysis on a web portal. Samples are thawed at a central laboratory for assay or specimen quality control and sample quality data is uploaded directly to the database by the central laboratory. Four year cumulative data covering 23,477 blood draws reveals an average fresh PBMC yield of 1.45×10(6)±0.48 cells per milliliter of useable whole blood. 95% of samples were within the acceptable range for fresh cell yield of 0.8-3.2×10(6) cells/ml of usable blood. Prior to full implementation of the HVTN PBMC Quality Management Program, the 2007 EQC evaluations from 10 international sites showed a mean day 2 thawed viability of 83.1% and a recovery of 67.5%. Since then, four year cumulative data covering 3338 specimens used in immunologic assays shows that 99.88% had acceptable viabilities (>66%) for use in cellular assays (mean, 91.46% ±4.5%), and 96.2% had acceptable recoveries (50%-130%) with a mean of recovery of 85.8% ±19.12% of the originally cryopreserved cells. EQC testing revealed that since August 2009, failed recoveries dropped from 4.1% to 1.6% and failed viabilities dropped from 1.0% to 0.3%. The HVTN PBMC quality program provides for laboratory assessment, training and tools for identifying problems, implementing corrective action and monitoring for improvements. These data support the benefits of implementing a comprehensive, web-based PBMC quality program for large clinical trials networks. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Generic System for Remote Testing and Calibration of Measuring Instruments: Security Architecture

    NASA Astrophysics Data System (ADS)

    Jurčević, M.; Hegeduš, H.; Golub, M.

    2010-01-01

    Testing and calibration of laboratory instruments and reference standards is a routine activity and is a resource and time consuming process. Since many of the modern instruments include some communication interfaces, it is possible to create a remote calibration system. This approach addresses a wide range of possible applications and permits to drive a number of different devices. On the other hand, remote calibration process involves a number of security issues due to recommendations specified in standard ISO/IEC 17025, since it is not under total control of the calibration laboratory personnel who will sign the calibration certificate. This approach implies that the traceability and integrity of the calibration process directly depends on the collected measurement data. The reliable and secure remote control and monitoring of instruments is a crucial aspect of internet-enabled calibration procedure.

  19. Computer Instructional Aids for Undergraduate Control Education.

    ERIC Educational Resources Information Center

    Volz, Richard A.; And Others

    Engineering is coming to rely more and more heavily upon the computer for computations, analyses, and graphic displays which aid the design process. A general purpose simulation system, the Time-shared Automatic Control Laboratory (TACL), and a set of computer-aided design programs, Control Oriented Interactive Graphic Analysis and Design…

  20. Development and expansion of high-quality control region databases to improve forensic mtDNA evidence interpretation.

    PubMed

    Irwin, Jodi A; Saunier, Jessica L; Strouss, Katharine M; Sturk, Kimberly A; Diegoli, Toni M; Just, Rebecca S; Coble, Michael D; Parson, Walther; Parsons, Thomas J

    2007-06-01

    In an effort to increase the quantity, breadth and availability of mtDNA databases suitable for forensic comparisons, we have developed a high-throughput process to generate approximately 5000 control region sequences per year from regional US populations, global populations from which the current US population is derived and global populations currently under-represented in available forensic databases. The system utilizes robotic instrumentation for all laboratory steps from pre-extraction through sequence detection, and a rigorous eight-step, multi-laboratory data review process with entirely electronic data transfer. Over the past 3 years, nearly 10,000 control region sequences have been generated using this approach. These data are being made publicly available and should further address the need for consistent, high-quality mtDNA databases for forensic testing.

  1. Lean six sigma methodologies improve clinical laboratory efficiency and reduce turnaround times.

    PubMed

    Inal, Tamer C; Goruroglu Ozturk, Ozlem; Kibar, Filiz; Cetiner, Salih; Matyar, Selcuk; Daglioglu, Gulcin; Yaman, Akgun

    2018-01-01

    Organizing work flow is a major task of laboratory management. Recently, clinical laboratories have started to adopt methodologies such as Lean Six Sigma and some successful implementations have been reported. This study used Lean Six Sigma to simplify the laboratory work process and decrease the turnaround time by eliminating non-value-adding steps. The five-stage Six Sigma system known as define, measure, analyze, improve, and control (DMAIC) is used to identify and solve problems. The laboratory turnaround time for individual tests, total delay time in the sample reception area, and percentage of steps involving risks of medical errors and biological hazards in the overall process are measured. The pre-analytical process in the reception area was improved by eliminating 3 h and 22.5 min of non-value-adding work. Turnaround time also improved for stat samples from 68 to 59 min after applying Lean. Steps prone to medical errors and posing potential biological hazards to receptionists were reduced from 30% to 3%. Successful implementation of Lean Six Sigma significantly improved all of the selected performance metrics. This quality-improvement methodology has the potential to significantly improve clinical laboratories. © 2017 Wiley Periodicals, Inc.

  2. Integration of the Uncertainties of Anion and TOC Measurements into the Flammability Control Strategy for Sludge Batch 8 at the DWPF

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, T. B.

    2013-03-14

    The Savannah River National Laboratory (SRNL) has been working with the Savannah River Remediation (SRR) Defense Waste Processing Facility (DWPF) in the development and implementation of a flammability control strategy for DWPF’s melter operation during the processing of Sludge Batch 8 (SB8). SRNL’s support has been in response to technical task requests that have been made by SRR’s Waste Solidification Engineering (WSE) organization. The flammability control strategy relies on measurements that are performed on Slurry Mix Evaporator (SME) samples by the DWPF Laboratory. Measurements of nitrate, oxalate, formate, and total organic carbon (TOC) standards generated by the DWPF Laboratory aremore » presented in this report, and an evaluation of the uncertainties of these measurements is provided. The impact of the uncertainties of these measurements on DWPF’s strategy for controlling melter flammability also is evaluated. The strategy includes monitoring each SME batch for its nitrate content and its TOC content relative to the nitrate content and relative to the antifoam additions made during the preparation of the SME batch. A linearized approach for monitoring the relationship between TOC and nitrate is developed, equations are provided that integrate the measurement uncertainties into the flammability control strategy, and sample calculations for these equations are shown to illustrate the impact of the uncertainties on the flammability control strategy.« less

  3. Project development laboratories energy fuels and oils based on NRU “MPEI”

    NASA Astrophysics Data System (ADS)

    Burakov, I. A.; Burakov, A. Y.; Nikitina, I. S.; Khomenkov, A. M.; Paramonova, A. O.; Khtoo Naing, Aung

    2017-11-01

    In the process of improving the efficiency of power plants a hot topic is the use of high-quality fuels and lubricants. In the process of transportation, preparation for use, storage and maintenance of the properties of fuels and lubricants may deteriorate, which entails a reduction in the efficiency of power plants. One of the ways to prevent the deterioration of the properties is a timely analysis of the relevant laboratories. In this day, the existence of laboratories of energy fuels and energy laboratory oil at thermal power stations is satisfactory character. However, the training of qualified personnel to work in these laboratories is a serious problem, as the lack of opportunities in these laboratories a complete list of required tests. The solution to this problem is to explore the possibility of application of methods of analysis of the properties of fuels and lubricants in the stage of training and re-training of qualified personnel. In this regard, on the basis of MPEI developed laboratory projects of solid, liquid and gaseous fuels, power and energy oils and lubricants. Projects allow for a complete list of tests required for the timely control of properties and prevent the deterioration of these properties. Assess the financial component of the implementation of the developed projects based on the use of modern equipment used for tests. Projects allow for a complete list of tests required for the timely control of properties and prevent the deterioration of these properties.

  4. [Information system of the national network of public health laboratories in Peru (Netlab)].

    PubMed

    Vargas-Herrera, Javier; Segovia-Juarez, José; Garro Nuñez, Gladys María

    2015-01-01

    Clinical laboratory information systems produce improvements in the quality of information, reduce service costs, and diminish wait times for results, among other things. In the construction process of this information system, the National Institute of Health (NIH) of Peru has developed and implemented a web-based application to communicate to health personnel (laboratory workers, epidemiologists, health strategy managers, physicians, etc.) the results of laboratory tests performed at the Peruvian NIH or in the laboratories of the National Network of Public Health Laboratories which is called NETLAB. This article presents the experience of implementing NETLAB, its current situation, perspectives of its use, and its contribution to the prevention and control of diseases in Peru.

  5. Phase B: Final definition and preliminary design study for the initial Atmospheric Cloud Physics Laboratory (ACPL): A spacelab mission payload. Final review (DR-MA-03)

    NASA Technical Reports Server (NTRS)

    Clausen, O. W.

    1976-01-01

    Systems design for an initial atmospheric cloud physics laboratory to study microphysical processes in zero gravity is presented. Included are descriptions of the fluid, thermal, mechanical, control and data, and electrical distribution interfaces with Spacelab. Schedule and cost analysis are discussed.

  6. Control of Infectious Diseases in the Era of European Clinical Microbiology Laboratory Consolidation: New Challenges and Opportunities for the Patient and for Public Health Surveillance.

    PubMed

    Vandenberg, Olivier; Kozlakidis, Zisis; Schrenzel, Jacques; Struelens, Marc Jean; Breuer, Judith

    2018-01-01

    Many new innovative diagnostic approaches have been made available during the last 10 years with major impact on patient care and public health surveillance. In parallel, to enhance the cost-effectiveness of the clinical microbiology laboratories (CMLs), European laboratory professionals have streamlined their organization leading to amalgamation of activities and restructuring of their professional relationships with clinicians and public health specialists. Through this consolidation process, an operational model has emerged that combines large centralized clinical laboratories performing most tests on one high-throughput analytical platform connected to several distal laboratories dealing locally with urgent analyses at near point of care. The centralization of diagnostic services over a large geographical region has given rise to the concept of regional-scale "microbiology laboratories network." Although the volume-driven cost savings associated with such laboratory networks seem self-evident, the consequence(s) for the quality of patient care and infectious disease surveillance and control remain less obvious. In this article, we describe the range of opportunities that the changing landscape of CMLs in Europe can contribute toward improving the quality of patient care but also the early detection and enhanced surveillance of public health threats caused by infectious diseases. The success of this transformation of health services is reliant on the appropriate preparation in terms of staff, skills, and processes that would be inclusive of stakeholders. In addition, rigorous metrics are needed to set out more concrete laboratory service performance objectives and assess the expected benefits to society in terms of saving lives and preventing diseases.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kegel, T.M.

    Calibration laboratories are faced with the need to become accredited or registered to one or more quality standards. One requirement common to all of these standards is the need to have in place a measurement assurance program. What is a measurement assurance program? Brian Belanger, in Measurement Assurance Programs: Part 1, describes it as a {open_quotes}quality assurance program for a measurement process that quantifies the total uncertainty of the measurements (both random and systematic components of error) with respect to national or designated standards and demonstrates that the total uncertainty is sufficiently small to meet the user`s requirements.{close_quotes} Rolf Schumachermore » is more specific in Measurement Assurance in Your Own Laboratory. He states, {open_quotes}Measurement assurance is the application of broad quality control principles to measurements of calibrations.{close_quotes} Here, the focus is on one important part of any measurement assurance program: implementation of statistical process control (SPC). Paraphrasing Juran`s Quality Control Handbook, a process is in statistical control if the only observed variations are those that can be attributed to random causes. Conversely, a process that exhibits variations due to assignable causes is not in a state of statistical control. Finally, Carrol Croarkin states, {open_quotes}In the measurement assurance context the measurement algorithm including instrumentation, reference standards and operator interactions is the process that is to be controlled, and its direct product is the measurement per se. The measurements are assumed to be valid if the measurement algorithm is operating in a state of control.{close_quotes} Implicit in this statement is the important fact that an out-of-control process cannot produce valid measurements. 7 figs.« less

  8. Modular workcells: modern methods for laboratory automation.

    PubMed

    Felder, R A

    1998-12-01

    Laboratory automation is beginning to become an indispensable survival tool for laboratories facing difficult market competition. However, estimates suggest that only 8% of laboratories will be able to afford total laboratory automation systems. Therefore, automation vendors have developed alternative hardware configurations called 'modular automation', to fit the smaller laboratory. Modular automation consists of consolidated analyzers, integrated analyzers, modular workcells, and pre- and post-analytical automation. These terms will be defined in this paper. Using a modular automation model, the automated core laboratory will become a site where laboratory data is evaluated by trained professionals to provide diagnostic information to practising physicians. Modem software information management and process control tools will complement modular hardware. Proper standardization that will allow vendor-independent modular configurations will assure success of this revolutionary new technology.

  9. Recommendations for accreditation of laboratories in molecular biology of hematologic malignancies.

    PubMed

    Flandrin-Gresta, Pascale; Cornillet, Pascale; Hayette, Sandrine; Gachard, Nathalie; Tondeur, Sylvie; Mauté, Carole; Cayuela, Jean-Michel

    2015-01-01

    Over recent years, the development of molecular biology techniques has improved the hematological diseases diagnostic and follow-up. Consequently, these techniques are largely used in the biological screening of these diseases; therefore the Hemato-oncology molecular diagnostics laboratories must be actively involved in the accreditation process according the ISO 15189 standard. The French group of molecular biologists (GBMHM) provides requirements for the implementation of quality assurance for the medical molecular laboratories. This guideline states the recommendations for the pre-analytical, analytical (methods validation procedures, quality controls, reagents), and post-analytical conditions. In addition, herein we state a strategy for the internal quality control management. These recommendations will be regularly updated.

  10. Laboratory process control using natural language commands from a personal computer

    NASA Technical Reports Server (NTRS)

    Will, Herbert A.; Mackin, Michael A.

    1989-01-01

    PC software is described which provides flexible natural language process control capability with an IBM PC or compatible machine. Hardware requirements include the PC, and suitable hardware interfaces to all controlled devices. Software required includes the Microsoft Disk Operating System (MS-DOS) operating system, a PC-based FORTRAN-77 compiler, and user-written device drivers. Instructions for use of the software are given as well as a description of an application of the system.

  11. Role of the New South Wales Department of Primary Industries' Laboratory Information Management System (LIMS) in the 2007 equine influenza emergency animal disease response.

    PubMed

    Croft, M G; Fraser, G C; Gaul, W N

    2011-07-01

    A Laboratory Information Management System (LIMS) was used to manage the laboratory data and support planning and field activities as part of the response to the equine influenza outbreak in Australia in 2007. The database structure of the LIMS and the system configurations that were made to best handle the laboratory implications of the disease response are discussed. The operational aspects of the LIMS and the related procedures used at the laboratory to process the increased sample throughput are reviewed, as is the interaction of the LIMS with other corporate systems used in the management of the response. Outcomes from this tailored configuration and operation of the LIMS resulted in effective provision and control of the laboratory and laboratory information aspects of the response. The extent and immediate availability of the information provided from the LIMS was critical to some of the activities of key operatives involved in controlling the response. © 2011 The Authors. Australian Veterinary Journal © 2011 Australian Veterinary Association.

  12. [The external evaluation of study quality: the role in maintaining the reliability of laboratory information].

    PubMed

    Men'shikov, V V

    2013-08-01

    The external evaluation of quality of clinical laboratory examinations was gradually introduced in USSR medical laboratories since 1970s. In Russia, in the middle of 1990 a unified all-national system of external evaluation quality was organized known as the Federal center of external evaluation of quality at the basis of laboratory of the state research center of preventive medicine. The main positions of policy in this area were neatly formulated in the guidance documents of ministry of Health. Nowadays, the center of external evaluation of quality proposes 100 and more types of control studies and permanently extends their specter starting from interests of different disciplines of clinical medicine. The consistent participation of laboratories in the cycles of external evaluation of quality intrinsically promotes improvement of indicators of properness and precision of analysis results and increases reliability of laboratory information. However, a significant percentage of laboratories does not participate at all in external evaluation of quality or takes part in control process irregularly and in limited number of tests. The managers of a number of medical organizations disregard the application of the proposed possibilities to increase reliability of laboratory information and limit financing of studies in the field of quality control. The article proposes to adopt the national standard on the basis of ISO 17043 "Evaluation of compliance. The common requirements of professional competence testing".

  13. B827 Chemical Synthhesis Project - Industrial Control System Integration - Statement of Work & Specification with Attachments 1-14

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wade, F. E.

    The Chemical Synthesis Pilot Process at the Lawrence Livermore National Laboratory (LLNL) Site 300 827 Complex will be used to synthesize small quantities of material to support research and development. The project will modernize and increase current capabilities for chemical synthesis at LLNL. The primary objective of this project is the conversion of a non-automated hands-on process to a remoteoperation process, while providing enhanced batch process step control, stored recipe-specific parameter sets, process variable visibility, monitoring, alarm and warning handling, and comprehensive batch record data logging. This Statement of Work and Specification provides the industrial-grade process control requirements for themore » chemical synthesis batching control system, hereafter referred to as the “Control System” to be delivered by the System Integrator.« less

  14. Institutional practices and policies in acid-base testing: a self reported Croatian survey study on behalf of the Croatian society of medical biochemistry and laboratory medicine Working Group for acid-base balance.

    PubMed

    Dukić, Lora; Simundić, Ana-Maria

    2014-01-01

    The aim of this survey study was to assess the current practices and policies in use related to the various steps in the blood gas testing process, across hospital laboratories in Croatia. First questionnaire was sent by email to all medical biochemistry laboratories (N = 104) within general, specialized and clinical hospitals and university hospital centres to identify laboratories which perform blood gas analysis. Second questionnaire with detailed questions about sample collection, analysis and quality control procedures, was sent only to 47 laboratories identified by the first survey. Questionnaire was designed as combination of questions and statements with Likert scale. Third questionnaire was sent to all participating laboratories (N=47) for additional clarification for either indeterminate or unclear answers. Blood gas analysis is performed in 47/104 hospital laboratories in Croatia. In 25/41 (0.61) of the laboratories capillary blood gas sampling is the preferred sample type for adult patient population, whereas arterial blood sample is preferentially used in only 5/44 laboratories (0.11). Blood sampling and sample processing for capillary samples is done almost always by laboratory technicians (36/41 and 37/44, respectively), whereas arterial blood sampling is almost always done by the physician (24/29) and only rarely by a nurse (5/28). Sample acceptance criteria and sample analysis are in accordance with international recommendations for majority of laboratories. 43/44 laboratories participate in the national EQA program. POCT analyzers are installed outside of the laboratory in 20/47 (0.43) institutions. Laboratory staff is responsible for education and training of ward personnel, quality control and instrument maintenance in only 12/22, 11/20 and 9/20 institutions, respectively. Practices related to collection and analysis for blood gases in Croatia are not standardised and vary substantially between laboratories. POCT analyzers are not under the direct supervision by laboratory personnel in a large proportion of surveyed institutions. Collective efforts should be made to harmonize and improve policies and procedures related to blood gas testing in Croatian laboratories.

  15. The organization of perception and action in complex control skills

    NASA Technical Reports Server (NTRS)

    Miller, Richard A.; Jagacinski, Richard J.

    1989-01-01

    An attempt was made to describe the perceptual, cognitive, and action processes that account for highly skilled human performance in complex task environments. In order to study such a performance in a controlled setting, a laboratory task was constructed and three experiments were performed using human subjects. A general framework was developed for describing the organization of perceptual, cognitive, and action process.

  16. The LabTube - a novel microfluidic platform for assay automation in laboratory centrifuges.

    PubMed

    Kloke, A; Fiebach, A R; Zhang, S; Drechsel, L; Niekrawietz, S; Hoehl, M M; Kneusel, R; Panthel, K; Steigert, J; von Stetten, F; Zengerle, R; Paust, N

    2014-05-07

    Assay automation is the key for successful transformation of modern biotechnology into routine workflows. Yet, it requires considerable investment in processing devices and auxiliary infrastructure, which is not cost-efficient for laboratories with low or medium sample throughput or point-of-care testing. To close this gap, we present the LabTube platform, which is based on assay specific disposable cartridges for processing in laboratory centrifuges. LabTube cartridges comprise interfaces for sample loading and downstream applications and fluidic unit operations for release of prestored reagents, mixing, and solid phase extraction. Process control is achieved by a centrifugally-actuated ballpen mechanism. To demonstrate the workflow and functionality of the LabTube platform, we show two LabTube automated sample preparation assays from laboratory routines: DNA extractions from whole blood and purification of His-tagged proteins. Equal DNA and protein yields were observed compared to manual reference runs, while LabTube automation could significantly reduce the hands-on-time to one minute per extraction.

  17. The Impact of Internet Virtual Physics Laboratory Instruction on the Achievement in Physics, Science Process Skills and Computer Attitudes of 10th-Grade Students

    NASA Astrophysics Data System (ADS)

    Yang, Kun-Yuan; Heh, Jia-Sheng

    2007-10-01

    The purpose of this study was to investigate and compare the impact of Internet Virtual Physics Laboratory (IVPL) instruction with traditional laboratory instruction in physics academic achievement, performance of science process skills, and computer attitudes of tenth grade students. One-hundred and fifty students from four classes at one private senior high school in Taoyuan Country, Taiwan, R.O.C. were sampled. All four classes contained 75 students who were equally divided into an experimental group and a control group. The pre-test results indicated that the students' entry-level physics academic achievement, science process skills, and computer attitudes were equal for both groups. On the post-test, the experimental group achieved significantly higher mean scores in physics academic achievement and science process skills. There was no significant difference in computer attitudes between the groups. We concluded that the IVPL had potential to help tenth graders improve their physics academic achievement and science process skills.

  18. Informatics applied to cytology

    PubMed Central

    Hornish, Maryanne; Goulart, Robert A.

    2008-01-01

    Automation and emerging information technologies are being adopted by cytology laboratories to augment Pap test screening and improve diagnostic accuracy. As a result, informatics, the application of computers and information systems to information management, has become essential for the successful operation of the cytopathology laboratory. This review describes how laboratory information management systems can be used to achieve an automated and seamless workflow process. The utilization of software, electronic databases and spreadsheets to perform necessary quality control measures are discussed, as well as a Lean production system and Six Sigma approach, to reduce errors in the cytopathology laboratory. PMID:19495402

  19. Testing of the Defense Waste Processing Facility Cold Chemical Dissolution Method in Sludge Batch 9 Qualification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, T.; Pareizs, J.; Coleman, C.

    For each sludge batch that is processed in the Defense Waste Processing Facility (DWPF), the Savannah River National Laboratory (SRNL) tests the applicability of the digestion methods used by the DWPF Laboratory for elemental analysis of Sludge Receipt and Adjustment Tank (SRAT) Receipt samples and SRAT Product process control samples. DWPF SRAT samples are typically dissolved using a method referred to as the DWPF Cold Chemical or Cold Chem Method (CC), (see DWPF Procedure SW4- 15.201). Testing indicates that the CC method produced mixed results. The CC method did not result in complete dissolution of either the SRAT Receipt ormore » SRAT Product with some fine, dark solids remaining. However, elemental analyses did not reveal extreme biases for the major elements in the sludge when compared with analyses obtained following dissolution by hot aqua regia (AR) or sodium peroxide fusion (PF) methods. The CC elemental analyses agreed with the AR and PF methods well enough that it should be adequate for routine process control analyses in the DWPF after much more extensive side-by-side tests of the CC method and the PF method are performed on the first 10 SRAT cycles of the Sludge Batch 9 (SB9) campaign. The DWPF Laboratory should continue with their plans for further tests of the CC method during these 10 SRAT cycles.« less

  20. INTERIOR OF SECOND FLOOR CONTROL ROOM OF FUEL STORAGE BUILDING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR OF SECOND FLOOR CONTROL ROOM OF FUEL STORAGE BUILDING (CPP-603). PHOTO TAKEN LOOKING SOUTHWEST. INL PHOTO NUMBER HD-54-19-2. Mike Crane, Photographer, 8/2005 - Idaho National Engineering Laboratory, Idaho Chemical Processing Plant, Fuel Reprocessing Complex, Scoville, Butte County, ID

  1. Zero-Gravity Atmospheric Cloud Physics Experiment Laboratory engineering concepts/design tradeoffs. Volume 1: Study results

    NASA Technical Reports Server (NTRS)

    Greco, R. V.; Eaton, L. R.; Wilkinson, H. C.

    1974-01-01

    The work is summarized which was accomplished from January 1974 to October 1974 for the Zero-Gravity Atmospheric Cloud Physics Laboratory. The definition and development of an atmospheric cloud physics laboratory and the selection and delineation of candidate experiments that require the unique environment of zero gravity or near zero gravity are reported. The experiment program and the laboratory concept for a Spacelab payload to perform cloud microphysics research are defined. This multimission laboratory is planned to be available to the entire scientific community to utilize in furthering the basic understanding of cloud microphysical processes and phenomenon, thereby contributing to improved weather prediction and ultimately to provide beneficial weather control and modification.

  2. Flow Control and Measurement in Electric Propulsion Systems: Towards an AIAA Reference Standard

    NASA Technical Reports Server (NTRS)

    Snyder, John Steven; Baldwin, Jeff; Frieman, Jason D.; Walker, Mitchell L. R.; Hicks, Nathan S.; Polzin, Kurt A.; Singleton, James T.

    2013-01-01

    Accurate control and measurement of propellant flow to a thruster is one of the most basic and fundamental requirements for operation of electric propulsion systems, whether they be in the laboratory or on flight spacecraft. Hence, it is important for the electric propulsion community to have a common understanding of typical methods for flow control and measurement. This paper addresses the topic of propellant flow primarily for the gaseous propellant systems which have dominated laboratory research and flight application over the last few decades, although other types of systems are also briefly discussed. While most flight systems have employed a type of pressure-fed flow restrictor for flow control, both thermal-based and pressure-based mass flow controllers are routinely used in laboratories. Fundamentals and theory of operation of these types of controllers are presented, along with sources of uncertainty associated with their use. Methods of calibration and recommendations for calibration processes are presented. Finally, details of uncertainty calculations are presented for some common calibration methods and for the linear fits to calibration data that are commonly used.

  3. [Logistics of collection and transportation of biological samples and the organization of the central laboratory in the ELSA-Brasil].

    PubMed

    Fedeli, Ligia G; Vidigal, Pedro G; Leite, Claudia Mendes; Castilhos, Cristina D; Pimentel, Robércia Anjos; Maniero, Viviane C; Mill, Jose Geraldo; Lotufo, Paulo A; Pereira, Alexandre C; Bensenor, Isabela M

    2013-06-01

    The ELSA (Estudo Longitudinal de Saúde do Adulto - Brazilian Longitudinal Study for Adult Health) is a multicenter cohort study which aims at the identification of risk factors associated with type 2 diabetes and cardiovascular diseases in the Brazilian population. The paper describes the strategies for the collection, processing, transportation, and quality control of blood and urine tests in the ELSA. The study decided to centralize the tests at one single laboratory. The processing of the samples was performed at the local laboratories, reducing the weight of the material to be transported, and diminishing the costs of transportation to the central laboratory at the Universidade de São Paulo Hospital. The study included tests for the evaluation of diabetes, insulin resistance, dyslipidemia, electrolyte abnormalities, thyroid hormones, uric acid, hepatic enzyme abnormalities, inflammation, and total blood cell count. In addition, leukocyte DNA, urine, plasma and serum samples were stored. The central laboratory performed approximately 375,000 tests.

  4. Technology to improve quality and accountability.

    PubMed

    Kay, Jonathan

    2006-01-01

    A body of evidence has been accumulated to demonstrate that current practice is not sufficiently safe for several stages of central laboratory testing. In particular, while analytical and perianalytical steps that take place within the laboratory are subjected to quality control procedures, this is not the case for several pre- and post-analytical steps. The ubiquitous application of auto-identification technology seems to represent a valuable tool for reducing error rates. A series of projects in Oxford has attempted to improve processes which support several areas of laboratory medicine, including point-of-care testing, blood transfusion, delivery and interpretation of reports, and support of decision-making by clinicians. The key tools are auto-identification, Internet communication technology, process re-engineering, and knowledge management.

  5. KENNEDY SPACE CENTER, FLA. - In the Space Station Processing Facility, Center Director Roy Bridges (left), Program Manager of the International Space Station (ISS) Randy Brinkley (second from left) and STS-98 Commander Ken Cockrell (right) applaud the unveiling of the name "Destiny" for the U.S. Laboratory module. The lab, which is behnd them on a workstand, is scheduled to be launched on STS-98 on Space Shuttle Endeavour in early 2000. It will become the centerpiece of scientific research on the ISS. The Shuttle will spend six days docked to the Station while the laboratory is attached and three spacewalks are conducted to compete its assembly. The laboratory will be launched with five equipment racks aboard, which will provide essential functions for Station systems, including high data-rate communications, and maintain the Station's orientation using control gyroscopes launched earlier. Additional equipment and research racks will be installed in the laboratory on subsequent Shuttle flights.

    NASA Image and Video Library

    1998-12-01

    KENNEDY SPACE CENTER, FLA. - In the Space Station Processing Facility, Center Director Roy Bridges (left), Program Manager of the International Space Station (ISS) Randy Brinkley (second from left) and STS-98 Commander Ken Cockrell (right) applaud the unveiling of the name "Destiny" for the U.S. Laboratory module. The lab, which is behnd them on a workstand, is scheduled to be launched on STS-98 on Space Shuttle Endeavour in early 2000. It will become the centerpiece of scientific research on the ISS. The Shuttle will spend six days docked to the Station while the laboratory is attached and three spacewalks are conducted to compete its assembly. The laboratory will be launched with five equipment racks aboard, which will provide essential functions for Station systems, including high data-rate communications, and maintain the Station's orientation using control gyroscopes launched earlier. Additional equipment and research racks will be installed in the laboratory on subsequent Shuttle flights.

  6. Control of Infectious Diseases in the Era of European Clinical Microbiology Laboratory Consolidation: New Challenges and Opportunities for the Patient and for Public Health Surveillance

    PubMed Central

    Vandenberg, Olivier; Kozlakidis, Zisis; Schrenzel, Jacques; Struelens, Marc Jean; Breuer, Judith

    2018-01-01

    Many new innovative diagnostic approaches have been made available during the last 10 years with major impact on patient care and public health surveillance. In parallel, to enhance the cost-effectiveness of the clinical microbiology laboratories (CMLs), European laboratory professionals have streamlined their organization leading to amalgamation of activities and restructuring of their professional relationships with clinicians and public health specialists. Through this consolidation process, an operational model has emerged that combines large centralized clinical laboratories performing most tests on one high-throughput analytical platform connected to several distal laboratories dealing locally with urgent analyses at near point of care. The centralization of diagnostic services over a large geographical region has given rise to the concept of regional-scale “microbiology laboratories network.” Although the volume-driven cost savings associated with such laboratory networks seem self-evident, the consequence(s) for the quality of patient care and infectious disease surveillance and control remain less obvious. In this article, we describe the range of opportunities that the changing landscape of CMLs in Europe can contribute toward improving the quality of patient care but also the early detection and enhanced surveillance of public health threats caused by infectious diseases. The success of this transformation of health services is reliant on the appropriate preparation in terms of staff, skills, and processes that would be inclusive of stakeholders. In addition, rigorous metrics are needed to set out more concrete laboratory service performance objectives and assess the expected benefits to society in terms of saving lives and preventing diseases. PMID:29457001

  7. Development and implementation of the Caribbean Laboratory Quality Management Systems Stepwise Improvement Process (LQMS-SIP) Towards Accreditation.

    PubMed

    Alemnji, George; Edghill, Lisa; Guevara, Giselle; Wallace-Sankarsingh, Sacha; Albalak, Rachel; Cognat, Sebastien; Nkengasong, John; Gabastou, Jean-Marc

    2017-01-01

    Implementing quality management systems and accrediting laboratories in the Caribbean has been a challenge. We report the development of a stepwise process for quality systems improvement in the Caribbean Region. The Caribbean Laboratory Stakeholders met under a joint Pan American Health Organization/US Centers for Disease Control and Prevention initiative and developed a user-friendly framework called 'Laboratory Quality Management System - Stepwise Improvement Process (LQMS-SIP) Towards Accreditation' to support countries in strengthening laboratory services through a stepwise approach toward fulfilling the ISO 15189: 2012 requirements. This approach consists of a three-tiered framework. Tier 1 represents the minimum requirements corresponding to the mandatory criteria for obtaining a licence from the Ministry of Health of the participating country. The next two tiers are quality improvement milestones that are achieved through the implementation of specific quality management system requirements. Laboratories that meet the requirements of the three tiers will be encouraged to apply for accreditation. The Caribbean Regional Organisation for Standards and Quality hosts the LQMS-SIP Secretariat and will work with countries, including the Ministry of Health and stakeholders, including laboratory staff, to coordinate and implement LQMS-SIP activities. The Caribbean Public Health Agency will coordinate and advocate for the LQMS-SIP implementation. This article presents the Caribbean LQMS-SIP framework and describes how it will be implemented among various countries in the region to achieve quality improvement.

  8. An overview of quality control practices in Ontario with particular reference to cholesterol analysis.

    PubMed

    Krishnan, S; Webb, S; Henderson, A R; Cheung, C M; Nazir, D J; Richardson, H

    1999-03-01

    The Laboratory Proficiency Testing Program (LPTP) assesses the analytical performance of all licensed laboratories in Ontario. The LPTP Enzymes, Cardiac Markers, and Lipids Committee conducted a "Patterns of Practice" survey to assess the in-house quality control (QC) practices of laboratories in Ontario using cholesterol as the QC paradigm. The survey was questionnaire-based seeking information on statistical calculations, software rules, review process and data retention, and so on. Copies of the in-house cholesterol QC graphs were requested. A total of 120 of 210 laboratories were randomly chosen to receive the questionnaires during 1995 and 1996; 115 laboratories responded, although some did not answer all questions. The majority calculate means and standard deviations (SD) every month, using anywhere from 4 to >100 data points. 65% use a fixed mean and SD, while 17% use means calculated from the previous month. A few use a floating or cumulative mean. Some laboratories that do not use fixed means use a fixed SD. About 90% use some form of statistical quality control rules. The most common rules used to detect random error are 1(3s)/R4s while 2(2s)/4(1s)/10x are used for systematic errors. About 20% did not assay any QC at levels >5.5 mmol/L. Quality control data are reviewed daily (technologists), weekly and monthly (supervisors/directors). Most laboratories retain their QC records for up to 3 years on paper and magnetic media. On some QC graphs the mean and SD, QC product lot number, or reference to action logs are not apparent. Quality control practices in Ontario are, therefore, disappointing. Improvement is required in the use of clinically appropriate concentrations of QC material and documentation on QC graphs.

  9. Quality management and accreditation in a mixed research and clinical hair testing analytical laboratory setting-a review.

    PubMed

    Fulga, Netta

    2013-06-01

    Quality management and accreditation in the analytical laboratory setting are developing rapidly and becoming the standard worldwide. Quality management refers to all the activities used by organizations to ensure product or service consistency. Accreditation is a formal recognition by an authoritative regulatory body that a laboratory is competent to perform examinations and report results. The Motherisk Drug Testing Laboratory is licensed to operate at the Hospital for Sick Children in Toronto, Ontario. The laboratory performs toxicology tests of hair and meconium samples for research and clinical purposes. Most of the samples are involved in a chain of custody cases. Establishing a quality management system and achieving accreditation became mandatory by legislation for all Ontario clinical laboratories since 2003. The Ontario Laboratory Accreditation program is based on International Organization for Standardization 15189-Medical laboratories-Particular requirements for quality and competence, an international standard that has been adopted as a national standard in Canada. The implementation of a quality management system involves management commitment, planning and staff education, documentation of the system, validation of processes, and assessment against the requirements. The maintenance of a quality management system requires control and monitoring of the entire laboratory path of workflow. The process of transformation of a research/clinical laboratory into an accredited laboratory, and the benefits of maintaining an effective quality management system, are presented in this article.

  10. Logistics engineering education from the point of view environment

    NASA Astrophysics Data System (ADS)

    Bányai, Ágota

    2010-05-01

    A new field of MSc programme offered by the Faculty of Mechanical Engineering and Informatics of the University of Miskolc is represented by the programme in logistics engineering. The Faculty has always laid great emphasis on assigning processes connected with environment protection and globalisation issues the appropriate weight in its programmes. This is based on the fact that the Faculty has initiated and been involved in a great number of research and development projects with a substantial emphasis on the fundamental principles of sustainable development. The objective of the programme of logistics engineering is to train engineers who, in possession of the science, engineering, economic, informatics and industrial, transportation technological knowledge related to the professional field of logistics, are able to analyse, design, organise, and control logistics processes and systems (freight transportation, materials handling, storage, commissioning, loading, purchasing, distribution and waste management) as well as to design and develop machinery and equipment as the elements of logistic systems and also to be involved in their manufacture and quality control and are able to control their operation. The programme prepares its students for performing the logistics management tasks in a company, for creative participation in solving research and development problems in logistics and for pursuing logistics studies in doctoral programmes. There are several laboratories available for practice-oriented training. The 'Integrated Logistics Laboratory' consists of various fixed and mobile, real industrial, i.e. not model-level equipment, the integration of which in one system facilitates not only the presentation, examination and development of the individual self-standing facilities, but the study of their interaction as well in terms of mechatronics, engineering, control engineering, informatics, identification technology and logistics. The state-of-the-art, reliable, automated mechatronics-material flow system with its single control engineering system provides the academic staff with up-to-date research facilities, and enables the students to study sophisticated equipment and systems that could also operate under industrial conditions, thus offering knowledge that can be efficiently utilised in the industry after graduation. The laboratory measurements of the programme in logistics engineering are performed in this laboratory, and they are supplemented by the theoretical and practical measurements in the ‘Robotic Technology Assembly Laboratory', the ‘Power Electronics Laboratory', the ‘Mechatronics Laboratory', the ‘CAD/CAM Laboratory' and the ‘Acoustics and Product Laboratory'. The bodies of knowledge connected with environment protection and sustainable development can be grouped around three large topic areas. In environmental economics the objective is to present the corporate-organisational aspects of environmental management. Putting environmental management in the focal point, the objective of the programme is to impart knowledge that can be utilised in practice which can be used to shift the relation between the organisation and its environment in the direction of sustainability. The tools include environmental controlling, environmental marketing and various solutions of environmental performance evaluation. The second large topic area is globalization and its logistic aspects. In the field of global logistics the following knowledge carries special weight: logistic challenges in a globalised world; the concept of global logistics, its conditions and effects; delayed manufacture, assembly, packaging; the economic investigation of delayed assembly; globalised purchase and distribution in logistics; the logistic features of the globalised production supply/distribution chain; meta-logistics systems; logistics-related EU harmonisation issues; the effect of e-commerce on the global logistic system; logistic centres, connecting virtual logistic companies in a network; the environmental harmonisation of international transportation. The third large area is recycling logistics. Here the bodies of knowledge are as follows: the concept of developing a ‘closed-loop economy'; stages in the progress of products after discarding, connections between the uses of waste collection, processing, selection, deposition or reuse processes; features of European recommendations (e.g. EMAS), harmonisation of national practices and global solutions; presenting the logistics part-processes of recycling; presenting process organisation procedures for the foundation of designing one-route, multi-route, replacement container waste collecting and distributing part systems; recycling strategies with consideration of logistically serving the separation and storage of waste to be deposited, the technological processing systems of recyclable materials; presenting dismantling and product and material identification technologies, presenting logistics part-tasks, analysis of technical solutions; IT solutions for identifying products and their elements to be distributed and withdrawn from distribution after use (e.g. RFID systems) and monitoring their material flow; methodology of using efficiency analyses and incentive systems in the decision making processes of recycling processes, risk analysis for evaluating typical part processes; the methodology of recycling-oriented product design for specific product groups. Graduates of the Master programmes are able to use and utilise the knowledge obtained in practice, use problem-solving techniques; process the information, new problems and new phenomena arising in the border areas of the professional experience gained the discipline; formulate substantial criticism and opinions as far as possible, make decisions and draw conclusions; comprehending and solving the problems arising, suggesting original ideas; plan and perform tasks independently at a high professional standard; improve themselves, develop their knowledge to higher levels; view the management of technical/engineering - economic - human resources in a complex way; design complex systems in a global way based on a system-oriented and process-oriented way of thinking; use integrated knowledge from the professional fields of transport, mobile machinery, process theory, industrial production processes, electronics and informatics; combine the part processes of logistics systems and the part units performing their physical realisation (materials handling equipment, sensors, actuators, control systems, and database systems, etc.); perform state evaluations depending on their specialisation, use them to elaborate evaluations and recommendations, develop complex logistic systems, design, organise and control them at the highest level. This work was implemented with support by the European Union and co-funding of the European Social Fund.

  11. Monitoring sodium levels in commercially processed and restaurant foods - dataset and webpages.

    USDA-ARS?s Scientific Manuscript database

    Nutrient Data Laboratory (NDL), Agriculture Research Service (ARS) in collaboration with Food Surveys Research Group, ARS, and the Centers for Disease Control and Prevention has been monitoring commercially processed and restaurant foods in the United States since 2010. About 125 highly consumed, s...

  12. [Internal audit in medical laboratory: what means of control for an effective audit process?].

    PubMed

    Garcia-Hejl, Carine; Chianéa, Denis; Dedome, Emmanuel; Sanmartin, Nancy; Bugier, Sarah; Linard, Cyril; Foissaud, Vincent; Vest, Philippe

    2013-01-01

    To prepare the French Accreditation Committee (COFRAC) visit for initial certification of our medical laboratory, our direction evaluated its quality management system (QMS) and all its technical activities. This evaluation was performed owing an internal audit. This audit was outsourced. Auditors had an expertise in audit, a whole knowledge of biological standards and were independent. Several nonconformities were identified at that time, including a lack of control of several steps of the internal audit process. Hence, necessary corrective actions were taken in order to meet the requirements of standards, in particular, the formalization of all stages, from the audit program, to the implementation, review and follow-up of the corrective actions taken, and also the implementation of the resources needed to carry out audits in a pre-established timing. To ensure an optimum control of each step, the main concepts of risk management were applied: process approach, root cause analysis, effects and criticality analysis (FMECA). After a critical analysis of our practices, this methodology allowed us to define our "internal audit" process, then to formalize it and to follow it up, with a whole documentary system.

  13. Development of in-situ control diagnostics for application of epitaxial superconductor and buffer layers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    B.C. Winkleman; T.V. Giel; Jason Cunningham

    1999-07-30

    The recent achievements of critical currents in excess of 1 x 10{sup 6} amp/cm{sup 2} at 77 K in YBCO deposited over suitably textured buffer/substrate composites have stimulated interest in the potential fabrication of these coated conductors as wire. Numerous approaches and manufacturing schemes for producing coated conductor wire are currently being developed. Recently, under the US DOE's sponsorship, the University of Tennessee Space Institute performed an extensive evaluation of leading coated conductor processing options. In general, it is their feeling that the science and chemistry that are being developed in the coated conductor wire program now need proper engineeringmore » evaluation to define the most viable options for a commercial fabrication process. All fabrication processes will need process control measurements. This report provides a specific review of the needs and available technologies for process control for many of the coated conductor processing options. This report also addresses generic process monitoring areas in which additional research and development is needed. The concentration is on the two different approaches for obtaining the textured substrates that have been identified as viable candidates. These are the Los Alamos National Laboratory's ion-beam assisted deposition, called IBAD, to obtain a highly textured yttria-stabilized zirconia (YSZ) buffer on nickel alloy strips, and Oak Ridge National Laboratory's rolling assisted, bi-axially textured substrate option called RABiTS{trademark}.« less

  14. Reproducing stone monument photosynthetic-based colonization under laboratory conditions.

    PubMed

    Miller, Ana Zélia; Laiz, Leonila; Gonzalez, Juan Miguel; Dionísio, Amélia; Macedo, Maria Filomena; Saiz-Jimenez, Cesareo

    2008-11-01

    In order to understand the biodeterioration process occurring on stone monuments, we analyzed the microbial communities involved in these processes and studied their ability to colonize stones under controlled laboratory experiments. In this study, a natural green biofilm from a limestone monument was cultivated, inoculated on stone probes of the same lithotype and incubated in a laboratory chamber. This incubation system, which exposes stone samples to intermittently sprinkling water, allowed the development of photosynthetic biofilms similar to those occurring on stone monuments. Denaturing gradient gel electrophoresis (DGGE) analysis was used to evaluate the major microbial components of the laboratory biofilms. Cyanobacteria, green microalgae, bacteria and fungi were identified by DNA-based molecular analysis targeting the 16S and 18S ribosomal RNA genes. The natural green biofilm was mainly composed by the Chlorophyta Chlorella, Stichococcus, and Trebouxia, and by Cyanobacteria belonging to the genera Leptolyngbya and Pleurocapsa. A number of bacteria belonging to Alphaproteobacteria, Bacteroidetes and Verrucomicrobia were identified, as well as fungi from the Ascomycota. The laboratory colonization experiment on stone probes showed a colonization pattern similar to that occurring on stone monuments. The methodology described in this paper allowed to reproduce a colonization equivalent to the natural biodeteriorating process.

  15. Laboratory Modelling of Volcano Plumbing Systems: a review

    NASA Astrophysics Data System (ADS)

    Galland, Olivier; Holohan, Eoghan P.; van Wyk de Vries, Benjamin; Burchardt, Steffi

    2015-04-01

    Earth scientists have, since the XIX century, tried to replicate or model geological processes in controlled laboratory experiments. In particular, laboratory modelling has been used study the development of volcanic plumbing systems, which sets the stage for volcanic eruptions. Volcanic plumbing systems involve complex processes that act at length scales of microns to thousands of kilometres and at time scales from milliseconds to billions of years, and laboratory models appear very suitable to address them. This contribution reviews laboratory models dedicated to study the dynamics of volcano plumbing systems (Galland et al., Accepted). The foundation of laboratory models is the choice of relevant model materials, both for rock and magma. We outline a broad range of suitable model materials used in the literature. These materials exhibit very diverse rheological behaviours, so their careful choice is a crucial first step for the proper experiment design. The second step is model scaling, which successively calls upon: (1) the principle of dimensional analysis, and (2) the principle of similarity. The dimensional analysis aims to identify the dimensionless physical parameters that govern the underlying processes. The principle of similarity states that "a laboratory model is equivalent to his geological analogue if the dimensionless parameters identified in the dimensional analysis are identical, even if the values of the governing dimensional parameters differ greatly" (Barenblatt, 2003). The application of these two steps ensures a solid understanding and geological relevance of the laboratory models. In addition, this procedure shows that laboratory models are not designed to exactly mimic a given geological system, but to understand underlying generic processes, either individually or in combination, and to identify or demonstrate physical laws that govern these processes. From this perspective, we review the numerous applications of laboratory models to understand the distinct key features of volcanic plumbing systems: dykes, cone sheets, sills, laccoliths, caldera-related structures, ground deformation, magma/fault interactions, and explosive vents. Barenblatt, G.I., 2003. Scaling. Cambridge University Press, Cambridge. Galland, O., Holohan, E.P., van Wyk de Vries, B., Burchardt, S., Accepted. Laboratory modelling of volcanic plumbing systems: A review, in: Breitkreuz, C., Rocchi, S. (Eds.), Laccoliths, sills and dykes: Physical geology of shallow level magmatic systems. Springer.

  16. Secure Control Systems for the Energy Sector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Rhett; Campbell, Jack; Hadley, Mark

    2012-03-31

    Schweitzer Engineering Laboratories (SEL) will conduct the Hallmark Project to address the need to reduce the risk of energy disruptions because of cyber incidents on control systems. The goals is to develop solutions that can be both applied to existing control systems and designed into new control systems to add the security measures needed to mitigate energy network vulnerabilities. The scope of the Hallmark Project contains four primary elements: 1. Technology transfer of the Secure Supervisory Control and Data Acquisition (SCADA) Communications Protocol (SSCP) from Pacific Northwest National Laboratories (PNNL) to Schweitzer Engineering Laboratories (SEL). The project shall use thismore » technology to develop a Federal Information Processing Standard (FIPS) 140-2 compliant original equipment manufacturer (OEM) module to be called a Cryptographic Daughter Card (CDC) with the ability to directly connect to any PC enabling that computer to securely communicate across serial to field devices. Validate the OEM capabilities with another vendor. 2. Development of a Link Authenticator Module (LAM) using the FIPS 140-2 validated Secure SCADA Communications Protocol (SSCP) CDC module with a central management software kit. 3. Validation of the CDC and Link Authenticator modules via laboratory and field tests. 4. Creation of documents that record the impact of the Link Authenticator to the operators of control systems and on the control system itself. The information in the documents can assist others with technology deployment and maintenance.« less

  17. Current status of verification practices in clinical biochemistry in Spain.

    PubMed

    Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè

    2013-09-01

    Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.

  18. How do laboratory technicians perceive their role in the tuberculosis diagnostic process? A cross-sectional study among laboratory technicians in health centers of Central Java Province, Indonesia.

    PubMed

    Widjanarko, Bagoes; Widyastari, Dyah Anantalia; Martini, Martini; Ginandjar, Praba

    2016-01-01

    Detection of acid-fast bacilli in respiratory specimens serves as an initial pulmonary tuberculosis (TB) diagnosis. Laboratories are the essential and fundamental part of all health systems. This study aimed to describe how laboratory technicians perceived their own self and work. This included perceived self-efficacy, perceived role, perceived equipment availability, perceived procedures, perceived reward and job, and perceived benefit of health education, as well as level of knowledge and attitudes related to work performance of laboratory technicians. This was a cross-sectional quantitative study involving 120 laboratory technicians conducted in Central Java. Interviews and observation were conducted to measure performance and work-related variables. Among 120 laboratory technicians, 43.3% showed fairly good performance. They complied with 50%-75% of all procedures, including sputum collection, laboratory tools utilization, sputum smearing, staining, smear examination, grading of results, and universal precaution practice. Perceived role, perceived self-efficacy, and knowledge of laboratory procedures were significantly correlated to performance, besides education and years of working as a laboratory technician. Perceived equipment availability was also significantly correlated to performance after the education variable was controlled. Most of the laboratory technicians believed that they have an important role in TB patients' treatment and should display proper self-efficacy in performing laboratory activities. The result may serve as a basic consideration to develop a policy for enhancing motivation of laboratory technicians in order to improve the TB control program.

  19. Automation software for a materials testing laboratory

    NASA Technical Reports Server (NTRS)

    Mcgaw, Michael A.; Bonacuse, Peter J.

    1990-01-01

    The software environment in use at the NASA-Lewis Research Center's High Temperature Fatigue and Structures Laboratory is reviewed. This software environment is aimed at supporting the tasks involved in performing materials behavior research. The features and capabilities of the approach to specifying a materials test include static and dynamic control mode switching, enabling multimode test control; dynamic alteration of the control waveform based upon events occurring in the response variables; precise control over the nature of both command waveform generation and data acquisition; and the nesting of waveform/data acquisition strategies so that material history dependencies may be explored. To eliminate repetitive tasks in the coventional research process, a communications network software system is established which provides file interchange and remote console capabilities.

  20. [The fundamental role of stage control technology on the detectability for Salmonella networking laboratory].

    PubMed

    Zhou, Yong-ming; Chen, Xiu-hua; Xu, Wen; Jin, Hui-ming; Li, Chao-qun; Liang, Wei-li; Wang, Duo-chun; Yan, Mei-ying; Lou, Jing; Kan, Biao; Ran, Lu; Cui, Zhi-gang; Wang, Shu-kun; Xu, Xue-bin

    2013-11-01

    To evaluated the fundamental role of stage control technology (SCT) on the detectability for Salmonella networking laboratories. Appropriate Salmonella detection methods after key point control being evaluated, were establishment and optimized. Our training and evaluation networking laboratories participated in the World Health Organization-Global Salmonella Surveillance Project (WHO-GSS) and China-U.S. Collaborative Program on Emerging and Re-emerging infectious diseases Project (GFN) in Shanghai. Staff members from the Yunnan Yuxi city Center for Disease Control and Prevention were trained on Salmonella isolation from diarrhea specimens. Data on annual Salmonella positive rates was collected from the provincial-level monitoring sites to be part of the GSS and GFN projects from 2006 to 2012. The methodology was designed based on the conventional detection procedure of Salmonella which involved the processes as enrichment, isolation, species identification and sero-typing. These methods were simultaneously used to satisfy the sensitivity requirements on non-typhoid Salmonella detection for networking laboratories. Public Health Laboratories in Shanghai had developed from 5 in 2006 to 9 in 2011, and Clinical laboratories from 8 to 22. Number of clinical isolates, including typhoid and non-typhoid Salmonella increased from 196 in 2006 to 1442 in 2011. The positive rate of Salmonella isolated from the clinical diarrhea cases was 2.4% in Yuxi county, in 2012. At present, three other provincial monitoring sites were using the SBG technique as selectivity enrichment broth for Salmonella isolation, with Shanghai having the most stable positive baseline. The method of SCT was proved the premise of the network laboratory construction. Based on this, the improvement of precise phenotypic identification and molecular typing capabilities could reach the level equivalent to the national networking laboratory.

  1. Guidance, Navigation and Control Digital Emulation Technology Laboratory. Volume 1. Part 1. Task 1: Digital Emulation Technology Laboratory

    DTIC Science & Technology

    1991-09-27

    complex floating-point functions in a fraction of the time used by the best supercomputers on the market today. These co-processing boards "piggy-back...by the VNIX-based DECLARE program. Ve’ ctLptieu du te, tedi the new verion with main programs that noi, include onlN the variablc required wkith each

  2. Photographic consulting services to the Earth Resources program. [using aerial photography as a tool for scientific measurement

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The recommendations, procedures, and techniques are summarized which provided by the Kodak Apparatus Division to the Ames Research Center to support the Earth Resources Aircraft Program at that facility. Recommendations, procedures, and calibration data are included for sensitometry, densitometry, laboratory cleanliness, and determination of camera exposure. Additional comments are made regarding process control procedures and general laboratory operations.

  3. Computing and data processing

    NASA Technical Reports Server (NTRS)

    Smarr, Larry; Press, William; Arnett, David W.; Cameron, Alastair G. W.; Crutcher, Richard M.; Helfand, David J.; Horowitz, Paul; Kleinmann, Susan G.; Linsky, Jeffrey L.; Madore, Barry F.

    1991-01-01

    The applications of computers and data processing to astronomy are discussed. Among the topics covered are the emerging national information infrastructure, workstations and supercomputers, supertelescopes, digital astronomy, astrophysics in a numerical laboratory, community software, archiving of ground-based observations, dynamical simulations of complex systems, plasma astrophysics, and the remote control of fourth dimension supercomputers.

  4. WebLab of a DC Motor Speed Control Didactical Experiment

    ERIC Educational Resources Information Center

    Bauer, Karine; Mendes, Luciano

    2012-01-01

    Purpose: Weblabs are an additional resource in the execution of experiments in control engineering education, making learning process more flexible both in time, by allowing extra class laboratory activities, and space, bringing the learning experience to remote locations where experimentation facilities would not be available. The purpose of this…

  5. A numerical cloud model for the support of laboratory experimentation

    NASA Technical Reports Server (NTRS)

    Hagen, D. E.

    1979-01-01

    A numerical cloud model is presented which can describe the evolution of a cloud starting from moist aerosol-laden air through the diffusional growth regime. The model is designed for the direct support of cloud chamber laboratory experimentation, i.e., experiment preparation, real-time control and data analysis. In the model the thermodynamics is uncoupled from the droplet growth processes. Analytic solutions for the cloud droplet growth equations are developed which can be applied in most laboratory situations. The model is applied to a variety of representative experiments.

  6. [Cellular transplantation laboratory: a new field of action for nurses].

    PubMed

    Corradi, Maria Inês; da Silva, Sandra Honorato

    2008-01-01

    This article presents the experience of a nurse at a cellular transplantation laboratory. This laboratory goal is to isolate insulin producing cells for human transplantation. The nurse, as a member of an interdisciplinary team, took part in the planning of all work processes: working procedures and team training. The main activities under the nurse responsibilities include contamination control, on-the-job training and evaluation of the Quality of the procedures developed by the interdisciplinary team. Results have shown the effectiveness of the nurses' work in this new field.

  7. A survey of coagulation laboratory practices and satisfaction ratings of member laboratories of the Thailand National External Quality Assessment Scheme for blood coagulation.

    PubMed

    Chuntarut, A; Tientadakul, P; Wongkrajang, P

    2016-06-01

    The Thailand National External Quality Assessment Scheme (NEQAS) for blood coagulation was established in 2005. The objective of this study was to collect data of coagulation laboratory practices and satisfaction of NEQAS member. Two hundred seventy-six questionnaires were sent to laboratories that are members of NEQAS to obtain data relating to coagulation laboratory practice and satisfaction in 2014. Data from this survey were compared with data from the survey conducted in 2005 to evaluate levels of improvement. Of 276 questionnaires sent, 212 (76.8%) were returned. Improvements were characterized by the number of laboratories that (i) decreased use of 3.8% sodium citrate as anticoagulant; (ii) implemented use of at least two control levels for internal quality control; and (iii) implemented reporting of reference values with results, as well as establishing their own reference range and using geometric mean as the denominator for international normalized ratio calculation. For overall satisfaction, 179 of 206 (86.9%) participant laboratories reported being satisfied or very satisfied. Improvements in coagulation laboratory practices in Thailand were observed in every step of the total testing process. However, additional improvements are still needed, such as determination and use of a local reference range. © 2016 John Wiley & Sons Ltd.

  8. The effect of participation in an extended inquiry project on general chemistry student laboratory interactions, confidence, and process skills

    NASA Astrophysics Data System (ADS)

    Krystyniak, Rebecca A.

    2001-12-01

    This study explored the effect of participation by second-semester general chemistry students in an extended open-inquiry laboratory investigation on their use of science process skills and confidence in performing specific aspects of laboratory investigations. In addition, verbal interactions of a student lab team among team members and with their instructor over three open-inquiry laboratory sessions and two non-inquiry sessions were investigated. Instruments included the Test of Integrated Skills (TIPS), a 36-item multiple-choice instrument, and the Chemistry Laboratory Survey (CLS), a researcher co-designed 20-item 8-point instrument. Instruments were administered at the beginning and close of the semester to 157 second-semester general chemistry students at the two universities; students at only one university participated in open-inquiry activity. A MANCOVA was performed to investigate relationships among control and experimental students, TIPS, and CLS post-test scores. Covariates were TIPS and CLS pre-test scores and prior high school and college science experience. No significant relationships were found. Wilcoxen analyses indicated both groups showed increase in confidence; experimental-group students with below-average TIPS pre-test scores showed a significant increase in science process skills. Transcribed audio tapes of all laboratory-based verbal interactions were analyzed. Coding categories, developed using the constant comparison method, led to an inter-rater reliability of .96. During open-inquiry activities, the lab team interacted less often, sought less guidance from their instructor, and talked less about chemistry concepts than during non-inquiry activities. Evidence confirmed that students used science process skills and engaged in higher-order thinking during both types of activities. A four-student focus shared their experiences with open-inquiry activities, indicating that they enjoyed the experience, viewed it as worthwhile, and believed it helped them gain understanding of the nature of chemistry research. Research results indicate that participation in open-inquiry laboratory increases student confidence and, for some students, the ability to use science process skills. Evidence documents differences in student laboratory interactions and behavior that are attributable to the type of laboratory experience. Further research into aspects of open-inquiry laboratory experiences is recommended.

  9. 48 CFR 915.404-4-71-4 - Considerations affecting fee amounts.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...—Manufacturing plants involving operations requiring a high degree of design layout or process control; nuclear reactors; atomic particle accelerators; complex laboratories or industrial units especially designed for...

  10. 48 CFR 915.404-4-71-4 - Considerations affecting fee amounts.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...—Manufacturing plants involving operations requiring a high degree of design layout or process control; nuclear reactors; atomic particle accelerators; complex laboratories or industrial units especially designed for...

  11. 48 CFR 915.404-4-71-4 - Considerations affecting fee amounts.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...—Manufacturing plants involving operations requiring a high degree of design layout or process control; nuclear reactors; atomic particle accelerators; complex laboratories or industrial units especially designed for...

  12. 48 CFR 915.404-4-71-4 - Considerations affecting fee amounts.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...—Manufacturing plants involving operations requiring a high degree of design layout or process control; nuclear reactors; atomic particle accelerators; complex laboratories or industrial units especially designed for...

  13. 48 CFR 915.404-4-71-4 - Considerations affecting fee amounts.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...—Manufacturing plants involving operations requiring a high degree of design layout or process control; nuclear reactors; atomic particle accelerators; complex laboratories or industrial units especially designed for...

  14. [Safety management in pathology laboratory: from specimen handling to confirmation of reports].

    PubMed

    Minato, Hiroshi; Nojima, Takayuki; Nakano, Mariko; Yamazaki, Michiko

    2011-03-01

    Medical errors in pathological diagnosis give a huge amount of physical and psychological damage to patients as well as medical staffs. We discussed here how to avoid medical errors in surgical pathology laboratory through our experience. Handling of surgical specimens and diagnosing process requires intensive labor and involves many steps. Each hospital reports many kinds of accidents or incidents, however, many laboratories share common problems and each process has its specific risk for the certain error. We analyzed the problems in each process and concentrated on avoiding misaccessioning, mislabeling, and misreporting. We have made several changes in our system, such as barcode labels, digital images of all specimens, putting specimens in embedding cassettes directly on the endoscopic biopsied specimens, and using a multitissue control block as controls in immunohistochemistry. Some problems are still left behind, but we have reduced the errors by decreasing the number of artificial operation as much as possible. A pathological system recognizing the status of read or unread the pathological reports by clinician are now underconstruction. We also discussed about quality assurance of diagnosis, cooperation with clinicians and other comedical staffs, and organization and method. In order to operate riskless work, it is important for all the medical staffs to have common awareness of the problems, keeping careful observations, and sharing all the information in common. Incorporation of an organizational management tool such as ISO 15189 and utilizing PDCA cycle is also helpful for safety management and quality improvement of the laboratory.

  15. Evaluation of the “Pipeline” for Development of Medications for Cocaine Use Disorder: A Review of Translational Preclinical, Human Laboratory, and Clinical Trial Research

    PubMed Central

    Stoops, William W.; Rush, Craig R.

    2016-01-01

    Cocaine use disorder is a persistent public health problem for which no widely effective medications exist. Self-administration procedures, which have shown good predictive validity in estimating the abuse potential of drugs, have been used in rodent, nonhuman primate, and human laboratory studies to screen putative medications. This review assessed the effectiveness of the medications development process regarding pharmacotherapies for cocaine use disorder. The primary objective was to determine whether data from animal and human laboratory self-administration studies predicted the results of clinical trials. In addition, the concordance between laboratory studies in animals and humans was assessed. More than 100 blinded, randomized, fully placebo-controlled studies of putative medications for cocaine use disorder were identified. Of the 64 drugs tested in these trials, only 10 had been examined in both human and well-controlled animal laboratory studies. Within all three stages, few studies had been conducted for each drug and when multiple studies had been conducted conclusions were sometimes contradictory. Overall, however, there was good concordance between animal and human laboratory results when the former assessed chronic drug treatment. Although only seven of the ten reviewed drugs showed fully concordant results across all three types of studies reviewed, the analysis revealed several subject-related, procedural, and environmental factors that differ between the laboratory and clinical trial settings that help explain the disagreement for other drugs. The review closes with several recommendations to enhance translation and communication across stages of the medications development process that will ultimately speed the progress toward effective pharmacotherapeutic strategies for cocaine use disorder. PMID:27255266

  16. The Prosocial Effects of 3,4-methylenedioxymethamphetamine (MDMA): Controlled Studies in Humans and Laboratory Animals

    PubMed Central

    Kamilar-Britt, Philip; Bedi, Gillinder

    2015-01-01

    Users of ±3,4-Methylenedioxymethamphetamine (MDMA; ‘ecstasy’) report prosocial effects such as sociability and empathy. Supporting these apparently unique social effects, data from controlled laboratory studies indicate that MDMA alters social feelings, information processing, and behavior in humans, and social behavior in rodents. Here, we review this growing body of evidence. In rodents, MDMA increases passive prosocial behavior (adjacent lying) and social reward while decreasing aggression, effects that may involve serotonin 1A receptor mediated oxytocin release interacting with vasopressin receptor 1A. In humans, MDMA increases plasma oxytocin and produces feelings of social affiliation. It decreases identification of negative facial expressions (cognitive empathy) and blunts responses to social rejection, while enhancing responses to others’ positive emotions (emotional empathy) and increasing social approach. Thus, consistent with drug folklore, laboratory administration of MDMA robustly alters social processing in humans and increases social approach in humans and animals. Effects are consistent with increased sociability, with mixed evidence about enhanced empathy. These neurobiologically-complex prosocial effects likely motivate recreational ecstasy use. PMID:26408071

  17. The prosocial effects of 3,4-methylenedioxymethamphetamine (MDMA): Controlled studies in humans and laboratory animals.

    PubMed

    Kamilar-Britt, Philip; Bedi, Gillinder

    2015-10-01

    Users of ±3,4-methylenedioxymethamphetamine (MDMA; 'ecstasy') report prosocial effects such as sociability and empathy. Supporting these apparently unique social effects, data from controlled laboratory studies indicate that MDMA alters social feelings, information processing, and behavior in humans, and social behavior in rodents. Here, we review this growing body of evidence. In rodents, MDMA increases passive prosocial behavior (adjacent lying) and social reward while decreasing aggression, effects that may involve serotonin 1A receptor mediated oxytocin release interacting with vasopressin receptor 1A. In humans, MDMA increases plasma oxytocin and produces feelings of social affiliation. It decreases identification of negative facial expressions (cognitive empathy) and blunts responses to social rejection, while enhancing responses to others' positive emotions (emotional empathy) and increasing social approach. Thus, consistent with drug folklore, laboratory administration of MDMA robustly alters social processing in humans and increases social approach in humans and animals. Effects are consistent with increased sociability, with mixed evidence about enhanced empathy. These neurobiologically-complex prosocial effects likely motivate recreational ecstasy use. Copyright © 2015. Published by Elsevier Ltd.

  18. Process Modeling and Dynamic Simulation for EAST Helium Refrigerator

    NASA Astrophysics Data System (ADS)

    Lu, Xiaofei; Fu, Peng; Zhuang, Ming; Qiu, Lilong; Hu, Liangbing

    2016-06-01

    In this paper, the process modeling and dynamic simulation for the EAST helium refrigerator has been completed. The cryogenic process model is described and the main components are customized in detail. The process model is controlled by the PLC simulator, and the realtime communication between the process model and the controllers is achieved by a customized interface. Validation of the process model has been confirmed based on EAST experimental data during the cool down process of 300-80 K. Simulation results indicate that this process simulator is able to reproduce dynamic behaviors of the EAST helium refrigerator very well for the operation of long pulsed plasma discharge. The cryogenic process simulator based on control architecture is available for operation optimization and control design of EAST cryogenic systems to cope with the long pulsed heat loads in the future. supported by National Natural Science Foundation of China (No. 51306195) and Key Laboratory of Cryogenics, Technical Institute of Physics and Chemistry, CAS (No. CRYO201408)

  19. OVERVIEW: CCL PATHOGENS RESEARCH AT NRMRL

    EPA Science Inventory


    The Microbial Contaminants Control Branch (MCCB), Water Supply and Water Resources Division, National Risk Management Research Laboratory, conducts research on microbiological problems associated with source water quality, treatment processes, distribution and storage of drin...

  20. Stochastic Multiscale Analysis and Design of Engine Disks

    DTIC Science & Technology

    2010-07-28

    shown recently to fail when used with data-driven non-linear stochastic input models (KPCA, IsoMap, etc.). Need for scalable exascale computing algorithms Materials Process Design and Control Laboratory Cornell University

  1. Development and implementation of the Caribbean Laboratory Quality Management Systems Stepwise Improvement Process (LQMS-SIP) Towards Accreditation

    PubMed Central

    Alemnji, George; Edghill, Lisa; Wallace-Sankarsingh, Sacha; Albalak, Rachel; Cognat, Sebastien; Nkengasong, John; Gabastou, Jean-Marc

    2017-01-01

    Background Implementing quality management systems and accrediting laboratories in the Caribbean has been a challenge. Objectives We report the development of a stepwise process for quality systems improvement in the Caribbean Region. Methods The Caribbean Laboratory Stakeholders met under a joint Pan American Health Organization/US Centers for Disease Control and Prevention initiative and developed a user-friendly framework called ‘Laboratory Quality Management System – Stepwise Improvement Process (LQMS-SIP) Towards Accreditation’ to support countries in strengthening laboratory services through a stepwise approach toward fulfilling the ISO 15189: 2012 requirements. Results This approach consists of a three-tiered framework. Tier 1 represents the minimum requirements corresponding to the mandatory criteria for obtaining a licence from the Ministry of Health of the participating country. The next two tiers are quality improvement milestones that are achieved through the implementation of specific quality management system requirements. Laboratories that meet the requirements of the three tiers will be encouraged to apply for accreditation. The Caribbean Regional Organisation for Standards and Quality hosts the LQMS-SIP Secretariat and will work with countries, including the Ministry of Health and stakeholders, including laboratory staff, to coordinate and implement LQMS-SIP activities. The Caribbean Public Health Agency will coordinate and advocate for the LQMS-SIP implementation. Conclusion This article presents the Caribbean LQMS-SIP framework and describes how it will be implemented among various countries in the region to achieve quality improvement. PMID:28879149

  2. Motor-response learning at a process control panel by an autonomous robot

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spelt, P.F.; de Saussure, G.; Lyness, E.

    1988-01-01

    The Center for Engineering Systems Advanced Research (CESAR) was founded at Oak Ridge National Laboratory (ORNL) by the Department of Energy's Office of Energy Research/Division of Engineering and Geoscience (DOE-OER/DEG) to conduct basic research in the area of intelligent machines. Therefore, researchers at the CESAR Laboratory are engaged in a variety of research activities in the field of machine learning. In this paper, we describe our approach to a class of machine learning which involves motor response acquisition using feedback from trial-and-error learning. Our formulation is being experimentally validated using an autonomous robot, learning tasks of control panel monitoring andmore » manipulation for effect process control. The CLIPS Expert System and the associated knowledge base used by the robot in the learning process, which reside in a hypercube computer aboard the robot, are described in detail. Benchmark testing of the learning process on a robot/control panel simulation system consisting of two intercommunicating computers is presented, along with results of sample problems used to train and test the expert system. These data illustrate machine learning and the resulting performance improvement in the robot for problems similar to, but not identical with, those on which the robot was trained. Conclusions are drawn concerning the learning problems, and implications for future work on machine learning for autonomous robots are discussed. 16 refs., 4 figs., 1 tab.« less

  3. Pathology economic model tool: a novel approach to workflow and budget cost analysis in an anatomic pathology laboratory.

    PubMed

    Muirhead, David; Aoun, Patricia; Powell, Michael; Juncker, Flemming; Mollerup, Jens

    2010-08-01

    The need for higher efficiency, maximum quality, and faster turnaround time is a continuous focus for anatomic pathology laboratories and drives changes in work scheduling, instrumentation, and management control systems. To determine the costs of generating routine, special, and immunohistochemical microscopic slides in a large, academic anatomic pathology laboratory using a top-down approach. The Pathology Economic Model Tool was used to analyze workflow processes at The Nebraska Medical Center's anatomic pathology laboratory. Data from the analysis were used to generate complete cost estimates, which included not only materials, consumables, and instrumentation but also specific labor and overhead components for each of the laboratory's subareas. The cost data generated by the Pathology Economic Model Tool were compared with the cost estimates generated using relative value units. Despite the use of automated systems for different processes, the workflow in the laboratory was found to be relatively labor intensive. The effect of labor and overhead on per-slide costs was significantly underestimated by traditional relative-value unit calculations when compared with the Pathology Economic Model Tool. Specific workflow defects with significant contributions to the cost per slide were identified. The cost of providing routine, special, and immunohistochemical slides may be significantly underestimated by traditional methods that rely on relative value units. Furthermore, a comprehensive analysis may identify specific workflow processes requiring improvement.

  4. An analytical chemistry laboratory's experiences under Department of Energy Order 5633. 3 - a status report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bingham, C.D.

    The U.S. Department of Energy (DOE) order 5633.3, Control and Accountability of Nuclear Materials, initiated substantial changes to the requirements for operations involving nuclear materials. In the opinion of this author, the two most significant changes are the clarification of and the increased emphasis on the concept of graded safeguards and the implementation of performance requirements. Graded safeguards recognizes that some materials are more attractive than others to potential adversary actions and, thus, should be afforded a higher level of integrated safeguards effort. An analytical chemistry laboratory, such as the New Brunswick Laboratory (NBL), typically has a small total inventorymore » of special nuclear materials compared to, for example, a production or manufacturing facility. The NBL has a laboratory information management system (LIMS) that not only provides the sample identification and tracking but also incorporates the essential features of MC A required of NBL operations. As a consequence of order 5633.3, NBL had to modify LIMS to accommodate material attractiveness information for the logging process, to reflect changes in the attractiveness as the material was processed through the laboratory, and to enable inventory information to be accumulated by material attractiveness as the material was processed through the laboratory, and to enable inventory information to be accumulated by material attractiveness codes.« less

  5. Inpatient preanalytic process improvements.

    PubMed

    Wagar, Elizabeth A; Phipps, Ron; Del Guidice, Robert; Middleton, Lavinia P; Bingham, John; Prejean, Cheryl; Johnson-Hamilton, Martha; Philip, Pheba; Le, Ngoc Han; Muses, Waheed

    2013-12-01

    Phlebotomy services are a common target for preanalytic improvements. Many new, quality engineering tools have recently been applied in clinical laboratories. However, data on relatively few projects have been published. This example describes a complete application of current, quality engineering tools to improve preanalytic phlebotomy services. To decrease the response time in the preanalytic inpatient laboratory by 25%, to reduce the number of incident reports related to preanalytic phlebotomy, and to make systematic process changes that satisfied the stakeholders. The Department of Laboratory Medicine, General Services Section, at the University of Texas MD Anderson Cancer Center (Houston) is responsible for inpatient phlebotomy in a 24-hour operation, which serves 689 inpatient beds. The study director was project director of the Division of Pathology and Laboratory Medicine's Quality Improvement Section and was assisted by 2 quality technologists and an industrial engineer from MD Anderson Office of Performance Improvement. After implementing each solution, using well-recognized, quality tools and metrics, the response time for blood collection decreased by 23%, which was close to meeting the original responsiveness goal of 25%. The response time between collection and arrival in the laboratory decreased by 8%. Applicable laboratory-related incident reports were reduced by 43%. Comprehensive application of quality tools, such as statistical control charts, Pareto diagrams, value-stream maps, process failure modes and effects analyses, fishbone diagrams, solution prioritization matrices, and customer satisfaction surveys can significantly improve preset goals for inpatient phlebotomy.

  6. Error identification in a high-volume clinical chemistry laboratory: Five-year experience.

    PubMed

    Jafri, Lena; Khan, Aysha Habib; Ghani, Farooq; Shakeel, Shahid; Raheem, Ahmed; Siddiqui, Imran

    2015-07-01

    Quality indicators for assessing the performance of a laboratory require a systematic and continuous approach in collecting and analyzing data. The aim of this study was to determine the frequency of errors utilizing the quality indicators in a clinical chemistry laboratory and to convert errors to the Sigma scale. Five-year quality indicator data of a clinical chemistry laboratory was evaluated to describe the frequency of errors. An 'error' was defined as a defect during the entire testing process from the time requisition was raised and phlebotomy was done until the result dispatch. An indicator with a Sigma value of 4 was considered good but a process for which the Sigma value was 5 (i.e. 99.977% error-free) was considered well controlled. In the five-year period, a total of 6,792,020 specimens were received in the laboratory. Among a total of 17,631,834 analyses, 15.5% were from within hospital. Total error rate was 0.45% and of all the quality indicators used in this study the average Sigma level was 5.2. Three indicators - visible hemolysis, failure of proficiency testing and delay in stat tests - were below 5 on the Sigma scale and highlight the need to rigorously monitor these processes. Using Six Sigma metrics quality in a clinical laboratory can be monitored more effectively and it can set benchmarks for improving efficiency.

  7. Engineering Design and Automation in the Applied Engineering Technologies (AET) Group at Los Alamos National Laboratory.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wantuck, P. J.; Hollen, R. M.

    2002-01-01

    This paper provides an overview of some design and automation-related projects ongoing within the Applied Engineering Technologies (AET) Group at Los Alamos National Laboratory. AET uses a diverse set of technical capabilities to develop and apply processes and technologies to applications for a variety of customers both internal and external to the Laboratory. The Advanced Recovery and Integrated Extraction System (ARIES) represents a new paradigm for the processing of nuclear material from retired weapon systems in an environment that seeks to minimize the radiation dose to workers. To achieve this goal, ARIES relies upon automation-based features to handle and processmore » the nuclear material. Our Chemical Process Development Team specializes in fuzzy logic and intelligent control systems. Neural network technology has been utilized in some advanced control systems developed by team members. Genetic algorithms and neural networks have often been applied for data analysis. Enterprise modeling, or discrete event simulation, as well as chemical process simulation has been employed for chemical process plant design. Fuel cell research and development has historically been an active effort within the AET organization. Under the principal sponsorship of the Department of Energy, the Fuel Cell Team is now focusing on technologies required to produce fuel cell compatible feed gas from reformation of a variety of conventional fuels (e.g., gasoline, natural gas), principally for automotive applications. This effort involves chemical reactor design and analysis, process modeling, catalyst analysis, as well as full scale system characterization and testing. The group's Automation and Robotics team has at its foundation many years of experience delivering automated and robotic systems for nuclear, analytical chemistry, and bioengineering applications. As an integrator of commercial systems and a developer of unique custom-made systems, the team currently supports the automation needs of many Laboratory programs.« less

  8. European external quality control study on the competence of laboratories to recognize rare sequence variants resulting in unusual genotyping results.

    PubMed

    Márki-Zay, János; Klein, Christoph L; Gancberg, David; Schimmel, Heinz G; Dux, László

    2009-04-01

    Depending on the method used, rare sequence variants adjacent to the single nucleotide polymorphism (SNP) of interest may cause unusual or erroneous genotyping results. Because such rare variants are known for many genes commonly tested in diagnostic laboratories, we organized a proficiency study to assess their influence on the accuracy of reported laboratory results. Four external quality control materials were processed and sent to 283 laboratories through 3 EQA organizers for analysis of the prothrombin 20210G>A mutation. Two of these quality control materials contained sequence variants introduced by site-directed mutagenesis. One hundred eighty-nine laboratories participated in the study. When samples gave a usual result with the method applied, the error rate was 5.1%. Detailed analysis showed that more than 70% of the failures were reported from only 9 laboratories. Allele-specific amplification-based PCR had a much higher error rate than other methods (18.3% vs 2.9%). The variants 20209C>T and [20175T>G; 20179_20180delAC] resulted in unusual genotyping results in 67 and 85 laboratories, respectively. Eighty-three (54.6%) of these unusual results were not recognized, 32 (21.1%) were attributed to technical issues, and only 37 (24.3%) were recognized as another sequence variant. Our findings revealed that some of the participating laboratories were not able to recognize and correctly interpret unusual genotyping results caused by rare SNPs. Our study indicates that the majority of the failures could be avoided by improved training and careful selection and validation of the methods applied.

  9. How do laboratory technicians perceive their role in the tuberculosis diagnostic process? A cross-sectional study among laboratory technicians in health centers of Central Java Province, Indonesia

    PubMed Central

    Widjanarko, Bagoes; Widyastari, Dyah Anantalia; Martini, Martini; Ginandjar, Praba

    2016-01-01

    Purpose Detection of acid-fast bacilli in respiratory specimens serves as an initial pulmonary tuberculosis (TB) diagnosis. Laboratories are the essential and fundamental part of all health systems. This study aimed to describe how laboratory technicians perceived their own self and work. This included perceived self-efficacy, perceived role, perceived equipment availability, perceived procedures, perceived reward and job, and perceived benefit of health education, as well as level of knowledge and attitudes related to work performance of laboratory technicians. Methods This was a cross-sectional quantitative study involving 120 laboratory technicians conducted in Central Java. Interviews and observation were conducted to measure performance and work-related variables. Results Among 120 laboratory technicians, 43.3% showed fairly good performance. They complied with 50%–75% of all procedures, including sputum collection, laboratory tools utilization, sputum smearing, staining, smear examination, grading of results, and universal precaution practice. Perceived role, perceived self-efficacy, and knowledge of laboratory procedures were significantly correlated to performance, besides education and years of working as a laboratory technician. Perceived equipment availability was also significantly correlated to performance after the education variable was controlled. Conclusion Most of the laboratory technicians believed that they have an important role in TB patients’ treatment and should display proper self-efficacy in performing laboratory activities. The result may serve as a basic consideration to develop a policy for enhancing motivation of laboratory technicians in order to improve the TB control program. PMID:27660502

  10. A new phosphate-selective sorbent for the Rem Nut process. Laboratory investigation and field experience at a medium size wastewater treatment plant.

    PubMed

    Petruzzelli, D; De Florio, L; Dell'Erba, A; Liberti, L; Notarnicola, M; Sengupta, A K

    2003-01-01

    P-control technologies for municipal wastewater are essentially based on "destructive" methods, that lead to formation of concentrated solid-phases (sludge), usually disposed-off in controlled landfills. Ion exchange, as a "non-destructive" technology, allows for selective removal and simultaneous recovery of pollutants, which can be recycled to the same and/or related productive lines. In this context, the REM NUT process removes nutrient species (HPO4 = , NH4+, K+) present in biologically oxidised municipal effluents and recovers them in the form of struvites (MgNH4PO4; MgKPO4), premium quality slow release fertilisers. The main limitation to the extensive application of this ion exchange based process is the non-availability of selective exchangers for specific removal of nutrient species. This paper illustrates laboratory investigation and pilot scale development of a so-called "P-driven" modified REM NUT scheme based on a new phosphate-selective sorbent developed at Lehigh University, PA, USA.

  11. Sodium content of popular commercially processed and restaurant foods in the United States

    USDA-ARS?s Scientific Manuscript database

    Nutrient Data Laboratory (NDL) of the U.S. Department of Agriculture (USDA) in close collaboration with U.S. Center for Disease Control and Prevention is monitoring the sodium content of commercially processed and restaurant foods in the United States. The main purpose of this manuscript is to prov...

  12. Monitoring and Evaluation of Alcoholic Fermentation Processes Using a Chemocapacitor Sensor Array

    PubMed Central

    Oikonomou, Petros; Raptis, Ioannis; Sanopoulou, Merope

    2014-01-01

    The alcoholic fermentation of Savatiano must variety was initiated under laboratory conditions and monitored daily with a gas sensor array without any pre-treatment steps. The sensor array consisted of eight interdigitated chemocapacitors (IDCs) coated with specific polymers. Two batches of fermented must were tested and also subjected daily to standard chemical analysis. The chemical composition of the two fermenting musts differed from day one of laboratory monitoring (due to different storage conditions of the musts) and due to a deliberate increase of the acetic acid content of one of the musts, during the course of the process, in an effort to spoil the fermenting medium. Sensor array responses to the headspace of the fermenting medium were compared with those obtained either for pure or contaminated samples with controlled concentrations of standard ethanol solutions of impurities. Results of data processing with Principal Component Analysis (PCA), demonstrate that this sensing system could discriminate between a normal and a potential spoiled grape must fermentation process, so this gas sensing system could be potentially applied during wine production as an auxiliary qualitative control instrument. PMID:25184490

  13. Conference on Real-Time Computer Applications in Nuclear, Particle and Plasma Physics, 6th, Williamsburg, VA, May 15-19, 1989, Proceedings

    NASA Technical Reports Server (NTRS)

    Pordes, Ruth (Editor)

    1989-01-01

    Papers on real-time computer applications in nuclear, particle, and plasma physics are presented, covering topics such as expert systems tactics in testing FASTBUS segment interconnect modules, trigger control in a high energy physcis experiment, the FASTBUS read-out system for the Aleph time projection chamber, a multiprocessor data acquisition systems, DAQ software architecture for Aleph, a VME multiprocessor system for plasma control at the JT-60 upgrade, and a multiasking, multisinked, multiprocessor data acquisition front end. Other topics include real-time data reduction using a microVAX processor, a transputer based coprocessor for VEDAS, simulation of a macropipelined multi-CPU event processor for use in FASTBUS, a distributed VME control system for the LISA superconducting Linac, a distributed system for laboratory process automation, and a distributed system for laboratory process automation. Additional topics include a structure macro assembler for the event handler, a data acquisition and control system for Thomson scattering on ATF, remote procedure execution software for distributed systems, and a PC-based graphic display real-time particle beam uniformity.

  14. Evaluation of FNS control systems: software development and sensor characterization.

    PubMed

    Riess, J; Abbas, J J

    1997-01-01

    Functional Neuromuscular Stimulation (FNS) systems activate paralyzed limbs by electrically stimulating motor neurons. These systems have been used to restore functions such as standing and stepping in people with thoracic level spinal cord injury. Research in our laboratory is directed at the design and evaluation of the control algorithms for generating posture and movement. This paper describes software developed for implementing FNS control systems and the characterization of a sensor system used to implement and evaluate controllers in the laboratory. In order to assess FNS control algorithms, we have developed a versatile software package using Lab VIEW (National Instruments, Corp). This package provides the ability to interface with sensor systems via serial port or A/D board, implement data processing and real-time control algorithms, and interface with neuromuscular stimulation devices. In our laboratory, we use the Flock of Birds (Ascension Technology Corp.) motion tracking sensor system to monitor limb segment position and orientation (6 degrees of freedom). Errors in the sensor system have been characterized and nonlinear polynomial models have been developed to account for these errors. With this compensation, the error in the distance measurement is reduced by 90 % so that the maximum error is less than 1 cm.

  15. DEVELOPMENT OF IN-SITU CONTROL DIAGNOSTICS FOR APPLICATION OF EPITAXIAL SUPERCONDUCTOR AND BUFFER LAYERS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    B.C. Winkleman; T.V. Giel, Jr.; J. Cunningham

    1999-06-30

    The recent achievements of critical currents in excess of 1x10{sup 6}amp/cm{sup 2} at 77K in YBCO deposited over suitably textured buffer/substrate composites have stimulated interest in the potential fabrication of these coated conductors as wire. Numerous approaches and manufacturing schemes for producing coated conductor wire are currently being developed. Recently, under the U. S. Department of Energy (DOE's) sponsorship, the University of Tennessee Space Institute (UTSI) performed an extensive evaluation of leading coated conductor processing options. In general, it is our feeling that the science and chemistry that are being developed in the coated conductor wire program now need propermore » engineering evaluation to define the most viable options for a commercial fabrication process. All fabrication processes will need process control measurements. This report provides a specific review of the needs and available technologies for process control for many of the coated conductor processing options. This report also addresses generic process monitoring areas in which additional research and development is needed. The concentration is on the two different approaches for obtaining the textured substrates that have been identified as viable candidates. These are the Los Alamos National Laboratory's (LANL) ion-beam assisted deposition, called IBAD, to obtain a highly textured yttria-stabilized zirconia (YSZ) buffer on nickel alloy strips, and Oak Ridge National Laboratory's (ORNL) rolling assisted, bi-axially textured substrate option called RABiTS{trademark}.« less

  16. Critical role of developing national strategic plans as a guide to strengthen laboratory health systems in resource-poor settings.

    PubMed

    Nkengasong, John N; Mesele, Tsehaynesh; Orloff, Sherry; Kebede, Yenew; Fonjungo, Peter N; Timperi, Ralph; Birx, Deborah

    2009-06-01

    Medical laboratory services are an essential, yet often neglected, component of health systems in developing countries. Their central role in public health, disease control and surveillance, and patient management is often poorly recognized by governments and donors. However, medical laboratory services in developing countries can be strengthened by leveraging funding from other sources of HIV/AIDS prevention, care, surveillance, and treatment programs. Strengthening these services will require coordinated efforts by national governments and partners and can be achieved by establishing and implementing national laboratory strategic plans and policies that integrate laboratory systems to combat major infectious diseases. These plans should take into account policy, legal, and regulatory frameworks; the administrative and technical management structure of the laboratories; human resources and retention strategies; laboratory quality management systems; monitoring and evaluation systems; procurement and maintenance of equipment; and laboratory infrastructure enhancement. Several countries have developed or are in the process of developing their laboratory plans, and others, such as Ethiopia, have implemented and evaluated their plan.

  17. Sigma metrics as a tool for evaluating the performance of internal quality control in a clinical chemistry laboratory.

    PubMed

    Kumar, B Vinodh; Mohan, Thuthi

    2018-01-01

    Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes.

  18. Sleep continuity is positively correlated with sleep duration in laboratory nighttime sleep recordings

    PubMed Central

    Van Dongen, Hans P. A.; Natelson, Benjamin H.; Bender, Amy M.; Palombini, Luciana O.; Bittencourt, Lia; Tufik, Sergio; Ayappa, Indu; Rapoport, David M.

    2017-01-01

    Sleep duration varies widely across individuals and appears to be trait-like. Differences in the stability of underlying sleep processes may underlie this phenomenon. To investigate underlying mechanisms, we examined the relationship between sleep duration and sleep continuity in baseline polysomnography (PSG) recordings from three independently collected datasets: 1) 134 healthy controls (ages 37 ± 13 years) from the São Paulo Epidemiologic Sleep Study, who spent one night in a sleep laboratory, 2) 21 obstructive sleep apnea (OSA) patients who were treated with continuous positive airway pressure for at least 2 months (45 ± 12 years, respiratory disturbance index <15), who spent one night in a sleep laboratory with previous experience of multiple PSG studies, and 3) 62 healthy controls (28 ± 6 years) who, as part of larger experiments, spent 2 consecutive nights in a sleep laboratory. For each dataset, we used total sleep time (TST) to separate subjects into those with shorter sleep (S-TST) and those with longer sleep (L-TST). In all three datasets, survival curves of continuous sleep segments showed greater sleep continuity in L-TST than in S-TST. Correlation analyses with TST as a continuous variable corroborated the results; and the results also held true after controlling for age. There were no significant differences in baseline waking performance and sleepiness between S-TST and L-TST. In conclusion, in both healthy controls and treated OSA patients, sleep continuity was positively correlated with sleep duration. These findings suggest that S-TST may differ from L-TST in processes underlying sleep continuity, shedding new light on mechanisms underlying individual differences in sleep duration. PMID:28394943

  19. Immunohistochemistry practices of cytopathology laboratories: a survey of participants in the College of American Pathologists Nongynecologic Cytopathology Education Program.

    PubMed

    Fischer, Andrew H; Schwartz, Mary R; Moriarty, Ann T; Wilbur, David C; Souers, Rhona; Fatheree, Lisa; Booth, Christine N; Clayton, Amy C; Kurtyz, Daniel F I; Padmanabhan, Vijayalakshmi; Crothers, Barbara A

    2014-09-01

    Immunohistochemistry (IHC) is important for cytology but poses special challenges because preanalytic conditions may differ from the conditions of IHC-positive controls. To broadly survey cytology laboratories to quantify preanalytic platforms for cytology IHC and identify problems with particular platforms or antigens. To discover how validation guidelines for HER2 testing have affected cytology. A voluntary survey of cytology IHC practices was sent to 1899 cytology laboratories participating in the College of American Pathologists Nongynecologic Cytopathology Education Program in the fall of 2009. A total of 818 laboratories (43%) responded to the survey by April 2010. Three hundred fourty-five of 791 respondents (44%) performed IHC on cytology specimens. Seventeen different fixation and processing platforms prior to antibody reaction were reported. A total of 59.2% of laboratories reported differences between the platforms for cytology specimens and positive controls, but most (155 of 184; 84%) did not alter antibody dilutions or antigen retrieval for cytology IHC. When asked to name 2 antibodies for which staining conditions differed between cytology and surgical samples, there were 18 responses listing 14 antibodies. A total of 30.6% of laboratories performing IHC offered HER2 testing before publication of the 2007 College of American Pathologists/American Society of Clinical Oncologists guidelines, compared with 33.6% afterward, with increased performance of testing by reference laboratories. Three laboratories validated a nonformalin HER2 platform. The platforms for cytology IHC and positive controls differ for most laboratories, yet conditions are uncommonly adjusted for cytology specimens. Except for the unsuitability of air-dried smears for HER2 testing, the survey did not reveal evidence of systematic problems with any antibody or platform.

  20. Quality Indicators in Laboratory Medicine: from theory to practice. Preliminary data from the IFCC Working Group Project "Laboratory Errors and Patient Safety".

    PubMed

    Sciacovelli, Laura; O'Kane, Maurice; Skaik, Younis Abdelwahab; Caciagli, Patrizio; Pellegrini, Cristina; Da Rin, Giorgio; Ivanov, Agnes; Ghys, Timothy; Plebani, Mario

    2011-05-01

    The adoption of Quality Indicators (QIs) has prompted the development of tools to measure and evaluate the quality and effectiveness of laboratory testing, first in the hospital setting and subsequently in ambulatory and other care settings. While Laboratory Medicine has an important role in the delivery of high-quality care, no consensus exists as yet on the use of QIs focussing on all steps of the laboratory total testing process (TTP), and further research in this area is required. In order to reduce errors in laboratory testing, the IFCC Working Group on "Laboratory Errors and Patient Safety" (WG-LEPS) developed a series of Quality Indicators, specifically designed for clinical laboratories. In the first phase of the project, specific QIs for key processes of the TTP were identified, including all the pre-, intra- and post-analytic steps. The overall aim of the project is to create a common reporting system for clinical laboratories based on standardized data collection, and to define state-of-the-art and Quality Specifications (QSs) for each QI independent of: a) the size of organization and type of activities; b) the complexity of processes undertaken; and c) different degree of knowledge and ability of the staff. The aim of the present paper is to report the results collected from participating laboratories from February 2008 to December 2009 and to identify preliminary QSs. The results demonstrate that a Model of Quality Indicators managed as an External Quality Assurance Program can serve as a tool to monitor and control the pre-, intra- and post-analytical activities. It might also allow clinical laboratories to identify risks that lead to errors resulting in patient harm: identification and design of practices that eliminate medical errors; the sharing of information and education of clinical and laboratory teams on practices that reduce or prevent errors; the monitoring and evaluation of improvement activities.

  1. Multiobjective optimization and multivariable control of the beer fermentation process with the use of evolutionary algorithms.

    PubMed

    Andrés-Toro, B; Girón-Sierra, J M; Fernández-Blanco, P; López-Orozco, J A; Besada-Portas, E

    2004-04-01

    This paper describes empirical research on the model, optimization and supervisory control of beer fermentation. Conditions in the laboratory were made as similar as possible to brewery industry conditions. Since mathematical models that consider realistic industrial conditions were not available, a new mathematical model design involving industrial conditions was first developed. Batch fermentations are multiobjective dynamic processes that must be guided along optimal paths to obtain good results. The paper describes a direct way to apply a Pareto set approach with multiobjective evolutionary algorithms (MOEAs). Successful finding of optimal ways to drive these processes were reported. Once obtained, the mathematical fermentation model was used to optimize the fermentation process by using an intelligent control based on certain rules.

  2. Assessment of Sensor Technologies for Advanced Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Korsah, Kofi; Ramuhalli, Pradeep; Vlim, R.

    2016-10-01

    Sensors and measurement technologies provide information on processes, support operations and provide indications of component health. They are therefore crucial to plant operations and to commercialization of advanced reactors (AdvRx). This report, developed by a three-laboratory team consisting of Argonne National Laboratory (ANL), Oak Ridge National Laboratory (ORNL) and Pacific Northwest National Laboratory (PNNL), provides an assessment of sensor technologies and a determination of measurement needs for AdvRx. It provides the technical basis for identifying and prioritizing research targets within the instrumentation and control (I&C) Technology Area under the Department of Energy’s (DOE’s) Advanced Reactor Technology (ART) program and contributesmore » to the design and implementation of AdvRx concepts.« less

  3. Guidance for laboratories performing molecular pathology for cancer patients

    PubMed Central

    Cree, Ian A; Deans, Zandra; Ligtenberg, Marjolijn J L; Normanno, Nicola; Edsjö, Anders; Rouleau, Etienne; Solé, Francesc; Thunnissen, Erik; Timens, Wim; Schuuring, Ed; Dequeker, Elisabeth; Murray, Samuel; Dietel, Manfred; Groenen, Patricia; Van Krieken, J Han

    2014-01-01

    Molecular testing is becoming an important part of the diagnosis of any patient with cancer. The challenge to laboratories is to meet this need, using reliable methods and processes to ensure that patients receive a timely and accurate report on which their treatment will be based. The aim of this paper is to provide minimum requirements for the management of molecular pathology laboratories. This general guidance should be augmented by the specific guidance available for different tumour types and tests. Preanalytical considerations are important, and careful consideration of the way in which specimens are obtained and reach the laboratory is necessary. Sample receipt and handling follow standard operating procedures, but some alterations may be necessary if molecular testing is to be performed, for instance to control tissue fixation. DNA and RNA extraction can be standardised and should be checked for quality and quantity of output on a regular basis. The choice of analytical method(s) depends on clinical requirements, desired turnaround time, and expertise available. Internal quality control, regular internal audit of the whole testing process, laboratory accreditation, and continual participation in external quality assessment schemes are prerequisites for delivery of a reliable service. A molecular pathology report should accurately convey the information the clinician needs to treat the patient with sufficient information to allow for correct interpretation of the result. Molecular pathology is developing rapidly, and further detailed evidence-based recommendations are required for many of the topics covered here. PMID:25012948

  4. 21 CFR 111.315 - What are the requirements for laboratory control processes?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... specifications; (b) Use of sampling plans for obtaining representative samples, in accordance with subpart E of... for distribution rather than for return to the supplier); and (5) Packaged and labeled dietary...

  5. An Integrated RFID and Barcode Tagged Item Inventory System for Deployment at New Brunswick Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Younkin, James R; Kuhn, Michael J; Gradle, Colleen

    New Brunswick Laboratory (NBL) has a numerous inventory containing thousands of plutonium and uranium certified reference materials. The current manual inventory process is well established but is a lengthy process which requires significant oversight and double checking to ensure correctness. Oak Ridge National Laboratory has worked with NBL to develop and deploy a new inventory system which utilizes handheld computers with barcode scanners and radio frequency identification (RFID) readers termed the Tagged Item Inventory System (TIIS). Certified reference materials are identified by labels which incorporate RFID tags and barcodes. The label printing process and RFID tag association process are integratedmore » into the main desktop software application. Software on the handheld computers syncs with software on designated desktop machines and the NBL inventory database to provide a seamless inventory process. This process includes: 1) identifying items to be inventoried, 2) downloading the current inventory information to the handheld computer, 3) using the handheld to read item and location labels, and 4) syncing the handheld computer with a designated desktop machine to analyze the results, print reports, etc. The security of this inventory software has been a major concern. Designated roles linked to authenticated logins are used to control access to the desktop software while password protection and badge verification are used to control access to the handheld computers. The overall system design and deployment at NBL will be presented. The performance of the system will also be discussed with respect to a small piece of the overall inventory. Future work includes performing a full inventory at NBL with the Tagged Item Inventory System and comparing performance, cost, and radiation exposures to the current manual inventory process.« less

  6. Acute stress affects prospective memory functions via associative memory processes.

    PubMed

    Szőllősi, Ágnes; Pajkossy, Péter; Demeter, Gyula; Kéri, Szabolcs; Racsmány, Mihály

    2018-01-01

    Recent findings suggest that acute stress can improve the execution of delayed intentions (prospective memory, PM). However, it is unclear whether this improvement can be explained by altered executive control processes or by altered associative memory functioning. To investigate this issue, we used physical-psychosocial stressors to induce acute stress in laboratory settings. Then participants completed event- and time-based PM tasks requiring the different contribution of control processes and a control task (letter fluency) frequently used to measure executive functions. According to our results, acute stress had no impact on ongoing task performance, time-based PM, and verbal fluency, whereas it enhanced event-based PM as measured by response speed for the prospective cues. Our findings indicate that, here, acute stress did not affect executive control processes. We suggest that stress affected event-based PM via associative memory processes. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Portability issues for a structured clinical vocabulary: mapping from Yale to the Columbia medical entities dictionary.

    PubMed Central

    Kannry, J L; Wright, L; Shifman, M; Silverstein, S; Miller, P L

    1996-01-01

    OBJECTIVE: To examine the issues involved in mapping an existing structured controlled vocabulary, the Medical Entities Dictionary (MED) developed at Columbia University, to an institutional vocabulary, the laboratory and pharmacy vocabularies of the Yale New Haven Medical Center. DESIGN: 200 Yale pharmacy terms and 200 Yale laboratory terms were randomly selected from database files containing all of the Yale laboratory and pharmacy terms. These 400 terms were then mapped to the MED in three phases: mapping terms, mapping relationships between terms, and mapping attributes that modify terms. RESULTS: 73% of the Yale pharmacy terms mapped to MED terms. 49% of the Yale laboratory terms mapped to MED terms. After certain obsolete and otherwise inappropriate laboratory terms were eliminated, the latter rate improved to 59%. 23% of the unmatched Yale laboratory terms failed to match because of differences in granularity with MED terms. The Yale and MED pharmacy terms share 12 of 30 distinct attributes. The Yale and MED laboratory terms share 14 of 23 distinct attributes. CONCLUSION: The mapping of an institutional vocabulary to a structured controlled vocabulary requires that the mapping be performed at the level of terms, relationships, and attributes. The mapping process revealed the importance of standardization of local vocabulary subsets, standardization of attribute representation, and term granularity. PMID:8750391

  8. The gallium melting-point standard: its role in manufacture and quality control of electronic thermometers for the clinical laboratory.

    PubMed

    Sostman, H E

    1977-01-01

    I discuss the traceability of calibration of electronic thermometers to thermometric constants of nature or to the National Bureau of Standards, form a manufacturer's basic standards through the manufacturing process to the user's laboratory. Useful electrical temperature sensors, their advantages, and means for resolving their disadvantages are described. I summarize our development of a cell for realizing the melting phase equilibrium of pure gallium (at 29.770 degrees C) as a thermometer calibration fixed point, and enumerate its advantages in the routine calibration verification of electrical thermometers in the clinical chemistry laboratory.

  9. Integration of Biosafety into Core Facility Management

    PubMed Central

    Fontes, Benjamin

    2013-01-01

    This presentation will discuss the implementation of biosafety policies for small, medium and large core laboratories with primary shared objectives of ensuring the control of biohazards to protect core facility operators and assure conformity with applicable state and federal policies, standards and guidelines. Of paramount importance is the educational process to inform core laboratories of biosafety principles and policies and to illustrate the technology and process pathways of the core laboratory for biosafety professionals. Elevating awareness of biohazards and the biosafety regulatory landscape among core facility operators is essential for the establishment of a framework for both project and material risk assessment. The goal of the biohazard risk assessment process is to identify the biohazard risk management parameters to conduct the procedure safely and in compliance with applicable regulations. An evaluation of the containment, protective equipment and work practices for the procedure for the level of risk identified is facilitated by the establishment of a core facility registration form for work with biohazards and other biological materials with potential risk. The final step in the biocontainment process is the assumption of Principal Investigator role with full responsibility for the structure of the site-specific biosafety program plan by core facility leadership. The presentation will provide example biohazard protocol reviews and accompanying containment measures for core laboratories at Yale University.

  10. Application of lab derived kinetic biodegradation parameters at the field scale

    NASA Astrophysics Data System (ADS)

    Schirmer, M.; Barker, J. F.; Butler, B. J.; Frind, E. O.

    2003-04-01

    Estimating the intrinsic remediation potential of an aquifer typically requires the accurate assessment of the biodegradation kinetics, the level of available electron acceptors and the flow field. Zero- and first-order degradation rates derived at the laboratory scale generally overpredict the rate of biodegradation when applied to the field scale, because limited electron acceptor availability and microbial growth are typically not considered. On the other hand, field estimated zero- and first-order rates are often not suitable to forecast plume development because they may be an oversimplification of the processes at the field scale and ignore several key processes, phenomena and characteristics of the aquifer. This study uses the numerical model BIO3D to link the laboratory and field scale by applying laboratory derived Monod kinetic degradation parameters to simulate a dissolved gasoline field experiment at Canadian Forces Base (CFB) Borden. All additional input parameters were derived from laboratory and field measurements or taken from the literature. The simulated results match the experimental results reasonably well without having to calibrate the model. An extensive sensitivity analysis was performed to estimate the influence of the most uncertain input parameters and to define the key controlling factors at the field scale. It is shown that the most uncertain input parameters have only a minor influence on the simulation results. Furthermore it is shown that the flow field, the amount of electron acceptor (oxygen) available and the Monod kinetic parameters have a significant influence on the simulated results. Under the field conditions modelled and the assumptions made for the simulations, it can be concluded that laboratory derived Monod kinetic parameters can adequately describe field scale degradation processes, if all controlling factors are incorporated in the field scale modelling that are not necessarily observed at the lab scale. In this way, there are no scale relationships to be found that link the laboratory and the field scale, accurately incorporating the additional processes, phenomena and characteristics, such as a) advective and dispersive transport of one or more contaminants, b) advective and dispersive transport and availability of electron acceptors, c) mass transfer limitations and d) spatial heterogeneities, at the larger scale and applying well defined lab scale parameters should accurately describe field scale processes.

  11. New Bedford Harbor Superfund Project, Acushnet River Estuary Engineering Feasibility Study of Dredging and Dredged Material Disposal Alternatives. Report 9. Laboratory-Scale Application of Solidification/Stabilization Technology

    DTIC Science & Technology

    1989-01-01

    force) per square inch to kilopascals, multiply by 6.894757. ** Flue - gas desulfurization . 27 1.0 sediment process, UCS measurements for solidified...Dredging Control Technoloqies 11 Evaluation of Conceptual Dredging and Disposal Alternatives 12 Executive Summary Destroy this report when no longer needed...solubility of metals by controlling the pH and alkalinity. Additional metal immobilization can be obtained by modify- ing the process to include

  12. Laboratory Food Acceptance in Children With Autism Spectrum Disorder Compared With Children With Typical Development.

    PubMed

    Suarez, Michelle A

    Studies using parent-report measures have described the high prevalence of food selectivity in children with autism spectrum disorder (ASD). However, few studies have documented food acceptance in a controlled laboratory environment. The objective of this study was to compare laboratory food acceptance in children with ASD with that of children with typical development (TD). In addition, the relationships between food acceptance and the child's age, sensory processing pattern, and autism severity were explored. Results indicate that children with autism (n = 31) accepted fewer foods in the laboratory environment than the children with TD (n = 21) and that food acceptance was related to age but not to ASD severity. In addition, sensory processing scores were associated with food acceptance for the combined ASD and TD groups. Results are discussed in the context of the literature. This information has the potential to support evaluation and treatment of food selectivity. Copyright © 2017 by the American Occupational Therapy Association, Inc.

  13. A teaching intervention for reading laboratory experiments in college-level introductory chemistry

    NASA Astrophysics Data System (ADS)

    Kirk, Maria Kristine

    The purpose of this study was to determine the effects that a pre-laboratory guide, conceptualized as a "scientific story grammar," has on college chemistry students' learning when they read an introductory chemistry laboratory manual and perform the experiments in the chemistry laboratory. The participants (N = 56) were students enrolled in four existing general chemistry laboratory sections taught by two instructors at a women's liberal arts college. The pre-laboratory guide consisted of eight questions about the experiment, including the purpose, chemical species, variables, chemical method, procedure, and hypothesis. The effects of the intervention were compared with those of the traditional pre-laboratory assignment for the eight chemistry experiments. Measures included quizzes, tests, chemistry achievement test, science process skills test, laboratory reports, laboratory average, and semester grade. The covariates were mathematical aptitude and prior knowledge of chemistry and science processes, on which the groups differed significantly. The study captured students' perceptions of their experience in general chemistry through a survey and interviews with eight students. The only significant differences in the treatment group's performance were in some subscores on lecture items and laboratory items on the quizzes. An apparent induction period was noted, in that significant measures occurred in mid-semester. Voluntary study with the pre-laboratory guide by control students precluded significant differences on measures given later in the semester. The groups' responses to the survey were similar. Significant instructor effects on three survey items were corroborated by the interviews. The researcher's students were more positive about their pre-laboratory tasks, enjoyed the laboratory sessions more, and were more confident about doing chemistry experiments than the laboratory instructor's groups due to differences in scaffolding by the instructors.

  14. Effect of acute heat stress and slaughter processing on poultry meat quality and postmortem carbohydrate metabolism.

    PubMed

    Wang, R H; Liang, R R; Lin, H; Zhu, L X; Zhang, Y M; Mao, Y W; Dong, P C; Niu, L B; Zhang, M H; Luo, X

    2017-03-01

    This study investigated the effects of acute heat stress and slaughter processing on poultry meat quality and carbohydrate metabolism. Broilers (200) were randomly divided into 2 groups receiving heat stress (HS; 36°C for one h), compared to a non-stressed control (C). At slaughter, each group was further divided into 2 groups for slaughter processing (L = laboratory; F = commercial factory). L group breasts were removed immediately after bleeding without carcass scalding or defeathering, and stored at 4°C. F group broilers were scalded (60°C, 45 s) after bleeding and defeathering. Then the breasts were removed and cooled in ice water until the core temperature was ≤4°C. Rates of Pectoralis core temperature and pH decline were changed by slaughter processing, but only HS affected ultimate pH in group L. HS muscles had higher L* values (P < 0.05) than controls at 24 h postmortem. Laboratory processing "hot-deboning" increased drip loss, which resulted in a lower cooked loss (P < 0.05). Postmortem glycolysis was affected only by HS. The speed of lactic acid accumulation and glycogen degradation was faster in the HS group than controls at 5 min postmortem. During storage the glycolysis rates were not different (P > 0.05). Sarcoplasmic protein solubility was higher in F processed birds (P < 0.05). HS decreased the solubility of myofibrillar and total protein in the L-slaughtered birds. Thus, HS caused a higher frequency of accelerated muscle glycolysis than controls. Factory processing (chilling) could not completely eliminate the effects of accelerated glycolysis caused by pre-slaughter HS. © 2016 Poultry Science Association Inc.

  15. Microencapsulation Processes

    NASA Astrophysics Data System (ADS)

    Whateley, T. L.; Poncelet, D.

    2005-06-01

    Microencapsulation by solvent evaporation is a novel technique to enable the controlled delivery of active materials.The controlled release of drugs, for example, is a key challenge in the pharmaceutical industries. Although proposed several decades ago, it remains largely an empirical laboratory process.The Topical Team has considered its critical points and the work required to produce a more effective technology - better control of the process for industrial production, understanding of the interfacial dynamics, determination of the solvent evaporation profile, and establishment of the relation between polymer/microcapsule structures.The Team has also defined how microgravity experiments could help in better understanding microencapsulation by solvent evaporation, and it has proposed a strategy for a collaborative project on the topic.

  16. Laboratory Astrophysics Prize: Laboratory Astrophysics with Nuclei

    NASA Astrophysics Data System (ADS)

    Wiescher, Michael

    2018-06-01

    Nuclear astrophysics is concerned with nuclear reaction and decay processes from the Big Bang to the present star generation controlling the chemical evolution of our universe. Such nuclear reactions maintain stellar life, determine stellar evolution, and finally drive stellar explosion in the circle of stellar life. Laboratory nuclear astrophysics seeks to simulate and understand the underlying processes using a broad portfolio of nuclear instrumentation, from reactor to accelerator from stable to radioactive beams to map the broad spectrum of nucleosynthesis processes. This talk focuses on only two aspects of the broad field, the need of deep underground accelerator facilities in cosmic ray free environments in order to understand the nucleosynthesis in stars, and the need for high intensity radioactive beam facilities to recreate the conditions found in stellar explosions. Both concepts represent the two main frontiers of the field, which are being pursued in the US with the CASPAR accelerator at the Sanford Underground Research Facility in South Dakota and the FRIB facility at Michigan State University.

  17. Laboratory safety and the WHO World Alliance for Patient Safety.

    PubMed

    McCay, Layla; Lemer, Claire; Wu, Albert W

    2009-06-01

    Laboratory medicine has been a pioneer in the field of patient safety; indeed, the College of American Pathology first called attention to the issue in 1946. Delivering reliable laboratory results has long been considered a priority, as the data produced in laboratory medicine have the potential to critically influence individual patients' diagnosis and management. Until recently, most attention on laboratory safety has focused on the analytic stage of laboratory medicine. Addressing this stage has led to significant and impressive improvements in the areas over which laboratories have direct control. However, recent data demonstrate that pre- and post-analytical phases are at least as vulnerable to errors; to further improve patient safety in laboratory medicine, attention must now be focused on the pre- and post-analytic phases, and the concept of patient safety as a multi-disciplinary, multi-stage and multi-system concept better understood. The World Alliance for Patient Safety (WAPS) supports improvement of patient safety globally and provides a potential framework for considering the total testing process.

  18. Dynamics of microbial growth and metabolic activity and their control by aeration.

    PubMed

    Kalina, V

    1993-01-01

    The optimization of fermentation processes depends to a large extent on the modelling of microbial activity under complex environmental conditions where aeration is an important limiting and control factor. Simple relationships are used to establish the sensitivity of cultures to oxygen stress. Specific limitation coefficients which can be determined in laboratory reactors allow a projection to industrial operation and the definition of appropriate aeration and agitation profiles. Optimum control can be assured on the basis of directly measurable process parameters. This is shown for the case of ethanol production using S. cerevisiae at high cell dry weight concentrations.

  19. Improving Histopathology Laboratory Productivity: Process Consultancy and A3 Problem Solving.

    PubMed

    Yörükoğlu, Kutsal; Özer, Erdener; Alptekin, Birsen; Öcal, Cem

    2017-01-01

    The ISO 17020 quality program has been run in our pathology laboratory for four years to establish an action plan for correction and prevention of identified errors. In this study, we aimed to evaluate the errors that we could not identify through ISO 17020 and/or solve by means of process consulting. Process consulting is carefully intervening in a group or team to help it to accomplish its goals. The A3 problem solving process was run under the leadership of a 'workflow, IT and consultancy manager'. An action team was established consisting of technical staff. A root cause analysis was applied for target conditions, and the 6-S method was implemented for solution proposals. Applicable proposals were activated and the results were rated by six-sigma analysis. Non-applicable proposals were reported to the laboratory administrator. A mislabelling error was the most complained issue triggering all pre-analytical errors. There were 21 non-value added steps grouped in 8 main targets on the fish bone graphic (transporting, recording, moving, individual, waiting, over-processing, over-transaction and errors). Unnecessary redundant requests, missing slides, archiving issues, redundant activities, and mislabelling errors were proposed to be solved by improving visibility and fixing spaghetti problems. Spatial re-organization, organizational marking, re-defining some operations, and labeling activities raised the six sigma score from 24% to 68% for all phases. Operational transactions such as implementation of a pathology laboratory system was suggested for long-term improvement. Laboratory management is a complex process. Quality control is an effective method to improve productivity. Systematic checking in a quality program may not always find and/or solve the problems. External observation may reveal crucial indicators about the system failures providing very simple solutions.

  20. Use of Alum for Odor Reduction in Sludge and Biosolids from Different Wastewater Treatment Processes.

    PubMed

    Gruchlik, Yolanta; Fouché, Lise; Joll, Cynthia A; Heitz, Anna

    2017-12-01

      Applicability of alum addition to wastewater sludge and biosolids produced from different treatment processes was evaluated as a means of odor reduction. Four water resource recovery facilities (WRRFs) were chosen for this study: two used mesophilic anaerobic digestion and two used oxidation ditch processes. The experiments were conducted on a laboratory scale and in all cases the alum was added prior to dewatering. This is the first report of the application of alum for odor reduction in oxidation ditch processes. Alum addition was effective in reducing odors in anaerobically digested biosolids. Addition of 4% alum to anaerobically digested liquid biosolids prior to dewatering resulted in a 60% reduction in the peak odor concentration in the laboratory dewatered cake, relative to the control sample. Alum addition did not reduce odors in dewatered sludge from oxidation ditch processes.

  1. PEP Run Report for Integrated Test A, Caustic Leaching in UFP-VSL-T01A, Oxidative Leaching in UFP-VSL-T02A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guzman-Leong, Consuelo E.; Bredt, Ofelia P.; Burns, Carolyn A.

    2009-12-04

    Pacific Northwest National Laboratory (PNNL) was tasked by Bechtel National Inc. (BNI) on the River Protection Project-Hanford Tank Waste Treatment and Immobilization Plant (RPP-WTP) project to perform research and development activities to resolve technical issues identified for the Pretreatment Facility (PTF). The Pretreatment Engineering Platform (PEP) was designed and constructed and operated as part of a plan to respond to issue M12, “Undemonstrated Leaching Processes.”(a) The PEP, located in the Process Engineering Laboratory-West (PDLW) located in Richland, Washington, is a 1/4.5-scale test platform designed to simulate the WTP pretreatment caustic leaching, oxidative leaching, ultrafiltration solids concentration, and slurry washing processes.more » The PEP replicates the WTP leaching processes using prototypic equipment and control strategies. The PEP also includes non-prototypic ancillary equipment to support the core processing.« less

  2. From Concept-to-Flight: An Active Active Fluid Loop Based Thermal Control System for Mars Science Laboratory Rover

    NASA Technical Reports Server (NTRS)

    Birur, Gajanana C.; Bhandari, Pradeep; Bame, David; Karlmann, Paul; Mastropietro, A. J.; Liu, Yuanming; Miller, Jennifer; Pauken, Michael; Lyra, Jacqueline

    2012-01-01

    The Mars Science Laboratory (MSL) rover, Curiosity, which was launched on November 26, 2011, incorporates a novel active thermal control system to keep the sensitive electronics and science instruments at safe operating and survival temperatures. While the diurnal temperature variations on the Mars surface range from -120 C to +30 C, the sensitive equipment are kept within -40 C to +50 C. The active thermal control system is based on a single-phase mechanically pumped fluid loop (MPFL) system which removes or recovers excess waste heat and manages it to maintain the sensitive equipment inside the rover at safe temperatures. This paper will describe the entire process of developing this active thermal control system for the MSL rover from concept to flight implementation. The development of the rover thermal control system during its architecture, design, fabrication, integration, testing, and launch is described.

  3. Lab Procedures. Sludge Treatment and Disposal Course #166. Instructor's Guide [and] Student Workbook.

    ERIC Educational Resources Information Center

    Carnegie, John W.

    Laboratory tests used to determine status and to evaluate and/or maintain process control of the various sludge treatment processes are introduced in this lesson. Neither detailed test procedures nor explanations of how the tests should be applied to every unit are explained; this information is provided in other modules. The instructor's manual…

  4. Configurable technology development for reusable control and monitor ground systems

    NASA Technical Reports Server (NTRS)

    Uhrlaub, David R.

    1994-01-01

    The control monitor unit (CMU) uses configurable software technology for real-time mission command and control, telemetry processing, simulation, data acquisition, data archiving, and ground operations automation. The base technology is currently planned for the following control and monitor systems: portable Space Station checkout systems; ecological life support systems; Space Station logistics carrier system; and the ground system of the Delta Clipper (SX-2) in the Single-Stage Rocket Technology program. The CMU makes extensive use of commercial technology to increase capability and reduce development and life-cycle costs. The concepts and technology are being developed by McDonnell Douglas Space and Defense Systems for the Real-Time Systems Laboratory at NASA's Kennedy Space Center under the Payload Ground Operations Contract. A second function of the Real-Time Systems Laboratory is development and utilization of advanced software development practices.

  5. Shuttle Mission STS-50: Orbital Processing of High-Quality CdTe Compound Semiconductors Experiment: Final Flight Sample Characterization Report

    NASA Technical Reports Server (NTRS)

    Larson, David J.; Casagrande, Luis G.; DiMarzio, Don; Alexander, J. Iwan D.; Carlson, Fred; Lee, Taipo; Dudley, Michael; Raghathamachar, Balaji

    1998-01-01

    The Orbital Processing of High-Quality Doped and Alloyed CdTe Compound Semiconductors program was initiated to investigate, quantitatively, the influences of gravitationally dependent phenomena on the growth and quality of bulk compound semiconductors. The objective was to improve crystal quality (both structural and compositional) and to better understand and control the variables within the crystal growth production process. The empirical effort entailed the development of a terrestrial (one-g) experiment baseline for quantitative comparison with microgravity (mu-g) results. This effort was supported by the development of high-fidelity process models of heat transfer, fluid flow and solute redistribution, and thermo-mechanical stress occurring in the furnace, safety cartridge, ampoule, and crystal throughout the melting, seeding, crystal growth, and post-solidification processing. In addition, the sensitivity of the orbital experiments was analyzed with respect to the residual microgravity (mu-g) environment, both steady state and g-jitter. CdZnTe crystals were grown in one-g and in mu-g. Crystals processed terrestrially were grown at the NASA Ground Control Experiments Laboratory (GCEL) and at Grumman Aerospace Corporation (now Northrop Grumman Corporation). Two mu-g crystals were grown in the Crystal Growth Furnace (CGF) during the First United States Microgravity Laboratory Mission (USML-1), STS-50, June 24 - July 9, 1992.

  6. Mindfulness training alters emotional memory recall compared to active controls: support for an emotional information processing model of mindfulness.

    PubMed

    Roberts-Wolfe, Douglas; Sacchet, Matthew D; Hastings, Elizabeth; Roth, Harold; Britton, Willoughby

    2012-01-01

    While mindfulness-based interventions have received widespread application in both clinical and non-clinical populations, the mechanism by which mindfulness meditation improves well-being remains elusive. One possibility is that mindfulness training alters the processing of emotional information, similar to prevailing cognitive models of depression and anxiety. The aim of this study was to investigate the effects of mindfulness training on emotional information processing (i.e., memory) biases in relation to both clinical symptomatology and well-being in comparison to active control conditions. Fifty-eight university students (28 female, age = 20.1 ± 2.7 years) participated in either a 12-week course containing a "meditation laboratory" or an active control course with similar content or experiential practice laboratory format (music). Participants completed an emotional word recall task and self-report questionnaires of well-being and clinical symptoms before and after the 12-week course. Meditators showed greater increases in positive word recall compared to controls [F(1, 56) = 6.6, p = 0.02]. The meditation group increased significantly more on measures of well-being [F(1, 56) = 6.6, p = 0.01], with a marginal decrease in depression and anxiety [F(1, 56) = 3.0, p = 0.09] compared to controls. Increased positive word recall was associated with increased psychological well-being (r = 0.31, p = 0.02) and decreased clinical symptoms (r = -0.29, p = 0.03). Mindfulness training was associated with greater improvements in processing efficiency for positively valenced stimuli than active control conditions. This change in emotional information processing was associated with improvements in psychological well-being and less depression and anxiety. These data suggest that mindfulness training may improve well-being via changes in emotional information processing. Future research with a fully randomized design will be needed to clarify the possible influence of self-selection.

  7. Langley Wind Tunnel Data Quality Assurance-Check Standard Results

    NASA Technical Reports Server (NTRS)

    Hemsch, Michael J.; Grubb, John P.; Krieger, William B.; Cler, Daniel L.

    2000-01-01

    A framework for statistical evaluation, control and improvement of wind funnel measurement processes is presented The methodology is adapted from elements of the Measurement Assurance Plans developed by the National Bureau of Standards (now the National Institute of Standards and Technology) for standards and calibration laboratories. The present methodology is based on the notions of statistical quality control (SQC) together with check standard testing and a small number of customer repeat-run sets. The results of check standard and customer repeat-run -sets are analyzed using the statistical control chart-methods of Walter A. Shewhart long familiar to the SQC community. Control chart results are presented for. various measurement processes in five facilities at Langley Research Center. The processes include test section calibration, force and moment measurements with a balance, and instrument calibration.

  8. A precise laboratory goniometer system to collect spectral BRDF data of materials

    NASA Astrophysics Data System (ADS)

    Jiao, Guangping; Jiao, Ziti; Wang, Jie; Zhang, Hu; Dong, Yadong

    2014-11-01

    This paper presents a precise laboratory goniometer system to quickly collect bidirectional reflectance distribution factor(BRDF)of typical materials such soil, canopy and artificial materials in the laboratory. The system consists of the goniometer, SVC HR1024 spectroradiometer, and xenon long-arc lamp as light source. the innovation of cantilever slab can reduce the shadow of the goniometer in the principle plane. The geometric precision of the footprint centre is better than +/-4cm in most azimuth directions, and the angle-controlling accuracy is better than 0.5°. The light source keeps good stability, with 0.8% irradiance decrease in 3 hours. But the large areal heterogeneity of the light source increase the data processing difficulty to capture the accurate BRDF. First measurements are taken from soil in a resolution of 15° and 30° in zenith and azimuth direction respectively, with the +/-50° biggest view angle. More observations are taken in the hot-spot direction. The system takes about 40 minutes to complete all measurements. A spectralon panel is measured at the beginning and end of the whole period. A simple interactive interface on the computer can automatically control all operations of the goniometer and data-processing. The laboratory experiment of soil layer and grass lawn shows that the goniometer can capture the the multi-angle variation of BRDF.

  9. Sequencing batch-reactor control using Gaussian-process models.

    PubMed

    Kocijan, Juš; Hvala, Nadja

    2013-06-01

    This paper presents a Gaussian-process (GP) model for the design of sequencing batch-reactor (SBR) control for wastewater treatment. The GP model is a probabilistic, nonparametric model with uncertainty predictions. In the case of SBR control, it is used for the on-line optimisation of the batch-phases duration. The control algorithm follows the course of the indirect process variables (pH, redox potential and dissolved oxygen concentration) and recognises the characteristic patterns in their time profile. The control algorithm uses GP-based regression to smooth the signals and GP-based classification for the pattern recognition. When tested on the signals from an SBR laboratory pilot plant, the control algorithm provided a satisfactory agreement between the proposed completion times and the actual termination times of the biodegradation processes. In a set of tested batches the final ammonia and nitrate concentrations were below 1 and 0.5 mg L(-1), respectively, while the aeration time was shortened considerably. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. 21 CFR 606.140 - Laboratory controls.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 7 2012-04-01 2012-04-01 false Laboratory controls. 606.140 Section 606.140 Food... CURRENT GOOD MANUFACTURING PRACTICE FOR BLOOD AND BLOOD COMPONENTS Laboratory Controls § 606.140 Laboratory controls. Laboratory control procedures shall include: (a) The establishment of scientifically...

  11. 21 CFR 606.140 - Laboratory controls.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 7 2010-04-01 2010-04-01 false Laboratory controls. 606.140 Section 606.140 Food... CURRENT GOOD MANUFACTURING PRACTICE FOR BLOOD AND BLOOD COMPONENTS Laboratory Controls § 606.140 Laboratory controls. Laboratory control procedures shall include: (a) The establishment of scientifically...

  12. 21 CFR 606.140 - Laboratory controls.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 7 2011-04-01 2010-04-01 true Laboratory controls. 606.140 Section 606.140 Food... CURRENT GOOD MANUFACTURING PRACTICE FOR BLOOD AND BLOOD COMPONENTS Laboratory Controls § 606.140 Laboratory controls. Laboratory control procedures shall include: (a) The establishment of scientifically...

  13. 21 CFR 606.140 - Laboratory controls.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 7 2014-04-01 2014-04-01 false Laboratory controls. 606.140 Section 606.140 Food... CURRENT GOOD MANUFACTURING PRACTICE FOR BLOOD AND BLOOD COMPONENTS Laboratory Controls § 606.140 Laboratory controls. Laboratory control procedures shall include: (a) The establishment of scientifically...

  14. A new hyperspectral imaging based device for quality control in plastic recycling

    NASA Astrophysics Data System (ADS)

    Bonifazi, G.; D'Agostini, M.; Dall'Ava, A.; Serranti, S.; Turioni, F.

    2013-05-01

    The quality control of contamination level in the recycled plastics stream has been identified as an important key factor for increasing the value of the recycled material by both plastic recycling and compounder industries. Existing quality control methods for the detection of both plastics and non-plastics contaminants in the plastic waste streams at different stages of the industrial process (e.g. feed, intermediate and final products) are currently based on the manual collection from the stream of a sample and on the subsequent off-line laboratory analyses. The results of such analyses are usually available after some hours, or sometimes even some days, after the material has been processed. The laboratory analyses are time-consuming and expensive (both in terms of equipment cost and their maintenance and of labour cost).Therefore, a fast on-line assessment to monitor the plastic waste feed streams and to characterize the composition of the different plastic products, is fundamental to increase the value of secondary plastics. The paper is finalized to describe and evaluate the development of an HSI-based device and of the related software architectures and processing algorithms for quality assessment of plastics in recycling plants, with particular reference to polyolefins (PO). NIR-HSI sensing devices coupled with multivariate data analysis methods was demonstrated as an objective, rapid and non-destructive technique that can be used for on-line quality and process control in the recycling process of POs. In particular, the adoption of the previous mentioned HD&SW integrated architectures can provide a solution to one of the major problems of the recycling industry, which is the lack of an accurate quality certification of materials obtained by recycling processes. These results could therefore assist in developing strategies to certify the composition of recycled PO products.

  15. ISO 15189 accreditation: Requirements for quality and competence of medical laboratories, experience of a laboratory I.

    PubMed

    Guzel, Omer; Guner, Ebru Ilhan

    2009-03-01

    Medical laboratories are the key partners in patient safety. Laboratory results influence 70% of medical diagnoses. Quality of laboratory service is the major factor which directly affects the quality of health care. The clinical laboratory as a whole has to provide the best patient care promoting excellence. International Standard ISO 15189, based upon ISO 17025 and ISO 9001 standards, provides requirements for competence and quality of medical laboratories. Accredited medical laboratories enhance credibility and competency of their testing services. Our group of laboratories, one of the leading institutions in the area, had previous experience with ISO 9001 and ISO 17025 Accreditation at non-medical sections. We started to prepared for ISO 15189 Accreditation at the beginning of 2006 and were certified in March, 2007. We spent more than a year to prepare for accreditation. Accreditation scopes of our laboratory were as follows: clinical chemistry, hematology, immunology, allergology, microbiology, parasitology, molecular biology of infection serology and transfusion medicine. The total number of accredited tests is 531. We participate in five different PT programs. Inter Laboratory Comparison (ILC) protocols are performed with reputable laboratories. 82 different PT Program modules, 277 cycles per year for 451 tests and 72 ILC program organizations for remaining tests have been performed. Our laboratory also organizes a PT program for flow cytometry. 22 laboratories participate in this program, 2 cycles per year. Our laboratory has had its own custom made WEB based LIS system since 2001. We serve more than 500 customers on a real time basis. Our quality management system is also documented and processed electronically, Document Management System (DMS), via our intranet. Preparatory phase for accreditation, data management, external quality control programs, personnel related issues before, during and after accreditation process are presented. Every laboratory has to concentrate on patient safety issues related to laboratory testing and should perform quality improvement projects.

  16. Quality assurance in road traffic analyses in Switzerland.

    PubMed

    Briellmann, Thomas A; Sigrist, Thomas; Augsburger, Marc; Favrat, Bernard; Oestreich, Andrea; Deom, André

    2010-05-20

    Swiss laboratories performing toxicological road traffic analyses have been authorized for many years by the Swiss Federal Roads Office (FEDRO). In 2003 FEDRO signed a contract with the Swiss Society of Legal Medicine (SSLM) to organize the complete quality management concerning road traffic analyses. For this purpose a multidisciplinary working group was established under the name of "road traffic commission (RTC)". RTC has to organize external quality control, to interpret the results of these controls, to perform audits in the laboratories and to report all results to FEDRO. Furthermore the working group can be mandated for special tasks by FEDRO. As an independent organization the Swiss Center for Quality Control (CSCQ) in Geneva manages the external quality controls in the laboratory over the past years. All tested drugs and psychoactive substances are listed in a federal instruction. The so-called 'zero tolerance substances' (THC, morphine, cocaine, amphetamine, methamphetamine, MDMA and MDEA) and their metabolites have to be tested once a year, all other substances (benzodiazepines, zolpidem, phenobarbital, etc.) periodically. Results over the last years show that all laboratories are generally within the confidence interval of +/-30% of the mean value. In cases of non-conformities measures have to be taken immediately and reported to the working group. External audits are performed triennially but accredited laboratories can combine this audit with the approval of the Swiss Accreditation Service (SAS). During the audits a special checklist filled in by the laboratory director is assessed. Non-conformities have to be corrected. During the process of establishing a new legislation, RTC had an opportunity of advising FEDRO. In collaboration with FEDRO, RTC and hence SSLM can work actively on improving of quality assurance in road traffic toxicological analyses, and has an opportunity to bring its professional requests to the federal authorities.

  17. Time-Efficiency Analysis Comparing Digital and Conventional Workflows for Implant Crowns: A Prospective Clinical Crossover Trial.

    PubMed

    Joda, Tim; Brägger, Urs

    2015-01-01

    To compare time-efficiency in the production of implant crowns using a digital workflow versus the conventional pathway. This prospective clinical study used a crossover design that included 20 study participants receiving single-tooth replacements in posterior sites. Each patient received a customized titanium abutment plus a computer-aided design/computer-assisted manufacture (CAD/CAM) zirconia suprastructure (for those in the test group, using digital workflow) and a standardized titanium abutment plus a porcelain-fused-to-metal crown (for those in the control group, using a conventional pathway). The start of the implant prosthetic treatment was established as the baseline. Time-efficiency analysis was defined as the primary outcome, and was measured for every single clinical and laboratory work step in minutes. Statistical analysis was calculated with the Wilcoxon rank sum test. All crowns could be provided within two clinical appointments, independent of the manufacturing process. The mean total production time, as the sum of clinical plus laboratory work steps, was significantly different. The mean ± standard deviation (SD) time was 185.4 ± 17.9 minutes for the digital workflow process and 223.0 ± 26.2 minutes for the conventional pathway (P = .0001). Therefore, digital processing for overall treatment was 16% faster. Detailed analysis for the clinical treatment revealed a significantly reduced mean ± SD chair time of 27.3 ± 3.4 minutes for the test group compared with 33.2 ± 4.9 minutes for the control group (P = .0001). Similar results were found for the mean laboratory work time, with a significant decrease of 158.1 ± 17.2 minutes for the test group vs 189.8 ± 25.3 minutes for the control group (P = .0001). Only a few studies have investigated efficiency parameters of digital workflows compared with conventional pathways in implant dental medicine. This investigation shows that the digital workflow seems to be more time-efficient than the established conventional production pathway for fixed implant-supported crowns. Both clinical chair time and laboratory manufacturing steps could be effectively shortened with the digital process of intraoral scanning plus CAD/CAM technology.

  18. Office of the Chief Financial Officer Annual Report 2007

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fernandez, Jeffrey

    2007-12-18

    2007 was a year of progress and challenges for the Office of the Chief Financial Officer (OCFO). I believe that with the addition of a new Controller, the OCFO senior management team is stronger than ever. With the new Controller on board, the senior management team spent two intensive days updating our strategic plan for the next five years ending in 2012, while making sure that we continue to execute on our existing strategic initiatives. In 2007 the Budget Office, teaming with Human Resources, worked diligently with our colleagues on campus to reengineer the Multi-Location Appointment (MLA) process, making itmore » easier for our Principal Investigators (PIs) to work simultaneously between the Laboratory and UC campuses. The hiring of a point-of-contact in Human Resources to administer the program will also make the process flow smoother. In order to increase our financial flexibility, the OCFO worked with the Department of Energy (DOE) to win approval to reduce the burden rates on research and development (R&D) subcontracts and Intra-University Transfers (IUT). The Budget Office also performed a 'return on investment' (ROI) analysis to secure UCRP funding for a much needed vocational rehabilitation counselor. This new counselor now works with employees who are on medical leave to ensure that they can return to work in a more timely fashion, or if not able to return, usher them through the various options available to them. Under the direction of the new Controller, PriceWaterhouse Coopers (PWC) performed their annual audit of the Laboratory's financial data and reported positive results. In partnership with the Financial Policy and Training Office, the Controller's Office also helped to launch self-assessments of some of our financial processes, including timekeeping and resource adjustments. These self assessments were conducted to promote efficiencies and mitigate risk. In some cases they provided assurance that our practices are sound, and in others highlighted opportunities to improve. A third, and most important assessment on funds control was also conducted that proved very useful in making sure that our financial processes are sound and of the highest ethical standards. In June of 2007 the Procurement Department was awarded the DOE's FY2006 Secretarial Small Business Award for the advancement of small business contracts at Lawrence Berkeley National Laboratory (LBNL). The award was presented in Washington, D.C. Procurement also distinguished itself by passing the tri-ennial Procurement Evaluation and Re-engineering Team (PERT) Review of its systems and processes. We continue to reduce costs through the Supply Chain Initiative saving the Laboratory {approx}$6M to date and have placed over 11,000 orders with over seven vendors using the eBuy system. Our wall-to-wall inventory, which was completed in March of 2007, reported a result of 99+% for item count and 99.51% by value. This was a remarkable achievement that required the hard work of every Division and the Property Department working together. Training continues to be a major initiative for the OCFO and in 2007 we rolled out financial training programs specifically tailored to meet the needs of the scientific divisions. FY2008 presents several opportunities to enhance and improve our service to the scientific community. With the awarding of the HELIOS and JBEI programs, we will be developing new financial paradigms to provide senior management flexibility in decision making. Last year we heard the Laboratory community loud and clear when they expressed their frustration with our current travel system. As we head into the new fiscal year, a cross-functional travel team has identified a new model for how we provide travel services. We will be implementing the Oracle PeopleSoft Travel Reimbursement system by July of 2008. The new system will be more user-friendly and provide better information to the divisions and travel operations. We will also continue to review the travel disbursements operation for further improvement. Also in FY2008, several key information systems implementation projects are under way which will strengthen the Laboratory's financial and business processes. These include Supply Chain Management, and the Budget and Planning System. Future planned systems development includes an electronic sponsored research administration system. Continuing to improve the procurement process at the Laboratory is another major priority for the OCFO. To that end, we will be working to re-engineer the 'procure-to-pay' process. The goal will be to correct process flow to maximize efficiency and effectiveness, while implementing sound business practices and incorporating strong internal controls. Along the same lines, we will also be working with the divisions to implement the Property Management Improvement Program that was identified in FY2007.« less

  19. Laboratory and Self-Report Methods to Assess Reappraisal and Distraction in Youth.

    PubMed

    Bettis, Alexandra H; Henry, Lauren; Prussien, Kemar V; Vreeland, Allison; Smith, Michele; Adery, Laura H; Compas, Bruce E

    2018-06-07

    Coping and emotion regulation are central features of risk and resilience in childhood and adolescence, but research on these constructs has relied on different methods of assessment. The current study aimed to bridge the gap between questionnaire and experimental methods of measuring secondary control coping strategies, specifically distraction and cognitive reappraisal, and examine associations with symptoms of anxiety and depression in youth. A community sample of 70 youth (ages 9-15) completed a novel experimental coping and emotion regulation paradigm and self-report measures of coping and emotion regulation and symptoms. Findings indicate that use of distraction and reappraisal during the laboratory paradigm was associated with lower levels of negative emotion during the task. Youth emotion ratings while implementing distraction, but not reappraisal, during the laboratory task were associated with youth self-reported use of secondary control coping in response to family stress. Youth symptoms of anxiety and depression were also significantly positively associated with negative emotion ratings during the laboratory task, and both laboratory task and self-reported coping and emotion regulation accounted for significant variance in symptoms in youth. Both questionnaire and laboratory methods to assess coping and emotion regulation in youth are important for understanding these processes as possible mechanisms of risk and resilience and continued integration of these methods is a priority for future research.

  20. USGS Blind Sample Project: monitoring and evaluating laboratory analytical quality

    USGS Publications Warehouse

    Ludtke, Amy S.; Woodworth, Mark T.

    1997-01-01

    The U.S. Geological Survey (USGS) collects and disseminates information about the Nation's water resources. Surface- and ground-water samples are collected and sent to USGS laboratories for chemical analyses. The laboratories identify and quantify the constituents in the water samples. Random and systematic errors occur during sample handling, chemical analysis, and data processing. Although all errors cannot be eliminated from measurements, the magnitude of their uncertainty can be estimated and tracked over time. Since 1981, the USGS has operated an independent, external, quality-assurance project called the Blind Sample Project (BSP). The purpose of the BSP is to monitor and evaluate the quality of laboratory analytical results through the use of double-blind quality-control (QC) samples. The information provided by the BSP assists the laboratories in detecting and correcting problems in the analytical procedures. The information also can aid laboratory users in estimating the extent that laboratory errors contribute to the overall errors in their environmental data.

  1. Electrochemical probing of high-level radioactive waste tanks containing washed sludge and precipitates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bickford, D.F.; Congdon, J.W.; Oblath, S.B.

    1987-01-01

    At the U.S. Department of Energy's Savannah River Plant, corrosion of carbon steel storage tanks containing alkaline, high-level radioactive waste is controlled by specification of limits on waste composition and temperature. Processes for the preparation of waste for final disposal will result in waste with low corrosion inhibitor concentrations and, in some cases, high aromatic organic concentrations, neither of which are characteristic of previous operations. Laboratory tests, conducted to determine minimum corrosion inhibitor levels indicated pitting of carbon steel near the waterline for proposed storage conditions. In situ electrochemical measurements of full-scale radioactive process demonstrations have been conducted to assessmore » the validity of laboratory tests. Probes included pH, Eh (potential relative to a standard hydrogen electrode), tank potential, and alloy coupons. In situ results are compared to those of the laboratory tests, with particular regard given to simulated solution composition.« less

  2. [Principles of cooperation between the specialties of internal medicine, pathology and clinical biochemistry].

    PubMed

    Hölzel, W; Baumgarten, R; Fiedler, H; Zimmermann, S

    1985-12-01

    The optimal utilization of the knowledge and possibilities of pathological and clinical biochemistry presumes a close cooperation between it and the clinical specialties. The common working team of the GDR Society of Internal Medicine and the GDR Society for Clinical Chemistry and Laboratory Diagnostics makes theses of the central points of the cooperation in care, education, further education and postgraduate study and in research a subject for discussion. As essential tasks in the process of medical care are regarded the balance of the examination programme standing at the disposal, the establishment of diagnostic programmes, the establishment of organisational measures, the ascertainment of a use according to indication, the guarantee of the representance of examination material, the control of plausibility and the interpretation of test results. Since the realization of the tasks to a large extent depends on the cooperation of the specialities in education, further education and postgraduate study during the further education the clinician should become acquainted with the possibilities, the limits and the prerequisites for the performance of laboratory diagnostic investigations, the clinical biochemist with the problems of medical care and the value of the laboratory diagnosis in the total process of the treatment. In the field of research the result is a necessary cooperation in the clarification of patho-biochemical mechanisms, in the search for suitable laboratory diagnostic parameters for diagnostics and control of the course as well as in the statement of the validity of laboratory diagnostic parameters and parameter combinations taking into consideration the factors expenses, benefit and risk as well as further diagnostic possibilities.

  3. Guidance for laboratories performing molecular pathology for cancer patients.

    PubMed

    Cree, Ian A; Deans, Zandra; Ligtenberg, Marjolijn J L; Normanno, Nicola; Edsjö, Anders; Rouleau, Etienne; Solé, Francesc; Thunnissen, Erik; Timens, Wim; Schuuring, Ed; Dequeker, Elisabeth; Murray, Samuel; Dietel, Manfred; Groenen, Patricia; Van Krieken, J Han

    2014-11-01

    Molecular testing is becoming an important part of the diagnosis of any patient with cancer. The challenge to laboratories is to meet this need, using reliable methods and processes to ensure that patients receive a timely and accurate report on which their treatment will be based. The aim of this paper is to provide minimum requirements for the management of molecular pathology laboratories. This general guidance should be augmented by the specific guidance available for different tumour types and tests. Preanalytical considerations are important, and careful consideration of the way in which specimens are obtained and reach the laboratory is necessary. Sample receipt and handling follow standard operating procedures, but some alterations may be necessary if molecular testing is to be performed, for instance to control tissue fixation. DNA and RNA extraction can be standardised and should be checked for quality and quantity of output on a regular basis. The choice of analytical method(s) depends on clinical requirements, desired turnaround time, and expertise available. Internal quality control, regular internal audit of the whole testing process, laboratory accreditation, and continual participation in external quality assessment schemes are prerequisites for delivery of a reliable service. A molecular pathology report should accurately convey the information the clinician needs to treat the patient with sufficient information to allow for correct interpretation of the result. Molecular pathology is developing rapidly, and further detailed evidence-based recommendations are required for many of the topics covered here. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  4. Safety management and risk assessment in chemical laboratories.

    PubMed

    Marendaz, Jean-Luc; Friedrich, Kirstin; Meyer, Thierry

    2011-01-01

    The present paper highlights a new safety management program, MICE (Management, Information, Control and Emergency), which has been specifically adapted for the academic environment. The process starts with an exhaustive hazard inventory supported by a platform assembling specific hazards encountered in laboratories and their subsequent classification. A proof of concept is given by a series of implementations in the domain of chemistry targeting workplace health protection. The methodology is expressed through three examples to illustrate how the MICE program can be used to address safety concerns regarding chemicals, strong magnetic fields and nanoparticles in research laboratories. A comprehensive chemical management program is also depicted.

  5. Selection of a Data Acquisition and Controls System Communications and Software Architecture for Johnson Space Center's Space Environment Simulation Laboratory Thermal and Vacuum Test Facilities

    NASA Technical Reports Server (NTRS)

    Jordan, Eric A.

    2004-01-01

    Upgrade of data acquisition and controls systems software at Johnson Space Center's Space Environment Simulation Laboratory (SESL) involved the definition, evaluation and selection of a system communication architecture and software components. A brief discussion of the background of the SESL and its data acquisition and controls systems provides a context for discussion of the requirements for each selection. Further framework is provided as upgrades to these systems accomplished in the 1990s and in 2003 are compared to demonstrate the role that technological advances have had in their improvement. Both of the selections were similar in their three phases; 1) definition of requirements, 2) identification of candidate products and their evaluation and testing and 3) selection by comparison of requirement fulfillment. The candidates for the communication architecture selection embraced several different methodologies which are explained and contrasted. Requirements for this selection are presented and the selection process is described. Several candidates for the software component of the data acquisition and controls system are identified, requirements for evaluation and selection are presented, and the evaluation process is described.

  6. Adverse effect versus quality control of the Fuenzalida-Palacios antirabies vaccine.

    PubMed

    Nogueira, Y L

    1998-01-01

    We evaluated the components of the Fuenzalida-Palacios antirabies vaccine, which is till used in most developing countries in human immunization for treatment and prophylaxis. This vaccine is prepared from newborn mouse brains at 1% concentration. Even though the vaccine is considered to have a low myelin content, it is not fully free of myelin or of other undesirable components that might trigger adverse effects after vaccination. The most severe effect is a post-vaccination neuroparalytic accident associated with Guillain-Barré syndrome. In the present study we demonstrate how the vaccines produced and distributed by different laboratories show different component patterns with different degrees of impurity and with varying protein concentrations, indicating that production processes can vary from one laboratory to another. These differences, which could be resolved using a better quality control process, may affect and impair immunization, with consequent risks and adverse effects after vaccination. We used crossed immunoelectrophoresis to evaluate and demonstrate the possibility of quality control in vaccine production, reducing the risk factors possibly involved in these immunizing products.

  7. 15 CFR 200.103 - Consulting and advisory services.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...., details of design and construction, operational aspects, unusual or extreme conditions, methods of statistical control of the measurement process, automated acquisition of laboratory data, and data reduction... group seminars on the precision measurement of specific types of physical quantities, offering the...

  8. 15 CFR 200.103 - Consulting and advisory services.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...., details of design and construction, operational aspects, unusual or extreme conditions, methods of statistical control of the measurement process, automated acquisition of laboratory data, and data reduction... group seminars on the precision measurement of specific types of physical quantities, offering the...

  9. Old River Control Complex Sedimentation Investigation

    DTIC Science & Technology

    2015-06-01

    efforts to describe the shoaling processes and sediment transport in the two-river system. Geomorphic analysis The geomorphic assessment utilized...District, New Orleans. The investigation was conducted via a combination of field data collection and laboratory analysis, geomorphic assessments, and...6 Geomorphic analysis

  10. A Cyclic Voltammetry Experiment for the Instrumental Analysis Laboratory.

    ERIC Educational Resources Information Center

    Baldwin, Richard P.; And Others

    1984-01-01

    Background information and procedures are provided for experiments that illustrate the nature of cyclic voltammetry and its application in the characterization of organic electrode processes. The experiments also demonstrate the concepts of electrochemical reversibility and diffusion-controlled mass transfer. (JN)

  11. 10 CFR 74.45 - Measurements and measurement control.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... measurements, obtaining samples, and performing laboratory analyses for element concentration and isotope... of random error behavior. On a predetermined schedule, the program shall include, as appropriate: (i) Replicate analyses of individual samples; (ii) Analysis of replicate process samples; (iii) Replicate volume...

  12. 10 CFR 74.45 - Measurements and measurement control.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... measurements, obtaining samples, and performing laboratory analyses for element concentration and isotope... of random error behavior. On a predetermined schedule, the program shall include, as appropriate: (i) Replicate analyses of individual samples; (ii) Analysis of replicate process samples; (iii) Replicate volume...

  13. 10 CFR 74.45 - Measurements and measurement control.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... measurements, obtaining samples, and performing laboratory analyses for element concentration and isotope... of random error behavior. On a predetermined schedule, the program shall include, as appropriate: (i) Replicate analyses of individual samples; (ii) Analysis of replicate process samples; (iii) Replicate volume...

  14. Local area networks, laboratory information management systems, languages, and operating systems in the lab and pilot plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dessy, R.E.

    1983-08-01

    Microprocessors and microcomputers are being incorporated into the instruments and controllers in our laboratory and pilot plant. They enhance both the quality and amount of information that is produced. Yet they simultaneously produce vast amounts of information that must be controlled, or scientists and engineers will become high priced secretaries. The devices need programs that control them in a time frame relevant to the experiment. Simple, expeditious pathways to the generation of software that will run rapidly is essential or first class scientists and engineers become second class system programmersexclamation This paper attempts to develop the vocabulary by which themore » people involved in this technological revolution can understand and control it. We will examine the elements that synergistically make up the electronic laboratory and pilot plant. More detailed analyses of each area may be found in a series of articles entitled A/C INTERFACE (1-4). Many factors interact in the final system that we bring into our laboratory. Yet many purchasers only perform a cursory evaluation on the superficial aspects of the hardware. The integrated lab and pilot plant require that microprocessors, which control and collect, be connected in a LAN to larger processors that can provide LIMS support. Statistics and scientific word processing capabilities then complete the armamentorium. The end result is a system that does things for the user, rather than doing things to him.« less

  15. Analysis of negative historical control group data from the in vitro micronucleus assay using TK6 cells.

    PubMed

    Lovell, David P; Fellows, Mick; Marchetti, Francesco; Christiansen, Joan; Elhajouji, Azeddine; Hashimoto, Kiyohiro; Kasamoto, Sawako; Li, Yan; Masayasu, Ozaki; Moore, Martha M; Schuler, Maik; Smith, Robert; Stankowski, Leon F; Tanaka, Jin; Tanir, Jennifer Y; Thybaud, Veronique; Van Goethem, Freddy; Whitwell, James

    2018-01-01

    The recent revisions of the Organisation for Economic Co-operation and Development (OECD) genetic toxicology test guidelines emphasize the importance of historical negative controls both for data quality and interpretation. The goal of a HESI Genetic Toxicology Technical Committee (GTTC) workgroup was to collect data from participating laboratories and to conduct a statistical analysis to understand and publish the range of values that are normally seen in experienced laboratories using TK6 cells to conduct the in vitro micronucleus assay. Data from negative control samples from in vitro micronucleus assays using TK6 cells from 13 laboratories were collected using a standard collection form. Although in some cases statistically significant differences can be seen within laboratories for different test conditions, they were very small. The mean incidence of micronucleated cells/1000 cells ranged from 3.2/1000 to 13.8/1000. These almost four-fold differences in micronucleus levels cannot be explained by differences in scoring method, presence or absence of exogenous metabolic activation (S9), length of treatment, presence or absence of cytochalasin B or different solvents used as vehicles. The range of means from the four laboratories using flow cytometry methods (3.7-fold: 3.5-12.9 micronucleated cells/1000 cells) was similar to that from the nine laboratories using other scoring methods (4.3-fold: 3.2-13.8 micronucleated cells/1000 cells). No laboratory could be identified as an outlier or as showing unacceptably high variability. Quality Control (QC) methods applied to analyse the intra-laboratory variability showed that there was evidence of inter-experimental variability greater than would be expected by chance (i.e. over-dispersion). However, in general, this was low. This study demonstrates the value of QC methods in helping to analyse the reproducibility of results, building up a 'normal' range of values, and as an aid to identify variability within a laboratory in order to implement processes to maintain and improve uniformity. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  16. Precision Heating Process

    NASA Technical Reports Server (NTRS)

    1992-01-01

    A heat sealing process was developed by SEBRA based on technology that originated in work with NASA's Jet Propulsion Laboratory. The project involved connecting and transferring blood and fluids between sterile plastic containers while maintaining a closed system. SEBRA markets the PIRF Process to manufacturers of medical catheters. It is a precisely controlled method of heating thermoplastic materials in a mold to form or weld catheters and other products. The process offers advantages in fast, precise welding or shape forming of catheters as well as applications in a variety of other industries.

  17. Multimission Telemetry Visualization (MTV) system: A mission applications project from JPL's Multimedia Communications Laboratory

    NASA Technical Reports Server (NTRS)

    Koeberlein, Ernest, III; Pender, Shaw Exum

    1994-01-01

    This paper describes the Multimission Telemetry Visualization (MTV) data acquisition/distribution system. MTV was developed by JPL's Multimedia Communications Laboratory (MCL) and designed to process and display digital, real-time, science and engineering data from JPL's Mission Control Center. The MTV system can be accessed using UNIX workstations and PC's over common datacom and telecom networks from worldwide locations. It is designed to lower data distribution costs while increasing data analysis functionality by integrating low-cost, off-the-shelf desktop hardware and software. MTV is expected to significantly lower the cost of real-time data display, processing, distribution, and allow for greater spacecraft safety and mission data access.

  18. Laboratory formation of non-cementing, methane hydrate-bearing sands

    USGS Publications Warehouse

    Waite, William F.; Bratton, Peter M.; Mason, David H.

    2011-01-01

    Naturally occurring hydrate-bearing sands often behave as though methane hydrate is acting as a load-bearing member of the sediment. Mimicking this behavior in laboratory samples with methane hydrate likely requires forming hydrate from methane dissolved in water. To hasten this formation process, we initially form hydrate in a free-gas-limited system, then form additional hydrate by circulating methane-supersaturated water through the sample. Though the dissolved-phase formation process can theoretically be enhanced by increasing the pore pressure and flow rate and lowering the sample temperature, a more fundamental concern is preventing clogs resulting from inadvertent methane bubble formation in the circulation lines. Clog prevention requires careful temperature control throughout the circulation loop.

  19. Six Sigma Quality Management System and Design of Risk-based Statistical Quality Control.

    PubMed

    Westgard, James O; Westgard, Sten A

    2017-03-01

    Six sigma concepts provide a quality management system (QMS) with many useful tools for managing quality in medical laboratories. This Six Sigma QMS is driven by the quality required for the intended use of a test. The most useful form for this quality requirement is the allowable total error. Calculation of a sigma-metric provides the best predictor of risk for an analytical examination process, as well as a design parameter for selecting the statistical quality control (SQC) procedure necessary to detect medically important errors. Simple point estimates of sigma at medical decision concentrations are sufficient for laboratory applications. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. A Multi-mission Event-Driven Component-Based System for Support of Flight Software Development, ATLO, and Operations first used by the Mars Science Laboratory (MSL) Project

    NASA Technical Reports Server (NTRS)

    Dehghani, Navid; Tankenson, Michael

    2006-01-01

    This paper details an architectural description of the Mission Data Processing and Control System (MPCS), an event-driven, multi-mission ground data processing components providing uplink, downlink, and data management capabilities which will support the Mars Science Laboratory (MSL) project as its first target mission. MPCS is developed based on a set of small reusable components, implemented in Java, each designed with a specific function and well-defined interfaces. An industry standard messaging bus is used to transfer information among system components. Components generate standard messages which are used to capture system information, as well as triggers to support the event-driven architecture of the system. Event-driven systems are highly desirable for processing high-rate telemetry (science and engineering) data, and for supporting automation for many mission operations processes.

  1. Propulsive Reaction Control System Model

    NASA Technical Reports Server (NTRS)

    Brugarolas, Paul; Phan, Linh H.; Serricchio, Frederick; San Martin, Alejandro M.

    2011-01-01

    This software models a propulsive reaction control system (RCS) for guidance, navigation, and control simulation purposes. The model includes the drive electronics, the electromechanical valve dynamics, the combustion dynamics, and thrust. This innovation follows the Mars Science Laboratory entry reaction control system design, and has been created to meet the Mars Science Laboratory (MSL) entry, descent, and landing simulation needs. It has been built to be plug-and-play on multiple MSL testbeds [analysis, Monte Carlo, flight software development, hardware-in-the-loop, and ATLO (assembly, test and launch operations) testbeds]. This RCS model is a C language program. It contains two main functions: the RCS electronics model function that models the RCS FPGA (field-programmable-gate-array) processing and commanding of the RCS valve, and the RCS dynamic model function that models the valve and combustion dynamics. In addition, this software provides support functions to initialize the model states, set parameters, access model telemetry, and access calculated thruster forces.

  2. The relation of mothers' controlling vocalizations to children's intrinsic motivation.

    PubMed

    Deci, E L; Driver, R E; Hotchkiss, L; Robbins, R J; Wilson, I M

    1993-04-01

    Twenty-six mother-child dyads played together in a laboratory setting. Play sessions were surreptitiously videotaped (with mothers' permission), and each maternal vocalization was transcribed and coded, first into 1 of 24 categories and then ipso facto into one of three supercategories--namely, controlling, autonomy supportive, and neutral. The degree of mothers' controllingness was calculated as the percentage of vocalizations coded as controlling. This index was correlated with the intrinsic motivation of their 6- or 7-year-old children, as assessed primarily by the free-choice behavioral measure and secondarily by a child self-report measure of interest and liking for the task. Both correlations were significantly negative, thereby suggesting that the robust laboratory findings of a negative relation between controlling contexts and individuals' intrinsic motivation are directly generalizable to the domain of parenting. Results are discussed in terms of the processes that undermine intrinsic motivation and the means through which parental controllingness is communicated.

  3. The AMTEX Partnership{trademark}. First quarter report, Fiscal year 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-12-01

    The AMTEX Partnership is a collaborative research and development program among the US Integrated Textile Industry, DOE, the National Laboratories, other federal agencies and laboratories, and universities. The goal of AMTEX is to strengthen the competitiveness of this vital industry, thereby preserving and creating US jobs. Topics in this quarters report include: computer-aided fabric evaluation, cotton biotechnology, demand activated manufacturing architecture, electronic embedded fingerprints, on-line process control in flexible fiber manufacturing, rapid cutting, sensors for agile manufacturing, and textile resource conservation.

  4. A Systems Approach towards an Intelligent and Self-Controlling Platform for Integrated Continuous Reaction Sequences**

    PubMed Central

    Ingham, Richard J; Battilocchio, Claudio; Fitzpatrick, Daniel E; Sliwinski, Eric; Hawkins, Joel M; Ley, Steven V

    2015-01-01

    Performing reactions in flow can offer major advantages over batch methods. However, laboratory flow chemistry processes are currently often limited to single steps or short sequences due to the complexity involved with operating a multi-step process. Using new modular components for downstream processing, coupled with control technologies, more advanced multi-step flow sequences can be realized. These tools are applied to the synthesis of 2-aminoadamantane-2-carboxylic acid. A system comprising three chemistry steps and three workup steps was developed, having sufficient autonomy and self-regulation to be managed by a single operator. PMID:25377747

  5. Automatic sample changer control software for automation of neutron activation analysis process in Malaysian Nuclear Agency

    NASA Astrophysics Data System (ADS)

    Yussup, N.; Ibrahim, M. M.; Rahman, N. A. A.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.; Lombigit, L.; Azman, A.; Omar, S. A.

    2018-01-01

    Most of the procedures in neutron activation analysis (NAA) process that has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s were performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient especially for sample counting and measurement process. The sample needs to be changed and the measurement software needs to be setup for every one hour counting time. Both of these procedures are performed manually for every sample. Hence, an automatic sample changer system (ASC) that consists of hardware and software is developed to automate sample counting process for up to 30 samples consecutively. This paper describes the ASC control software for NAA process which is designed and developed to control the ASC hardware and call GammaVision software for sample measurement. The software is developed by using National Instrument LabVIEW development package.

  6. On the nature of extraversion: variation in conditioned contextual activation of dopamine-facilitated affective, cognitive, and motor processes

    PubMed Central

    Depue, Richard A.; Fu, Yu

    2013-01-01

    Research supports an association between extraversion and dopamine (DA) functioning. DA facilitates incentive motivation and the conditioning and incentive encoding of contexts that predict reward. Therefore, we assessed whether extraversion is related to the efficacy of acquiring conditioned contextual facilitation of three processes that are dependent on DA: motor velocity, positive affect, and visuospatial working memory. We exposed high and low extraverts to three days of association of drug reward (methylphenidate, MP) with a particular laboratory context (Paired group), a test day of conditioning, and three days of extinction in the same laboratory. A Placebo group and an Unpaired group (that had MP in a different laboratory context) served as controls. Conditioned contextual facilitation was assessed by (i) presenting video clips that varied in their pairing with drug and laboratory context and in inherent incentive value, and (ii) measuring increases from day 1 to Test day on the three processes above. Results showed acquisition of conditioned contextual facilitation across all measures to video clips that had been paired with drug and laboratory context in the Paired high extraverts, but no conditioning in the Paired low extraverts (nor in either of the control groups). Increases in the Paired high extraverts were correlated across the three measures. Also, conditioned facilitation was evident on the first day of extinction in Paired high extraverts, despite the absence of the unconditioned effects of MP. By the last day of extinction, responding returned to day 1 levels. The findings suggest that extraversion is associated with variation in the acquisition of contexts that predict reward. Over time, this variation may lead to differences in the breadth of networks of conditioned contexts. Thus, individual differences in extraversion may be maintained by activation of differentially encoded central representations of incentive contexts that predict reward. PMID:23785330

  7. The Subsurface Flow and Transport Laboratory: A New Department of Energy User's Facility for Intermediate-Scale Experimentation

    NASA Astrophysics Data System (ADS)

    Wietsma, T. W.; Oostrom, M.; Foster, N. S.

    2003-12-01

    Intermediate-scale experiments (ISEs) for flow and transport are a valuable tool for simulating subsurface features and conditions encountered in the field at government and private sites. ISEs offer the ability to study, under controlled laboratory conditions, complicated processes characteristic of mixed wastes and heterogeneous subsurface environments, in multiple dimensions and at different scales. ISEs may, therefore, result in major cost savings if employed prior to field studies. A distinct advantage of ISEs is that researchers can design physical and/or chemical heterogeneities in the porous media matrix that better approximate natural field conditions and therefore address research questions that contain the additional complexity of processes often encountered in the natural environment. A new Subsurface Flow and Transport Laboratory (SFTL) has been developed for ISE users in the Environmental Spectroscopy & Biogeochemistry Facility in the Environmental Molecular Sciences Laboratory (EMSL) at Pacific Northwest National Laboratory (PNNL). The SFTL offers a variety of columns and flow cells, a new state-of-the-art dual-energy gamma system, a fully automated saturation-pressure apparatus, and analytical equipment for sample processing. The new facility, including qualified staff, is available for scientists interested in collaboration on conducting high-quality flow and transport experiments, including contaminant remediation. Close linkages exist between the SFTL and numerical modelers to aid in experimental design and interpretation. This presentation will discuss the facility and outline the procedures required to submit a proposal to use this unique facility for research purposes. The W. R. Wiley Environmental Molecular Sciences Laboratory, a national scientific user facility, is sponsored by the U.S. Department of Energy's Office of Biological and Environmental Research and located at Pacific Northwest National Laboratory.

  8. Variability in baseline laboratory measurements of the Brazilian Longitudinal Study of Adult Health (ELSA-Brasil).

    PubMed

    Ladwig, R; Vigo, A; Fedeli, L M G; Chambless, L E; Bensenor, I; Schmidt, M I; Vidigal, P G; Castilhos, C D; Duncan, B B

    2016-08-01

    Multi-center epidemiological studies must ascertain that their measurements are accurate and reliable. For laboratory measurements, reliability can be assessed through investigation of reproducibility of measurements in the same individual. In this paper, we present results from the quality control analysis of the baseline laboratory measurements from the ELSA-Brasil study. The study enrolled 15,105 civil servants at 6 research centers in 3 regions of Brazil between 2008-2010, with multiple biochemical analytes being measured at a central laboratory. Quality control was ascertained through standard laboratory evaluation of intra- and inter-assay variability and test-retest analysis in a subset of randomly chosen participants. An additional sample of urine or blood was collected from these participants, and these samples were handled in the same manner as the original ones, locally and at the central laboratory. Reliability was assessed with the intraclass correlation coefficient (ICC), estimated through a random effects model. Coefficients of variation (CV) and Bland-Altman plots were additionally used to assess measurement variability. Laboratory intra and inter-assay CVs varied from 0.86% to 7.77%. From test-retest analyses, the ICCs were high for the majority of the analytes. Notably lower ICCs were observed for serum sodium (ICC=0.50; 95%CI=0.31-0.65) and serum potassium (ICC=0.73; 95%CI=0.60-0.83), due to the small biological range of these analytes. The CVs ranged from 1 to 14%. The Bland-Altman plots confirmed these results. The quality control analyses showed that the collection, processing and measurement protocols utilized in the ELSA-Brasil produced reliable biochemical measurements.

  9. Outbreak of Neisseria meningitidis C in workers at a large food-processing plant in Brazil: challenges of controlling disease spread to the larger community.

    PubMed

    Iser, B P M; Lima, H C A V; de Moraes, C; de Almeida, R P A; Watanabe, L T; Alves, S L A; Lemos, A P S; Gorla, M C O; Gonçalves, M G; Dos Santos, D A; Sobel, J

    2012-05-01

    SUMMARYAn outbreak of meningococcal disease (MD) with severe morbidity and mortality was investigated in midwestern Brazil in order to identify control measures. A MD case was defined as isolation of Neisseria meningitidis, or detection of polysaccharide antigen in a sterile site, or presence of clinical purpura fulminans, or an epidemiological link with a laboratory-confirmed case-patient, between June and August 2008. In 8 out of 16 MD cases studied, serogroup C ST103 complex was identified. Five (31%) cases had neurological findings and five (31%) died. The attack rate was 12 cases/100 000 town residents and 60 cases/100 000 employees in a large local food-processing plant. We conducted a matched case-control study of eight primary laboratory-confirmed cases (1:4). Factors associated with illness in single variable analysis were work at the processing plant [matched odds ratio (mOR) 22, 95% confidence interval (CI) 2·3-207·7, P<0·01], and residing <1 year in Rio Verde (mOR 7, 95% CI 1·11-43·9, P<0·02). Mass vaccination (>10 000 plant employees) stopped propagation in the plant, but not in the larger community.

  10. Practical issues in implementing whole-genome-sequencing in routine diagnostic microbiology.

    PubMed

    Rossen, J W A; Friedrich, A W; Moran-Gilad, J

    2018-04-01

    Next generation sequencing (NGS) is increasingly being used in clinical microbiology. Like every new technology adopted in microbiology, the integration of NGS into clinical and routine workflows must be carefully managed. To review the practical aspects of implementing bacterial whole genome sequencing (WGS) in routine diagnostic laboratories. Review of the literature and expert opinion. In this review, we discuss when and how to integrate whole genome sequencing (WGS) in the routine workflow of the clinical laboratory. In addition, as the microbiology laboratories have to adhere to various national and international regulations and criteria for their accreditation, we deliberate on quality control issues for using WGS in microbiology, including the importance of proficiency testing. Furthermore, the current and future place of this technology in the diagnostic hierarchy of microbiology is described as well as the necessity of maintaining backwards compatibility with already established methods. Finally, we speculate on the question of whether WGS can entirely replace routine microbiology in the future and the tension between the fact that most sequencers are designed to process multiple samples in parallel whereas for optimal diagnosis a one-by-one processing of the samples is preferred. Special reference is made to the cost and turnaround time of WGS in diagnostic laboratories. Further development is required to improve the workflow for WGS, in particular to shorten the turnaround time, reduce costs, and streamline downstream data analyses. Only when these processes reach maturity will reliance on WGS for routine patient management and infection control management become feasible, enabling the transformation of clinical microbiology into a genome-based and personalized diagnostic field. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  11. Experimental comparison of conventional and nonlinear model-based control of a mixing tank

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haeggblom, K.E.

    1993-11-01

    In this case study concerning control of a laboratory-scale mixing tank, conventional multiloop single-input single-output (SISO) control is compared with model-based'' control where the nonlinearity and multivariable characteristics of the process are explicitly taken into account. It is shown, especially if the operating range of the process is large, that the two outputs (level and temperature) cannot be adequately controlled by multiloop SISO control even if gain scheduling is used. By nonlinear multiple-input multiple-output (MIMO) control, on the other hand, very good control performance is obtained. The basic approach to nonlinear control used in this study is first to transformmore » the process into a globally linear and decoupled system, and then to design controllers for this system. Because of the properties of the resulting MIMO system, the controller design is very easy. Two nonlinear control system designs based on a steady-state and a dynamic model, respectively, are considered. In the dynamic case, both setpoint tracking and disturbance rejection can be addressed separately.« less

  12. The Role of Intelligence Quotient and Emotional Intelligence in Cognitive Control Processes

    PubMed Central

    Checa, Purificación; Fernández-Berrocal, Pablo

    2015-01-01

    The relationship between intelligence quotient (IQ) and cognitive control processes has been extensively established. Several studies have shown that IQ correlates with cognitive control abilities, such as interference suppression, as measured with experimental tasks like the Stroop and Flanker tasks. By contrast, there is a debate about the role of Emotional Intelligence (EI) in individuals' cognitive control abilities. The aim of this study is to examine the relation between IQ and EI, and cognitive control abilities evaluated by a typical laboratory control cognitive task, the Stroop task. Results show a negative correlation between IQ and the interference suppression index, the ability to inhibit processing of irrelevant information. However, the Managing Emotions dimension of EI measured by the Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT), but not self-reported of EI, negatively correlates with the impulsivity index, the premature execution of the response. These results suggest that not only is IQ crucial, but also competences related to EI are essential to human cognitive control processes. Limitations and implications of these results are also discussed. PMID:26648901

  13. Application of the Toyota Production System improves core laboratory operations.

    PubMed

    Rutledge, Joe; Xu, Min; Simpson, Joanne

    2010-01-01

    To meet the increased clinical demands of our hospital expansion, improve quality, and reduce costs, our tertiary care, pediatric core laboratory used the Toyota Production System lean processing to reorganize our 24-hour, 7 d/wk core laboratory. A 4-month, consultant-driven process removed waste, led to a physical reset of the space to match the work flow, and developed a work cell for our random access analyzers. In addition, visual controls, single piece flow, standard work, and "5S" were instituted. The new design met our goals as reflected by achieving and maintaining improved turnaround time (TAT; mean for creatinine reduced from 54 to 23 minutes) with increased testing volume (20%), monetary savings (4 full-time equivalents), decreased variability in TAT, and better space utilization (25% gain). The project had the unanticipated consequence of eliminating STAT testing because our in-laboratory TAT for routine testing was less than our prior STAT turnaround goal. The viability of this approach is demonstrated by sustained gains and further PDCA (Plan, Do, Check, Act) improvements during the 4 years after completion of the project.

  14. Cardiovascular reactivity in real life settings: Measurement, mechanisms and meaning

    PubMed Central

    Zanstra, Ydwine Jieldouw; Johnston, Derek William

    2011-01-01

    Cardiovascular reactivity to stress is most commonly studied in the laboratory. Laboratory stressors may have limited ecological validity due to the many constraints, operating in controlled environments. This paper will focus on paradigms that involve the measurement of cardiovascular reactions to stress in real life using ambulatory monitors. Probably the most commonly used paradigm in this field is to measure the response to a specific real life stressor, such as sitting an exam or public speaking. A more general approach has been to derive a measure of CV variability testing the hypothesis that more reactive participants will have more variable heart rate or blood pressure. Alternatively, self-reports of the participants’ perceived stress, emotion or demands may be linked to simultaneously collected ambulatory measures of cardiovascular parameters. This paper examines the following four questions: (1) What is the form and what are the determinants of stress-induced CV reactivity in real life? (2) What are the psychophysiological processes underlying heart rate and blood pressure reactivity in real life? (3) Does CV reactivity determined in the laboratory predict CV reactivity in real life? (4) Are ambulatory cardiovascular measures predictive of cardiovascular disease? It is concluded that the hemodynamic processes that underlie the blood pressure response can reliably be measured in real life and the psychophysiological relationships seen in the laboratory have been obtained in real life as well. Studies examining the effects of specific real life stressors show that responses obtained in real life are often larger than those obtained in the laboratory. Subjective ratings of stress, emotion and cognitive determinants of real life stress (e.g. demand, reward and control) also relate to real life CV responses. Surprisingly, ambulatory studies on real life cardiovascular reactivity to stress as a predictor of cardiovascular disease are rare. Measuring the CV response to stress in real life may provide a better measure of the stress-related process that are hypothesized to cause disease than is possible in the laboratory. In addressing these questions, below we review the studies that we believe are representative of the field. Therefore, this review is not comprehensive. PMID:20561941

  15. A Nonlinear, Multiinput, Multioutput Process Control Laboratory Experiment

    ERIC Educational Resources Information Center

    Young, Brent R.; van der Lee, James H.; Svrcek, William Y.

    2006-01-01

    Experience in using a user-friendly software, Mathcad, in the undergraduate chemical reaction engineering course is discussed. Example problems considered for illustration deal with simultaneous solution of linear algebraic equations (kinetic parameter estimation), nonlinear algebraic equations (equilibrium calculations for multiple reactions and…

  16. 40 CFR 270.30 - Conditions applicable to all permits.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... operator staffing and training, and adequate laboratory and process controls, including appropriate quality... any location. (j) Monitoring and records. (1) Samples and measurements taken for the purpose of... recordings for continuous monitoring instrumentation, copies of all reports required by this permit, the...

  17. 40 CFR 270.30 - Conditions applicable to all permits.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... operator staffing and training, and adequate laboratory and process controls, including appropriate quality... any location. (j) Monitoring and records. (1) Samples and measurements taken for the purpose of... recordings for continuous monitoring instrumentation, copies of all reports required by this permit, the...

  18. SIMPLE WAYS TO IMPROVE PH AND ALKALINITY MEASUREMENTS FOR WATER UTILITIES AND LABORATORIES

    EPA Science Inventory

    Both pH and total alkalinity determinations are critical in characterizing chemical properties of water, being important to implementing good process control, determining corrosivity and other water quality properties, and assessing changes in water characteristics. Poor charac...

  19. Short communication: Analytical method and amount of preservative added to milk samples may alter milk urea nitrogen measurements.

    PubMed

    Weeks, Holley L; Hristov, Alexander N

    2017-02-01

    Milk urea N (MUN) is used by dairy nutritionists and producers to monitor dietary protein intake and is indicative of N utilization in lactating dairy cows. Two experiments were conducted to explore discrepancies in MUN results provided by 3 milk processing laboratories using different methods. An additional experiment was conducted to evaluate the effect of 2-bromo-2-nitropropane-1, 3-diol (bronopol) on MUN analysis. In experiment 1, 10 replicates of bulk tank milk samples, collected from the Pennsylvania State University's Dairy Center over 5 consecutive days, were sent to 3 milk processing laboratories in Pennsylvania. Average MUN differed between laboratory A (14.9 ± 0.40 mg/dL; analyzed on MilkoScan 4000; Foss, Hillerød, Denmark), laboratory B (6.5 ± 0.17 mg/dL; MilkoScan FT + 6000), and laboratory C (7.4 ± 0.36 mg/dL; MilkoScan 6000). In experiment 2, milk samples were spiked with urea at 0 (7.3 to 15.0 mg/dL, depending on the laboratory analyzing the samples), 17.2, 34.2, and 51.5 mg/dL of milk. Two 35-mL samples from each urea level were sent to the 3 laboratories used in experiment 1. Average analyzed MUN was greater than predicted (calculated for each laboratory based on the control; 0 mg of added urea): for laboratory A (23.2 vs. 21.0 mg/dL), laboratory B (18.0 vs. 13.3 mg/dL), and laboratory C (20.6 vs. 15.2 mg/dL). In experiment 3, replicated milk samples were preserved with 0 to 1.35 mg of bronopol/mL of milk and submitted to one milk processing laboratory that analyzed MUN using 2 different methods. Milk samples with increasing amounts of bronopol ranged in MUN concentration from 7.7 to 11.9 mg/dL and from 9.0 to 9.3 mg/dL when analyzed on MilkoScan 4000 or CL 10 (EuroChem, Moscow, Russia), respectively. In conclusion, measured MUN concentrations varied due to analytical procedure used by milk processing laboratories and were affected by the amount of bronopol used to preserve milk sample, when milk was analyzed using a mid-infrared analyzer. Thus, it is important to maintain consistency in milk sample preservation and analysis to ensure precision of MUN results. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  20. Experimental validation of an integrated controls-structures design methodology for a class of flexible space structures

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Gupta, Sandeep; Elliott, Kenny B.; Joshi, Suresh M.; Walz, Joseph E.

    1994-01-01

    This paper describes the first experimental validation of an optimization-based integrated controls-structures design methodology for a class of flexible space structures. The Controls-Structures-Interaction (CSI) Evolutionary Model, a laboratory test bed at Langley, is redesigned based on the integrated design methodology with two different dissipative control strategies. The redesigned structure is fabricated, assembled in the laboratory, and experimentally compared with the original test structure. Design guides are proposed and used in the integrated design process to ensure that the resulting structure can be fabricated. Experimental results indicate that the integrated design requires greater than 60 percent less average control power (by thruster actuators) than the conventional control-optimized design while maintaining the required line-of-sight performance, thereby confirming the analytical findings about the superiority of the integrated design methodology. Amenability of the integrated design structure to other control strategies is considered and evaluated analytically and experimentally. This work also demonstrates the capabilities of the Langley-developed design tool CSI DESIGN which provides a unified environment for structural and control design.

  1. Decreasing laboratory turnaround time and patient wait time by implementing process improvement methodologies in an outpatient oncology infusion unit.

    PubMed

    Gjolaj, Lauren N; Gari, Gloria A; Olier-Pino, Angela I; Garcia, Juan D; Fernandez, Gustavo L

    2014-11-01

    Prolonged patient wait times in the outpatient oncology infusion unit indicated a need to streamline phlebotomy processes by using existing resources to decrease laboratory turnaround time and improve patient wait time. Using the DMAIC (define, measure, analyze, improve, control) method, a project to streamline phlebotomy processes within the outpatient oncology infusion unit in an academic Comprehensive Cancer Center known as the Comprehensive Treatment Unit (CTU) was completed. Laboratory turnaround time for patients who needed same-day lab and CTU services and wait time for all CTU patients was tracked for 9 weeks. During the pilot, the wait time from arrival to CTU to sitting in treatment area decreased by 17% for all patients treated in the CTU during the pilot. A total of 528 patients were seen at the CTU phlebotomy location, representing 16% of the total patients who received treatment in the CTU, with a mean turnaround time of 24 minutes compared with a baseline turnaround time of 51 minutes. Streamlining workflows and placing a phlebotomy station inside of the CTU decreased laboratory turnaround times by 53% for patients requiring same day lab and CTU services. The success of the pilot project prompted the team to make the station a permanent fixture. Copyright © 2014 by American Society of Clinical Oncology.

  2. Production of Copper-Plated Beamline Bellows and Spools for LCLS-II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, Katherine M.; Carpenter, Brian C.; Daly, Ed

    The SLAC National Accelerator Laboratory is currently constructing a major upgrade to its accelerator, the Linac Coherent Light Source II (LCLS-II). Several Department of Energy national laboratories, including the Thomas Jefferson National Accelerator Facility (JLab) and Fermi National Accelerator Laboratory (FNAL), are participating in this project. The 1.3-GHz cryomodules for this project consist of eight cavities separated by bellows (expansion joints) and spools (tube sections), which are copper plated for RF conduction. JLab is responsible for procurement of these bellows and spools, which are delivered to JLab and FNAL for assembly into cryomodules. Achieving accelerator-grade copper plating is always amore » challenge and requires careful specification of requirements and application of quality control processes. Due to the demanding technical requirements of this part, JLab implemented procurement strategies to make the process more efficient as well as provide process redundancy. This paper discusses the manufacturing challenges that were encountered and resolved, as well as the strategies that were employed to minimize the impact of any technical issues.« less

  3. Defining the best quality-control systems by design and inspection.

    PubMed

    Hinckley, C M

    1997-05-01

    Not all of the many approaches to quality control are equally effective. Nonconformities in laboratory testing are caused basically by excessive process variation and mistakes. Statistical quality control can effectively control process variation, but it cannot detect or prevent most mistakes. Because mistakes or blunders are frequently the dominant source of nonconformities, we conclude that statistical quality control by itself is not effective. I explore the 100% inspection methods essential for controlling mistakes. Unlike the inspection techniques that Deming described as ineffective, the new "source" inspection methods can detect mistakes and enable corrections before nonconformities are generated, achieving the highest degree of quality at a fraction of the cost of traditional methods. Key relationships between task complexity and nonconformity rates are also described, along with cultural changes that are essential for implementing the best quality-control practices.

  4. Sigma metrics as a tool for evaluating the performance of internal quality control in a clinical chemistry laboratory

    PubMed Central

    Kumar, B. Vinodh; Mohan, Thuthi

    2018-01-01

    OBJECTIVE: Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. MATERIALS AND METHODS: This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. RESULTS: For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. CONCLUSION: This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes. PMID:29692587

  5. Human System Simulation in Support of Human Performance Technical Basis at NPPs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David Gertman; Katya Le Blanc; alan mecham

    2010-06-01

    This paper focuses on strategies and progress toward establishing the Idaho National Laboratory’s (INL’s) Human Systems Simulator Laboratory at the Center for Advanced Energy Studies (CAES), a consortium of Idaho State Universities. The INL is one of the National Laboratories of the US Department of Energy. One of the first planned applications for the Human Systems Simulator Laboratory is implementation of a dynamic nuclear power plant simulation (NPP) where studies of operator workload, situation awareness, performance and preference will be carried out in simulated control rooms including nuclear power plant control rooms. Simulation offers a means by which to reviewmore » operational concepts, improve design practices and provide a technical basis for licensing decisions. In preparation for the next generation power plant and current government and industry efforts in support of light water reactor sustainability, human operators will be attached to a suite of physiological measurement instruments and, in combination with traditional Human Factors Measurement techniques, carry out control room tasks in simulated advanced digital and hybrid analog/digital control rooms. The current focus of the Human Systems Simulator Laboratory is building core competence in quantitative and qualitative measurements of situation awareness and workload. Of particular interest is whether introduction of digital systems including automated procedures has the potential to reduce workload and enhance safety while improving situation awareness or whether workload is merely shifted and situation awareness is modified in yet to be determined ways. Data analysis is carried out by engineers and scientists and includes measures of the physical and neurological correlates of human performance. The current approach supports a user-centered design philosophy (see ISO 13407 “Human Centered Design Process for Interactive Systems, 1999) wherein the context for task performance along with the requirements of the end-user are taken into account during the design process and the validity of design is determined through testing of real end users« less

  6. Development of a candidate reference material for adventitious virus detection in vaccine and biologicals manufacturing by deep sequencing

    PubMed Central

    Mee, Edward T.; Preston, Mark D.; Minor, Philip D.; Schepelmann, Silke; Huang, Xuening; Nguyen, Jenny; Wall, David; Hargrove, Stacey; Fu, Thomas; Xu, George; Li, Li; Cote, Colette; Delwart, Eric; Li, Linlin; Hewlett, Indira; Simonyan, Vahan; Ragupathy, Viswanath; Alin, Voskanian-Kordi; Mermod, Nicolas; Hill, Christiane; Ottenwälder, Birgit; Richter, Daniel C.; Tehrani, Arman; Jacqueline, Weber-Lehmann; Cassart, Jean-Pol; Letellier, Carine; Vandeputte, Olivier; Ruelle, Jean-Louis; Deyati, Avisek; La Neve, Fabio; Modena, Chiara; Mee, Edward; Schepelmann, Silke; Preston, Mark; Minor, Philip; Eloit, Marc; Muth, Erika; Lamamy, Arnaud; Jagorel, Florence; Cheval, Justine; Anscombe, Catherine; Misra, Raju; Wooldridge, David; Gharbia, Saheer; Rose, Graham; Ng, Siemon H.S.; Charlebois, Robert L.; Gisonni-Lex, Lucy; Mallet, Laurent; Dorange, Fabien; Chiu, Charles; Naccache, Samia; Kellam, Paul; van der Hoek, Lia; Cotten, Matt; Mitchell, Christine; Baier, Brian S.; Sun, Wenping; Malicki, Heather D.

    2016-01-01

    Background Unbiased deep sequencing offers the potential for improved adventitious virus screening in vaccines and biotherapeutics. Successful implementation of such assays will require appropriate control materials to confirm assay performance and sensitivity. Methods A common reference material containing 25 target viruses was produced and 16 laboratories were invited to process it using their preferred adventitious virus detection assay. Results Fifteen laboratories returned results, obtained using a wide range of wet-lab and informatics methods. Six of 25 target viruses were detected by all laboratories, with the remaining viruses detected by 4–14 laboratories. Six non-target viruses were detected by three or more laboratories. Conclusion The study demonstrated that a wide range of methods are currently used for adventitious virus detection screening in biological products by deep sequencing and that they can yield significantly different results. This underscores the need for common reference materials to ensure satisfactory assay performance and enable comparisons between laboratories. PMID:26709640

  7. LabData database sub-systems for post-processing and quality control of stable isotope and gas chromatography measurements

    NASA Astrophysics Data System (ADS)

    Suckow, A. O.

    2013-12-01

    Measurements need post-processing to obtain results that are comparable between laboratories. Raw data may need to be corrected for blank, memory, drift (change of reference values with time), linearity (dependence of reference on signal height) and normalized to international reference materials. Post-processing parameters need to be stored for traceability of results. State of the art stable isotope correction schemes are available based on MS Excel (Geldern and Barth, 2012; Gröning, 2011) or MS Access (Coplen, 1998). These are specialized to stable isotope measurements only, often only to the post-processing of a special run. Embedding of algorithms into a multipurpose database system was missing. This is necessary to combine results of different tracers (3H, 3He, 2H, 18O, CFCs, SF6...) or geochronological tools (Sediment dating e.g. with 210Pb, 137Cs), to relate to attribute data (submitter, batch, project, geographical origin, depth in core, well information etc.) and for further interpretation tools (e.g. lumped parameter modelling). Database sub-systems to the LabData laboratory management system (Suckow and Dumke, 2001) are presented for stable isotopes and for gas chromatographic CFC and SF6 measurements. The sub-system for stable isotopes allows the following post-processing: 1. automated import from measurement software (Isodat, Picarro, LGR), 2. correction for sample-to sample memory, linearity, drift, and renormalization of the raw data. The sub-system for gas chromatography covers: 1. storage of all raw data 2. storage of peak integration parameters 3. correction for blank, efficiency and linearity The user interface allows interactive and graphical control of the post-processing and all corrections by export to and plot in MS Excel and is a valuable tool for quality control. The sub-databases are integrated into LabData, a multi-user client server architecture using MS SQL server as back-end and an MS Access front-end and installed in four laboratories to date. Attribute data storage (unique ID for each subsample, origin, project context etc.) and laboratory management features are included. Export routines to Excel (depth profiles, time series, all possible tracer-versus tracer plots...) and modelling capabilities are add-ons. The source code is public domain and available under the GNU general public licence agreement (GNU-GPL). References Coplen, T.B., 1998. A manual for a laboratory information management system (LIMS) for light stable isotopes. Version 7.0. USGS open file report 98-284. Geldern, R.v., Barth, J.A.C., 2012. Optimization of instrument setup and post-run corrections for oxygen and hydrogen stable isotope measurements of water by isotope ratio infrared spectroscopy (IRIS). Limnology and Oceanography: Methods 10, 1024-1036. Gröning, M., 2011. Improved water δ2H and δ18O calibration and calculation of measurement uncertainty using a simple software tool. Rapid Communications in Mass Spectrometry 25, 2711-2720. Suckow, A., Dumke, I., 2001. A database system for geochemical, isotope hydrological and geochronological laboratories. Radiocarbon 43, 325-337.

  8. Obstacles for optimal tuberculosis case detection in primary health centers (PHC) in Sidoarjo district, East Java, Indonesia

    PubMed Central

    Wahyuni, Chatarina U; Budiono; Rahariyani, Lutfia Dwi; Sulistyowati, Muji; Rachmawati, Tety; Djuwari; Yuliwati, Sri; van der Werf, Marieke J

    2007-01-01

    Background Pulmonary tuberculosis (TB) is a major health problem worldwide. Detection of the most infectious cases of tuberculosis – sputum smear-positive pulmonary cases – by passive case finding is an essential component of TB control. The district of Sidoarjo in East Java reported a low case detection rate (CDR) of 14% in 2003. We evaluated the diagnostic process for TB in primary health care centers (PHC) in Sidoarjo district to assess whether problems in identification of TB suspects or in diagnosing TB patients can explain the low CDR. Methods We performed interviews with the staff (general nurse, TB worker, laboratory technician, and head of health center) of the 25 PHCs of Sidoarjo district to obtain information about the knowledge of TB, health education practices, and availability of support services for TB diagnosis. The quality of the laboratory diagnosis was examined by providing 10 slides with a known result to the laboratory technicians for re-examination. Results Eighty percent of the nurses and 84% of the TB workers knew that cough >3 weeks can be a symptom of TB. Only 40% of the nurses knew the cause of TB, few could mention complications of TB and none could mention the duration of infectiousness after start of treatment. Knowledge of TB workers was much better. Information about how to produce a good sputum sample was provided to TB suspects by 76% of the nurses and 84% of the TB workers. Only few provided all information. Fifty-five percent of the 11 laboratory technicians correctly identified all positive slides as positive and 45% correctly identified 100% of the negative slides as negative. All TB workers, one general nurses and 32% of the laboratory technicians had received specific training in TB control. There has been no shortage of TB forms and laboratory materials in 96% of the PHCs. Conclusion The quality of the diagnostic process for TB at PHC in Sidoarjo district should be improved on all levels. Training in TB control of all general nurses and the laboratory technicians that have not received training would be a good first step to enhance diagnosis of TB and to improve the case detection rate. PMID:17760984

  9. 21 CFR 226.58 - Laboratory controls.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 4 2012-04-01 2012-04-01 false Laboratory controls. 226.58 Section 226.58 Food...: GENERAL CURRENT GOOD MANUFACTURING PRACTICE FOR TYPE A MEDICATED ARTICLES Product Quality Control § 226.58 Laboratory controls. Laboratory controls shall include the establishment of adequate specifications and test...

  10. 21 CFR 226.58 - Laboratory controls.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 4 2011-04-01 2011-04-01 false Laboratory controls. 226.58 Section 226.58 Food...: GENERAL CURRENT GOOD MANUFACTURING PRACTICE FOR TYPE A MEDICATED ARTICLES Product Quality Control § 226.58 Laboratory controls. Laboratory controls shall include the establishment of adequate specifications and test...

  11. 21 CFR 226.58 - Laboratory controls.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 4 2013-04-01 2013-04-01 false Laboratory controls. 226.58 Section 226.58 Food...: GENERAL CURRENT GOOD MANUFACTURING PRACTICE FOR TYPE A MEDICATED ARTICLES Product Quality Control § 226.58 Laboratory controls. Laboratory controls shall include the establishment of adequate specifications and test...

  12. Large-area copper indium diselenide (CIS) process, control and manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gillespie, T.J.; Lanning, B.R.; Marshall, C.H.

    1997-12-31

    Lockheed Martin Astronautics (LMA) has developed a large-area (30x30cm) sequential CIS manufacturing approach amenable to low-cost photovoltaics (PV) production. A prototype CIS manufacturing system has been designed and built with compositional uniformity (Cu/In ratio) verified within {+-}4 atomic percent over the 30x30cm area. CIS device efficiencies have been measured by the National Renewable Energy Laboratory (NREL) at 7% on a flexible non-sodium-containing substrate and 10% on a soda-lime-silica (SLS) glass substrate. Critical elements of the manufacturing capability include the CIS sequential process selection, uniform large-area material deposition, and in-situ process control. Details of the process and large-area manufacturing approach aremore » discussed and results presented.« less

  13. Revitalizing chemistry laboratory instruction

    NASA Astrophysics Data System (ADS)

    McBride, Phil Blake

    This dissertation involves research in three major domains of chemical education as partial fulfillment of the requirements for the Ph.D. program in chemistry at Miami University with a major emphasis on chemical education, and concurrent study in organic chemistry. Unit I, Development and Assessment of a Column Chromatography Laboratory Activity, addresses the domain of Instructional Materials Development and Testing. This unit outlines the process of developing a publishable laboratory activity, testing and revising that activity, and subsequently sharing that activity with the chemical education community. A laboratory activity focusing on the separation of methylene blue and sodium fluorescein was developed to demonstrate the effects of both the stationary and mobile phase in conducting a separation. Unit II, Bringing Industry to the Laboratory, addresses the domain of Curriculum Development and Testing. This unit outlines the development of the Chemistry of Copper Mining module, which is intended for use in high school or undergraduate college chemistry. The module uses the learning cycle approach to present the chemistry of the industrial processes of mining copper to the students. The module includes thirteen investigations (three of which are web-based and ten which are laboratory experiments) and an accompanying interactive CD-ROM, which provides an explanation of the chemistry used in copper mining with a virtual tour of an operational copper mine. Unit III, An Alternative Method of Teaching Chemistry. Integrating Lecture and the Laboratory, is a project that addresses the domain of Research in Student Learning. Fundamental Chemistry was taught at Eastern Arizona College as an integrated lecture/laboratory course that met in two-hour blocks on Monday, Wednesday, and Friday. The students taking this integrated course were compared with students taking the traditional 1-hour lectures held on Monday, Wednesday, and Friday, with accompanying 3-hour lab on Tuesday or Thursday. There were 119 students in the test group, 522 students in the Shelton control group and 556 students in the McBride control group. Both qualitative data and quantitative data were collected. A t-test was used to test significance.

  14. Does Copper Metal React with Acetic Acid?

    ERIC Educational Resources Information Center

    DeMeo, Stephen

    1997-01-01

    Describes an activity that promotes analytical thinking and problem solving. Gives students experience with important scientific processes that can be generalized to other new laboratory experiences. Provides students with the opportunity to hypothesize answers, control variables by designing an experiment, and make logical deductions based on…

  15. Microscopic Analysis of Activated Sludge. Training Manual.

    ERIC Educational Resources Information Center

    Office of Water Program Operations (EPA), Cincinnati, OH. National Training and Operational Technology Center.

    This training manual presents material on the use of a compound microscope to analyze microscope communities, present in wastewater treatment processes, for operational control. Course topics include: sampling techniques, sample handling, laboratory analysis, identification of organisms, data interpretation, and use of the compound microscope.…

  16. LABORATORY AND COMPUTATIONAL INVESTIGATIONS OF THE ATMOSPHERIC CHEMISTRY OF KEY OXIDATION PRODUCTS CONTROLLING TROPOSPHERIC OZONE FORMATION

    EPA Science Inventory

    Major uncertainties remain in our ability to identify the key reactions and primary oxidation products of volatile hydrocarbons that contribute to ozone formation in the troposphere. To reduce these uncertainties, computational chemistry, mechanistic and process analysis techniqu...

  17. Towards an evaluation framework for Laboratory Information Systems.

    PubMed

    Yusof, Maryati M; Arifin, Azila

    Laboratory testing and reporting are error-prone and redundant due to repeated, unnecessary requests and delayed or missed reactions to laboratory reports. Occurring errors may negatively affect the patient treatment process and clinical decision making. Evaluation on laboratory testing and Laboratory Information System (LIS) may explain the root cause to improve the testing process and enhance LIS in supporting the process. This paper discusses a new evaluation framework for LIS that encompasses the laboratory testing cycle and the socio-technical part of LIS. Literature review on discourses, dimensions and evaluation methods of laboratory testing and LIS. A critical appraisal of the Total Testing Process (TTP) and the human, organization, technology-fit factors (HOT-fit) evaluation frameworks was undertaken in order to identify error incident, its contributing factors and preventive action pertinent to laboratory testing process and LIS. A new evaluation framework for LIS using a comprehensive and socio-technical approach is outlined. Positive relationship between laboratory and clinical staff resulted in a smooth laboratory testing process, reduced errors and increased process efficiency whilst effective use of LIS streamlined the testing processes. The TTP-LIS framework could serve as an assessment as well as a problem-solving tool for the laboratory testing process and system. Copyright © 2016 King Saud Bin Abdulaziz University for Health Sciences. Published by Elsevier Ltd. All rights reserved.

  18. Evolution of catalytic function

    NASA Technical Reports Server (NTRS)

    Joyce, G. F.

    1993-01-01

    An RNA-based evolution system was constructed in the laboratory and used to develop RNA enzymes with novel catalytic function. By controlling the nature of the catalytic task that the molecules must perform in order to survive, it is possible to direct the evolving population toward the expression of some desired catalytic behavior. More recently, this system has been coupled to an in vitro translation procedure, raising the possibility of evolving protein enzymes in the laboratory to produce novel proteins with desired catalytic properties. The aim of this line of research is to reduce darwinian evolution, the fundamental process of biology, to a laboratory procedure that can be made to operate in the service of organic synthesis.

  19. Reverberation Chamber Uniformity Validation and Radiated Susceptibility Test Procedures for the NASA High Intensity Radiated Fields Laboratory

    NASA Technical Reports Server (NTRS)

    Koppen, Sandra V.; Nguyen, Truong X.; Mielnik, John J.

    2010-01-01

    The NASA Langley Research Center's High Intensity Radiated Fields Laboratory has developed a capability based on the RTCA/DO-160F Section 20 guidelines for radiated electromagnetic susceptibility testing in reverberation chambers. Phase 1 of the test procedure utilizes mode-tuned stirrer techniques and E-field probe measurements to validate chamber uniformity, determines chamber loading effects, and defines a radiated susceptibility test process. The test procedure is segmented into numbered operations that are largely software controlled. This document is intended as a laboratory test reference and includes diagrams of test setups, equipment lists, as well as test results and analysis. Phase 2 of development is discussed.

  20. Characterization of X-ray fields at the center for devices and radiological health

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cerra, F.

    This talk summarizes the process undertaken by the Center for Devices and Radiological Health (CDRH) for establishing reference x-ray fields in its accredited calibration laboratory. The main considerations and their effects on the calibration parameters are discussed. The characterization of fields may be broken down into two parts: (1) the initial setup of the calibration beam spectra and (2) the ongoing measurements and controls which ensure consistency of the reference fields. The methods employed by CDRH for both these stages and underlying considerations are presented. Uncertainties associated with the various parameters are discussed. Finally, the laboratory`s performance, as evidenced bymore » ongoing measurement quality assurance results, is reported.« less

  1. Microbiological, Geochemical and Hydrologic Processes Controlling Uranium Mobility: An Integrated Field-Scale Subsurface Research Challenge Site at Rifle, Colorado, Quality Assurance Project Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fix, N. J.

    The U.S. Department of Energy (DOE) is cleaning up and/or monitoring large, dilute plumes contaminated by metals, such as uranium and chromium, whose mobility and solubility change with redox status. Field-scale experiments with acetate as the electron donor have stimulated metal-reducing bacteria to effectively remove uranium [U(VI)] from groundwater at the Uranium Mill Tailings Site in Rifle, Colorado. The Pacific Northwest National Laboratory and a multidisciplinary team of national laboratory and academic collaborators has embarked on a research proposed for the Rifle site, the object of which is to gain a comprehensive and mechanistic understanding of the microbial factors andmore » associated geochemistry controlling uranium mobility so that DOE can confidently remediate uranium plumes as well as support stewardship of uranium-contaminated sites. This Quality Assurance Project Plan provides the quality assurance requirements and processes that will be followed by the Rifle Integrated Field-Scale Subsurface Research Challenge Project.« less

  2. Flexible End2End Workflow Automation of Hit-Discovery Research.

    PubMed

    Holzmüller-Laue, Silke; Göde, Bernd; Thurow, Kerstin

    2014-08-01

    The article considers a new approach of more complex laboratory automation at the workflow layer. The authors purpose the automation of end2end workflows. The combination of all relevant subprocesses-whether automated or manually performed, independently, and in which organizational unit-results in end2end processes that include all result dependencies. The end2end approach focuses on not only the classical experiments in synthesis or screening, but also on auxiliary processes such as the production and storage of chemicals, cell culturing, and maintenance as well as preparatory activities and analyses of experiments. Furthermore, the connection of control flow and data flow in the same process model leads to reducing of effort of the data transfer between the involved systems, including the necessary data transformations. This end2end laboratory automation can be realized effectively with the modern methods of business process management (BPM). This approach is based on a new standardization of the process-modeling notation Business Process Model and Notation 2.0. In drug discovery, several scientific disciplines act together with manifold modern methods, technologies, and a wide range of automated instruments for the discovery and design of target-based drugs. The article discusses the novel BPM-based automation concept with an implemented example of a high-throughput screening of previously synthesized compound libraries. © 2014 Society for Laboratory Automation and Screening.

  3. Impact of providing fee data on laboratory test ordering: a controlled clinical trial.

    PubMed

    Feldman, Leonard S; Shihab, Hasan M; Thiemann, David; Yeh, Hsin-Chieh; Ardolino, Margaret; Mandell, Steven; Brotman, Daniel J

    2013-05-27

    Inpatient care providers often order laboratory tests without any appreciation for the costs of the tests. To determine whether we could decrease the number of laboratory tests ordered by presenting providers with test fees at the time of order entry in a tertiary care hospital, without adding extra steps to the ordering process. Controlled clinical trial. Tertiary care hospital. All providers, including physicians and nonphysicians, who ordered laboratory tests through the computerized provider order entry system at The Johns Hopkins Hospital. We randomly assigned 61 diagnostic laboratory tests to an "active" arm (fee displayed) or to a control arm (fee not displayed). During a 6-month baseline period (November 10, 2008, through May 9, 2009), we did not display any fee data. During a 6-month intervention period 1 year later (November 10, 2009, through May 9, 2010), we displayed fees, based on the Medicare allowable fee, for active tests only. We examined changes in the total number of orders placed, the frequency of ordered tests (per patient-day), and total charges associated with the orders according to the time period (baseline vs intervention period) and by study group (active test vs control). For the active arm tests, rates of test ordering were reduced from 3.72 tests per patient-day in the baseline period to 3.40 tests per patient-day in the intervention period (8.59% decrease; 95% CI, -8.99% to -8.19%). For control arm tests, ordering increased from 1.15 to 1.22 tests per patient-day from the baseline period to the intervention period (5.64% increase; 95% CI, 4.90% to 6.39%) (P < .001 for difference over time between active and control tests). Presenting fee data to providers at the time of order entry resulted in a modest decrease in test ordering. Adoption of this intervention may reduce the number of inappropriately ordered diagnostic tests.

  4. Extremely low frequency (ELF) stray magnetic fields of laboratory equipment: a possible co-exposure conducting experiments on cell cultures.

    PubMed

    Gresits, Iván; Necz, Péter Pál; Jánossy, Gábor; Thuróczy, György

    2015-09-01

    Measurements of extremely low frequency (ELF) magnetic fields were conducted in the environment of commercial laboratory equipment in order to evaluate the possible co-exposure during the experimental processes on cell cultures. Three types of device were evaluated: a cell culture CO2 incubator, a thermostatic water bath and a laboratory shaker table. These devices usually have electric motors, heating wires and electronic control systems, therefore may expose the cell cultures to undesirable ELF stray magnetic fields. Spatial distributions of magnetic field time domain signal waveform and frequency spectral analysis (FFT) were processed. Long- and short-term variation of stray magnetic field was also evaluated under normal use of investigated laboratory devices. The results show that the equipment under test may add a considerable ELF magnetic field to the ambient environmental magnetic field or to the intentional exposure to ELF, RF or other physical/chemical agents. The maximum stray magnetic fields were higher than 3 µT, 20 µT and 75 µT in the CO2 incubator, in water bath and on the laboratory shaker table, respectively, with high variation of spatial distribution and time domain. Our investigation emphasizes possible confounding factors conducting cell culture studies related to low-level ELF-EMF exposure due to the existing stray magnetic fields in the ambient environment of laboratory equipment.

  5. The Virtual Robotics Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kress, R.L.; Love, L.J.

    The growth of the Internet has provided a unique opportunity to expand research collaborations between industry, universities, and the national laboratories. The Virtual Robotics Laboratory (VRL) is an innovative program at Oak Ridge National Laboratory (ORNL) that is focusing on the issues related to collaborative research through controlled access of laboratory equipment using the World Wide Web. The VRL will provide different levels of access to selected ORNL laboratory secondary education programs. In the past, the ORNL Robotics and Process Systems Division has developed state-of-the-art robotic systems for the Army, NASA, Department of Energy, Department of Defense, as well asmore » many other clients. After proof of concept, many of these systems sit dormant in the laboratories. This is not out of completion of all possible research topics. but from completion of contracts and generation of new programs. In the past, a number of visiting professors have used this equipment for their own research. However, this requires that the professor, and possibly his/her students, spend extended periods at the laboratory facility. In addition, only a very exclusive group of faculty can gain access to the laboratory and hardware. The VRL is a tool that enables extended collaborative efforts without regard to geographic limitations.« less

  6. Automatic humidification system to support the assessment of food drying processes

    NASA Astrophysics Data System (ADS)

    Ortiz Hernández, B. D.; Carreño Olejua, A. R.; Castellanos Olarte, J. M.

    2016-07-01

    This work shows the main features of an automatic humidification system to provide drying air that match environmental conditions of different climate zones. This conditioned air is then used to assess the drying process of different agro-industrial products at the Automation and Control for Agro-industrial Processes Laboratory of the Pontifical Bolivarian University of Bucaramanga, Colombia. The automatic system allows creating and improving control strategies to supply drying air under specified conditions of temperature and humidity. The development of automatic routines to control and acquire real time data was made possible by the use of robust control systems and suitable instrumentation. The signals are read and directed to a controller memory where they are scaled and transferred to a memory unit. Using the IP address is possible to access data to perform supervision tasks. One important characteristic of this automatic system is the Dynamic Data Exchange Server (DDE) to allow direct communication between the control unit and the computer used to build experimental curves.

  7. Transformation From a Conventional Clinical Microbiology Laboratory to Full Automation.

    PubMed

    Moreno-Camacho, José L; Calva-Espinosa, Diana Y; Leal-Leyva, Yoseli Y; Elizalde-Olivas, Dolores C; Campos-Romero, Abraham; Alcántar-Fernández, Jonathan

    2017-12-22

    To validate the performance, reproducibility, and reliability of BD automated instruments in order to establish a fully automated clinical microbiology laboratory. We used control strains and clinical samples to assess the accuracy, reproducibility, and reliability of the BD Kiestra WCA, the BD Phoenix, and BD Bruker MALDI-Biotyper instruments and compared them to previously established conventional methods. The following processes were evaluated: sample inoculation and spreading, colony counts, sorting of cultures, antibiotic susceptibility test, and microbial identification. The BD Kiestra recovered single colonies in less time than conventional methods (e.g. E. coli, 7h vs 10h, respectively) and agreement between both methodologies was excellent for colony counts (κ=0.824) and sorting cultures (κ=0.821). Antibiotic susceptibility tests performed with BD Phoenix and disk diffusion demonstrated 96.3% agreement with both methods. Finally, we compared microbial identification in BD Phoenix and Bruker MALDI-Biotyper and observed perfect agreement (κ=1) and identification at a species level for control strains. Together these instruments allow us to process clinical urine samples in 36h (effective time). The BD automated technologies have improved performance compared with conventional methods, and are suitable for its implementation in very busy microbiology laboratories. © American Society for Clinical Pathology 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  8. A laboratory rainfall simulator to study the soil erosion and runoff water

    NASA Astrophysics Data System (ADS)

    Cancelo González, Javier; Rial, M. E.; Díaz-Fierros, Francisco

    2010-05-01

    The soil erosion and the runoff water composition in some areas affected by forest fires or submitted to intensive agriculture are an important factor to keep an account, particularly in sensitive areas like estuary and rias that have a high importance in the socioeconomic development of some regions. An understanding of runoff production indicates the processes by which pollutants reach streams and also indicates the management techniques that might be uses to minimize the discharge of these materials into surface waters. One of the most methodology implemented in the soil erosion studies is a rainfall simulation. This method can reproduce the natural soil degradation processes in field or laboratory experiences. With the aim of improve the rainfall-runoff generation, a laboratory rainfall simulator which incorporates a fan-like intermittent water jet system for rainfall generation were modified. The major change made to the rainfall simulator consist in a system to coupling stainless steel boxes, whose dimensions are 12 x 20 x 45 centimeters, and it allows to place soil samples under the rainfall simulator. Previously these boxes were used to take soil samples in field with more of 20 centimeters of depth, causing the minimum disturbance in their properties and structure. These new implementations in the rainfall simulator also allow collect water samples of runoff in two ways: firstly, the rain water that constituted the overland flow or direct runoff and besides the rain water seeps into the soil by the process of infiltration and contributed to the subsurface runoff. Among main the variables controlled in the rainfall simulations were the soil slope and the intensity and duration of rainfall. With the aim of test the prototype, six soil samples were collected in the same sampling point and subjected to rainfall simulations in laboratory with the same intensity and duration. Two samples will constitute the control test, and they were fully undisturbed, and four samples were subjected to controlled burnings with different fire severity: two samples burnt to 250°C and the other two samples burnt to 450°C. Preliminary laboratory data of soil erosion and surface and subsurface runoff were obtained. The water parameters analysed were: pH, electrical conductivity, temperature (in the moment of sampling) and suspended sediments, ammonium, nitrates, total nitrogen (Kjeldahl method), within 24 hours after sampling.

  9. A Multi-mission Event-Driven Component-Based System for Support of Flight Software Development, ATLO, and Operations first used by the Mars Science Laboratory (MSL) Project

    NASA Technical Reports Server (NTRS)

    Dehghani, Navid; Tankenson, Michael

    2006-01-01

    This viewgraph presentation reviews the architectural description of the Mission Data Processing and Control System (MPCS). MPCS is an event-driven, multi-mission ground data processing components providing uplink, downlink, and data management capabilities which will support the Mars Science Laboratory (MSL) project as its first target mission. MPCS is designed with these factors (1) Enabling plug and play architecture (2) MPCS has strong inheritance from GDS components that have been developed for other Flight Projects (MER, MRO, DAWN, MSAP), and are currently being used in operations and ATLO, and (3) MPCS components are Java-based, platform independent, and are designed to consume and produce XML-formatted data

  10. Hybrid nanosensor for colorimetric and ultrasensitive detection of nuclease contaminations

    NASA Astrophysics Data System (ADS)

    Cecere, Paola; Valentini, Paola; Pompa, Pier Paolo

    2016-04-01

    Nucleases are ubiquitous enzymes that degrade DNA or RNA, thus they can prejudice the good outcome of molecular biology experiments involving nucleic acids. We propose a colorimetric test for the naked-eye detection of nuclease contaminations. The system uses an hybrid nanosensor, based on gold nanoparticles functionalized with DNA probes. Our assay is rapid, instrument-free, simple and low-cost. Moreover, it reaches sensitivity equal or better than those of commercial kits, and presents a lot of advantageous aspects. Therefore, it is very competitive, with a real market potential. This test will be relevant in routine process monitoring in scientific laboratories, and in quality control in clinical laboratories and industrial processes, allowing the simultaneous detection of nucleases with different substrate specificities and large-scale screening.

  11. Assessing accuracy and precision for field and laboratory data: a perspective in ecosystem restoration

    USGS Publications Warehouse

    Stapanian, Martin A.; Lewis, Timothy E; Palmer, Craig J.; Middlebrook Amos, Molly

    2016-01-01

    Unlike most laboratory studies, rigorous quality assurance/quality control (QA/QC) procedures may be lacking in ecosystem restoration (“ecorestoration”) projects, despite legislative mandates in the United States. This is due, in part, to ecorestoration specialists making the false assumption that some types of data (e.g. discrete variables such as species identification and abundance classes) are not subject to evaluations of data quality. Moreover, emergent behavior manifested by complex, adapting, and nonlinear organizations responsible for monitoring the success of ecorestoration projects tend to unconsciously minimize disorder, QA/QC being an activity perceived as creating disorder. We discuss similarities and differences in assessing precision and accuracy for field and laboratory data. Although the concepts for assessing precision and accuracy of ecorestoration field data are conceptually the same as laboratory data, the manner in which these data quality attributes are assessed is different. From a sample analysis perspective, a field crew is comparable to a laboratory instrument that requires regular “recalibration,” with results obtained by experts at the same plot treated as laboratory calibration standards. Unlike laboratory standards and reference materials, the “true” value for many field variables is commonly unknown. In the laboratory, specific QA/QC samples assess error for each aspect of the measurement process, whereas field revisits assess precision and accuracy of the entire data collection process following initial calibration. Rigorous QA/QC data in an ecorestoration project are essential for evaluating the success of a project, and they provide the only objective “legacy” of the dataset for potential legal challenges and future uses.

  12. Laboratory space physics: Investigating the physics of space plasmas in the laboratory

    NASA Astrophysics Data System (ADS)

    Howes, Gregory G.

    2018-05-01

    Laboratory experiments provide a valuable complement to explore the fundamental physics of space plasmas without the limitations inherent to spacecraft measurements. Specifically, experiments overcome the restriction that spacecraft measurements are made at only one (or a few) points in space, enable greater control of the plasma conditions and applied perturbations, can be reproducible, and are orders of magnitude less expensive than launching spacecraft. Here, I highlight key open questions about the physics of space plasmas and identify the aspects of these problems that can potentially be tackled in laboratory experiments. Several past successes in laboratory space physics provide concrete examples of how complementary experiments can contribute to our understanding of physical processes at play in the solar corona, solar wind, planetary magnetospheres, and the outer boundary of the heliosphere. I present developments on the horizon of laboratory space physics, identifying velocity space as a key new frontier, highlighting new and enhanced experimental facilities, and showcasing anticipated developments to produce improved diagnostics and innovative analysis methods. A strategy for future laboratory space physics investigations will be outlined, with explicit connections to specific fundamental plasma phenomena of interest.

  13. The MIST /MIUS Integration and Subsystems Test/ laboratory - A testbed for the MIUS /Modular Integrated Utility System/ program

    NASA Technical Reports Server (NTRS)

    Beckham, W. S., Jr.; Keune, F. A.

    1974-01-01

    The MIUS (Modular Integrated Utility System) concept is to be an energy-conserving, economically feasible, integrated community utility system to provide five necessary services: electricity generation, space heating and air conditioning, solid waste processing, liquid waste processing, and residential water purification. The MIST (MIUS Integration and Subsystem Test) integrated system testbed constructed at the Johnson Space Center in Houston includes subsystems for power generation, heating, ventilation, and air conditioning (HVAC), wastewater management, solid waste management, and control and monitoring. The key design issues under study include thermal integration and distribution techniques, thermal storage, integration of subsystems controls and displays, incinerator performance, effluent characteristics, and odor control.

  14. SAVA 3: A testbed for integration and control of visual processes

    NASA Technical Reports Server (NTRS)

    Crowley, James L.; Christensen, Henrik

    1994-01-01

    The development of an experimental test-bed to investigate the integration and control of perception in a continuously operating vision system is described. The test-bed integrates a 12 axis robotic stereo camera head mounted on a mobile robot, dedicated computer boards for real-time image acquisition and processing, and a distributed system for image description. The architecture was designed to: (1) be continuously operating, (2) integrate software contributions from geographically dispersed laboratories, (3) integrate description of the environment with 2D measurements, 3D models, and recognition of objects, (4) capable of supporting diverse experiments in gaze control, visual servoing, navigation, and object surveillance, and (5) dynamically reconfiguarable.

  15. Toward a Comprehensive Understanding of Executive Cognitive Function in Implicit Racial Bias

    PubMed Central

    Ito, Tiffany A.; Friedman, Naomi P.; Bartholow, Bruce D.; Correll, Joshua; Loersch, Chris; Altamirano, Lee J.; Miyake, Akira

    2014-01-01

    Although performance on laboratory-based implicit bias tasks often is interpreted strictly in terms of the strength of automatic associations, recent evidence suggests that such tasks are influenced by higher-order cognitive control processes, so-called executive functions (EFs). However, extant work in this area has been limited by failure to account for the unity and diversity of EFs, focus on only a single measure of bias and/or EF, and relatively small sample sizes. The current study sought to comprehensively model the relation between individual differences in EFs and the expression of racial bias in three commonly used laboratory measures. Participants (N=485) completed a battery of EF tasks (session 1) and three racial bias tasks (session 2), along with numerous individual difference questionnaires. The main findings were as follows: (1) measures of implicit bias were only weakly intercorrelated; (2) EF and estimates of automatic processes both predicted implicit bias and also interacted, such that the relation between automatic processes and bias expression was reduced at higher levels of EF; (3) specific facets of EF were differentially associated with overall task performance and controlled processing estimates across different bias tasks; (4) EF did not moderate associations between implicit and explicit measures of bias; and (5) external, but not internal, motivation to control prejudice depended on EF to reduce bias expression. Findings are discussed in terms of the importance of global and specific EF abilities in determining expression of implicit racial bias. PMID:25603372

  16. Power control electronics for cryogenic instrumentation

    NASA Technical Reports Server (NTRS)

    Ray, Biswajit; Gerber, Scott S.; Patterson, Richard L.; Myers, Ira T.

    1995-01-01

    In order to achieve a high-efficiency high-density cryogenic instrumentation system, the power processing electronics should be placed in the cold environment along with the sensors and signal-processing electronics. The typical instrumentation system requires low voltage dc usually obtained from processing line frequency ac power. Switch-mode power conversion topologies such as forward, flyback, push-pull, and half-bridge are used for high-efficiency power processing using pulse-width modulation (PWM) or resonant control. This paper presents several PWM and multiresonant power control circuits, implemented using commercially available CMOS and BiCMOS integrated circuits, and their performance at liquid-nitrogen temperature (77 K) as compared to their room temperature (300 K) performance. The operation of integrated circuits at cryogenic temperatures results in an improved performance in terms of increased speed, reduced latch-up susceptibility, reduced leakage current, and reduced thermal noise. However, the switching noise increased at 77 K compared to 300 K. The power control circuits tested in the laboratory did successfully restart at 77 K.

  17. Simulating the Historical Process To Create Laboratory Exercises That Teach Research Methods.

    ERIC Educational Resources Information Center

    Alcock, James

    1994-01-01

    Explains how controlling student access to data can be used as a strategy enabling students to take the role of a research geologist. Students develop models based on limited data and conduct field tests by comparing their predictions with the additional data. (DDR)

  18. A run-time control architecture for the JPL telerobot

    NASA Technical Reports Server (NTRS)

    Balaram, J.; Lokshin, A.; Kreutz, K.; Beahan, J.

    1987-01-01

    An architecture for implementing the process-level decision making for a hierarchically structured telerobot currently being implemented at the Jet Propolusion Laboratory (JPL) is described. Constraints on the architecture design, architecture partitioning concepts, and a detailed description of the existing and proposed implementations are provided.

  19. Information systems as a quality management tool in clinical laboratories

    NASA Astrophysics Data System (ADS)

    Schmitz, Vanessa; Rosecler Bez el Boukhari, Marta

    2007-11-01

    This article describes information systems as a quality management tool in clinical laboratories. The quality of laboratory analyses is of fundamental importance for health professionals in aiding appropriate diagnosis and treatment. Information systems allow the automation of internal quality management processes, using standard sample tests, Levey-Jennings charts and Westgard multirule analysis. This simplifies evaluation and interpretation of quality tests and reduces the possibility of human error. This study proposes the development of an information system with appropriate functions and costs for the automation of internal quality control in small and medium-sized clinical laboratories. To this end, it evaluates the functions and usability of two commercial software products designed for this purpose, identifying the positive features of each, so that these can be taken into account during the development of the proposed system.

  20. Laboratory Investigation of Space and Planetary Dust Grains

    NASA Technical Reports Server (NTRS)

    Spann, James

    2005-01-01

    Dust in space is ubiquitous and impacts diverse observed phenomena in various ways. Understanding the dominant mechanisms that control dust grain properties and its impact on surrounding environments is basic to improving our understanding observed processes at work in space. There is a substantial body of work on the theory and modeling of dust in space and dusty plasmas. To substantiate and validate theory and models, laboratory investigations and space borne observations have been conducted. Laboratory investigations are largely confined to an assembly of dust grains immersed in a plasma environment. Frequently the behaviors of these complex dusty plasmas in the laboratory have raised more questions than verified theories. Space borne observations have helped us characterize planetary environments. The complex behavior of dust grains in space indicates the need to understand the microphysics of individual grains immersed in a plasma or space environment.

  1. Thermal Storage Process and Components Laboratory | Energy Systems

    Science.gov Websites

    Integration Facility | NREL Process and Components Laboratory Thermal Storage Process and Components Laboratory The Energy Systems Integration Facility's Thermal Systems Process and Components Laboratory supports research and development, testing, and evaluation of new thermal energy storage systems

  2. KENNEDY SPACE CENTER, FLA. - At the Astrotech Space Operations processing facilities, NASA’s MESSENGER spacecraft is secure after transfer to the work stand. There employees of the Johns Hopkins University Applied Physics Laboratory, builders of the spacecraft, will perform an initial state-of-health check. Then processing for launch can begin, including checkout of the power systems, communications systems and control systems. The thermal blankets will also be attached for flight. MESSENGER - short for MErcury Surface, Space ENvironment, GEochemistry and Ranging - will be launched May 11 on a six-year mission aboard a Boeing Delta II rocket. Liftoff is targeted for 2:26 a.m. EDT on Tuesday, May 11.

    NASA Image and Video Library

    2004-03-10

    KENNEDY SPACE CENTER, FLA. - At the Astrotech Space Operations processing facilities, NASA’s MESSENGER spacecraft is secure after transfer to the work stand. There employees of the Johns Hopkins University Applied Physics Laboratory, builders of the spacecraft, will perform an initial state-of-health check. Then processing for launch can begin, including checkout of the power systems, communications systems and control systems. The thermal blankets will also be attached for flight. MESSENGER - short for MErcury Surface, Space ENvironment, GEochemistry and Ranging - will be launched May 11 on a six-year mission aboard a Boeing Delta II rocket. Liftoff is targeted for 2:26 a.m. EDT on Tuesday, May 11.

  3. KENNEDY SPACE CENTER, FLA. - At the Astrotech Space Operations processing facilities, NASA’s MESSENGER spacecraft is lifted off the pallet for transfer to a work stand. There employees of the Johns Hopkins University Applied Physics Laboratory, builders of the spacecraft, will perform an initial state-of-health check. Then processing for launch can begin, including checkout of the power systems, communications systems and control systems. The thermal blankets will also be attached for flight. MESSENGER - short for MErcury Surface, Space ENvironment, GEochemistry and Ranging - will be launched May 11 on a six-year mission aboard a Boeing Delta II rocket. Liftoff is targeted for 2:26 a.m. EDT on Tuesday, May 11.

    NASA Image and Video Library

    2004-03-10

    KENNEDY SPACE CENTER, FLA. - At the Astrotech Space Operations processing facilities, NASA’s MESSENGER spacecraft is lifted off the pallet for transfer to a work stand. There employees of the Johns Hopkins University Applied Physics Laboratory, builders of the spacecraft, will perform an initial state-of-health check. Then processing for launch can begin, including checkout of the power systems, communications systems and control systems. The thermal blankets will also be attached for flight. MESSENGER - short for MErcury Surface, Space ENvironment, GEochemistry and Ranging - will be launched May 11 on a six-year mission aboard a Boeing Delta II rocket. Liftoff is targeted for 2:26 a.m. EDT on Tuesday, May 11.

  4. KENNEDY SPACE CENTER, FLA. - In the high bay clean room at the Astrotech Space Operations processing facilities near KSC, workers remove the protective cover from NASA’s MESSENGER spacecraft. Employees of the Johns Hopkins University Applied Physics Laboratory, builders of the spacecraft, will perform an initial state-of-health check. Then processing for launch can begin, including checkout of the power systems, communications systems and control systems. The thermal blankets will also be attached for flight. MESSENGER - short for MErcury Surface, Space ENvironment, GEochemistry and Ranging - will be launched May 11 on a six-year mission aboard a Boeing Delta II rocket. Liftoff is targeted for 2:26 a.m. EDT on Tuesday, May 11.

    NASA Image and Video Library

    2004-03-10

    KENNEDY SPACE CENTER, FLA. - In the high bay clean room at the Astrotech Space Operations processing facilities near KSC, workers remove the protective cover from NASA’s MESSENGER spacecraft. Employees of the Johns Hopkins University Applied Physics Laboratory, builders of the spacecraft, will perform an initial state-of-health check. Then processing for launch can begin, including checkout of the power systems, communications systems and control systems. The thermal blankets will also be attached for flight. MESSENGER - short for MErcury Surface, Space ENvironment, GEochemistry and Ranging - will be launched May 11 on a six-year mission aboard a Boeing Delta II rocket. Liftoff is targeted for 2:26 a.m. EDT on Tuesday, May 11.

  5. KENNEDY SPACE CENTER, FLA. - At the Astrotech Space Operations processing facilities, workers check the placement of NASA’s MESSENGER spacecraft on a work stand. There employees of the Johns Hopkins University Applied Physics Laboratory, builders of the spacecraft, will perform an initial state-of-health check. Then processing for launch can begin, including checkout of the power systems, communications systems and control systems. The thermal blankets will also be attached for flight. MESSENGER - short for MErcury Surface, Space ENvironment, GEochemistry and Ranging - will be launched May 11 on a six-year mission aboard a Boeing Delta II rocket. Liftoff is targeted for 2:26 a.m. EDT on Tuesday, May 11.

    NASA Image and Video Library

    2004-03-10

    KENNEDY SPACE CENTER, FLA. - At the Astrotech Space Operations processing facilities, workers check the placement of NASA’s MESSENGER spacecraft on a work stand. There employees of the Johns Hopkins University Applied Physics Laboratory, builders of the spacecraft, will perform an initial state-of-health check. Then processing for launch can begin, including checkout of the power systems, communications systems and control systems. The thermal blankets will also be attached for flight. MESSENGER - short for MErcury Surface, Space ENvironment, GEochemistry and Ranging - will be launched May 11 on a six-year mission aboard a Boeing Delta II rocket. Liftoff is targeted for 2:26 a.m. EDT on Tuesday, May 11.

  6. KENNEDY SPACE CENTER, FLA. - At the Astrotech Space Operations processing facilities near KSC, workers move NASA’s MESSENGER spacecraft into a high bay clean room. Employees of the Johns Hopkins University Applied Physics Laboratory, builders of the spacecraft, will perform an initial state-of-health check. Then processing for launch can begin, including checkout of the power systems, communications systems and control systems. The thermal blankets will also be attached for flight. MESSENGER - short for MErcury Surface, Space ENvironment, GEochemistry and Ranging - will be launched May 11 on a six-year mission aboard a Boeing Delta II rocket. Liftoff is targeted for 2:26 a.m. EDT on Tuesday, May 11.

    NASA Image and Video Library

    2004-03-10

    KENNEDY SPACE CENTER, FLA. - At the Astrotech Space Operations processing facilities near KSC, workers move NASA’s MESSENGER spacecraft into a high bay clean room. Employees of the Johns Hopkins University Applied Physics Laboratory, builders of the spacecraft, will perform an initial state-of-health check. Then processing for launch can begin, including checkout of the power systems, communications systems and control systems. The thermal blankets will also be attached for flight. MESSENGER - short for MErcury Surface, Space ENvironment, GEochemistry and Ranging - will be launched May 11 on a six-year mission aboard a Boeing Delta II rocket. Liftoff is targeted for 2:26 a.m. EDT on Tuesday, May 11.

  7. KENNEDY SPACE CENTER, FLA. - At the Astrotech Space Operations processing facilities, an overhead crane moves NASA’s MESSENGER spacecraft toward a work stand. There employees of the Johns Hopkins University Applied Physics Laboratory, builders of the spacecraft, will perform an initial state-of-health check. Then processing for launch can begin, including checkout of the power systems, communications systems and control systems. The thermal blankets will also be attached for flight. MESSENGER - short for MErcury Surface, Space ENvironment, GEochemistry and Ranging - will be launched May 11 on a six-year mission aboard a Boeing Delta II rocket. Liftoff is targeted for 2:26 a.m. EDT on Tuesday, May 11.

    NASA Image and Video Library

    2004-03-10

    KENNEDY SPACE CENTER, FLA. - At the Astrotech Space Operations processing facilities, an overhead crane moves NASA’s MESSENGER spacecraft toward a work stand. There employees of the Johns Hopkins University Applied Physics Laboratory, builders of the spacecraft, will perform an initial state-of-health check. Then processing for launch can begin, including checkout of the power systems, communications systems and control systems. The thermal blankets will also be attached for flight. MESSENGER - short for MErcury Surface, Space ENvironment, GEochemistry and Ranging - will be launched May 11 on a six-year mission aboard a Boeing Delta II rocket. Liftoff is targeted for 2:26 a.m. EDT on Tuesday, May 11.

  8. KENNEDY SPACE CENTER, FLA. - At the Astrotech Space Operations processing facilities, an overhead crane lowers NASA’s MESSENGER spacecraft onto a work stand. There employees of the Johns Hopkins University Applied Physics Laboratory, builders of the spacecraft, will perform an initial state-of-health check. Then processing for launch can begin, including checkout of the power systems, communications systems and control systems. The thermal blankets will also be attached for flight. MESSENGER - short for MErcury Surface, Space ENvironment, GEochemistry and Ranging - will be launched May 11 on a six-year mission aboard a Boeing Delta II rocket. Liftoff is targeted for 2:26 a.m. EDT on Tuesday, May 11.

    NASA Image and Video Library

    2004-03-10

    KENNEDY SPACE CENTER, FLA. - At the Astrotech Space Operations processing facilities, an overhead crane lowers NASA’s MESSENGER spacecraft onto a work stand. There employees of the Johns Hopkins University Applied Physics Laboratory, builders of the spacecraft, will perform an initial state-of-health check. Then processing for launch can begin, including checkout of the power systems, communications systems and control systems. The thermal blankets will also be attached for flight. MESSENGER - short for MErcury Surface, Space ENvironment, GEochemistry and Ranging - will be launched May 11 on a six-year mission aboard a Boeing Delta II rocket. Liftoff is targeted for 2:26 a.m. EDT on Tuesday, May 11.

  9. KENNEDY SPACE CENTER, FLA. - In the high bay clean room at the Astrotech Space Operations processing facilities near KSC, NASA’s MESSENGER spacecraft is revealed. Employees of the Johns Hopkins University Applied Physics Laboratory, builders of the spacecraft, will perform an initial state-of-health check. Then processing for launch can begin, including checkout of the power systems, communications systems and control systems. The thermal blankets will also be attached for flight. MESSENGER - short for MErcury Surface, Space ENvironment, GEochemistry and Ranging - will be launched May 11 on a six-year mission aboard a Boeing Delta II rocket. Liftoff is targeted for 2:26 a.m. EDT on Tuesday, May 11.

    NASA Image and Video Library

    2004-03-10

    KENNEDY SPACE CENTER, FLA. - In the high bay clean room at the Astrotech Space Operations processing facilities near KSC, NASA’s MESSENGER spacecraft is revealed. Employees of the Johns Hopkins University Applied Physics Laboratory, builders of the spacecraft, will perform an initial state-of-health check. Then processing for launch can begin, including checkout of the power systems, communications systems and control systems. The thermal blankets will also be attached for flight. MESSENGER - short for MErcury Surface, Space ENvironment, GEochemistry and Ranging - will be launched May 11 on a six-year mission aboard a Boeing Delta II rocket. Liftoff is targeted for 2:26 a.m. EDT on Tuesday, May 11.

  10. A low-cost computer-controlled Arduino-based educational laboratory system for teaching the fundamentals of photovoltaic cells

    NASA Astrophysics Data System (ADS)

    Zachariadou, K.; Yiasemides, K.; Trougkakos, N.

    2012-11-01

    We present a low-cost, fully computer-controlled, Arduino-based, educational laboratory (SolarInsight) to be used in undergraduate university courses concerned with electrical engineering and physics. The major goal of the system is to provide students with the necessary instrumentation, software tools and methodology in order to learn fundamental concepts of semiconductor physics by exploring the process of an experimental physics inquiry. The system runs under the Windows operating system and is composed of a data acquisition/control board, a power supply and processing boards, sensing elements, a graphical user interface and data analysis software. The data acquisition/control board is based on the Arduino open source electronics prototyping platform. The graphical user interface and communication with the Arduino are developed in C# and C++ programming languages respectively, by using IDE Microsoft Visual Studio 2010 Professional, which is freely available to students. Finally, the data analysis is performed by using the open source, object-oriented framework ROOT. Currently the system supports five teaching activities, each one corresponding to an independent tab in the user interface. SolarInsight has been partially developed in the context of a diploma thesis conducted within the Technological Educational Institute of Piraeus under the co-supervision of the Physics and Electronic Computer Systems departments’ academic staff.

  11. Can current analytical quality performance of UK clinical laboratories support evidence-based guidelines for diabetes and ischaemic heart disease?--A pilot study and a proposal.

    PubMed

    Jassam, Nuthar; Yundt-Pacheco, John; Jansen, Rob; Thomas, Annette; Barth, Julian H

    2013-08-01

    The implementation of national and international guidelines is beginning to standardise clinical practice. However, since many guidelines have decision limits based on laboratory tests, there is an urgent need to ensure that different laboratories obtain the same analytical result on any sample. A scientifically-based quality control process will be a pre-requisite to provide this level of analytical performance which will support evidence-based guidelines and movement of patients across boundaries while maintaining standardised outcomes. We discuss the finding of a pilot study performed to assess UK clinical laboratories readiness to work to a higher grade quality specifications such as biological variation-based quality specifications. Internal quality control (IQC) data for HbA1c, glucose, creatinine, cholesterol and high density lipoprotein (HDL)-cholesterol were collected from UK laboratories participating in the Bio-Rad Unity QC programme. The median of the coefficient of variation (CV%) of the participating laboratories was evaluated against the CV% based on biological variation. Except creatinine, the other four analytes had a variable degree of compliance with the biological variation-based quality specifications. More than 75% of the laboratories met the biological variation-based quality specifications for glucose, cholesterol and HDL-cholesterol. Slightly over 50% of the laboratories met the analytical goal for HBA1c. Only one analyte (cholesterol) had a performance achieving the higher quality specifications consistent with 5σ. Our data from IQC do not consistently demonstrate that the results from clinical laboratories meet evidence-based quality specifications. Therefore, we propose that a graded scale of quality specifications may be needed at this stage.

  12. Research and development program in fiber optic sensors and distributed sensing for high temperature harsh environment energy applications (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Romanosky, Robert R.

    2017-05-01

    he National Energy Technology Laboratory (NETL) under the Department of Energy (DOE) Fossil Energy (FE) Program is leading the effort to not only develop near zero emission power generation systems, but to increaser the efficiency and availability of current power systems. The overarching goal of the program is to provide clean affordable power using domestic resources. Highly efficient, low emission power systems can have extreme conditions of high temperatures up to 1600 oC, high pressures up to 600 psi, high particulate loadings, and corrosive atmospheres that require monitoring. Sensing in these harsh environments can provide key information that directly impacts process control and system reliability. The lack of suitable measurement technology serves as a driver for the innovations in harsh environment sensor development. Advancements in sensing using optical fibers are key efforts within NETL's sensor development program as these approaches offer the potential to survive and provide critical information about these processes. An overview of the sensor development supported by the National Energy Technology Laboratory (NETL) will be given, including research in the areas of sensor materials, designs, and measurement types. New approaches to intelligent sensing, sensor placement and process control using networked sensors will be discussed as will novel approaches to fiber device design concurrent with materials development research and development in modified and coated silica and sapphire fiber based sensors. The use of these sensors for both single point and distributed measurements of temperature, pressure, strain, and a select suite of gases will be addressed. Additional areas of research includes novel control architecture and communication frameworks, device integration for distributed sensing, and imaging and other novel approaches to monitoring and controlling advanced processes. The close coupling of the sensor program with process modeling and control will be discussed for the overarching goal of clean power production.

  13. Adaptive process control using fuzzy logic and genetic algorithms

    NASA Technical Reports Server (NTRS)

    Karr, C. L.

    1993-01-01

    Researchers at the U.S. Bureau of Mines have developed adaptive process control systems in which genetic algorithms (GA's) are used to augment fuzzy logic controllers (FLC's). GA's are search algorithms that rapidly locate near-optimum solutions to a wide spectrum of problems by modeling the search procedures of natural genetics. FLC's are rule based systems that efficiently manipulate a problem environment by modeling the 'rule-of-thumb' strategy used in human decision making. Together, GA's and FLC's possess the capabilities necessary to produce powerful, efficient, and robust adaptive control systems. To perform efficiently, such control systems require a control element to manipulate the problem environment, and a learning element to adjust to the changes in the problem environment. Details of an overall adaptive control system are discussed. A specific laboratory acid-base pH system is used to demonstrate the ideas presented.

  14. Adaptive Process Control with Fuzzy Logic and Genetic Algorithms

    NASA Technical Reports Server (NTRS)

    Karr, C. L.

    1993-01-01

    Researchers at the U.S. Bureau of Mines have developed adaptive process control systems in which genetic algorithms (GA's) are used to augment fuzzy logic controllers (FLC's). GA's are search algorithms that rapidly locate near-optimum solutions to a wide spectrum of problems by modeling the search procedures of natural genetics. FLC's are rule based systems that efficiently manipulate a problem environment by modeling the 'rule-of-thumb' strategy used in human decision-making. Together, GA's and FLC's possess the capabilities necessary to produce powerful, efficient, and robust adaptive control systems. To perform efficiently, such control systems require a control element to manipulate the problem environment, an analysis element to recognize changes in the problem environment, and a learning element to adjust to the changes in the problem environment. Details of an overall adaptive control system are discussed. A specific laboratory acid-base pH system is used to demonstrate the ideas presented.

  15. A real time, FEM based optimal control algorithm and its implementation using parallel processing hardware (transistors) in a microprocessor environment

    NASA Technical Reports Server (NTRS)

    Patten, William Neff

    1989-01-01

    There is an evident need to discover a means of establishing reliable, implementable controls for systems that are plagued by nonlinear and, or uncertain, model dynamics. The development of a generic controller design tool for tough-to-control systems is reported. The method utilizes a moving grid, time infinite element based solution of the necessary conditions that describe an optimal controller for a system. The technique produces a discrete feedback controller. Real time laboratory experiments are now being conducted to demonstrate the viability of the method. The algorithm that results is being implemented in a microprocessor environment. Critical computational tasks are accomplished using a low cost, on-board, multiprocessor (INMOS T800 Transputers) and parallel processing. Progress to date validates the methodology presented. Applications of the technique to the control of highly flexible robotic appendages are suggested.

  16. Simulation Based Low-Cost Composite Process Development at the US Air Force Research Laboratory

    NASA Technical Reports Server (NTRS)

    Rice, Brian P.; Lee, C. William; Curliss, David B.

    2003-01-01

    Low-cost composite research in the US Air Force Research Laboratory, Materials and Manufacturing Directorate, Organic Matrix Composites Branch has focused on the theme of affordable performance. Practically, this means that we use a very broad view when considering the affordability of composites. Factors such as material costs, labor costs, recurring and nonrecurring manufacturing costs are balanced against performance to arrive at the relative affordability vs. performance measure of merit. The research efforts discussed here are two projects focused on affordable processing of composites. The first topic is the use of a neural network scheme to model cure reaction kinetics, then utilize the kinetics coupled with simple heat transport models to predict, in real-time, future exotherms and control them. The neural network scheme is demonstrated to be very robust and a much more efficient method that mechanistic cure modeling approach. This enables very practical low-cost processing of thick composite parts. The second project is liquid composite molding (LCM) process simulation. LCM processing of large 3D integrated composite parts has been demonstrated to be a very cost effective way to produce large integrated aerospace components specific examples of LCM processes are resin transfer molding (RTM), vacuum assisted resin transfer molding (VARTM), and other similar approaches. LCM process simulation is a critical part of developing an LCM process approach. Flow simulation enables the development of the most robust approach to introducing resin into complex preforms. Furthermore, LCM simulation can be used in conjunction with flow front sensors to control the LCM process in real-time to account for preform or resin variability.

  17. Biodegradation modelling of a dissolved gasoline plume applying independent laboratory and field parameters

    NASA Astrophysics Data System (ADS)

    Schirmer, Mario; Molson, John W.; Frind, Emil O.; Barker, James F.

    2000-12-01

    Biodegradation of organic contaminants in groundwater is a microscale process which is often observed on scales of 100s of metres or larger. Unfortunately, there are no known equivalent parameters for characterizing the biodegradation process at the macroscale as there are, for example, in the case of hydrodynamic dispersion. Zero- and first-order degradation rates estimated at the laboratory scale by model fitting generally overpredict the rate of biodegradation when applied to the field scale because limited electron acceptor availability and microbial growth are not considered. On the other hand, field-estimated zero- and first-order rates are often not suitable for predicting plume development because they may oversimplify or neglect several key field scale processes, phenomena and characteristics. This study uses the numerical model BIO3D to link the laboratory and field scales by applying laboratory-derived Monod kinetic degradation parameters to simulate a dissolved gasoline field experiment at the Canadian Forces Base (CFB) Borden. All input parameters were derived from independent laboratory and field measurements or taken from the literature a priori to the simulations. The simulated results match the experimental results reasonably well without model calibration. A sensitivity analysis on the most uncertain input parameters showed only a minor influence on the simulation results. Furthermore, it is shown that the flow field, the amount of electron acceptor (oxygen) available, and the Monod kinetic parameters have a significant influence on the simulated results. It is concluded that laboratory-derived Monod kinetic parameters can adequately describe field scale degradation, provided all controlling factors are incorporated in the field scale model. These factors include advective-dispersive transport of multiple contaminants and electron acceptors and large-scale spatial heterogeneities.

  18. Multi-laboratory survey of qPCR enterococci analysis method performance

    EPA Pesticide Factsheets

    Quantitative polymerase chain reaction (qPCR) has become a frequently used technique for quantifying enterococci in recreational surface waters, but there are several methodological options. Here we evaluated how three method permutations, type of mastermix, sample extract dilution and use of controls in results calculation, affect method reliability among multiple laboratories with respect to sample interference. Multiple samples from each of 22 sites representing an array of habitat types were analyzed using EPA Method 1611 and 1609 reagents with full strength and five-fold diluted extracts. The presence of interference was assessed three ways: using sample processing and PCR amplifications controls; consistency of results across extract dilutions; and relative recovery of target genes from spiked enterococci in water sample compared to control matrices with acceptable recovery defined as 50 to 200%. Method 1609, which is based on an environmental mastermix, was found to be superior to Method 1611, which is based on a universal mastermix. Method 1611 had over a 40% control assay failure rate with undiluted extracts and a 6% failure rate with diluted extracts. Method 1609 failed in only 11% and 3% of undiluted and diluted extracts analyses. Use of sample processing control assay results in the delta-delta Ct method for calculating relative target gene recoveries increased the number of acceptable recovery results. Delta-delta tended to bias recoveries fr

  19. Implementation of Quality Management in Core Service Laboratories

    PubMed Central

    Creavalle, T.; Haque, K.; Raley, C.; Subleski, M.; Smith, M.W.; Hicks, B.

    2010-01-01

    CF-28 The Genetics and Genomics group of the Advanced Technology Program of SAIC-Frederick exists to bring innovative genomic expertise, tools and analysis to NCI and the scientific community. The Sequencing Facility (SF) provides next generation short read (Illumina) sequencing capacity to investigators using a streamlined production approach. The Laboratory of Molecular Technology (LMT) offers a wide range of genomics core services including microarray expression analysis, miRNA analysis, array comparative genome hybridization, long read (Roche) next generation sequencing, quantitative real time PCR, transgenic genotyping, Sanger sequencing, and clinical mutation detection services to investigators from across the NIH. As the technology supporting this genomic research becomes more complex, the need for basic quality processes within all aspects of the core service groups becomes critical. The Quality Management group works alongside members of these labs to establish or improve processes supporting operations control (equipment, reagent and materials management), process improvement (reengineering/optimization, automation, acceptance criteria for new technologies and tech transfer), and quality assurance and customer support (controlled documentation/SOPs, training, service deficiencies and continual improvement efforts). Implementation and expansion of quality programs within unregulated environments demonstrates SAIC-Frederick's dedication to providing the highest quality products and services to the NIH community.

  20. LACIS-T - A moist air wind tunnel for investigating the interactions between cloud microphysics and turbulence

    NASA Astrophysics Data System (ADS)

    Niedermeier, Dennis; Voigtländer, Jens; Siebert, Holger; Desai, Neel; Shaw, Raymond; Chang, Kelken; Krueger, Steven; Schumacher, Jörg; Stratmann, Frank

    2017-11-01

    Turbulence - cloud droplet interaction processes have been investigated primarily through numerical simulation and field measurements over the last ten years. However, only in the laboratory we can be confident in our knowledge of initial and boundary conditions, and are able to measure for extended times under statistically stationary and repeatable conditions. Therefore, the newly built turbulent wind tunnel LACIS-T (Turbulent Leipzig Aerosol Cloud Interaction Simulator) is an ideal facility for pursuing mechanistic understanding of these processes. Within the tunnel we are able to adjust precisely controlled turbulent temperature and humidity fields so as to achieve supersaturation levels allowing for detailed investigations of the interactions between cloud microphysical processes (e.g., cloud droplet activation) and the turbulent flow, under well-defined and reproducible laboratory conditions. We will present the fundamental operating principle, first results from ongoing characterization efforts, numerical simulations as well as first droplet activation experiments.

  1. Entry Guidance for the 2011 Mars Science Laboratory Mission

    NASA Technical Reports Server (NTRS)

    Mendeck, Gavin F.; Craig, Lynn E.

    2011-01-01

    The 2011 Mars Science Laboratory will be the first Mars mission to attempt a guided entry to safely deliver the rover to a touchdown ellipse of 25 km x 20 km. The Entry Terminal Point Controller guidance algorithm is derived from the final phase Apollo Command Module guidance and, like Apollo, modulates the bank angle to control the range flown. For application to Mars landers which must make use of the tenuous Martian atmosphere, it is critical to balance the lift of the vehicle to minimize the range error while still ensuring a safe deploy altitude. An overview of the process to generate optimized guidance settings is presented, discussing improvements made over the last nine years. Key dispersions driving deploy ellipse and altitude performance are identified. Performance sensitivities including attitude initialization error and the velocity of transition from range control to heading alignment are presented.

  2. Strategy for 90% autoverification of clinical chemistry and immunoassay test results using six sigma process improvement.

    PubMed

    Randell, Edward W; Short, Garry; Lee, Natasha; Beresford, Allison; Spencer, Margaret; Kennell, Marina; Moores, Zoë; Parry, David

    2018-06-01

    Six Sigma involves a structured process improvement strategy that places processes on a pathway to continued improvement. The data presented here summarizes a project that took three clinical laboratories from autoverification processes that allowed between about 40% to 60% of tests being auto-verified to more than 90% of tests and samples auto-verified. The project schedule, metrics and targets, a description of the previous system and detailed information on the changes made to achieve greater than 90% auto-verification is presented for this Six Sigma DMAIC (Design, Measure, Analyze, Improve, Control) process improvement project.

  3. Report formatting in laboratory medicine - a call for harmony.

    PubMed

    Jones, Graham R D; Legg, Michael

    2018-04-19

    The results of medical laboratory testing are only useful if they lead to appropriate actions by medical practitioners and/or patients. An underappreciated component of the medical testing process is the transfer of the information from the laboratory report into the reader's brain. The format of laboratory reports can be determined by the testing laboratory, which may issue a formatted report, or by electronic systems receiving information from laboratories and controlling the report format. As doctors can receive information from many laboratories, interpreting information from reports in a safe and rapid manner is facilitated by having similar report layouts and formats. Using Australia as an example, there is a wide variation in report formats in spite of a body of work to define standards for reporting. In addition to standardising of report formats, consideration needs to be given to optimisation of report formatting to facilitate rapid and unambiguous reading of the report and also interpretation of the data. Innovative report formats have been developed by some laboratories; however, wide adoption has not followed. The need to balance uniformity of reporting with appropriate innovation is a challenge for safe reporting of laboratory results. This paper discusses the current status and opportunity for improvement in safety and efficiency of the reading of laboratory reports, using current practise and developments in Australia as examples.

  4. Statistical Process Control: A Quality Tool for a Venous Thromboembolic Disease Registry.

    PubMed

    Posadas-Martinez, Maria Lourdes; Rojas, Liliana Paloma; Vazquez, Fernando Javier; De Quiros, Fernan Bernaldo; Waisman, Gabriel Dario; Giunta, Diego Hernan

    2016-01-01

    We aim to describe Statistical Control Process as a quality tool for the Institutional Registry of Venous Thromboembolic Disease (IRTD), a registry developed in a community-care tertiary hospital in Buenos Aires, Argentina. The IRTD is a prospective cohort. The process of data acquisition began with the creation of a computerized alert generated whenever physicians requested imaging or laboratory study to diagnose venous thromboembolism, which defined eligible patients. The process then followed a structured methodology for patient's inclusion, evaluation, and posterior data entry. To control this process, process performance indicators were designed to be measured monthly. These included the number of eligible patients, the number of included patients, median time to patient's evaluation, and percentage of patients lost to evaluation. Control charts were graphed for each indicator. The registry was evaluated in 93 months, where 25,757 patients were reported and 6,798 patients met inclusion criteria. The median time to evaluation was 20 hours (SD, 12) and 7.7% of the total was lost to evaluation. Each indicator presented trends over time, caused by structural changes and improvement cycles, and therefore the central limit suffered inflexions. Statistical process control through process performance indicators allowed us to control the performance of the registry over time to detect systematic problems. We postulate that this approach could be reproduced for other clinical registries.

  5. Effects of task structure on category priming in patients with Parkinson's disease and in healthy individuals.

    PubMed

    Brown, Gregory G; Brown, Sandra J; Christenson, Gina; Williams, Rebecca E; Kindermann, Sandra S; Loftis, Christopher; Olsen, Ryan; Siple, Patricia; Shults, Clifford; Gorell, Jay M

    2002-05-01

    Lexical decision tasks have been used to study both shifts of attention and semantic processing in Parkinson's Disease (PD). Whereas other laboratories have reported normal levels of semantic priming among PD patients, our laboratory has reported abnormally large levels. In this study, two experiments were performed to determine the influence of task structure on the extent of semantic priming during lexical decision-making and pronunciation tasks among PD patients and neurologically healthy controls. In Experiment 1, the effect of Prime Dominance (the ratio of category to neutral trials) on lexical decision-making was studied. Although equal numbers of word and nonword trials were presented, half of the PD patients and controls were studied under Category Prime Dominance (category : neutral prime ratio of 2:1) and half were studied under Neutral Prime Dominance (category : neutral prime ratio of 1:2). In Experiment 2, PD and control participants were studied on lexical decision-making and pronunciation tasks where twice as many words as nonword trials were presented, consistent with other studies from our laboratory. In Experiment 1, we found no group differences in the magnitude of priming and no effect of Prime Dominance. Moreover, the findings were similar in pattern and magnitude to results published by Neely (1977). In Experiment 2, we observed larger priming effects among PD patients than among controls, but only on the lexical decision (LD) task. These results support the hypothesis that abnormally large category-priming effects appear in LD studies of PD patients when the number of word trials exceeds the number of nonword trials. Furthermore, increased lexical priming in PD appears to be due to processes operating during the decision-making period that follows presentation of the lexical target.

  6. Shoreline Erosion Processes: Orwell Lake, Minnesota.

    DTIC Science & Technology

    1984-12-01

    1976) and Savat ( 1981 ) found such splash layer Will absorb much of the impact of the rain- erosion to increase with increasing slope angle, but...pp. 188-196. U.S. Army Corps of Engineers (1979) Flood control, Savat , J. ( 1981 ) Work done by splash: Laboratory Orwell Dam, Otter Tail River

  7. Biomechanical pulping : a mill-scale evaluation

    Treesearch

    Masood Akhtar; Gary M. Scott; Ross E. Swaney; Mike J. Lentz; Eric G. Horn; Marguerite S. Sykes; Gary C. Myers

    1999-01-01

    Mechanical pulping process is electrical energy intensive and results in low paper strength. Biomechanical pulping, defined as the fungal treatment of lignocellulosic materials prior to mechanical pulping, has shown at least 30% savings in electrical energy consumption, and significant improvements in paper strength properties compared to the control at a laboratory...

  8. MEASUREMENT AND PREDICTION OF THE RESISTIVITY OF ASH/SORBENT MIXTURES PRODUCED BY SULFUR OXIDE CONTROL PROCESSES

    EPA Science Inventory

    The report describes the development of (1) a modified procedure for obtaining consistent and reproducible laboratory resistivity values for mixtures of coal fly ash and partially spent sorbent, and (2) an approach for predicting resistivity based on the chemical composition of t...

  9. Teaching Smartphone and Microcontroller Systems Using "Android Java"

    ERIC Educational Resources Information Center

    Tigrek, Seyitriza

    2012-01-01

    Mobile devices are becoming indispensable tools for many students and educators. Mobile technology is starting a new era in the computing methodologies in many engineering disciplines and laboratories. Microcontroller extension that communicates with mobile devices will take the data acquisition and control process into a new level in the sensing…

  10. Remote Labs and Game-Based Learning for Process Control

    ERIC Educational Resources Information Center

    Zualkernan, Imran A.; Husseini, Ghaleb A.; Loughlin, Kevin F.; Mohebzada, Jamshaid G.; El Gaml, Moataz

    2013-01-01

    Social networking platforms and computer games represent a natural informal learning environment for the current generation of learners in higher education. This paper explores the use of game-based learning in the context of an undergraduate chemical engineering remote laboratory. Specifically, students are allowed to manipulate chemical…

  11. Sandia National Laboratories: Careers: Life at Sandia

    Science.gov Websites

    ; Culture Work-Life Balance Special Programs Students and Postdocs Benefits and Perks Hiring Process Life at control my own career, and the incredible work-life balance." Daniel - Electrical and Computer offers meaningful work, unparalleled work-life balance, outstanding benefits, job stability, and multiple

  12. Gas Cylinder Safety, Course 9518

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glass, George

    2016-10-27

    This course, Gas Cylinder Safety (#9518), presents an overview of the hazards and controls associated with handling, storing, using, and transporting gas cylinders. Standard components and markings of gas cylinders are also presented, as well as the process for the procurement, delivery, and return of gas cylinders at Los Alamos National Laboratory (LANL).

  13. The Social Perceptual Salience Effect

    ERIC Educational Resources Information Center

    Inderbitzin, Martin P.; Betella, Alberto; Lanata, Antonio; Scilingo, Enzo P.; Bernardet, Ulysses; Verschure, Paul F. M. J.

    2013-01-01

    Affective processes appraise the salience of external stimuli preparing the agent for action. So far, the relationship between stimuli, affect, and action has been mainly studied in highly controlled laboratory conditions. In order to find the generalization of this relationship to social interaction, we assess the influence of the salience of…

  14. Using generic tool kits to build intelligent systems

    NASA Technical Reports Server (NTRS)

    Miller, David J.

    1994-01-01

    The Intelligent Systems and Robots Center at Sandia National Laboratories is developing technologies for the automation of processes associated with environmental remediation and information-driven manufacturing. These technologies, which focus on automated planning and programming and sensor-based and model-based control, are used to build intelligent systems which are able to generate plans of action, program the necessary devices, and use sensors to react to changes in the environment. By automating tasks through the use of programmable devices tied to computer models which are augmented by sensing, requirements for faster, safer, and cheaper systems are being satisfied. However, because of the need for rapid cost-effect prototyping and multi-laboratory teaming, it is also necessary to define a consistent approach to the construction of controllers for such systems. As a result, the Generic Intelligent System Controller (GISC) concept has been developed. This concept promotes the philosophy of producing generic tool kits which can be used and reused to build intelligent control systems.

  15. Measuring laboratory-based influenza surveillance capacity: development of the 'International Influenza Laboratory Capacity Review' Tool.

    PubMed

    Muir-Paulik, S A; Johnson, L E A; Kennedy, P; Aden, T; Villanueva, J; Reisdorf, E; Humes, R; Moen, A C

    2016-01-01

    The 2005 International Health Regulations (IHR 2005) emphasized the importance of laboratory capacity to detect emerging diseases including novel influenza viruses. To support IHR 2005 requirements and the need to enhance influenza laboratory surveillance capacity, the Association of Public Health Laboratories (APHL) and the Centers for Disease Control and Prevention (CDC) Influenza Division developed the International Influenza Laboratory Capacity Review (Tool). Data from 37 assessments were reviewed and analyzed to verify that the quantitative analysis results accurately depicted a laboratory's capacity and capabilities. Subject matter experts in influenza and laboratory practice used an iterative approach to develop the Tool incorporating feedback and lessons learnt through piloting and implementation. To systematically analyze assessment data, a quantitative framework for analysis was added to the Tool. The review indicated that changes in scores consistently reflected enhanced or decreased capacity. The review process also validated the utility of adding a quantitative analysis component to the assessments and the benefit of establishing a baseline from which to compare future assessments in a standardized way. Use of the Tool has provided APHL, CDC and each assessed laboratory with a standardized analysis of the laboratory's capacity. The information generated is used to improve laboratory systems for laboratory testing and enhance influenza surveillance globally. We describe the development of the Tool and lessons learnt. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  16. Controlling changes - lessons learned from waste management facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, B.M.; Koplow, A.S.; Stoll, F.E.

    This paper discusses lessons learned about change control at the Waste Reduction Operations Complex (WROC) and Waste Experimental Reduction Facility (WERF) of the Idaho National Engineering Laboratory (INEL). WROC and WERF have developed and implemented change control and an as-built drawing process and have identified structures, systems, and components (SSCS) for configuration management. The operations have also formed an Independent Review Committee to minimize costs and resources associated with changing documents. WROC and WERF perform waste management activities at the INEL. WROC activities include storage, treatment, and disposal of hazardous and mixed waste. WERF provides volume reduction of solid low-levelmore » waste through compaction, incineration, and sizing operations. WROC and WERF`s efforts aim to improve change control processes that have worked inefficiently in the past.« less

  17. Underground Nuclear Astrophysics - from LUNA to CASPAR

    NASA Astrophysics Data System (ADS)

    Strieder, Frank; Caspar Collaboration

    2015-04-01

    It is in the nature of astrophysics that many of the processes and objects are physically inaccessible. Thus, it is important that those aspects that can be studied in the laboratory are well understood. Nuclear reactions are such quantities that can be partly measured in the laboratory. These reactions influence the nucleosynthesis of the elements in the Big Bang as well as in all objects formed thereafter, and control the associated energy generation and evolution of stars. Since 20 years LUNA (Laboratory for Underground Nuclear Astrophysics) has been measuring cross sections relevant for hydrogen burning in the Gran Sasso Laboratory and demonstrated the research potential of an underground accelerator facility. Unfortunately, the number of reactions is limited by the energy range accessible with the 400 kV LUNA accelerator. The CASPAR (Compact Accelerator System for Performing Astrophysical Research) Collaboration will implement a high intensity 1 MV accelerator at the Sanford Underground Research Facility (SURF) and overcome the current limitation at LUNA. This project will primarily focus on the neutron sources for the so-called s-process, e.g. 13 C(α , n) 16 O and 22 Ne(α , n) 25 Mg , and lead to unprecedented measurements compared to previous studies.

  18. Electrohydraulic Forming of Near-Net Shape Automotive Panels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Golovaschenko, Sergey F.

    2013-09-26

    The objective of this project was to develop the electrohydraulic forming (EHF) process as a near-net shape automotive panel manufacturing technology that simultaneously reduces the energy embedded in vehicles and the energy consumed while producing automotive structures. Pulsed pressure is created via a shockwave generated by the discharge of high voltage capacitors through a pair of electrodes in a liquid-filled chamber. The shockwave in the liquid initiated by the expansion of the plasma channel formed between two electrodes propagates towards the blank and causes the blank to be deformed into a one-sided die cavity. The numerical model of the EHFmore » process was validated experimentally and was successfully applied to the design of the electrode system and to a multi-electrode EHF chamber for full scale validation of the process. The numerical model was able to predict stresses in the dies during pulsed forming and was validated by the experimental study of the die insert failure mode for corner filling operations. The electrohydraulic forming process and its major subsystems, including durable electrodes, an EHF chamber, a water/air management system, a pulse generator and integrated process controls, were validated to be capable to operate in a fully automated, computer controlled mode for forming of a portion of a full-scale sheet metal component in laboratory conditions. Additionally, the novel processes of electrohydraulic trimming and electrohydraulic calibration were demonstrated at a reduced-scale component level. Furthermore, a hybrid process combining conventional stamping with EHF was demonstrated as a laboratory process for a full-scale automotive panel formed out of AHSS material. The economic feasibility of the developed EHF processes was defined by developing a cost model of the EHF process in comparison to the conventional stamping process.« less

  19. Dieting and the self-control of eating in everyday environments: An experience sampling study

    PubMed Central

    Hofmann, Wilhelm; Adriaanse, Marieke; Vohs, Kathleen D.; Baumeister, Roy F.

    2013-01-01

    Objective The literature on dieting has sparked several debates over how restrained eaters differ from unrestrained eaters in their self-regulation of healthy and unhealthy food desires and what distinguishes successful from unsuccessful dieters. We addressed these debates using a four-component model of self-control that was tested using ecological momentary assessment, long-term weight change, and a laboratory measure of inhibitory control. Design A large sample of adults varying in dietary restraint and inhibitory control (as measured by a Stroop task) were equipped with smartphones for a week. They were beeped on random occasions and provided information on their experience and control of healthy and unhealthy food desires in everyday environments. Main Outcome Measures Desire strength, experienced conflict, resistance, enactment of desire, and weight change after a four-month follow-up. Results and Conclusions Dietary restraint was unrelated to desire frequency and strength, but associated with higher conflict experiences and motivation to use self-control with regard to food desires. Most importantly, relationships between and among dietary restraint and resistance, enactment of desire, and long-term weight change were moderated by inhibitory control: Compared to dieters low in response inhibition, dieters high in response inhibition were more likely to attempt to resist food desires, not consume desired food (especially unhealthy food), and objectively lost more weight over the ensuing four months. These results highlight the combinatory effects of aspects of the self-control process in dieters and highlight the value in linking theoretical process frameworks, experience sampling, and laboratory-based assessment in health science. PMID:23751109

  20. Spatial variation in microbial processes controlling carbon mineralization within soils and sediments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fendorf, Scott; Kleber, Markus; Nico, Peter

    Soils have a defining role in global carbon cycling, having one of the largest dynamic stocks of C on earth—3300 Pg of C are stored in soils, which is three-times the amount stored in the atmosphere and more than the terrestrial land plants. An important control on soil organic matter (SOM) quantities is the mineralization rate. It is well recognized that the rate and extent of SOM mineralization is affected by climatic factors and mineral-organic matter associations. What remained elusive is to what extent constraints on microbial metabolism induced by the respiratory pathway, and specifically the electron acceptor in respiration,more » control overall rates of carbon mineralization in soils. Therefore, physical factors limiting oxygen diffusion such as soil texture and aggregate size (soil structure) may therefore be central controls on C mineralization rates. The goal of our research was therefore to determine if variations in microbial metabolic rates induced by anaerobic microsites in soils are a major control on SOM mineralization rates and thus storage. We performed a combination of laboratory experiments and field investigations will be performed to fulfill our research objectives. We used laboratory studies to examine fundamental factors of respiratory constraints (i.e., electron acceptor) on organic matter mineralization rates. We ground our laboratory studies with both manipulation of field samples and in-field measurements. Selection of the field sites is guided by variation in soil texture and structure while having (other environmental/soil factors constant. Our laboratory studies defined redox gradients and variations in microbial metabolism operating at the aggregate-scale (cm-scale) within soils using a novel constructed diffusion reactor. We further examined micro-scale variation in terminal electron accepting processes and resulting C mineralization rates within re-packed soils. A major outcome of our research is the ability to quantitatively place the importance of aggregate-based heterogeneity in microbial redox processes and the resulting lack of oxygen on the rate of carbon mineralization. Collectively, our research shows that anaerobic microsites are prevalent in soils and are important regulators of soil carbon persistence, shifting microbial metabolism to less efficient anaerobic respiration and selectively protecting otherwise bioavailable, reduced organic compounds such as lipids and waxes from decomposition. Further, shifting from anaerobic to aerobic conditions leads to a 10-fold increase in volume-specific mineralization rate, illustrating the sensitivity of anaerobically protected carbon to disturbance. Vulnerability of anaerobically protected carbon to future climate or land use change thus constitutes a yet unrecognized soil carbon-climate feedback that should be incorporated into terrestrial ecosystem models.« less

  1. Development of an Excel-based laboratory information management system for improving workflow efficiencies in early ADME screening.

    PubMed

    Lu, Xinyan

    2016-01-01

    There is a clear requirement for enhancing laboratory information management during early absorption, distribution, metabolism and excretion (ADME) screening. The application of a commercial laboratory information management system (LIMS) is limited by complexity, insufficient flexibility, high costs and extended timelines. An improved custom in-house LIMS for ADME screening was developed using Excel. All Excel templates were generated through macros and formulae, and information flow was streamlined as much as possible. This system has been successfully applied in task generation, process control and data management, with a reduction in both labor time and human error rates. An Excel-based LIMS can provide a simple, flexible and cost/time-saving solution for improving workflow efficiencies in early ADME screening.

  2. DHM and serious games: a case-study oil and gas laboratories.

    PubMed

    Santos, V; Zamberlan, M; Streit, P; Oliveira, J; Guimarães, C; Pastura, F; Cid, G

    2012-01-01

    The aim in this paper is to present a research on the application of serious games for the design of laboratories in the oil and gas industries. The focus is in human virtual representation acquired from 3D scanning, human interaction, workspace layout and equipment designed considering ergonomics standards. The laboratory studies were simulated in Unity3D platform, which allows the users to control the DHM1 on the dynamic virtual scenario, in order to simulate work activities. This methodology can change the design process by improving the level of interaction between final users, managers and human factor teams. That helps to better visualize future work settings and improve the level of participation between all stakeholders.

  3. Computational analysis of fluid dynamics in pharmaceutical freeze-drying.

    PubMed

    Alexeenko, Alina A; Ganguly, Arnab; Nail, Steven L

    2009-09-01

    Analysis of water vapor flows encountered in pharmaceutical freeze-drying systems, laboratory-scale and industrial, is presented based on the computational fluid dynamics (CFD) techniques. The flows under continuum gas conditions are analyzed using the solution of the Navier-Stokes equations whereas the rarefied flow solutions are obtained by the direct simulation Monte Carlo (DSMC) method for the Boltzmann equation. Examples of application of CFD techniques to laboratory-scale and industrial scale freeze-drying processes are discussed with an emphasis on the utility of CFD for improvement of design and experimental characterization of pharmaceutical freeze-drying hardware and processes. The current article presents a two-dimensional simulation of a laboratory scale dryer with an emphasis on the importance of drying conditions and hardware design on process control and a three-dimensional simulation of an industrial dryer containing a comparison of the obtained results with analytical viscous flow solutions. It was found that the presence of clean in place (CIP)/sterilize in place (SIP) piping in the duct lead to significant changes in the flow field characteristics. The simulation results for vapor flow rates in an industrial freeze-dryer have been compared to tunable diode laser absorption spectroscopy (TDLAS) and gravimetric measurements.

  4. [Development of a microbiology data warehouse (Akita-ReNICS) for networking hospitals in a medical region].

    PubMed

    Ueki, Shigeharu; Kayaba, Hiroyuki; Tomita, Noriko; Kobayashi, Noriko; Takahashi, Tomoe; Obara, Toshikage; Takeda, Masahide; Moritoki, Yuki; Itoga, Masamichi; Ito, Wataru; Ohsaga, Atsushi; Kondoh, Katsuyuki; Chihara, Junichi

    2011-04-01

    The active involvement of hospital laboratory in surveillance is crucial to the success of nosocomial infection control. The recent dramatic increase of antimicrobial-resistant organisms and their spread into the community suggest that the infection control strategy of independent medical institutions is insufficient. To share the clinical data and surveillance in our local medical region, we developed a microbiology data warehouse for networking hospital laboratories in Akita prefecture. This system, named Akita-ReNICS, is an easy-to-use information management system designed to compare, track, and report the occurrence of antimicrobial-resistant organisms. Participating laboratories routinely transfer their coded and formatted microbiology data to ReNICS server located at Akita University Hospital from their health care system's clinical computer applications over the internet. We established the system to automate the statistical processes, so that the participants can access the server to monitor graphical data in the manner they prefer, using their own computer's browser. Furthermore, our system also provides the documents server, microbiology and antimicrobiotic database, and space for long-term storage of microbiological samples. Akita-ReNICS could be a next generation network for quality improvement of infection control.

  5. Quality-control materials in the USDA National Food and Nutrient Analysis Program (NFNAP).

    PubMed

    Phillips, Katherine M; Patterson, Kristine Y; Rasor, Amy S; Exler, Jacob; Haytowitz, David B; Holden, Joanne M; Pehrsson, Pamela R

    2006-03-01

    The US Department of Agriculture (USDA) Nutrient Data Laboratory (NDL) develops and maintains the USDA National Nutrient Databank System (NDBS). Data are released from the NDBS for scientific and public use through the USDA National Nutrient Database for Standard Reference (SR) ( http://www.ars.usda.gov/ba/bhnrc/ndl ). In 1997 the NDL initiated the National Food and Nutrient Analysis Program (NFNAP) to update and expand its food-composition data. The program included: 1) nationwide probability-based sampling of foods; 2) central processing and archiving of food samples; 3) analysis of food components at commercial, government, and university laboratories; 4) incorporation of new analytical data into the NDBS; and 5) dissemination of these data to the scientific community. A key feature and strength of the NFNAP was a rigorous quality-control program that enabled independent verification of the accuracy and precision of analytical results. Custom-made food-control composites and/or commercially available certified reference materials were sent to the laboratories, blinded, with the samples. Data for these materials were essential to ongoing monitoring of analytical work, to identify and resolve suspected analytical problems, to ensure the accuracy and precision of results for the NFNAP food samples.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sailer, S.J.

    This Quality Assurance Project Plan (QAPJP) specifies the quality of data necessary and the characterization techniques employed at the Idaho National Engineering Laboratory (INEL) to meet the objectives of the Department of Energy (DOE) Waste Isolation Pilot Plant (WIPP) Transuranic Waste Characterization Quality Assurance Program Plan (QAPP) requirements. This QAPJP is written to conform with the requirements and guidelines specified in the QAPP and the associated documents referenced in the QAPP. This QAPJP is one of a set of five interrelated QAPjPs that describe the INEL Transuranic Waste Characterization Program (TWCP). Each of the five facilities participating in the TWCPmore » has a QAPJP that describes the activities applicable to that particular facility. This QAPJP describes the roles and responsibilities of the Idaho Chemical Processing Plant (ICPP) Analytical Chemistry Laboratory (ACL) in the TWCP. Data quality objectives and quality assurance objectives are explained. Sample analysis procedures and associated quality assurance measures are also addressed; these include: sample chain of custody; data validation; usability and reporting; documentation and records; audits and 0385 assessments; laboratory QC samples; and instrument testing, inspection, maintenance and calibration. Finally, administrative quality control measures, such as document control, control of nonconformances, variances and QA status reporting are described.« less

  7. 21 CFR 226.58 - Laboratory controls.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Laboratory controls. Laboratory controls shall include the establishment of adequate specifications and test... establishment of master records containing appropriate specifications and a description of the test procedures... necessary laboratory test procedures to check such specifications. (c) Assays which shall be made of...

  8. Proceedings of the Augmented VIsual Display (AVID) Research Workshop

    NASA Technical Reports Server (NTRS)

    Kaiser, Mary K. (Editor); Sweet, Barbara T. (Editor)

    1993-01-01

    The papers, abstracts, and presentations were presented at a three day workshop focused on sensor modeling and simulation, and image enhancement, processing, and fusion. The technical sessions emphasized how sensor technology can be used to create visual imagery adequate for aircraft control and operations. Participants from industry, government, and academic laboratories contributed to panels on Sensor Systems, Sensor Modeling, Sensor Fusion, Image Processing (Computer and Human Vision), and Image Evaluation and Metrics.

  9. Quality Assurance Specifications for Planetary Protection Assays

    NASA Astrophysics Data System (ADS)

    Baker, Amy

    As the European Space Agency planetary protection (PP) activities move forward to support the ExoMars and other planetary missions, it will become necessary to increase staffing of labo-ratories that provide analyses for these programs. Standardization of procedures, a comprehen-sive quality assurance program, and unilateral training of personnel will be necessary to ensure that the planetary protection goals and schedules are met. The PP Quality Assurance/Quality Control (QAQC) program is designed to regulate and monitor procedures performed by labora-tory personnel to ensure that all work meets data quality objectives through the assembly and launch process. Because personnel time is at a premium and sampling schedules are often de-pendent on engineering schedules, it is necessary to have flexible staffing to support all sampling requirements. The most productive approach to having a competent and flexible work force is to establish well defined laboratory procedures and training programs that clearly address the needs of the program and the work force. The quality assurance specification for planetary protection assays has to ensure that labora-tories and associated personnel can demonstrate the competence to perform assays according to the applicable standard AD4. Detailed subjects included in the presentation are as follows: • field and laboratory control criteria • data reporting • personnel training requirements and certification • laboratory audit criteria. Based upon RD2 for primary and secondary validation and RD3 for data quality objectives, the QAQC will provide traceable quality assurance safeguards by providing structured laboratory requirements for guidelines and oversight including training and technical updates, standardized documentation, standardized QA/QC checks, data review and data archiving.

  10. QA/QC in the laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hood, F.C.

    1992-05-01

    Quality assurance and quality control (QA/QC) of analytical chemistry laboratory activities are essential to the validity and usefulness of resultant data. However, in themselves, conventional QA/QC measures will not always ensure that fraudulent data are not generated. Conventional QA/QC measures are based on the assumption that work will be done in good faith; to assure against fraudulent practices, QA/QC measures must be tailored to specific analyses protocols in anticipation of intentional misapplication of those protocols. Application of specific QA/QC measures to ensure against fraudulent practices result in an increased administrative burden being placed on the analytical process; accordingly, in keepingmore » with graded QA philosophy, data quality objectives must be used to identify specific points of concern for special control to minimize the administrative impact.« less

  11. QA/QC in the laboratory. Session F

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hood, F.C.

    1992-05-01

    Quality assurance and quality control (QA/QC) of analytical chemistry laboratory activities are essential to the validity and usefulness of resultant data. However, in themselves, conventional QA/QC measures will not always ensure that fraudulent data are not generated. Conventional QA/QC measures are based on the assumption that work will be done in good faith; to assure against fraudulent practices, QA/QC measures must be tailored to specific analyses protocols in anticipation of intentional misapplication of those protocols. Application of specific QA/QC measures to ensure against fraudulent practices result in an increased administrative burden being placed on the analytical process; accordingly, in keepingmore » with graded QA philosophy, data quality objectives must be used to identify specific points of concern for special control to minimize the administrative impact.« less

  12. Evolution of an International External Quality Assurance Model To Support Laboratory Investigation of Streptococcus pneumoniae, Developed for the SIREVA Project in Latin America, from 1993 to 2005▿

    PubMed Central

    Lovgren, Marguerite; Talbot, James A.; Brandileone, Maria Cristina; Casagrande, Silvana T.; Agudelo, Clara Inés; Castañeda, Elizabeth; Regueira, Mabel; Corso, Alejandra; Heitmann, Ingrid; Maldonado, Aurora; Echániz-Avilés, Gabriela; Soto-Noguerón, Araceli; Hortal, María; Camou, Teresa; Gabastou, Jean-Marc; Fabio, José Luis Di

    2007-01-01

    In 1993 the Pan American Health Organization initiated a laboratory-based surveillance system, called the SIREVA project, to learn about Streptococcus pneumoniae invasive disease in Latin American children. In 1994, National Laboratories in six countries were trained to perform serotyping and antibiotic susceptibility testing using broth microdilution to determine the MIC for specified antibiotics. An international External Quality Assurance (EQA) program was developed to monitor and support ongoing laboratory performance. The EQA program was coordinated by the National Centre for Streptococcus (NCS), Edmonton, Canada, and included external proficiency testing (EPT) and a validation process requiring regular submission of a sample of isolates from each laboratory to the NCS for verification of the serotype and MIC. In 1999, the EQA program was decentralized to use three of the original laboratories as regional quality control centers to address operational concerns and to accommodate the growth of the laboratory network to more than 20 countries including the Caribbean region. The overall EPT serotyping accuracies for phase I (1993 to 1998) and phase II (1999 to 2005) were 88.0 and 93.8%, respectively; the MIC correlations within ±1 log2 dilution of the expected result were 83.0 and 91.0% and the interpretive category agreements were 89.1 and 95.3%. Overall, the validation process serotyping accuracies for phases I and II were 81.9 and 88.1%, respectively, 80.4 and 90.5% for MIC agreement, and 85.8 and 94.3% for category agreement. These results indicate a high level of testing accuracy in participating National Laboratories and a sustained increase in EQA participation in Latin America and the Caribbean. PMID:17687007

  13. LabRS: A Rosetta stone for retrospective standardization of clinical laboratory test results.

    PubMed

    Hauser, Ronald George; Quine, Douglas B; Ryder, Alex

    2018-02-01

    Clinical laboratories in the United States do not have an explicit result standard to report the 7 billion laboratory tests results they produce each year. The absence of standardized test results creates inefficiencies and ambiguities for secondary data users. We developed and tested a tool to standardize the results of laboratory tests in a large, multicenter clinical data warehouse. Laboratory records, each of which consisted of a laboratory result and a test identifier, from 27 diverse facilities were captured from 2000 through 2015. Each record underwent a standardization process to convert the original result into a format amenable to secondary data analysis. The standardization process included the correction of typos, normalization of categorical results, separation of inequalities from numbers, and conversion of numbers represented by words (eg, "million") to numerals. Quality control included expert review. We obtained 1.266 × 109 laboratory records and standardized 1.252 × 109 records (98.9%). Of the unique unstandardized records (78.887 × 103), most appeared <5 times (96%, eg, typos), did not have a test identifier (47%), or belonged to an esoteric test with <100 results (2%). Overall, these 3 reasons accounted for nearly all unstandardized results (98%). Current results suggest that the tool is both scalable and generalizable among diverse clinical laboratories. Based on observed trends, the tool will require ongoing maintenance to stay current with new tests and result formats. Future work to develop and implement an explicit standard for test results would reduce the need to retrospectively standardize test results. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  14. Mapping of the US Domestic Influenza Virologic Surveillance Landscape.

    PubMed

    Jester, Barbara; Schwerzmann, Joy; Mustaquim, Desiree; Aden, Tricia; Brammer, Lynnette; Humes, Rosemary; Shult, Pete; Shahangian, Shahram; Gubareva, Larisa; Xu, Xiyan; Miller, Joseph; Jernigan, Daniel

    2018-07-17

    Influenza virologic surveillance is critical each season for tracking influenza circulation, following trends in antiviral drug resistance, detecting novel influenza infections in humans, and selecting viruses for use in annual seasonal vaccine production. We developed a framework and process map for characterizing the landscape of US influenza virologic surveillance into 5 tiers of influenza testing: outpatient settings (tier 1), inpatient settings and commercial laboratories (tier 2), state public health laboratories (tier 3), National Influenza Reference Center laboratories (tier 4), and Centers for Disease Control and Prevention laboratories (tier 5). During the 2015-16 season, the numbers of influenza tests directly contributing to virologic surveillance were 804,000 in tiers 1 and 2; 78,000 in tier 3; 2,800 in tier 4; and 3,400 in tier 5. With the release of the 2017 US Pandemic Influenza Plan, the proposed framework will support public health officials in modeling, surveillance, and pandemic planning and response.

  15. Baking sunflower hulls within an aluminum envelope in a common laboratory oven yields charcoal.

    PubMed

    Arnal, Pablo Maximiliano

    2015-01-01

    Charcoals have been widely used by scientist to research the removal of contaminants from water and air. One key feature of charcoal is that it keeps macropores from the parent material - though anisotropically contracted - and can even develop meso- and micropores. However, the controlled thermochemical conversion of biomass into charcoal at laboratory scale normally requires special setups which involve either vacuum or inert gas. Those setups may not be affordable in research groups or educational institutions where the research of charcoals would be highly welcome. In this work, I propose a simple and effective method to steer the thermochemical process that converts sunflower hulls (SFH) into charcoal with basic laboratory resources. The carbonization method: •Place SFH in an airtight aluminum envelope.•Thermally treat SFH within the envelope in a common laboratory oven.•Open the envelope to obtain the carbonized sunflower hulls.

  16. Quality Assurance Program for Molecular Medicine Laboratories

    PubMed Central

    Hajia, M; Safadel, N; Samiee, S Mirab; Dahim, P; Anjarani, S; Nafisi, N; Sohrabi, A; Rafiee, M; Sabzavi, F; Entekhabi, B

    2013-01-01

    Background: Molecular diagnostic methods have played and continuing to have a critical role in clinical laboratories in recent years. Therefore, standardization is an evolutionary process that needs to be upgrade with increasing scientific knowledge, improvement of the instruments and techniques. The aim of this study was to design a quality assurance program in order to have similar conditions for all medical laboratories engaging with molecular tests. Methods: We had to design a plan for all four elements; required space conditions, equipments, training, and basic guidelines. Necessary guidelines was prepared and confirmed by the launched specific committee at the Health Reference Laboratory. Results: Several workshops were also held for medical laboratories directors and staffs, quality control manager of molecular companies, directors and nominees from universities. Accreditation of equipments and molecular material was followed parallel with rest of program. Now we are going to accredit medical laboratories and to evaluate the success of the program. Conclusion: Accreditation of medical laboratory will be succeeding if its basic elements are provided in advance. Professional practice guidelines, holding training and performing accreditation the molecular materials and equipments ensured us that laboratories are aware of best practices, proper interpretation, limitations of techniques, and technical issues. Now, active external auditing can improve the applied laboratory conditions toward the defined standard level. PMID:23865028

  17. Quality assurance program for molecular medicine laboratories.

    PubMed

    Hajia, M; Safadel, N; Samiee, S Mirab; Dahim, P; Anjarani, S; Nafisi, N; Sohrabi, A; Rafiee, M; Sabzavi, F; Entekhabi, B

    2013-01-01

    Molecular diagnostic methods have played and continuing to have a critical role in clinical laboratories in recent years. Therefore, standardization is an evolutionary process that needs to be upgrade with increasing scientific knowledge, improvement of the instruments and techniques. The aim of this study was to design a quality assurance program in order to have similar conditions for all medical laboratories engaging with molecular tests. We had to design a plan for all four elements; required space conditions, equipments, training, and basic guidelines. Necessary guidelines was prepared and confirmed by the launched specific committee at the Health Reference Laboratory. Several workshops were also held for medical laboratories directors and staffs, quality control manager of molecular companies, directors and nominees from universities. Accreditation of equipments and molecular material was followed parallel with rest of program. Now we are going to accredit medical laboratories and to evaluate the success of the program. Accreditation of medical laboratory will be succeeding if its basic elements are provided in advance. Professional practice guidelines, holding training and performing accreditation the molecular materials and equipments ensured us that laboratories are aware of best practices, proper interpretation, limitations of techniques, and technical issues. Now, active external auditing can improve the applied laboratory conditions toward the defined standard level.

  18. Physiological response of invasive mussel Limnoperna fortunei (Dunker, 1857) (Bivalvia: Mytilidae) submitted to transport and experimental conditions.

    PubMed

    Cordeiro, N I S; Andrade, J T M; Montresor, L C; Luz, D M R; Araújo, J M; Martinez, C B; Pinheiro, J; Vidigal, T H D A

    2017-03-01

    Successful animal rearing under laboratory conditions for commercial processes or laboratory experiments is a complex chain that includes several stressors (e.g., sampling and transport) and incurs, as a consequence, the reduction of natural animal conditions, economic losses and inconsistent and unreliable biological results. Since the invasion of the bivalve Limnoperna fortunei (Dunker, 1857) in South America, several studies have been performed to help control and manage this fouling pest in industrial plants that use raw water. Relatively little attention has been given to the laboratory rearing procedure of L. fortunei, its condition when exposed to a stressor or its acclimation into laboratory conditions. Considering this issue, the aims of this study are to (i) investigate L. fortunei physiological responses when submitted to the depuration process and subsequent air transport (without water/dry condition) at two temperatures, based on glycogen concentrations, and (ii) monitor the glycogen concentrations in different groups when maintained for 28 days under laboratory conditions. Based on the obtained results, depuration did not affect either of the groups when they were submitted to approximately eight hours of transport. The variation in glycogen concentration among the specimens that were obtained from the field under depurated and non-depurated conditions was significant only in the first week of laboratory growth for the non-depurated group and in the second week for the depurated group. In addition, the tested temperature did not affect either of the groups that were submitted to transport. The glycogen concentrations were similar to those of the specimens that were obtained from the field in third week, which suggests that the specimens acclimated to laboratory conditions during this period of time. Thus, the results indicate that the air transport and acclimation time can be successfully incorporated into experimental studies of L. fortunei. Finally, the tolerance of L. fortunei specimens to the stressor tested herein can help us understand the invasive capacity of this mussel during the establishment process.

  19. A novel method for room temperature distribution and conservation of RNA and DNA reference materials for guaranteeing performance of molecular diagnostics in onco-hematology: A GBMHM study.

    PubMed

    Cayuela, Jean-Michel; Mauté, Carole; Fabre, Anne-Lise; Nibourel, Olivier; Dulucq, Stéphanie; Delabesse, Eric; Villarèse, Patrick; Hayette, Sandrine; Mozziconacci, Marie-Joelle; Macintyre, Elizabeth

    2015-10-01

    Performance of methods used for molecular diagnostics must be closely controlled by regular analysis of internal quality controls. However, conditioning, shipping and long lasting storage of nucleic acid controls remain problematic. Therefore, we evaluated the minicapsule-based innovative process developed by Imagene (Evry, France) for implementing DNA and RNA controls designed for clonality assessment of lymphoproliferations and BCR-ABL1 mRNA quantification, respectively. DNA samples were extracted from 12 cell lines selected for giving specific amplifications with most BIOMED-2 PCR tubes. RNA samples were extracted from 8 cell line mixtures expressing various BCR-ABL1 transcript levels. DNA and RNA were encapsulated by Imagene and shipped at room temperature to participating laboratories. Biologists were asked to report quality data of recovered nucleic acids as well as PCR results. Encapsulated nucleic acids samples were easily and efficiently recovered from minicapsules. The expected rearrangements at immunoglobulin, T-cell receptor and BCL2 loci were detected in DNA samples by all laboratories. Quality of RNA was consistent between laboratories and met the criteria requested for quantification of BCR-ABL1 transcripts. Expression levels measured by the 5 laboratories were within ±2 fold interval from the corresponding pre-encapsulation reference value. Moreover aging studies of encapsulated RNA simulating up to 100 years storage at room temperature show no bias in quantitative outcome. Therefore, Imagene minicapsules are suitable for storage and distribution at room temperature of genetic material designed for proficiency control of molecular diagnostic methods based on end point or real-time quantitative PCR. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  20. Designing Scenarios for Controller-in-the-Loop Air Traffic Simulations

    NASA Technical Reports Server (NTRS)

    Kupfer, Michael; Mercer, Joey S.; Cabrall, Christopher; Callantine, Todd

    2013-01-01

    Well prepared traffic scenarios contribute greatly to the success of controller-in-the-loop simulations. This paper describes each stage in the design process of realistic scenarios based on real-world traffic, to be used in the Airspace Operations Laboratory for simulations within the Air Traffic Management Technology Demonstration 1 effort. The steps from the initial analysis of real-world traffic, to the editing of individual aircraft records in the scenario file, until the final testing of the scenarios before the simulation conduct, are all described. The iterative nature of the design process and the various efforts necessary to reach the required fidelity, as well as the applied design strategies, challenges, and tools used during this process are also discussed.

  1. Cost effectiveness of adopted quality requirements in hospital laboratories.

    PubMed

    Hamza, Alneil; Ahmed-Abakur, Eltayib; Abugroun, Elsir; Bakhit, Siham; Holi, Mohamed

    2013-01-01

    The present study was designed in quasi-experiment to assess adoption of the essential clauses of particular clinical laboratory quality management requirements based on international organization for standardization (ISO 15189) in hospital laboratories and to evaluate the cost effectiveness of compliance to ISO 15189. The quality management intervention based on ISO 15189 was conceded through three phases; pre - intervention phase, Intervention phase and Post-intervention phase. In pre-intervention phase the compliance to ISO 15189 was 49% for study group vs. 47% for control group with P value 0.48, while the post intervention results displayed 54% vs. 79% for study group and control group respectively in compliance to ISO 15189 and statistically significant difference (P value 0.00) with effect size (Cohen's d) of (0.00) in pre-intervention phase and (0.99) in post - intervention phase. The annual average cost per-test for the study group and control group was 1.80 ± 0.25 vs. 1.97 ± 0.39, respectively with P value 0.39 whereas the post-intervention results showed that the annual average total costs per-test for study group and control group was 1.57 ± 0.23 vs 2.08 ± 0.38, P value 0.019 respectively, with cost-effectiveness ratio of (0.88) in pre -intervention phase and (0.52) in post-intervention phase. The planned adoption of quality management requirements (QMS) in clinical laboratories had great effect to increase the compliance percent with quality management system requirement, raise the average total cost effectiveness, and improve the analytical process capability of the testing procedure.

  2. New algorithm for controlling electric arc furnaces using their vibrational and acoustic characteristics

    NASA Astrophysics Data System (ADS)

    Cherednichenko, V. S.; Bikeev, R. A.; Serikov, V. A.; Rechkalov, A. V.; Cherednichenko, A. V.

    2016-12-01

    The processes occurring in arc discharges are analyzed as the sources of acoustic radiation in an electric arc furnace (EAF). Acoustic vibrations are shown to transform into mechanical vibrations in the furnace laboratory. The shielding of the acoustic energy fluxes onto water-cooled wall panels by a charge is experimentally studied. It is shown that the rate of charge melting and the depth of submergence of arc discharges in the slag and metal melt can be monitored by measuring the vibrational characteristics of furnaces and using them in a universal industrial process-control system, which was developed for EAFs.

  3. The Role of Inhibitory Control in Behavioral and Physiological Expressions of Toddler Executive Function

    PubMed Central

    Morasch, Katherine C.; Bell, Martha Ann

    2010-01-01

    Eighty-one toddlers (ranging from 24 to 27 months) participated in a biobehavioral investigation of inhibitory control. Maternal-report measures of inhibitory control were related to laboratory tasks assessing inhibitory abilities under conditions of conflict, delay, and compliance challenge as well as toddler verbal ability. Additionally, unique variance in inhibitory control was explained by task-related changes in brain electrical activity at lateral frontal scalp sites as well as concurrent inhibitory task performance. Implications regarding neural correlates of executive function in early development and a central, organizing role of inhibitory processing in toddlerhood are discussed. PMID:20719337

  4. Programmable calculator as a data system controller

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barth, A.W.; Strasburg, A.C.

    Digital data techniques are in common use for analysis of analog information obtained in various tests, and systems have been developed which use a minicomputer as the central controller and data processor. Now, microprocessors allow new design approaches at considerably less cost. This report outlines an approach to system design based on the use of a programmable calculator as the data system controller. A block diagram of the calculator-controlled data system is shown. It was found that the programmable calculator provides a viable alternative to minicomputers or microprocessors for the development laboratory requiring digital data processing. 3 figures. (RWR)

  5. Experiential learning in control systems laboratories and engineering project management

    NASA Astrophysics Data System (ADS)

    Reck, Rebecca Marie

    Experiential learning is a process by which a student creates knowledge through the insights gained from an experience. Kolb's model of experiential learning is a cycle of four modes: (1) concrete experience, (2) reflective observation, (3) abstract conceptualization, and (4) active experimentation. His model is used in each of the three studies presented in this dissertation. Laboratories are a popular way to apply the experiential learning modes in STEM courses. Laboratory kits allow students to take home laboratory equipment to complete experiments on their own time. Although students like laboratory kits, no previous studies compared student learning outcomes on assignments using laboratory kits with existing laboratory equipment. In this study, we examined the similarities and differences between the experiences of students who used a portable laboratory kit and students who used the traditional equipment. During the 2014- 2015 academic year, we conducted a quasi-experiment to compare students' achievement of learning outcomes and their experiences in the instructional laboratory for an introductory control systems course. Half of the laboratory sections in each semester used the existing equipment, while the other sections used a new kit. We collected both quantitative data and qualitative data. We did not identify any major differences in the student experience based on the equipment they used. Course objectives, like research objectives and product requirements, help provide clarity and direction for faculty and students. Unfortunately, course and laboratory objectives are not always clearly stated. Without a clear set of objectives, it can be hard to design a learning experience and determine whether students are achieving the intended outcomes of the course or laboratory. In this study, I identified a common set of laboratory objectives, concepts, and components of a laboratory apparatus for undergraduate control systems laboratories. During the summer of 2015, a panel of 40 control systems faculty members, from a variety of institutions, completed a multi-round Delphi survey in order to bring them toward consensus on the common aspects of their laboratories. The following winter, 45 additional faculty members and practitioners from the control systems community completed a follow-up survey to gather feedback on the results of the Delphi survey. During the Delphi study, the panelists identified 15 laboratory objectives, 26 concepts, and 15 components that were common in their laboratories. Then in both the Delphi survey and follow-up survey each participant rated the importance of each of these items. While the average ratings differed slightly between the two groups, the order of each set of items was compared with two different tests and the order was found to be similar. Some of the common and important learning objectives include connecting theory to what is implemented and observed in the laboratory, designing controllers, and modeling and simulating systems. The most common component in both groups was Math-Works software. Some of the common concepts include block diagrams, stability, and PID control. Defining common aspects of undergraduate control systems laboratories enables common development, detailed comparisons, and simplified adaptation of equipment and experiments between campuses and programs. Throughout an undergraduate program in engineering, there are multiple opportunities for hands-on laboratory experiences that are related to course content. However, a similarly immersive experience for project management graduate students is harder to incorporate for all students in a course at once. This study explores an experiential learning opportunity for graduate students in engineering management or project management programs. The project management students enroll in a project management course. Undergraduate students interested in working on a project with a real customer enroll in a different projects course. Two students from the project management course function as project managers and lead a team of undergraduate students in the second course through a project. I studied how closely the project management experience in these courses aligns with engineering project management in industry. In the spring of 2015, I enrolled in the project management course at a large Midwestern university. I used analytic autoethnography to compare my experiences in the course with my experiences as a project engineer at a large aerospace company. I found that the experience in the course provided an authentic and comprehensive opportunity to practice most of the skills listed in the Project Management Book of Knowledge (an industry standard) as necessary for project managers. Some components of the course that made it successful: I was the project manager for the whole term, I worked with a real client, and the team defined and delivered the project before the end of the semester.

  6. An Integrated Tiered Service Delivery Model (ITSDM) Based on Local CD4 Testing Demands Can Improve Turn-Around Times and Save Costs whilst Ensuring Accessible and Scalable CD4 Services across a National Programme

    PubMed Central

    Glencross, Deborah K.; Coetzee, Lindi M.; Cassim, Naseem

    2014-01-01

    Background The South African National Health Laboratory Service (NHLS) responded to HIV treatment initiatives with two-tiered CD4 laboratory services in 2004. Increasing programmatic burden, as more patients access anti-retroviral therapy (ART), has demanded extending CD4 services to meet increasing clinical needs. The aim of this study was to review existing services and develop a service-model that integrated laboratory-based and point-of-care testing (POCT), to extend national coverage, improve local turn-around/(TAT) and contain programmatic costs. Methods NHLS Corporate Data Warehouse CD4 data, from 60–70 laboratories and 4756 referring health facilities was reviewed for referral laboratory workload, respective referring facility volumes and related TAT, from 2009–2012. Results An integrated tiered service delivery model (ITSDM) is proposed. Tier-1/POCT delivers CD4 testing at single health-clinics providing ART in hard-to-reach areas (<5 samples/day). Laboratory-based testing is extended with Tier-2/POC-Hubs (processing ≤30–40 CD4 samples/day), consolidating POCT across 8–10 health-clinics with other HIV-related testing and Tier-3/‘community’ laboratories, serving ≤40 health-clinics, processing ≤150 samples/day. Existing Tier-4/‘regional’ laboratories serve ≤100 facilities and process <350 samples/day; Tier-5 are high-volume ‘metro’/centralized laboratories (>350–1500 tests/day, serving ≥200 health-clinics). Tier-6 provides national support for standardisation, harmonization and quality across the organization. Conclusion The ITSDM offers improved local TAT by extending CD4 services into rural/remote areas with new Tier-3 or Tier-2/POC-Hub services installed in existing community laboratories, most with developed infrastructure. The advantage of lower laboratory CD4 costs and use of existing infrastructure enables subsidization of delivery of more expensive POC services, into hard-to-reach districts without reasonable access to a local CD4 laboratory. Full ITSDM implementation across 5 service tiers (as opposed to widespread implementation of POC testing to extend service) can facilitate sustainable ‘full service coverage’ across South Africa, and save>than R125 million in HIV/AIDS programmatic costs. ITSDM hierarchical parental-support also assures laboratory/POC management, equipment maintenance, quality control and on-going training between tiers. PMID:25490718

  7. Toward a comprehensive understanding of executive cognitive function in implicit racial bias.

    PubMed

    Ito, Tiffany A; Friedman, Naomi P; Bartholow, Bruce D; Correll, Joshua; Loersch, Chris; Altamirano, Lee J; Miyake, Akira

    2015-02-01

    Although performance on laboratory-based implicit bias tasks often is interpreted strictly in terms of the strength of automatic associations, recent evidence suggests that such tasks are influenced by higher-order cognitive control processes, so-called executive functions (EFs). However, extant work in this area has been limited by failure to account for the unity and diversity of EFs, focus on only a single measure of bias and/or EF, and relatively small sample sizes. The current study sought to comprehensively model the relation between individual differences in EFs and the expression of racial bias in 3 commonly used laboratory measures. Participants (N = 485) completed a battery of EF tasks (Session 1) and 3 racial bias tasks (Session 2), along with numerous individual difference questionnaires. The main findings were as follows: (a) measures of implicit bias were only weakly intercorrelated; (b) EF and estimates of automatic processes both predicted implicit bias and also interacted, such that the relation between automatic processes and bias expression was reduced at higher levels of EF; (c) specific facets of EF were differentially associated with overall task performance and controlled processing estimates across different bias tasks; (d) EF did not moderate associations between implicit and explicit measures of bias; and (e) external, but not internal, motivation to control prejudice depended on EF to reduce bias expression. Findings are discussed in terms of the importance of global and specific EF abilities in determining expression of implicit racial bias. PsycINFO Database Record (c) 2015 APA, all rights reserved.

  8. [Measures to prevent patient identification errors in blood collection/physiological function testing utilizing a laboratory information system].

    PubMed

    Shimazu, Chisato; Hoshino, Satoshi; Furukawa, Taiji

    2013-08-01

    We constructed an integrated personal identification workflow chart using both bar code reading and an all in-one laboratory information system. The information system not only handles test data but also the information needed for patient guidance in the laboratory department. The reception terminals at the entrance, displays for patient guidance and patient identification tools at blood-sampling booths are all controlled by the information system. The number of patient identification errors was greatly reduced by the system. However, identification errors have not been abolished in the ultrasound department. After re-evaluation of the patient identification process in this department, we recognized that the major reason for the errors came from excessive identification workflow. Ordinarily, an ultrasound test requires patient identification 3 times, because 3 different systems are required during the entire test process, i.e. ultrasound modality system, laboratory information system and a system for producing reports. We are trying to connect the 3 different systems to develop a one-time identification workflow, but it is not a simple task and has not been completed yet. Utilization of the laboratory information system is effective, but is not yet perfect for patient identification. The most fundamental procedure for patient identification is to ask a person's name even today. Everyday checks in the ordinary workflow and everyone's participation in safety-management activity are important for the prevention of patient identification errors.

  9. Standard Operating Procedure Utilization for Tuberculosis Microscopy in Mekelle City, North Ethiopia.

    PubMed

    Weldu, Yemane; Gebru, Hagos; Kahsay, Getahun; Teweldemedhn, Gebremichael; Hagos, Yifter; Kahsay, Amlsha

    2017-01-01

    The aim of this study was to assess the utilization of standard operating procedures for acid-fast bacilli (AFB) smear microscopy. A facility-based cross-sectional study was conducted in select health institutions in Mekelle City, Ethiopia, from July 1, 2015, through August 30, 2015. Using a simple random sampling technique, 18 health facilities were included in the study. Data were collected using a standard checklist and entered into Epi Info version 3.5.4 (Centers for Disease Control and Prevention, Atlanta, GA) for editing. Analysis was done using SPSS version 20 (SPSS, Chicago, IL). Of the 18 laboratory facilities, only seven (38.9%) had a legible AFB registration book. In three (16.7%) of the laboratories, heat fixation was not applied before adding primary staining reagent. In 12 (66.7%), the staining reagents had precipitates. Two laboratories had microscopes with mechanical stages that could not move freely on both axes. Seven (38.9%) of the laboratories reported samples to be negative before examining all required fields. Most laboratories, 16 (88.9%) and 17 (94.4%), respectively, did not run positive and negative controls after new batch reagent preparation. Tuberculosis microscopy was found to be substandard with clear gaps in documentation, sample collection, and processing. © American Society for Clinical Pathology, 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  10. Developing a lean culture in the laboratory.

    PubMed

    Napoles, Leyda; Quintana, Maria

    2006-07-25

    The Director of Pathology at Jackson Memorial Hospital was interested in improving the operational efficiencies of the department in order to enhance the department's level of service in conjunction with the expansion of the overall health system. The decision was made to implement proven Lean practices in the laboratory under the direction of a major consulting firm. This article details the scope of the initial project as well as the operating principles of Lean manufacturing practices as applied to the clinical laboratory. The goals of the project were to improve turnaround times of laboratory results, reduce inventory and supply costs, improve staff productivity, maximize workflow, and eliminate waste. Extensive data gathering and analysis guided the work process by highlighting the areas of highest opportunity. This systematic approach resulted in recommendations for the workflow and physical layout of the laboratory. It also included the introduction of "standard workflow" and "visual controls" as critical items that streamlined operational efficiencies. The authors provide actual photographs and schematics of the reorganization and improvements to the physical layout of the laboratory. In conclusion, this project resulted in decreased turnaround times and increased productivity, as well as significant savings in the overall laboratory operations.

  11. Flexible distributed architecture for semiconductor process control and experimentation

    NASA Astrophysics Data System (ADS)

    Gower, Aaron E.; Boning, Duane S.; McIlrath, Michael B.

    1997-01-01

    Semiconductor fabrication requires an increasingly expensive and integrated set of tightly controlled processes, driving the need for a fabrication facility with fully computerized, networked processing equipment. We describe an integrated, open system architecture enabling distributed experimentation and process control for plasma etching. The system was developed at MIT's Microsystems Technology Laboratories and employs in-situ CCD interferometry based analysis in the sensor-feedback control of an Applied Materials Precision 5000 Plasma Etcher (AME5000). Our system supports accelerated, advanced research involving feedback control algorithms, and includes a distributed interface that utilizes the internet to make these fabrication capabilities available to remote users. The system architecture is both distributed and modular: specific implementation of any one task does not restrict the implementation of another. The low level architectural components include a host controller that communicates with the AME5000 equipment via SECS-II, and a host controller for the acquisition and analysis of the CCD sensor images. A cell controller (CC) manages communications between these equipment and sensor controllers. The CC is also responsible for process control decisions; algorithmic controllers may be integrated locally or via remote communications. Finally, a system server images connections from internet/intranet (web) based clients and uses a direct link with the CC to access the system. Each component communicates via a predefined set of TCP/IP socket based messages. This flexible architecture makes integration easier and more robust, and enables separate software components to run on the same or different computers independent of hardware or software platform.

  12. PROCESS WATER BUILDING, TRA605, INTERIOR. FIRST FLOOR. CAMERA IS IN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PROCESS WATER BUILDING, TRA-605, INTERIOR. FIRST FLOOR. CAMERA IS IN SOUTHEAST CORNER AND FACES NORTHWEST. CONTROL ROOM AT RIGHT. CRANE MONORAIL IS OVER FLOOR HATCHES AND FLOOR OPENINGS. SIX VALVE HANDWHEELS ALONG FAR WALL IN LEFT CENTER VIEW. SEAL TANK IS ON OTHER SIDE OF WALL; PROCESS WATER PIPES ARE BELOW VALVE WHEELS. NOTE CURBS AROUND FLOOR OPENINGS. INL NEGATIVE NO. HD46-26-3. Mike Crane, Photographer, 2/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marshall, R.S.

    A simple in-line americium assay instrument (AAI) was installed at an americium recovery process area at LASL for use in process development and for providing process control information. The AAI counts 59.5-keV /sup 241/Am gamma rays, using a NaI(T1) detector and Eberline SAM II electronics. It has a useful range of 3 x 10/sup -5/ to 10 g Am/l and does not suffer from plutonium interference. Comparative analyses of samples assayed in the AAI and samples assayed by the LASL Analytical Laboratory show a combined relative standard deviation of 14%.

  14. In-situ measurement of processing properties during fabrication in a production tool

    NASA Technical Reports Server (NTRS)

    Kranbuehl, D. E.; Haverty, P.; Hoff, M.; Loos, A. C.

    1988-01-01

    Progress is reported on the use of frequency-dependent electromagnetic measurements (FDEMs) as a single, convenient technique for continuous in situ monitoring of polyester cure during fabrication in a laboratory and manufacturing environment. Preliminary FDEM sensor and modeling work using the Loss-Springer model in order to develop an intelligent closed-loop, sensor-controlled cure process is described. FDEMs using impedance bridges in the Hz to MHz region is found to be ideal for automatically monitoring polyester processing properties continuously throughout the cure cycle.

  15. 1979 bibliography of atomic and molecular processes. [Bibliography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1980-08-01

    This annotated bibliography lists 2146 works on atomic and molecular processes reported in publications dated 1979. Sources include scientific journals, conference proceedings, and books. Each entry is designated by one or more of the 114 categories of atomic and molecular processes used by the Controlled Fusion Atomic Data Center, Oak Ridge National Laboratory, to classify data. Also indicated is whether the work was experimental or theoretical, what energy range was covered, what reactants were investigated, and the country of origin of the first author. Following the bibliographical listing are indexes of reactants and authors.

  16. Software development to support sensor control of robot arc welding

    NASA Technical Reports Server (NTRS)

    Silas, F. R., Jr.

    1986-01-01

    The development of software for a Digital Equipment Corporation MINC-23 Laboratory Computer to provide functions of a workcell host computer for Space Shuttle Main Engine (SSME) robotic welding is documented. Routines were written to transfer robot programs between the MINC and an Advanced Robotic Cyro 750 welding robot. Other routines provide advanced program editing features while additional software allows communicatin with a remote computer aided design system. Access to special robot functions were provided to allow advanced control of weld seam tracking and process control for future development programs.

  17. Panel summary of recommendations

    NASA Technical Reports Server (NTRS)

    Dunbar, Bonnie J.; Coleman, Martin E.; Mitchell, Kenneth L.

    1990-01-01

    The following Space Station internal contamination topics were addressed: past flight experience (Skylab and Spacelab missions); present flight activities (Spacelabs and Soviet Space Station Mir); future activities (materials science and life science experiments); Space Station capabilities (PPMS, FMS, ECLSS, and U.S. Laboratory overview); manned systems/crew safety; internal contamination detection; contamination control - stowage and handling; and contamination control - waste gas processing. Space Station design assumptions are discussed. Issues and concerns are discussed as they relate to (1) policy and management, (2) subsystem design, (3) experiment design, and (4) internal contamination detection and control. The recommendations generated are summarized.

  18. Innovation Expo

    NASA Image and Video Library

    2017-11-01

    Ed Rosenthal, founder and chairman of Florikan controlled release fertilizers, displays his plant growth material during the 2017 Innovation Expo showcase at NASA's Kennedy Space Center in Florida. The controlled release fertilizer is used in NASA's Veggie plant growth system on the International Space Station and in the Veggie control unit in a laboratory in the Space Station Processing Facility. The purpose of the annual two-day event is to help foster innovation and creativity among the Kennedy workforce. The event included several keynote speakers, training opportunities, an innovation showcase and the KSC Kickstart competition.

  19. Conduction and Narrow Escape in Dense, Disordered, Particulate-based Heterogeneous Materials

    NASA Astrophysics Data System (ADS)

    Lechman, Jeremy

    For optimal and reliable performance, many technological devices rely on complex, disordered heterogeneous or composite materials and their associated manufacturing processes. Examples include many powder and particulate-based materials found in phyrotechnic devices for car airbags, electrodes in energy storage devices, and various advanced composite materials. Due to their technological importance and complex structure, these materials have been the subject of much research in a number of fields. Moreover, the advent of new manufacturing techniques based on powder bed and particulate process routes, the potential of functional nano-structured materials, and the additional recognition of persistent shortcomings in predicting reliable performance of high consequence applications; leading to ballooning costs of fielding and maintaining advanced technologies, should motivate renewed efforts in understanding, predicting and controlling these materials' fabrication and behavior. Our particular effort seeks to understand the link between the top-down control presented in specific non-equilibrium processes routes (i.e., manufacturing processes) and the variability and uncertainty of the end product performance. Our ultimate aim is to quantify the variability inherent in these constrained dynamical or random processes and to use it to optimize and predict resulting material properties/performance and to inform component design with precise margins. In fact, this raises a set of deep and broad-ranging issues that have been recognized and as touching the core of a major research challenge at Sandia National Laboratories. In this talk, we will give an overview of recent efforts to address aspects of this vision. In particular the case of conductive properties of packed particulate materials will be highlighted. Combining a number of existing approaches we will discuss new insights and potential directions for further development toward the stated goal. Sandia National Laboratories is a multiprogram laboratory managed and operated by Sandia Corporation, a Lockheed-Martin Company, for the U. S. Department of Energy's National Nuclear Security Administration under Contract No. DE-AC04-94AL85000.

  20. Proceedings of the drug testing laboratory managers symposium, 28 January--1 February 1974. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Noe, E.R.; Romanchick, W.A.; Ainsworth, C.A. III

    1975-06-01

    This report deals with broad concepts of managing mass screening programs for drugs of abuse; e.g., morphine, barbiturate, amphetamine, cocaine, and methaqualone. The interactions of the screening process and of the rehabilitation program were covered. Psychotherapy and group therapy are both utilized in rehabilitation programs. The semiautomated radioimmunoassay (RIA) screening procedures are both sensitive and specific at nanogram quantities. Future evaluations of a wafer disk transferral system and of a latex test for morphine are presented. The unique quality control system employed by military drug abuse testing laboratories is discussed. (Author) (GRA)

  1. Laboratory Study of Magnetorotational Instability and Hydrodynamic Stability at Large Reynolds Numbers

    NASA Technical Reports Server (NTRS)

    Ji, H.; Burin, M.; Schartman, E.; Goodman, J.; Liu, W.

    2006-01-01

    Two plausible mechanisms have been proposed to explain rapid angular momentum transport during accretion processes in astrophysical disks: nonlinear hydrodynamic instabilities and magnetorotational instability (MRI). A laboratory experiment in a short Taylor-Couette flow geometry has been constructed in Princeton to study both mechanisms, with novel features for better controls of the boundary-driven secondary flows (Ekman circulation). Initial results on hydrodynamic stability have shown negligible angular momentum transport in Keplerian-like flows with Reynolds numbers approaching one million, casting strong doubt on the viability of nonlinear hydrodynamic instability as a source for accretion disk turbulence.

  2. Early Temperamental and Family Predictors of Shyness and Anxiety

    ERIC Educational Resources Information Center

    Volbrecht, Michele M.; Goldsmith, H. Hill

    2010-01-01

    With a sample of 242 twins (135 girls, 107 boys) studied longitudinally, behavioral inhibition (BI) and inhibitory control (IC) measured at 3 years, as well as early and concurrent family process variables, were examined as predictors of shyness and of anxiety symptoms approximately 4 years later. Structured observational data from laboratory and…

  3. Testing and evaluation of light ablation decontamination

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Demmer, R.L.; Ferguson, R.L.

    1994-10-01

    This report details the testing and evaluation of light ablation decontamination. It details WINCO contracted research and application of light ablation efforts by Ames Laboratory. Tests were conducted with SIMCON (simulated contamination) coupons and REALCON (actual radioactive metal coupons) under controlled conditions to compare cleaning effectiveness, speed and application to plant process type equipment.

  4. Stratospheric aerosols and precursor gases

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Measurements were made of the aerosol size, height and geographical distribution, their composition and optical properties, and their temporal variation with season and following large volcanic eruptions. Sulfur-bearing gases were measured in situ in the stratosphere, and studied of the chemical and physical processes which control gas-to-particle conversion were carried out in the laboratory.

  5. 9 CFR 381.94 - Contamination with Microorganisms; process control verification criteria and testing; pathogen...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...) Analysis of samples. Laboratories may use any quantitative method for analysis of E. coli that is approved... FSIS in its nationwide microbiological baseline data collection programs and surveys. (Copies of... Surveys used in determining the prevalence of Salmonella on raw products are available in the FSIS Docket...

  6. 9 CFR 381.94 - Contamination with Microorganisms; process control verification criteria and testing; pathogen...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...) Analysis of samples. Laboratories may use any quantitative method for analysis of E. coli that is approved... FSIS in its nationwide microbiological baseline data collection programs and surveys. (Copies of... Surveys used in determining the prevalence of Salmonella on raw products are available in the FSIS Docket...

  7. 9 CFR 381.94 - Contamination with Microorganisms; process control verification criteria and testing; pathogen...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...) Analysis of samples. Laboratories may use any quantitative method for analysis of E. coli that is approved... FSIS in its nationwide microbiological baseline data collection programs and surveys. (Copies of... Surveys used in determining the prevalence of Salmonella on raw products are available in the FSIS Docket...

  8. 9 CFR 381.94 - Contamination with Microorganisms; process control verification criteria and testing; pathogen...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...) Analysis of samples. Laboratories may use any quantitative method for analysis of E. coli that is approved... FSIS in its nationwide microbiological baseline data collection programs and surveys. (Copies of... Surveys used in determining the prevalence of Salmonella on raw products are available in the FSIS Docket...

  9. 9 CFR 381.94 - Contamination with Microorganisms; process control verification criteria and testing; pathogen...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...) Analysis of samples. Laboratories may use any quantitative method for analysis of E. coli that is approved... FSIS in its nationwide microbiological baseline data collection programs and surveys. (Copies of... Surveys used in determining the prevalence of Salmonella on raw products are available in the FSIS Docket...

  10. Robotic Surgery

    ERIC Educational Resources Information Center

    Childress, Vincent W.

    2007-01-01

    The medical field has many uses for automated and remote-controlled technology. For example, if a tissue sample is only handled in the laboratory by a robotic handling system, then it will never come into contact with a human. Such a system not only helps to automate the medical testing process, but it also helps to reduce the chances of…

  11. Excavation/Fill/Soil Disturbance, Self-Study #31419

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grogin, Phillip W.

    This course, Excavation/Fill/Soil Disturbance Self-Study (#31419), presents an overview of the hazards, controls, and requirements that affect safe excavations at Los Alamos National Laboratory (LANL). An overview of the LANL excavation/fill/soil disturbance permit (EXID permit) approval process is also presented, along with potholing requirements for planning and performing excavations at LANL.

  12. Polio Eradication Initiative (PEI) contribution in strengthening public health laboratories systems in the African region.

    PubMed

    Gumede, Nicksy; Coulibaly, Sheick Oumar; Yahaya, Ali Ahmed; Ndihokubwayo, Jean-Bosco; Nsubuga, Peter; Okeibunor, Joseph; Dosseh, Annick; Salla, Mbaye; Mihigo, Richard; Mkanda, Pascal; Byabamazima, Charles

    2016-10-10

    The laboratory has always played a very critical role in diagnosis of the diseases. The success of any disease programme is based on a functional laboratory network. Health laboratory services are an integral component of the health system. Efficiency and effectiveness of both clinical and public health functions including surveillance, diagnosis, prevention, treatment, research and health promotion are influenced by reliable laboratory services. The establishment of the African Regional polio laboratory for the Polio Eradication Initiative (PEI) has contributed in supporting countries in their efforts to strengthen laboratory capacity. On the eve of the closing of the program, we have shown through this article, examples of this contribution in two countries of the African region: Côte d'Ivoire and the Democratic Republic of Congo. Descriptive studies were carried out in Côte d'Ivoire (RCI) and Democratic Republic of Congo (DRC) from October to December 2014. Questionnaires and self-administered and in-depth interviews and group discussions as well as records and observation were used to collect information during laboratory visits and assessments. The PEI financial support allows to maintain the majority of the 14 (DRC) and 12 (RCI) staff involved in the polio laboratory as full or in part time members. Through laboratory technical staff training supported by the PEI, skills and knowledge were gained to reinforce laboratories capacity and performance in quality laboratory functioning, processes and techniques such as cell culture. In the same way, infrastructure was improved and equipment provided. General laboratory quality standards, including the entire laboratory key elements was improved through the PEI accreditation process. The Polio Eradication Initiative (PEI) is a good example of contribution in strengthening public health laboratories systems in the African region. It has established strong Polio Laboratory network that contributed to the strengthening of capacities and its expansion to surveillance of other viral priority diseases such as measles, yellow fever, Influenza, MERS-CoV and Ebola. This could serve as lesson and good example of laboratory based surveillance to improving diseases prevention, detection and control in our middle and low income countries as WHO and partners are heading to polio eradication in the world. Copyright © 2016. Published by Elsevier Ltd.

  13. A model for the statistical description of analytical errors occurring in clinical chemical laboratories with time.

    PubMed

    Hyvärinen, A

    1985-01-01

    The main purpose of the present study was to describe the statistical behaviour of daily analytical errors in the dimensions of place and time, providing a statistical basis for realistic estimates of the analytical error, and hence allowing the importance of the error and the relative contributions of its different sources to be re-evaluated. The observation material consists of creatinine and glucose results for control sera measured in daily routine quality control in five laboratories for a period of one year. The observation data were processed and computed by means of an automated data processing system. Graphic representations of time series of daily observations, as well as their means and dispersion limits when grouped over various time intervals, were investigated. For partition of the total variation several two-way analyses of variance were done with laboratory and various time classifications as factors. Pooled sets of observations were tested for normality of distribution and for consistency of variances, and the distribution characteristics of error variation in different categories of place and time were compared. Errors were found from the time series to vary typically between days. Due to irregular fluctuations in general and particular seasonal effects in creatinine, stable estimates of means or of dispersions for errors in individual laboratories could not be easily obtained over short periods of time but only from data sets pooled over long intervals (preferably at least one year). Pooled estimates of proportions of intralaboratory variation were relatively low (less than 33%) when the variation was pooled within days. However, when the variation was pooled over longer intervals this proportion increased considerably, even to a maximum of 89-98% (95-98% in each method category) when an outlying laboratory in glucose was omitted, with a concomitant decrease in the interaction component (representing laboratory-dependent variation with time). This indicates that a substantial part of the variation comes from intralaboratory variation with time rather than from constant interlaboratory differences. Normality and consistency of statistical distributions were best achieved in the long-term intralaboratory sets of the data, under which conditions the statistical estimates of error variability were also most characteristic of the individual laboratories rather than necessarily being similar to one another. Mixing of data from different laboratories may give heterogeneous and nonparametric distributions and hence is not advisable.(ABSTRACT TRUNCATED AT 400 WORDS)

  14. Use of radiation in the production of hydrogels

    NASA Astrophysics Data System (ADS)

    Lugao, Ademar B.; Malmonge, Sônia Maria

    2001-12-01

    The first hydrogel for wound dressing processed by radiation left the laboratories in Poland in 1986 by the hands of its inventor Janusz M. Rosiak and soon, after formal tests, arrived in the local market (1992). It was a technological breakthrough due to its product characteristics as pain reliever and enhanced healing properties besides its clever production process combining sterilization and crosslinking in a simultaneous operation. IAEA invited professor Rosiak to support the transference of his technology for many laboratories around the world. The laboratories of developing countries, which face all kinds of restrictions, were seduced by the simplicity of the process and low cost of its raw materials. This was the seed of the flourishing activities in hydrogel dressings in Brazil and other developing countries. The technology transfer of the radiation production of hydrogel dressings and other hydrogels to the Brazilian industry is under way. The usual issues associated with radiation processing arise from this experience, i.e. capital costs, misinformation about radiation and lack of expertise on radiation processing. Some other issues concerning local market and social peculiarities also add to the problem. Notwithstanding, many different opportunities arise from those challenges. These technical and commercial issues are roughly: (i) There are plenty of new hydrogels in the market and all say the same. What else radiation processed hydrogels can say? (ii) Regarding to hydrogels and its industrial production as market product, what are the unique characteristics of radiation processing? It was shown that the radiation is a powerful tool for producing hydrogels the same basic formula with improved flexibility, control and purity.

  15. [Application of laboratory information system in the management of the key indicators of quality inspection].

    PubMed

    Guo, Ye; Chen, Qian; Wu, Wei; Cui, Wei

    2015-03-31

    To establish a system of monitoring the key indicator of quality for inspection (KIQI) on a laboratory information system (LIS), and to have a better management of KIQI. Clinical sample made in PUMCH were collected during the whole of 2014. Next, interactive input program were designed to accomplish data collecting of the disqualification rate of samples, the mistake rate of samples and the occasions of losing samples, etc. Then, a series moment of sample collection, laboratory sample arrived, sample test, sample check, response to critical value, namely, trajectory information left on LIS were recorded and the qualification rate of TAT, the notification rate of endangering result were calculated. Finally, the information about quality control were collected to build an internal quality control database and the KIQI, such as the out-of-control rate of quality control and the total error of test items were monitored. The inspection of the sample management shows the disqualification rates in 2014 were all below the target, but the rates in January and February were a little high and the rates of four wards were above 2%. The mistake rates of samples was 0.47 cases/10 000 cases, attaining the target (< 2 cases/10 000 cases). Also, there was no occasion of losing samples in 2014, attaining the target too. The inspection of laboratory reports shows the qualification rates of TAT was within the acceptable range (> 95%), however the rates of blood routine in November (94.75%) was out of range. We have solved the problem by optimizing the processes. The notification rate of endangering result attained the target (≥ 98%), while the rate of timely notification is needed to improve. Quality inspection shows the CV of APTT in August (5.02%) was rising significantly, beyond the accepted CV (5.0%). We have solved the problem by changing the reagent. The CV of TT in 2014 were all below the allowable CV, thus the allowable CV of the next year lower to 10%. It is an objective and effective method to manage KIQI with the powerful management mode of database and information process capability on LIS.

  16. A rigorous approach to facilitate and guarantee the correctness of the genetic testing management in human genome information systems.

    PubMed

    Araújo, Luciano V; Malkowski, Simon; Braghetto, Kelly R; Passos-Bueno, Maria R; Zatz, Mayana; Pu, Calton; Ferreira, João E

    2011-12-22

    Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces.

  17. A rigorous approach to facilitate and guarantee the correctness of the genetic testing management in human genome information systems

    PubMed Central

    2011-01-01

    Background Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. Results This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. Conclusions This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces. PMID:22369688

  18. Submarine Propulsion Shaft Life: Probabilistic Prediction and Extension through Prevention of Water Ingress

    DTIC Science & Technology

    2014-06-01

    Even well- controlled laboratory testing saw a range of COV from less than 10 percent to over 500 percent for different steels (Tryon & Cruse...identified. This 6-year limit is driven by concerns about corrosion fatigue , a process initiated by water gaining access to the carbon steel of the shaft...physics and is controlled by different parameters and interactions of the many variables involved. Figure 2 depicts the corrosion fatigue sequence of

  19. Network design and quality checks in automatic orientation of close-range photogrammetric blocks.

    PubMed

    Dall'Asta, Elisa; Thoeni, Klaus; Santise, Marina; Forlani, Gianfranco; Giacomini, Anna; Roncella, Riccardo

    2015-04-03

    Due to the recent improvements of automatic measurement procedures in photogrammetry, multi-view 3D reconstruction technologies are becoming a favourite survey tool. Rapidly widening structure-from-motion (SfM) software packages offer significantly easier image processing workflows than traditional photogrammetry packages. However, while most orientation and surface reconstruction strategies will almost always succeed in any given task, estimating the quality of the result is, to some extent, still an open issue. An assessment of the precision and reliability of block orientation is necessary and should be included in every processing pipeline. Such a need was clearly felt from the results of close-range photogrammetric surveys of in situ full-scale and laboratory-scale experiments. In order to study the impact of the block control and the camera network design on the block orientation accuracy, a series of Monte Carlo simulations was performed. Two image block configurations were investigated: a single pseudo-normal strip and a circular highly-convergent block. The influence of surveying and data processing choices, such as the number and accuracy of the ground control points, autofocus and camera calibration was investigated. The research highlights the most significant aspects and processes to be taken into account for adequate in situ and laboratory surveys, when modern SfM software packages are used, and evaluates their effect on the quality of the results of the surface reconstruction.

  20. KENNEDY SPACE CENTER, FLA. - In the high bay clean room at the Astrotech Space Operations processing facilities near KSC, workers prepare NASA’s MESSENGER spacecraft for transfer to a work stand. There employees of the Johns Hopkins University Applied Physics Laboratory, builders of the spacecraft, will perform an initial state-of-health check. Then processing for launch can begin, including checkout of the power systems, communications systems and control systems. The thermal blankets will also be attached for flight. MESSENGER - short for MErcury Surface, Space ENvironment, GEochemistry and Ranging - will be launched May 11 on a six-year mission aboard a Boeing Delta II rocket. Liftoff is targeted for 2:26 a.m. EDT on Tuesday, May 11.

    NASA Image and Video Library

    2004-03-10

    KENNEDY SPACE CENTER, FLA. - In the high bay clean room at the Astrotech Space Operations processing facilities near KSC, workers prepare NASA’s MESSENGER spacecraft for transfer to a work stand. There employees of the Johns Hopkins University Applied Physics Laboratory, builders of the spacecraft, will perform an initial state-of-health check. Then processing for launch can begin, including checkout of the power systems, communications systems and control systems. The thermal blankets will also be attached for flight. MESSENGER - short for MErcury Surface, Space ENvironment, GEochemistry and Ranging - will be launched May 11 on a six-year mission aboard a Boeing Delta II rocket. Liftoff is targeted for 2:26 a.m. EDT on Tuesday, May 11.

  1. KENNEDY SPACE CENTER, FLA. - At the Astrotech Space Operations processing facilities near KSC, workers begin moving NASA’s MESSENGER spacecraft into the building MESSENGER - short for MErcury Surface, Space ENvironment, GEochemistry and Ranging - is being taken into a high bay clean room where employees of the Johns Hopkins University Applied Physics Laboratory, builders of the spacecraft, will perform an initial state-of-health check. Then processing for launch can begin, including checkout of the power systems, communications systems and control systems. The thermal blankets will also be attached for flight. MESSENGER will be launched May 11 on a six-year mission aboard a Boeing Delta II rocket. Liftoff is targeted for 2:26 a.m. EDT on Tuesday, May 11.

    NASA Image and Video Library

    2004-03-10

    KENNEDY SPACE CENTER, FLA. - At the Astrotech Space Operations processing facilities near KSC, workers begin moving NASA’s MESSENGER spacecraft into the building MESSENGER - short for MErcury Surface, Space ENvironment, GEochemistry and Ranging - is being taken into a high bay clean room where employees of the Johns Hopkins University Applied Physics Laboratory, builders of the spacecraft, will perform an initial state-of-health check. Then processing for launch can begin, including checkout of the power systems, communications systems and control systems. The thermal blankets will also be attached for flight. MESSENGER will be launched May 11 on a six-year mission aboard a Boeing Delta II rocket. Liftoff is targeted for 2:26 a.m. EDT on Tuesday, May 11.

  2. KENNEDY SPACE CENTER, FLA. - At the Astrotech Space Operations processing facilities near KSC, a lift begins lowering NASA’s MESSENGER spacecraft onto the ground. MESSENGER - short for MErcury Surface, Space ENvironment, GEochemistry and Ranging - will be taken into a high bay clean room and employees of the Johns Hopkins University Applied Physics Laboratory, builders of the spacecraft, will perform an initial state-of-health check. Then processing for launch can begin, including checkout of the power systems, communications systems and control systems. The thermal blankets will also be attached for flight. MESSENGER will be launched May 11 on a six-year mission aboard a Boeing Delta II rocket. Liftoff is targeted for 2:26 a.m. EDT on Tuesday, May 11.

    NASA Image and Video Library

    2004-03-10

    KENNEDY SPACE CENTER, FLA. - At the Astrotech Space Operations processing facilities near KSC, a lift begins lowering NASA’s MESSENGER spacecraft onto the ground. MESSENGER - short for MErcury Surface, Space ENvironment, GEochemistry and Ranging - will be taken into a high bay clean room and employees of the Johns Hopkins University Applied Physics Laboratory, builders of the spacecraft, will perform an initial state-of-health check. Then processing for launch can begin, including checkout of the power systems, communications systems and control systems. The thermal blankets will also be attached for flight. MESSENGER will be launched May 11 on a six-year mission aboard a Boeing Delta II rocket. Liftoff is targeted for 2:26 a.m. EDT on Tuesday, May 11.

  3. KENNEDY SPACE CENTER, FLA. - In the high bay clean room at the Astrotech Space Operations processing facilities near KSC, workers get ready to remove the protective cover from NASA’s MESSENGER spacecraft. Employees of the Johns Hopkins University Applied Physics Laboratory, builders of the spacecraft, will perform an initial state-of-health check. Then processing for launch can begin, including checkout of the power systems, communications systems and control systems. The thermal blankets will also be attached for flight. MESSENGER - short for MErcury Surface, Space ENvironment, GEochemistry and Ranging - will be launched May 11 on a six-year mission aboard a Boeing Delta II rocket. Liftoff is targeted for 2:26 a.m. EDT on Tuesday, May 11.

    NASA Image and Video Library

    2004-03-10

    KENNEDY SPACE CENTER, FLA. - In the high bay clean room at the Astrotech Space Operations processing facilities near KSC, workers get ready to remove the protective cover from NASA’s MESSENGER spacecraft. Employees of the Johns Hopkins University Applied Physics Laboratory, builders of the spacecraft, will perform an initial state-of-health check. Then processing for launch can begin, including checkout of the power systems, communications systems and control systems. The thermal blankets will also be attached for flight. MESSENGER - short for MErcury Surface, Space ENvironment, GEochemistry and Ranging - will be launched May 11 on a six-year mission aboard a Boeing Delta II rocket. Liftoff is targeted for 2:26 a.m. EDT on Tuesday, May 11.

  4. KENNEDY SPACE CENTER, FLA. - At the Astrotech Space Operations processing facilities near KSC, workers check the moveable pallet holding NASA’s MESSENGER spacecraft. MESSENGER - short for MErcury Surface, Space ENvironment, GEochemistry and Ranging - will be taken into a high bay clean room and employees of the Johns Hopkins University Applied Physics Laboratory, builders of the spacecraft, will perform an initial state-of-health check. Then processing for launch can begin, including checkout of the power systems, communications systems and control systems. The thermal blankets will also be attached for flight. MESSENGER will be launched May 11 on a six-year mission aboard a Boeing Delta II rocket. Liftoff is targeted for 2:26 a.m. EDT on Tuesday, May 11.

    NASA Image and Video Library

    2004-03-10

    KENNEDY SPACE CENTER, FLA. - At the Astrotech Space Operations processing facilities near KSC, workers check the moveable pallet holding NASA’s MESSENGER spacecraft. MESSENGER - short for MErcury Surface, Space ENvironment, GEochemistry and Ranging - will be taken into a high bay clean room and employees of the Johns Hopkins University Applied Physics Laboratory, builders of the spacecraft, will perform an initial state-of-health check. Then processing for launch can begin, including checkout of the power systems, communications systems and control systems. The thermal blankets will also be attached for flight. MESSENGER will be launched May 11 on a six-year mission aboard a Boeing Delta II rocket. Liftoff is targeted for 2:26 a.m. EDT on Tuesday, May 11.

  5. Event-triggered logical flow control for comprehensive process integration of multi-step assays on centrifugal microfluidic platforms.

    PubMed

    Kinahan, David J; Kearney, Sinéad M; Dimov, Nikolay; Glynn, Macdara T; Ducrée, Jens

    2014-07-07

    The centrifugal "lab-on-a-disc" concept has proven to have great potential for process integration of bioanalytical assays, in particular where ease-of-use, ruggedness, portability, fast turn-around time and cost efficiency are of paramount importance. Yet, as all liquids residing on the disc are exposed to the same centrifugal field, an inherent challenge of these systems remains the automation of multi-step, multi-liquid sample processing and subsequent detection. In order to orchestrate the underlying bioanalytical protocols, an ample palette of rotationally and externally actuated valving schemes has been developed. While excelling with the level of flow control, externally actuated valves require interaction with peripheral instrumentation, thus compromising the conceptual simplicity of the centrifugal platform. In turn, for rotationally controlled schemes, such as common capillary burst valves, typical manufacturing tolerances tend to limit the number of consecutive laboratory unit operations (LUOs) that can be automated on a single disc. In this paper, a major advancement on recently established dissolvable film (DF) valving is presented; for the very first time, a liquid handling sequence can be controlled in response to completion of preceding liquid transfer event, i.e. completely independent of external stimulus or changes in speed of disc rotation. The basic, event-triggered valve configuration is further adapted to leverage conditional, large-scale process integration. First, we demonstrate a fluidic network on a disc encompassing 10 discrete valving steps including logical relationships such as an AND-conditional as well as serial and parallel flow control. Then we present a disc which is capable of implementing common laboratory unit operations such as metering and selective routing of flows. Finally, as a pilot study, these functions are integrated on a single disc to automate a common, multi-step lab protocol for the extraction of total RNA from mammalian cell homogenate.

  6. Launching a Laboratory Testing Process Quality Improvement Toolkit: From the Shared Networks of Colorado Ambulatory Practices and Partners (SNOCAP).

    PubMed

    Fernald, Douglas; Hamer, Mika; James, Kathy; Tutt, Brandon; West, David

    2015-01-01

    Family medicine and internal medicine physicians order diagnostic laboratory tests for nearly one-third of patient encounters in an average week, yet among medical errors in primary care, an estimated 15% to 54% are attributed to laboratory testing processes. From a practice improvement perspective, we (1) describe the need for laboratory testing process quality improvements from the perspective of primary care practices, and (2) describe the approaches and resources needed to implement laboratory testing process quality improvements in practice. We applied practice observations, process mapping, and interviews with primary care practices in the Shared Networks of Colorado Ambulatory Practices and Partners (SNOCAP)-affiliated practice-based research networks that field-tested in 2013 a laboratory testing process improvement toolkit. From the data collected in each of the 22 participating practices, common testing quality issues included, but were not limited to, 3 main testing process steps: laboratory test preparation, test tracking, and patient notification. Three overarching qualitative themes emerged: practices readily acknowledge multiple laboratory testing process problems; practices know that they need help addressing the issues; and practices face challenges with finding patient-centered solutions compatible with practice priorities and available resources. While practices were able to get started with guidance and a toolkit to improve laboratory testing processes, most did not seem able to achieve their quality improvement aims unassisted. Providing specific guidance tools with practice facilitation or other rapid-cycle quality improvement support may be an effective approach to improve common laboratory testing issues in primary care. © Copyright 2015 by the American Board of Family Medicine.

  7. Information Quality in Regulatory Decision Making: Peer Review versus Good Laboratory Practice.

    PubMed

    McCarty, Lynn S; Borgert, Christopher J; Mihaich, Ellen M

    2012-07-01

    There is an ongoing discussion on the provenance of toxicity testing data regarding how best to ensure its validity and credibility. A central argument is whether journal peer-review procedures are superior to Good Laboratory Practice (GLP) standards employed for compliance with regulatory mandates. We sought to evaluate the rationale for regulatory decision making based on peer-review procedures versus GLP standards. We examined pertinent published literature regarding how scientific data quality and validity are evaluated for peer review, GLP compliance, and development of regulations. Some contend that peer review is a coherent, consistent evaluative procedure providing quality control for experimental data generation, analysis, and reporting sufficient to reliably establish relative merit, whereas GLP is seen as merely a tracking process designed to thwart investigator corruption. This view is not supported by published analyses pointing to subjectivity and variability in peer-review processes. Although GLP is not designed to establish relative merit, it is an internationally accepted quality assurance, quality control method for documenting experimental conduct and data. Neither process is completely sufficient for establishing relative scientific soundness. However, changes occurring both in peer-review processes and in regulatory guidance resulting in clearer, more transparent communication of scientific information point to an emerging convergence in ensuring information quality. The solution to determining relative merit lies in developing a well-documented, generally accepted weight-of-evidence scheme to evaluate both peer-reviewed and GLP information used in regulatory decision making where both merit and specific relevance inform the process.

  8. TA-55 change control manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blum, T.W.; Selvage, R.D.; Courtney, K.H.

    This manual is the guide for initiating change at the Plutonium Facility, which handles the processing of plutonium as well as research on plutonium metallurgy. It describes the change and work control processes employed at TA-55 to ensure that all proposed changes are properly identified, reviewed, approved, implemented, tested, and documented so that operations are maintained within the approved safety envelope. All Laboratory groups, their contractors, and subcontractors doing work at TA-55 follow requirements set forth herein. This manual applies to all new and modified processes and experiments inside the TA-55 Plutonium Facility; general plant project (GPP) and line itemmore » funded construction projects at TA-55; temporary and permanent changes that directly or indirectly affect structures, systems, or components (SSCs) as described in the safety analysis, including Facility Control System (FCS) software; and major modifications to procedures. This manual does not apply to maintenance performed on process equipment or facility SSCs or the replacement of SSCs or equipment with documented approved equivalents.« less

  9. Proportional and Integral Thermal Control System for Large Scale Heating Tests

    NASA Technical Reports Server (NTRS)

    Fleischer, Van Tran

    2015-01-01

    The National Aeronautics and Space Administration Armstrong Flight Research Center (Edwards, California) Flight Loads Laboratory is a unique national laboratory that supports thermal, mechanical, thermal/mechanical, and structural dynamics research and testing. A Proportional Integral thermal control system was designed and implemented to support thermal tests. A thermal control algorithm supporting a quartz lamp heater was developed based on the Proportional Integral control concept and a linearized heating process. The thermal control equations were derived and expressed in terms of power levels, integral gain, proportional gain, and differences between thermal setpoints and skin temperatures. Besides the derived equations, user's predefined thermal test information generated in the form of thermal maps was used to implement the thermal control system capabilities. Graphite heater closed-loop thermal control and graphite heater open-loop power level were added later to fulfill the demand for higher temperature tests. Verification and validation tests were performed to ensure that the thermal control system requirements were achieved. This thermal control system has successfully supported many milestone thermal and thermal/mechanical tests for almost a decade with temperatures ranging from 50 F to 3000 F and temperature rise rates from -10 F/s to 70 F/s for a variety of test articles having unique thermal profiles and test setups.

  10. Sediment and erosion control laboratory facility expansion.

    DOT National Transportation Integrated Search

    2016-08-01

    The Sediment and Erosion Control Laboratory (SEC Lab), formerly the Hydraulics, Sedimentation, and : Erosion Control Laboratory, is operated by the Texas A&M Transportation Institutes Environment and : Planning Program. Performance evaluation prog...

  11. Increased instrument intelligence--can it reduce laboratory error?

    PubMed

    Jekelis, Albert W

    2005-01-01

    Recent literature has focused on the reduction of laboratory errors and the potential impact on patient management. This study assessed the intelligent, automated preanalytical process-control abilities in newer generation analyzers as compared with older analyzers and the impact on error reduction. Three generations of immuno-chemistry analyzers were challenged with pooled human serum samples for a 3-week period. One of the three analyzers had an intelligent process of fluidics checks, including bubble detection. Bubbles can cause erroneous results due to incomplete sample aspiration. This variable was chosen because it is the most easily controlled sample defect that can be introduced. Traditionally, lab technicians have had to visually inspect each sample for the presence of bubbles. This is time consuming and introduces the possibility of human error. Instruments with bubble detection may be able to eliminate the human factor and reduce errors associated with the presence of bubbles. Specific samples were vortexed daily to introduce a visible quantity of bubbles, then immediately placed in the daily run. Errors were defined as a reported result greater than three standard deviations below the mean and associated with incomplete sample aspiration of the analyte of the individual analyzer Three standard deviations represented the target limits of proficiency testing. The results of the assays were examined for accuracy and precision. Efficiency, measured as process throughput, was also measured to associate a cost factor and potential impact of the error detection on the overall process. The analyzer performance stratified according to their level of internal process control The older analyzers without bubble detection reported 23 erred results. The newest analyzer with bubble detection reported one specimen incorrectly. The precision and accuracy of the nonvortexed specimens were excellent and acceptable for all three analyzers. No errors were found in the nonvortexed specimens. There were no significant differences in overall process time for any of the analyzers when tests were arranged in an optimal configuration. The analyzer with advanced fluidic intelligence demostrated the greatest ability to appropriately deal with an incomplete aspiration by not processing and reporting a result for the sample. This study suggests that preanalytical process-control capabilities could reduce errors. By association, it implies that similar intelligent process controls could favorably impact the error rate and, in the case of this instrument, do it without negatively impacting process throughput. Other improvements may be realized as a result of having an intelligent error-detection process including further reduction in misreported results, fewer repeats, less operator intervention, and less reagent waste.

  12. Managing the Pre- and Post-analytical Phases of the Total Testing Process

    PubMed Central

    2012-01-01

    For many years, the clinical laboratory's focus on analytical quality has resulted in an error rate of 4-5 sigma, which surpasses most other areas in healthcare. However, greater appreciation of the prevalence of errors in the pre- and post-analytical phases and their potential for patient harm has led to increasing requirements for laboratories to take greater responsibility for activities outside their immediate control. Accreditation bodies such as the Joint Commission International (JCI) and the College of American Pathologists (CAP) now require clear and effective procedures for patient/sample identification and communication of critical results. There are a variety of free on-line resources available to aid in managing the extra-analytical phase and the recent publication of quality indicators and proposed performance levels by the International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) working group on laboratory errors and patient safety provides particularly useful benchmarking data. Managing the extra-laboratory phase of the total testing cycle is the next challenge for laboratory medicine. By building on its existing quality management expertise, quantitative scientific background and familiarity with information technology, the clinical laboratory is well suited to play a greater role in reducing errors and improving patient safety outside the confines of the laboratory. PMID:22259773

  13. Implementation of 5S Method for Ergonomic Laboratory

    NASA Astrophysics Data System (ADS)

    Dila Sari, Amarria; Ilma Rahmillah, Fety; Prabowo Aji, Bagus

    2017-06-01

    This article discusses 5S implementation in Work System Design and Ergonomic Laboratory, Department of Industrial Engineering, Islamic University of Indonesia. There are some problems related to equipment settings for activity involving students such as files which is accumulated over the previous year practicum, as well as the movement of waste in the form of time due to the placement of goods that do not fit. Therefore, this study aims to apply the 5S method in DSK & E laboratory to facilitate the work processes and reduce waste. The project is performed by laboratory management using 5S methods in response to continuous improvement (Kaizen). Moreover, some strategy and suggestions are promoted to impose 5S system within the laboratory. As a result, the tidiness and cleanliness can be achieved that lead to the great performance of laboratory users. Score assessment before implementing 5S DSKE laboratory is at 64 (2.56) while the score after implementation is 32 (1.28) and shows an improvement of 50%. This has implications for better use in the laboratory area, save time when looking for tools and materials due to its location and good visual control, as well as improving the culture and spirit of ‘5S’ on staff regarding better working environment

  14. Contributions of CCLM to advances in quality control.

    PubMed

    Kazmierczak, Steven C

    2013-01-01

    Abstract The discipline of laboratory medicine is relatively young when considered in the context of the history of medicine itself. The history of quality control, within the context of laboratory medicine, also enjoys a relatively brief, but rich history. Laboratory quality control continues to evolve along with advances in automation, measurement techniques and information technology. Clinical Chemistry and Laboratory Medicine (CCLM) has played a key role in helping disseminate information about the proper use and utility of quality control. Publication of important advances in quality control techniques and dissemination of guidelines concerned with laboratory quality control has undoubtedly helped readers of this journal keep up to date on the most recent developments in this field.

  15. Internal quality control indicators of cervical cytopathology exams performed in laboratories monitored by the External Quality Control Laboratory.

    PubMed

    Ázara, Cinara Zago Silveira; Manrique, Edna Joana Cláudio; Tavares, Suelene Brito do Nascimento; de Souza, Nadja Lindany Alves; Amaral, Rita Goreti

    2014-09-01

    To evaluate the impact of continued education provided by an external quality control laboratory on the indicators of internal quality control of cytopathology exams. The internal quality assurance indicators for cytopathology exams from 12 laboratories monitored by the External Quality Control Laboratory were evaluated. Overall, 185,194 exams were included, 98,133 of which referred to the period preceding implementation of a continued education program, while 87,061 referred to the period following this intervention. Data were obtained from the Cervical Cancer Database of the Brazilian National Health Service. Following implementation of the continued education program, the positivity index (PI) remained within recommended limits in four laboratories. In another four laboratories, the PI progressed from below the limits to within the recommended standards. In one laboratory, the PI remained low, in two laboratories, it remained very low, and in one, it increased from very low to low. The percentage of exams compatible with a high-grade squamous intraepithelial lesion (HSIL) remained within the recommended limits in five laboratories, while in three laboratories it progressed from below the recommended levels to >0.4% of the total number of satisfactory exams, and in four laboratories it remained below the standard limit. Both the percentage of atypical squamous cells of undetermined significance (ASC-US) in relation to abnormal exams, and the ratio between ASC-US and intraepithelial lesions remained within recommended levels in all the laboratories investigated. An improvement was found in the indicators represented by the positivity index and the percentage of exams compatible with a high-grade squamous intraepithelial lesion, showing that the role played by the external quality control laboratory in providing continued education contributed towards improving laboratory staff skills in detecting cervical cancer precursor lesions.

  16. A Web-Based Remote Access Laboratory Using SCADA

    ERIC Educational Resources Information Center

    Aydogmus, Z.; Aydogmus, O.

    2009-01-01

    The Internet provides an opportunity for students to access laboratories from outside the campus. This paper presents a Web-based remote access real-time laboratory using SCADA (supervisory control and data acquisition) control. The control of an induction motor is used as an example to demonstrate the effectiveness of this remote laboratory,…

  17. Selecting automation for the clinical chemistry laboratory.

    PubMed

    Melanson, Stacy E F; Lindeman, Neal I; Jarolim, Petr

    2007-07-01

    Laboratory automation proposes to improve the quality and efficiency of laboratory operations, and may provide a solution to the quality demands and staff shortages faced by today's clinical laboratories. Several vendors offer automation systems in the United States, with both subtle and obvious differences. Arriving at a decision to automate, and the ensuing evaluation of available products, can be time-consuming and challenging. Although considerable discussion concerning the decision to automate has been published, relatively little attention has been paid to the process of evaluating and selecting automation systems. To outline a process for evaluating and selecting automation systems as a reference for laboratories contemplating laboratory automation. Our Clinical Chemistry Laboratory staff recently evaluated all major laboratory automation systems in the United States, with their respective chemistry and immunochemistry analyzers. Our experience is described and organized according to the selection process, the important considerations in clinical chemistry automation, decisions and implementation, and we give conclusions pertaining to this experience. Including the formation of a committee, workflow analysis, submitting a request for proposal, site visits, and making a final decision, the process of selecting chemistry automation took approximately 14 months. We outline important considerations in automation design, preanalytical processing, analyzer selection, postanalytical storage, and data management. Selecting clinical chemistry laboratory automation is a complex, time-consuming process. Laboratories considering laboratory automation may benefit from the concise overview and narrative and tabular suggestions provided.

  18. Internal quality control: best practice.

    PubMed

    Kinns, Helen; Pitkin, Sarah; Housley, David; Freedman, Danielle B

    2013-12-01

    There is a wide variation in laboratory practice with regard to implementation and review of internal quality control (IQC). A poor approach can lead to a spectrum of scenarios from validation of incorrect patient results to over investigation of falsely rejected analytical runs. This article will provide a practical approach for the routine clinical biochemistry laboratory to introduce an efficient quality control system that will optimise error detection and reduce the rate of false rejection. Each stage of the IQC system is considered, from selection of IQC material to selection of IQC rules, and finally the appropriate action to follow when a rejection signal has been obtained. The main objective of IQC is to ensure day-to-day consistency of an analytical process and thus help to determine whether patient results are reliable enough to be released. The required quality and assay performance varies between analytes as does the definition of a clinically significant error. Unfortunately many laboratories currently decide what is clinically significant at the troubleshooting stage. Assay-specific IQC systems will reduce the number of inappropriate sample-run rejections compared with the blanket use of one IQC rule. In practice, only three or four different IQC rules are required for the whole of the routine biochemistry repertoire as assays are assigned into groups based on performance. The tools to categorise performance and assign IQC rules based on that performance are presented. Although significant investment of time and education is required prior to implementation, laboratories have shown that such systems achieve considerable reductions in cost and labour.

  19. Technology for On-Chip Qubit Control with Microfabricated Surface Ion Traps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Highstrete, Clark; Scott, Sean Michael; Nordquist, Christopher D.

    2013-11-01

    Trapped atomic ions are a leading physical system for quantum information processing. However, scalability and operational fidelity remain limiting technical issues often associated with optical qubit control. One promising approach is to develop on-chip microwave electronic control of ion qubits based on the atomic hyperfine interaction. This project developed expertise and capabilities at Sandia toward on-chip electronic qubit control in a scalable architecture. The project developed a foundation of laboratory capabilities, including trapping the 171Yb + hyperfine ion qubit and developing an experimental microwave coherent control capability. Additionally, the project investigated the integration of microwave device elements with surface ionmore » traps utilizing Sandia’s state-of-the-art MEMS microfabrication processing. This effort culminated in a device design for a multi-purpose ion trap experimental platform for investigating on-chip microwave qubit control, laying the groundwork for further funded R&D to develop on-chip microwave qubit control in an architecture that is suitable to engineering development.« less

  20. Assessment of the beryllium lymphocyte proliferation test using statistical process control.

    PubMed

    Cher, Daniel J; Deubner, David C; Kelsh, Michael A; Chapman, Pamela S; Ray, Rose M

    2006-10-01

    Despite more than 20 years of surveillance and epidemiologic studies using the beryllium blood lymphocyte proliferation test (BeBLPT) as a measure of beryllium sensitization (BeS) and as an aid for diagnosing subclinical chronic beryllium disease (CBD), improvements in specific understanding of the inhalation toxicology of CBD have been limited. Although epidemiologic data suggest that BeS and CBD risks vary by process/work activity, it has proven difficult to reach specific conclusions regarding the dose-response relationship between workplace beryllium exposure and BeS or subclinical CBD. One possible reason for this uncertainty could be misclassification of BeS resulting from variation in BeBLPT testing performance. The reliability of the BeBLPT, a biological assay that measures beryllium sensitization, is unknown. To assess the performance of four laboratories that conducted this test, we used data from a medical surveillance program that offered testing for beryllium sensitization with the BeBLPT. The study population was workers exposed to beryllium at various facilities over a 10-year period (1992-2001). Workers with abnormal results were offered diagnostic workups for CBD. Our analyses used a standard statistical technique, statistical process control (SPC), to evaluate test reliability. The study design involved a repeated measures analysis of BeBLPT results generated from the company-wide, longitudinal testing. Analytical methods included use of (1) statistical process control charts that examined temporal patterns of variation for the stimulation index, a measure of cell reactivity to beryllium; (2) correlation analysis that compared prior perceptions of BeBLPT instability to the statistical measures of test variation; and (3) assessment of the variation in the proportion of missing test results and how time periods with more missing data influenced SPC findings. During the period of this study, all laboratories displayed variation in test results that were beyond what would be expected due to chance alone. Patterns of test results suggested that variations were systematic. We conclude that laboratories performing the BeBLPT or other similar biological assays of immunological response could benefit from a statistical approach such as SPC to improve quality management.

  1. Analysis of hygienic critical control points in boar semen production.

    PubMed

    Schulze, M; Ammon, C; Rüdiger, K; Jung, M; Grobbel, M

    2015-02-01

    The present study addresses the microbiological results of a quality control audit in artificial insemination (AI) boar studs in Germany and Austria. The raw and processed semen of 344 boars in 24 AI boar studs were analyzed. Bacteria were found in 26% (88 of 344) of the extended ejaculates and 66.7% (18 of 24) of the boar studs. The bacterial species found in the AI dose were not cultured from the respective raw semen in 95.5% (84 of 88) of the positive samples. These data, together with the fact that in most cases all the samples from one stud were contaminated with identical bacteria (species and resistance profile), indicate contamination during processing. Microbiological investigations of the equipment and the laboratory environment during semen processing in 21 AI boar studs revealed nine hygienic critical control points (HCCP), which were addressed after the first audit. On the basis of the analysis of the contamination rates of the ejaculate samples, improvements in the hygiene status were already present in the second audit (P = 0.0343, F-test). Significant differences were observed for heating cabinets (improvement, P = 0.0388) and manual operating elements (improvement, P = 0.0002). The odds ratio of finding contaminated ejaculates in the first and second audit was 1.68 (with the 95% confidence interval ranging from 1.04 to 2.69). Furthermore, an overall good hygienic status was shown for extenders, the inner face of dilution tank lids, dyes, and ultrapure water treatment plants. Among the nine HCCP considered, the most heavily contaminated samples, as assessed by the median scores throughout all the studs, were found in the sinks and/or drains. High numbers (>10(3) colony-forming units/cm(2)) of bacteria were found in the heating cabinets, ejaculate transfer, manual operating elements, and laboratory surfaces. In conclusion, the present study emphasizes the need for both training of the laboratory staff in monitoring HCCP in routine semen production and audits in such AI centers for the external control of hygiene parameters. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Research and the planned Space Experiment Research and Processing Laboratory

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Original photo and caption dated August 14, 1995: 'KSC plant physiologist Dr. Gary Stutte harvests a potato grown in the Biomass Production Chamber of the Controlled environment Life Support system (CELSS) in Hangar L at Cape Canaveral Air Station. During a 418-day 'human rated' experiment, potato crops grown in the chamber provided the equivalent of a continuous supply of the oxygen for one astronaut, along with 55 percent of that long-duration space flight crew member's caloric food requirements and enough purified water for four astronauts while absorbing their expelled carbon dioxide. The experiment provided data that will help demonstarte the feasibility of the CELSS operating as a bioregenerative life support system for lunar and deep-space missions that can operate independently without the need to carry consumables such as air, water and food, while not requiring the expendable air and water system filters necessary on today's human-piloted spacecraft.' His work is an example of the type of life sciences research that will be conducted at the Space Experiment Research Procession Laboratory (SERPL). The SERPL is a planned 100,000-square-foot laboratory that will provide expanded and upgraded facilities for hosting International Space Station experiment processing. In addition, it will provide better support for other biological and life sciences payload processing at KSC. It will serve as a magnet facility for a planned 400-acre Space Station Commerce Park.

  3. Research and the planned Space Experiment Research and Processing Laboratory

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Original photo and caption dated August 14, 1995: 'KSC plant physiologist Dr. Gary Stutte (right) and Cheryl Mackowiak harvest potatoes grown in the Biomass Production Chamber of the Controlled Enviornment Life Support System (CELSS in Hangar L at Cape Canaveral Air Station. During a 418-day 'human rated' experiment, potato crops grown in the chamber provided the equivalent of a continuous supply of the oxygen for one astronaut, along with 55 percent of that long-duration space flight crew member's caloric food requirements and enough purified water for four astronauts while absorbing their expelled carbon dioxide. The experiment provided data that will help demonstarte the feasibility of the CELSS operating as a bioregenerative life support system for lunar and deep-space missions that can operate independently without the need to carry consumables such as air, water and food, while not requiring the expendable air and water system filters necessary on today's human-piloted spacecraft.' Their work is an example of the type of life sciences research that will be conducted at the Space Experiment Research Procession Laboratory (SERPL). The SERPL is a planned 100,000-square-foot laboratory that will provide expanded and upgraded facilities for hosting International Space Station experiment processing. In addition, it will provide better support for other biological and life sciences payload processing at KSC. It will serve as a magnet facility for a planned 400-acre Space Station Commerce Park.

  4. Variety of Sedimentary Process and Distribution of Tsunami Deposits in Laboratory Experiments

    NASA Astrophysics Data System (ADS)

    Yamaguchi, N.; Sekiguchi, T.

    2017-12-01

    As an indicator of the history and magnitude of paleotsunami events, tsunami deposits have received considerable attention. To improve the identification and interpretation of paleotsunami deposits, an understanding of sedimentary process and distribution of tsunami deposits is crucial. Recent detailed surveys of onshore tsunami deposits including the 2004 Indian Ocean tsunami and the 2011 Tohoku-oki tsunami have revealed that terrestrial topography causes a variety of their features and distributions. Therefore, a better understanding of possible sedimentary process and distribution on such influential topographies is required. Flume experiments, in which sedimentary conditions can be easily controlled, can provide insights into the effects of terrestrial topography as well as tsunami magnitude on the feature of tsunami deposits. In this presentation, we report laboratory experiments that focused on terrestrial topography including a water body (e.g. coastal lake) on a coastal lowland and a cliff. In both cases, the results suggested relationship between the distribution of tsunami deposits and the hydraulic condition of the tsunami flow associated with the terrestrial topography. These experiments suggest that influential topography would enhance the variability in thickness of tsunami deposits, and thus, in reconstructions of paleotsunami events using sedimentary records, we should take into account such anomalous distribution of tsunami deposits. Further examination of the temporal sequence of sedimentary process in laboratory tsunamis may improve interpretation and estimation of paleotsunami events.

  5. Are we drunk yet? Motor versus cognitive cues of subjective intoxication.

    PubMed

    Celio, Mark A; Usala, Julie M; Lisman, Stephen A; Johansen, Gerard E; Vetter-O'Hagen, Courtney S; Spear, Linda P

    2014-02-01

    Perception of alcohol intoxication presumably plays an important role in guiding behavior during a current drinking episode. Yet, there has been surprisingly little investigation of what aspects associated with intoxication are used by individuals to attribute their level of intoxication. Building on recent laboratory-based findings, this study employed a complex field-based design to explore the relative contributions of motor performance versus cognitive performance-specifically executive control-on self-attributions of intoxication. Individuals recruited outside of bars (N = 280; mean age = 22; range: 18 to 32) completed a structured interview, self-report questionnaire, and neuropsychological testing battery, and provided a breath alcohol concentration (BrAC) sample. Results of a multiple linear regression analysis demonstrated that current level of subjective intoxication was associated with current alcohol-related stimulant effects, current sedative effects, and current BrAC. After controlling for the unique variance accounted for by these factors, subjective intoxication was better predicted by simple motor speed, as indexed by performance on the Finger Tapping Test, than by executive control, as indexed by performance on the Trail Making Test. These results-generated from data collected in a naturally occurring setting-support previous findings from a more traditional laboratory-based investigation, thus illustrating the iterative process of linking field methodology and controlled laboratory experimentation. Copyright © 2013 by the Research Society on Alcoholism.

  6. 77 FR 14805 - Clinical Laboratory Improvement Advisory Committee, Centers for Disease Control and Prevention...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-13

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention Clinical Laboratory Improvement Advisory Committee, Centers for Disease Control and Prevention: Notice of Charter..., that the Clinical Laboratory Improvement Advisory Committee, Centers for Disease Control and Prevention...

  7. Description of the Spacecraft Control Laboratory Experiment (SCOLE) facility

    NASA Technical Reports Server (NTRS)

    Williams, Jeffrey P.; Rallo, Rosemary A.

    1987-01-01

    A laboratory facility for the study of control laws for large flexible spacecraft is described. The facility fulfills the requirements of the Spacecraft Control Laboratory Experiment (SCOLE) design challenge for a laboratory experiment, which will allow slew maneuvers and pointing operations. The structural apparatus is described in detail sufficient for modelling purposes. The sensor and actuator types and characteristics are described so that identification and control algorithms may be designed. The control implementation computer and real-time subroutines are also described.

  8. Description of the Spacecraft Control Laboratory Experiment (SCOLE) facility

    NASA Technical Reports Server (NTRS)

    Williams, Jeffrey P.; Rallo, Rosemary A.

    1987-01-01

    A laboratory facility for the study of control laws for large flexible spacecraft is described. The facility fulfills the requirements of the Spacecraft Control Laboratory Experiment (SCOLE) design challenge for laboratory experiments, which will allow slew maneuvers and pointing operations. The structural apparatus is described in detail sufficient for modelling purposes. The sensor and actuator types and characteristics are described so that identification and control algorithms may be designed. The control implementation computer and real-time subroutines are also described.

  9. Gemas: issues from the comparison of aqua regia and X-ray fluorescence results

    NASA Astrophysics Data System (ADS)

    Dinelli, Enrico; Birke, Manfred; Reimann, Clemens; Demetriades, Alecos; DeVivo, Benedetto; Flight, Dee; Ladenberger, Anna; Albanese, Stefano; Cicchella, Domenico; Lima, Annamaria

    2014-05-01

    The comparison of analytical results from aqua regia (AR) and X-ray fluorescence spectroscopy (XRF) can provide information on soil processes controlling the element distribution. The GEMAS (GEochemical Mapping of Agricultural and grazing land Soils) agricultural soil database is used for this comparison. Analyses for the same suite of elements and parameters were carried out in the same laboratory under strict quality control procedures. Sample preparation has been conducted at the laboratory of the The comparison of analytical results from aqua regia (AR) and X-ray fluorescence spectroscopy (XRF) can provide information on soil processes controlling the element distribution in soil. The GEMAS (GEochemical Mapping of Agricultural and grazing land Soils) agricultural soil database, consisting of 2 x ca. 2100 samples spread evenly over 33 European countries, is used for this comparison. Analyses for the same suite of elements and parameters were carried out in the same laboratory under strict quality control procedures. Sample preparation has been conducted at the laboratory of the Geological Survey of the Slovak Republic, AR analyses were carried out at ACME Labs, and XRF analyses at the Federal Institute for Geosciences and Natural Resources, Germany Element recovery by AR is very different, ranging from <1% (e.g. Na, Zr) to > 80% (e.g. Mn, P, Co). Recovery is controlled by mineralogy of the parent material, but geographic and climatic factors and the weathering history of the soils are also important. Nonetheless, even the very low recovery elements show wide ranges of variation and spatial patterns that are affected by other factors than soil parent material. For many elements soil pH have a clear influence on AR extractability: under acidic soil conditions almost all elements tend to be leached and their extractability is generally low. It progressively increases with increasing pH and is highest in the pH range 7-8. Critical is the clay content of the soil that almost for all elements correspond to higher extractability with increasing clay abundance. Also other factors such as organic matter content of soil, Fe and Mn occurrence are important for certain elements or in selected areas. This work illustrates that there are significant differences in the extractability of elements from soils and addresses important influencing factors related to soil properties, geology, climate.

  10. [Construction and operation status of management system of laboratories of schistosomiasis control institutions in Hubei Province].

    PubMed

    Zhao-Hui, Zheng; Jun, Qin; Li, Chen; Hong, Zhu; Li, Tang; Zu-Wu, Tu; Ming-Xing, Zeng; Qian, Sun; Shun-Xiang, Cai

    2016-10-09

    To analyze the construction and operation status of management system of laboratories of schistosomiasis control institutions in Hubei Province, so as to provide the reference for the standardized detection and management of schistosomiasis laboratories. According to the laboratory standard of schistosomiasis at provincial, municipal and county levels, the management system construction and operation status of 60 schistosomiasis control institutions was assessed by the acceptance examination method from 2013 to 2015. The management system was already occupied over all the laboratories of schistosomiasis control institutions and was officially running. There were 588 non-conformities and the inconsistency rate was 19.60%. The non-conformity rate of the management system of laboratory quality control was 38.10% (224 cases) and the non-conformity rate of requirements of instrument and equipment was 23.81% (140 cases). The management system has played an important role in the standardized management of schistosomiasis laboratories.

  11. Errors in the Extra-Analytical Phases of Clinical Chemistry Laboratory Testing.

    PubMed

    Zemlin, Annalise E

    2018-04-01

    The total testing process consists of various phases from the pre-preanalytical to the post-postanalytical phase, the so-called brain-to-brain loop. With improvements in analytical techniques and efficient quality control programmes, most laboratory errors now occur in the extra-analytical phases. There has been recent interest in these errors with numerous publications highlighting their effect on service delivery, patient care and cost. This interest has led to the formation of various working groups whose mission is to develop standardized quality indicators which can be used to measure the performance of service of these phases. This will eventually lead to the development of external quality assessment schemes to monitor these phases in agreement with ISO15189:2012 recommendations. This review focuses on potential errors in the extra-analytical phases of clinical chemistry laboratory testing, some of the studies performed to assess the severity and impact of these errors and processes that are in place to address these errors. The aim of this review is to highlight the importance of these errors for the requesting clinician.

  12. [Validation of measurement methods and estimation of uncertainty of measurement of chemical agents in the air at workstations].

    PubMed

    Dobecki, Marek

    2012-01-01

    This paper reviews the requirements for measurement methods of chemical agents in the air at workstations. European standards, which have a status of Polish standards, comprise some requirements and information on sampling strategy, measuring techniques, type of samplers, sampling pumps and methods of occupational exposure evaluation at a given technological process. Measurement methods, including air sampling and analytical procedure in a laboratory, should be appropriately validated before intended use. In the validation process, selected methods are tested and budget of uncertainty is set up. The validation procedure that should be implemented in the laboratory together with suitable statistical tools and major components of uncertainity to be taken into consideration, were presented in this paper. Methods of quality control, including sampling and laboratory analyses were discussed. Relative expanded uncertainty for each measurement expressed as a percentage, should not exceed the limit of values set depending on the type of occupational exposure (short-term or long-term) and the magnitude of exposure to chemical agents in the work environment.

  13. Engine Research Building’s Central Control Room

    NASA Image and Video Library

    1948-07-21

    Operators in the Engine Research Building’s Central Control Room at the National Advisory Committee for Aeronautics (NACA) Lewis Flight Propulsion Laboratory. The massive 4.25-acre Engine Research Building contains dozens of test cells, test stands, and altitude chambers. A powerful collection of compressors and exhausters located in the central portion of the basement provided process air and exhaust for these test areas. This system is connected to similar process air systems in the laboratory’s other large test facilities. The Central Control Room coordinates this activity and communicates with the local utilities. This photograph was taken just after a major upgrade to the control room in 1948. The panels on the wall contain rudimentary floor plans of the different Engine Research Building sections with indicator lights and instrumentation for each test cell. The process air equipment included 12 exhausters, four compressors, a refrigeration system, cooling water, and an exhaust system. The operators in the control room kept in contact with engineers running the process air system and those conducting the tests in the test cells. The operators also coordinated with the local power companies to make sure enough electricity was available to operate the powerful compressors and exhausters.

  14. Temperature management during semen processing: Impact on boar sperm quality under laboratory and field conditions.

    PubMed

    Schulze, M; Henning, H; Rüdiger, K; Wallner, U; Waberski, D

    2013-12-01

    Freshly collected boar spermatozoa are sensitive to a fast reduction in temperature because of lipid phase transition and phase separation processes. Temperature management during semen processing may determine the quality of stored samples. The aim of this study was to evaluate the influence of isothermic and hypothermic semen processing protocols on boar sperm quality under laboratory and field conditions. In the laboratory study, ejaculates (n = 12) were first diluted (1:1) with Beltsville Thawing Solution (BTS) at 32 °C, then processed either with isothermic (32 °C) or hypothermic (21 °C) BTS, stored at 17 °C, and assessed on days 1, 3, and 6. Temperature curves showed that 150 minutes after the first dilution, semen doses of both groups reached the same temperature. Two-step hypothermic processing resulted in lower sperm motility on days 1 and 6 (P < 0.05). Concomitantly, hypothermally processed samples contained less membrane intact sperm on days 3 and 6 (P < 0.05). Using AndroStar Plus extender instead of BTS reduced the negative effect of hypothermic processing. In the field study, 15 semen samples from each of 23 European artificial insemination studs were evaluated as part of an external quality control program. Semen quality based on motility, membrane integrity, mitochondrial activity, and a thermoresistance test was higher for stations using one-step isothermic dilutions (n = 7) compared with artificial insemination centers using two-step hypothermic protocols (n = 16). Both studies show that chilling injury associated with hypothermic dilution results in lower quality of stored boar semen compared with isothermic dilution and that the type of semen extender affects the outcomes. Copyright © 2013 Elsevier Inc. All rights reserved.

  15. Observational Studies of Parameters Influencing Air-sea Gas Exchange

    NASA Astrophysics Data System (ADS)

    Schimpf, U.; Frew, N. M.; Bock, E. J.; Hara, T.; Garbe, C. S.; Jaehne, B.

    A physically-based modeling of the air-sea gas transfer that can be used to predict the gas transfer rates with sufficient accuracy as a function of micrometeorological parameters is still lacking. State of the art are still simple gas transfer rate/wind speed relationships. Previous measurements from Coastal Ocean Experiment in the Atlantic revealed positive correlations between mean square slope, near surface turbulent dis- sipation, and wind stress. It also demonstrated a strong negative correlation between mean square slope and the fluorescence of surface-enriched colored dissolved organic matter. Using heat as a proxy tracer for gases the exchange process at the air/water interface and the micro turbulence at the water surface can be investigated. The anal- ysis of infrared image sequences allow the determination of the net heat flux at the ocean surface, the temperature gradient across the air/sea interface and thus the heat transfer velocity and gas transfer velocity respectively. Laboratory studies were carried out in the new Heidelberg wind-wave facility AELOTRON. Direct measurements of the Schmidt number exponent were done in conjunction with classical mass balance methods to estimate the transfer velocity. The laboratory results allowed to validate the basic assumptions of the so called controlled flux technique by applying differ- ent tracers for the gas exchange in a large Schmidt number regime. Thus a modeling of the Schmidt number exponent is able to fill the gap between laboratory and field measurements field. Both, the results from the laboratory and the field measurements should be able to give a further understanding of the mechanisms controlling the trans- port processes across the aqueous boundary layer and to relate the forcing functions to parameters measured by remote sensing.

  16. Analysis of STAT laboratory turnaround times before and after conversion of the hospital information system.

    PubMed

    Lowe, Gary R; Griffin, Yolanda; Hart, Michael D

    2014-08-01

    Modern electronic health record systems (EHRS) reportedly offer advantages including improved quality, error prevention, cost reduction, and increased efficiency. This project reviewed the impact on specimen turnaround times (TAT) and percent compliance for specimens processed in a STAT laboratory after implementation of an upgraded EHRS. Before EHRS implementation, laboratory personnel received instruction and training for specimen processing. One laboratory member per shift received additional training. TAT and percent compliance data sampling occurred 4 times monthly for 13 months post-conversion and were compared with the mean of data collected for 3 months pre-conversion. Percent compliance was gauged using a benchmark of reporting 95% of all specimens within 7 min from receipt. Control charts were constructed for TAT and percent compliance with control limits set at 2 SD and applied continuously through the data collection period. TAT recovered to pre-conversion levels by the 6th month post-conversion. Percent compliance consistently returned to pre-conversion levels by the 10th month post-conversion. Statistical analyses revealed the TAT were significantly longer for 3 months post-conversion (P < .001) compared with pre-conversion levels. Statistical significance was not observed for subsequent groups. Percent compliance results were significantly lower for 6 months post-conversion (P < .001). Statistical significance was not observed for subsequent groups. Extensive efforts were made to train and prepare personnel for challenges expected after the EHRS upgrade. Specific causes identified with the upgraded EHRS included multiple issues involving personnel and the EHRS. These data suggest that system and user issues contributed to delays in returning to pre-conversion TAT and percent compliance levels following the upgrade in the EHRS.

  17. Chemistry and haematology sample rejection and clinical impact in a tertiary laboratory in Cape Town.

    PubMed

    Jacobsz, Lourens A; Zemlin, Annalise E; Roos, Mark J; Erasmus, Rajiv T

    2011-10-14

    Recent publications report that up to 70% of total laboratory errors occur in the pre-analytical phase. Identification of specific problems highlights pre-analytic processes susceptible to errors. The rejection of unsuitable samples can lead to delayed turnaround time and affect patient care. A retrospective audit was conducted investigating the rejection rate of routine blood specimens received at chemistry and haematology laboratories over a 2-week period. The reasons for rejection and potential clinical impact of these rejections were investigated. Thirty patient files were randomly selected and examined to assess the impact of these rejections on clinical care. A total of 32,910 specimens were received during the study period, of which 481 were rejected, giving a rejection rate of 1.46%. The main reasons for rejection were inappropriate clotting (30%) and inadequate sample volume (22%). Only 51.7% of rejected samples were repeated and the average time for a repeat sample to reach the laboratory was about 5 days (121 h). Of the repeated samples, 5.1% had results within critical values. Examination of patient folders showed that in 40% of cases the rejection of samples had an impact on patient care. The evaluation of pre-analytical processes in the laboratory, with regard to sample rejection, allowed one to identify problem areas where improvement is necessary. Rejected samples due to factors out of the laboratory's control had a definite impact on patient care and can thus affect customer satisfaction. Clinicians should be aware of these factors to prevent such rejections.

  18. To other worlds via the laboratory (Invited)

    NASA Astrophysics Data System (ADS)

    Lorenz, R. D.

    2009-12-01

    Planetary science is fun, largely by virtue of the wide range of disciplines and techniques it embraces. Progress relies not only on spacecraft observation and models, but also on laboratory work to provide reference data with which to interpret observations and to provide quantitative constraints on model parameters. An important distinction should be drawn between two classes of investigation. The most familiar, pursued by those who make laboratory studies the focus of their careers, is the construction of well-controlled experiments, typically to determine the functional dependence of some desired physical property upon one or two controlled parameters such as temperature, pressure or concentration. Another class of experiment is more exploratory - to 'see what happens'. This exercise often reveals that models may be based on entirely false assumptions. In some cases laboratory results also have value as persuasive tools in providing graphic support for unfamiliar properties or processes - the iconic image of 'flaming ice' makes the exotic notion of methane clathrate immediately accessible. This talk will review the role of laboratory work in planetary science and especially the outer solar system. A few of the author's personal forays into laboratory measurements will be discussed in the talk; These include the physical properties of dessicated icy loess in the US Army Permafrost tunnel in Alaska (as a Mars analog), the use of a domestic microwave oven to measure radar absorptivity (in particular of ammonia-rich water ice) and the generation of waves - and ice - on the surface of a liquid by wind with fluid and air parameters appropriate to Mars and Titan rather than Earth using the MARSWIT wind tunnel at NASA Ames.

  19. The pilot plant for electron beam food processing

    NASA Astrophysics Data System (ADS)

    Migdal, W.; Walis, L.; Chmielewski, A. G.

    1993-07-01

    In the frames of the national programme on the application of irradiation for food preservation and hygienization an experimental plant for electron beam processing has been established in INCT. The pilot plant has been constructed inside an old fort what decreases significantly the cost of the investment. The pilot plant is equipped with a small research accelerator Pilot (10 MeV, 1 kW) and an industrial unit Elektronika (10 MeV, 10 kW). This allows both laboratory and full technological scale testing of the elaborated process to be conducted. The industrial unit is being equipped with e-/X conversion target, for high density products irradiation. On the basis of the research there were performed at different scientific institutions in Poland, health authorities have issued permissions for permanent treatment of spices, garlic, onions and temporary permissions for mushrooms, and potatoes. Dosimetric methods have been elaborated for the routine use at the plant. In the INCT laboratory methods for the control of e-/X treated food have been established.

  20. Ethanolic carbon-11 chemistry: the introduction of green radiochemistry.

    PubMed

    Shao, Xia; Fawaz, Maria V; Jang, Keunsam; Scott, Peter J H

    2014-07-01

    The principles of green chemistry have been applied to a radiochemistry setting. Eleven carbon-11 labeled radiopharmaceuticals have been prepared using ethanol as the only organic solvent throughout the entire manufacturing process. The removal of all other organic solvents from the process simplifies production and quality control (QC) testing, moving our PET Center towards the first example of a green radiochemistry laboratory. All radiopharmaceutical doses prepared are suitable for clinical use. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. KSC Technical Capabilities Website

    NASA Technical Reports Server (NTRS)

    Nufer, Brian; Bursian, Henry; Brown, Laurette L.

    2010-01-01

    This document is the website pages that review the technical capabilities that the Kennedy Space Center (KSC) has for partnership opportunities. The purpose of this information is to make prospective customers aware of the capabilities and provide an opportunity to form relationships with the experts at KSC. The technical capabilities fall into these areas: (1) Ground Operations and Processing Services, (2) Design and Analysis Solutions, (3) Command and Control Systems / Services, (4) Materials and Processes, (5) Research and Technology Development and (6) Laboratories, Shops and Test Facilities.

  2. Creative revision - From rough draft to published paper

    NASA Technical Reports Server (NTRS)

    Buehler, M. F.

    1976-01-01

    The process of revising a technical or scientific paper can be performed more efficiently by the people involved (author, co-author, supervisor, editor) when the revision is controlled by breaking it into a series of steps. The revision process recommended here is based on the levels-of-edit concept that resulted from a study of the technical editorial function at the Jet Propulsion Laboratory of the California Institute of Technology. Types of revision discussed are Substantive, Policy, Language, Mechanical Style, Format, Integrity, and Copy Clarification.

  3. Avionics test bed development plan

    NASA Technical Reports Server (NTRS)

    Harris, L. H.; Parks, J. M.; Murdock, C. R.

    1981-01-01

    A development plan for a proposed avionics test bed facility for the early investigation and evaluation of new concepts for the control of large space structures, orbiter attached flex body experiments, and orbiter enhancements is presented. A distributed data processing facility that utilizes the current laboratory resources for the test bed development is outlined. Future studies required for implementation, the management system for project control, and the baseline system configuration are defined. A background analysis of the specific hardware system for the preliminary baseline avionics test bed system is included.

  4. Inertial Confinement Fusion as an Extreme Example of Dynamic Compression

    NASA Astrophysics Data System (ADS)

    Moses, E.

    2013-06-01

    Initiating and controlling thermonuclear burn at the national ignition facility (NIF) will require the manipulation of matter to extreme energy densities. We will discuss recent advances in both controlling the dynamic compression of ignition targets and our understanding of the physical states and processes leading to ignition. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory in part under Contract W-7405-Eng-48 and in part under Contract DE-AC52-07NA27344.

  5. Veg-03 Ground Harvest

    NASA Image and Video Library

    2016-12-05

    Inside the Veggie flight laboratory in the Space Station Processing Facility at NASA’s Kennedy Space Center in Florida, a research scientist harvests a portion of the 'Outredgeous' red romaine lettuce from the Veg-03 ground control unit. The purpose of the ground Veggie system is to provide a control group to compare against the lettuce grown in orbit on the International Space Station. Veg-03 will continue NASA’s deep space plant growth research to benefit the Earth and the agency’s journey to Mars.

  6. Mineback Stimulation Research Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warpinski, N.R.

    The objective of the Mineback Stimulation Research Experiments is to improve hydraulic fracture stimulation technology by providing an in situ laboratory where basic processes and mechanisms that control and influence fracture propagation can be observed, measured and understood. While previous tests have been instrumental in providing an understanding of the mechanisms controlling fracture height, current experiments are focused on fluid flow through the created fracture and the associated pressure drops and crack widths. Work performed, accomplishments and future plans are presented. 7 refs., 2 figs.

  7. 7 CFR 58.442 - Laboratory and quality control tests.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Laboratory and quality control tests. 58.442 Section... Service 1 Operations and Operating Procedures § 58.442 Laboratory and quality control tests. (a) Chemical... Methods or by other methods giving equivalent results. (b) Weight or volume control. Representative...

  8. 7 CFR 58.442 - Laboratory and quality control tests.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Laboratory and quality control tests. 58.442 Section... Service 1 Operations and Operating Procedures § 58.442 Laboratory and quality control tests. (a) Chemical... Methods or by other methods giving equivalent results. (b) Weight or volume control. Representative...

  9. 21 CFR 225.58 - Laboratory controls.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 4 2013-04-01 2013-04-01 false Laboratory controls. 225.58 Section 225.58 Food...: GENERAL CURRENT GOOD MANUFACTURING PRACTICE FOR MEDICATED FEEDS Product Quality Control § 225.58 Laboratory controls. (a) The periodic assay of medicated feeds for drug components provides a measure of...

  10. 21 CFR 225.58 - Laboratory controls.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 4 2012-04-01 2012-04-01 false Laboratory controls. 225.58 Section 225.58 Food...: GENERAL CURRENT GOOD MANUFACTURING PRACTICE FOR MEDICATED FEEDS Product Quality Control § 225.58 Laboratory controls. (a) The periodic assay of medicated feeds for drug components provides a measure of...

  11. 21 CFR 225.58 - Laboratory controls.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 4 2011-04-01 2011-04-01 false Laboratory controls. 225.58 Section 225.58 Food...: GENERAL CURRENT GOOD MANUFACTURING PRACTICE FOR MEDICATED FEEDS Product Quality Control § 225.58 Laboratory controls. (a) The periodic assay of medicated feeds for drug components provides a measure of...

  12. Coordination and standardization of federal sedimentation activities

    USGS Publications Warehouse

    Glysson, G. Douglas; Gray, John R.

    1997-01-01

    - precipitation information critical to water resources management. Memorandum M-92-01 covers primarily freshwater bodies and includes activities, such as "development and distribution of consensus standards, field-data collection and laboratory analytical methods, data processing and interpretation, data-base management, quality control and quality assurance, and water- resources appraisals, assessments, and investigations." Research activities are not included.

  13. This view of Jupiter was taken by Voyager 1

    NASA Technical Reports Server (NTRS)

    1998-01-01

    This view of Jupiter was taken by Voyager 1. This image was taken through color filters and recombined to produce the color image. This photo was assembled from three black and white negatives by the Image Processing Lab at Jet Propulsion Laboratory. JPL manages and controls the VOyager project for NASA's Office of Space Science.

  14. Learner's Guide: Water Quality Monitoring. An Instructional Guide for the Two-Year Water Quality Monitoring Curriculum.

    ERIC Educational Resources Information Center

    Glazer, Richard B.; And Others

    This learner's guide is designed to meet the training needs for technicians involved in monitoring activities related to the Federal Water Pollution Act and the Safe Drinking Water Act. In addition it will assist technicians in learning how to perform process control laboratory procedures for drinking water and wastewater treatment plant…

  15. Hypothesis Testing of Edge Organizations: Laboratory Experimentation Using the ELICIT Multiplayer Intelligence Game

    DTIC Science & Technology

    2007-06-01

    Sea: Naval Command and Control since the Sixteenth Century. Cambridge, MA: Harvard University Press, 2005, pp. 400. [22] R. M. Leighton , "Allied...Information Processing View," Interfaces, vol. 4, pp. 28-36, May. 1974. [84] R. E. Levitt, J. Thomsen, T. R. Christiansen , J. C. Kunz, Y. Jin and C. I

  16. Virtual laboratory for the study of transport processes in surface waterflows

    NASA Astrophysics Data System (ADS)

    Aguilar, C.; Egüen, M.; Contreras, E.; Polo, M. J.

    2012-04-01

    The equations involved in the study of transport processes depend on the spatial and temporal scale of the study and according to the required level of detail can become very difficult to solve analytically. Besides, experimentation of processes with any transport phenomena involved is complex due to their natural or forced occurrence in the environment (eg. Rainfall-runoff, sediment yield, controlled and uncontrolled pollutant loadings, etc.) and the great diversity of substances and components with an specific chemical behavior. However, due to the numerous fields of application of transport phenomena (basic and applied research, hydrology and associated fluxes, sediment transport, pollutant loadings to water flows, industrial processes, soil and water quality, atmospheric emissions, legislation, etc.), realistic studies of transport processes are required. In this context, case study application, an active methodology according to the structural implications of the European Higher Education Area (EHEA), with the aid of computer tools constitute an interactive, instantaneous and flexible method with a new interplay between students and lecturers. Case studies allow the lecturer to design significant activities that generate knowledge in the students and motivates them to look for information, discuss, and be autonomous. This work presents the development of a graphical interface for the solution of different case studies for the acquisition of capacities and abilities in the autonomous apprenticeship of courses related to transport processes in Environmental Hydraulics. The interactive tool helps to develop and improve abilities in mixing and transport in surface water related courses. Thus, students clarify theoretical concepts and visualize processes with negative effects for the environment and that therefore, can only be reproduced in the laboratory or in the field under very controlled conditions and commonly with tracers instead of the real substances. The tool can be used for different case studies in terms of processes involved, governing variable, initial conditions, etc. (eg. Accidental spill of a conservative pollutant from a factory in a river stretch that constitutes a source of drinking water for a town downstream) and can be used as a virtual laboratory for the analysis of the influence of the different variables and parameters of the process. Thus, autonomous apprenticeship is fostered and therefore, the development of personal abilities and the analysis and summary of information related to the case study is stimulated.

  17. A laboratory investigation of interactions between denitrifying anaerobic methane oxidation (DAMO) and anammox processes in anoxic environments

    PubMed Central

    Hu, Shihu; Zeng, Raymond J.; Haroon, Mohamed F.; Keller, Jurg; Lant, Paul A.; Tyson, Gene W.; Yuan, Zhiguo

    2015-01-01

    This study investigates interactions between recently identified denitrifying anaerobic methane oxidation (DAMO) and anaerobic ammonium oxidation (anammox) processes in controlled anoxic laboratory reactors. Two reactors were seeded with the same inocula containing DAMO organisms Candidatus Methanoperedens nitroreducens and Candidatus Methylomirabilis oxyfera, and anammox organism Candidatus Kuenenia stuttgartiensis. Both were fed with ammonium and methane, but one was also fed with nitrate and the other with nitrite, providing anoxic environments with different electron acceptors. After steady state reached in several months, the DAMO process became solely/primarily responsible for nitrate reduction while the anammox process became solely responsible for nitrite reduction in both reactors. 16S rRNA gene amplicon sequencing showed that the nitrate-driven DAMO organism M. nitroreducens dominated both the nitrate-fed (~70%) and the nitrite-fed (~26%) reactors, while the nitrite-driven DAMO organism M. oxyfera disappeared in both communities. The elimination of M. oxyfera from both reactors was likely the results of this organism being outcompeted by anammox bacteria for nitrite. K. stuttgartiensis was detected at relatively low levels (1–3%) in both reactors. PMID:25732131

  18. Vacuum Plasma Spray Forming of Tungsten Lorentz Force Accelerator Components

    NASA Technical Reports Server (NTRS)

    Zimmerman, Frank R.

    2004-01-01

    The Vacuum Plasma Spray (VPS) Laboratory at NASA's Marshall Space Flight Center, working with the Jet Propulsion Laboratory, has developed and demonstrated a fabrication technique using the VPS process to form anode and cathode sections for a Lorentz force accelerator made from tungsten. Lorentz force accelerators are an attractive form of electric propulsion that provides continuous, high-efficiency propulsion at useful power levels for such applications as orbit transfers or deep space missions. The VPS process is used to deposit refractory metals such as tungsten onto a graphite mandrel of the desired shape. Because tungsten is reactive at high temperatures, it is thermally sprayed in an inert environment where the plasma gun melts and deposits the molten metal powder onto a mandrel. A three-axis robot inside the chamber controls the motion of the plasma spray torch. A graphite mandrel acts as a male mold, forming the required contour and dimensions for the inside surface of the anode or cathode of the accelerator. This paper describes the processing techniques, design considerations, and process development associated with the VPS forming of Lorentz force accelerator components.

  19. ANALYTICAL PLANS SUPPORTING THE SWPF GAP ANALYSIS BEING CONDUCTED WITH ENERGYSOLUTIONS AND THE VITREOUS STATE LABORATORY AT THE CUA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, T.; Peeler, D.

    2014-10-28

    EnergySolutions (ES) and its partner, the Vitreous State Laboratory (VSL) of The Catholic University of America (CUA), are to provide engineering and technical services support to Savannah River Remediation, LLC (SRR) for ongoing operation of the Defense Waste Processing Facility (DWPF) flowsheet as well as for modifications to improve overall plant performance. SRR has requested that the glass formulation team of Savannah River National Laboratory (SRNL) and ES-VSL develop a technical basis that validates the current Product Composition Control System models for use during the processing of the coupled flowsheet or that leads to the refinements of or modifications tomore » the models that are needed so that they may be used during the processing of the coupled flowsheet. SRNL has developed a matrix of test glasses that are to be batched and fabricated by ES-VSL as part of this effort. This document provides two analytical plans for use by ES-VSL: one plan is to guide the measurement of the chemical composition of the study glasses while the second is to guide the measurement of the durability of the study glasses based upon the results of testing by ASTM’s Product Consistency Test (PCT) Method A.« less

  20. JSC Materials Laboratory Reproduction and Failure Analysis of Cracked Orbiter Reaction Control System Niobium Thruster Injectors

    NASA Technical Reports Server (NTRS)

    Castner, Willard L.; Jacobs, Jeremy B.

    2006-01-01

    In April 2004 a Space Shuttle Orbiter Reaction Control System (RCS) thruster was found to be cracked while undergoing a nozzle (niobium/C103 alloy) retrofit. As a failure resulting from an in-flight RCS thruster burn-through (initiated from a crack) could be catastrophic, an official Space Shuttle Program flight constraint was issued until flight safety could be adequately demonstrated. This paper describes the laboratory test program which was undertaken to reproduce the cracking in order to fully understand and bound the driving environments. The associated rationale developed to justify continued safe flight of the Orbiter RCS system is also described. The laboratory testing successfully reproduced the niobium cracking, and established specific bounding conditions necessary to cause cracking in the C103 thruster injectors. Each of the following conditions is necessary in combination together: 1) a mechanically disturbed / cold-worked free surface, 2) an externally applied sustained tensile stress near yield strength, 3) presence of fluorine-containing fluids on exposed tensile / cold-worked free surfaces, and 4) sustained exposure to temperatures greater than 400 F. As a result of this work, it was concluded that fluorine-containing materials (e.g. HF acid, Krytox , Brayco etc.) should be carefully controlled or altogether eliminated during processing of niobium and its alloys.

  1. EDITORIAL: Interrelationship between plasma phenomena in the laboratory and in space

    NASA Astrophysics Data System (ADS)

    Koepke, Mark

    2008-07-01

    The premise of investigating basic plasma phenomena relevant to space is that an alliance exists between both basic plasma physicists, using theory, computer modelling and laboratory experiments, and space science experimenters, using different instruments, either flown on different spacecraft in various orbits or stationed on the ground. The intent of this special issue on interrelated phenomena in laboratory and space plasmas is to promote the interpretation of scientific results in a broader context by sharing data, methods, knowledge, perspectives, and reasoning within this alliance. The desired outcomes are practical theories, predictive models, and credible interpretations based on the findings and expertise available. Laboratory-experiment papers that explicitly address a specific space mission or a specific manifestation of a space-plasma phenomenon, space-observation papers that explicitly address a specific laboratory experiment or a specific laboratory result, and theory or modelling papers that explicitly address a connection between both laboratory and space investigations were encouraged. Attention was given to the utility of the references for readers who seek further background, examples, and details. With the advent of instrumented spacecraft, the observation of waves (fluctuations), wind (flows), and weather (dynamics) in space plasmas was approached within the framework provided by theory with intuition provided by the laboratory experiments. Ideas on parallel electric field, magnetic topology, inhomogeneity, and anisotropy have been refined substantially by laboratory experiments. Satellite and rocket observations, theory and simulations, and laboratory experiments have contributed to the revelation of a complex set of processes affecting the accelerations of electrons and ions in the geospace plasma. The processes range from meso-scale of several thousands of kilometers to micro-scale of a few meters to kilometers. Papers included in this special issue serve to synthesise our current understanding of processes related to the coupling and feedback at disparate scales. Categories of topics included here are (1) ionospheric physics and (2) Alfvén-wave physics, both of which are related to the particle acceleration responsible for auroral displays, (3) whistler-mode triggering mechanism, which is relevant to radiation-belt dynamics, (4) plasmoid encountering a barrier, which has applications throughout the realm of space and astrophysical plasmas, and (5) laboratory investigations of the entire magnetosphere or the plasma surrounding the magnetosphere. The papers are ordered from processes that take place nearest the Earth to processes that take place at increasing distances from Earth. Many advances in understanding space plasma phenomena have been linked to insight derived from theoretical modeling and/or laboratory experiments. Observations from space-borne instruments are typically interpreted using theoretical models developed to predict the properties and dynamics of space and astrophysical plasmas. The usefulness of customized laboratory experiments for providing confirmation of theory by identifying, isolating, and studying physical phenomena efficiently, quickly, and economically has been demonstrated in the past. The benefits of laboratory experiments to investigating space-plasma physics are their reproducibility, controllability, diagnosability, reconfigurability, and affordability compared to a satellite mission or rocket campaign. Certainly, the plasma being investigated in a laboratory device is quite different from that being measured by a spaceborne instrument; nevertheless, laboratory experiments discover unexpected phenomena, benchmark theoretical models, develop physical insight, establish observational signatures, and pioneer diagnostic techniques. Explicit reference to such beneficial laboratory contributions is occasionally left out of the citations in the space-physics literature in favor of theory-paper counterparts and, thus, the scientific support that laboratory results can provide to the development of space-relevant theoretical models is often under-recognized. It is unrealistic to expect the dimensional parameters corresponding to space plasma to be matchable in the laboratory. However, a laboratory experiment is considered well designed if the subset of parameters relevant to a specific process shares the same phenomenological regime as the subset of analogous space parameters, even if less important parameters are mismatched. Regime boundaries are assigned by normalizing a dimensional parameter to an appropriate reference or scale value to make it dimensionless and noting the values at which transitions occur in the physical behavior or approximations. An example of matching regimes for cold-plasma waves is finding a 45° diagonal line on the log--log CMA diagram along which lie both a laboratory-observed wave and a space-observed wave. In such a circumstance, a space plasma and a lab plasma will support the same kind of modes if the dimensionless parameters are scaled properly (Bellan 2006 Fundamentals of Plasma Physics (Cambridge: Cambridge University Press) p 227). The plasma source, configuration geometry, and boundary conditions associated with a specific laboratory experiment are characteristic elements that affect the plasma and plasma processes that are being investigated. Space plasma is not exempt from an analogous set of constraining factors that likewise influence the phenomena that occur. Typically, each morphologically distinct region of space has associated with it plasma that is unique by virtue of the various mechanisms responsible for the plasma's presence there, as if the plasma were produced by a unique source. Boundary effects that typically constrain the possible parameter values to lie within one or more restricted ranges are inescapable in laboratory plasma. The goal of a laboratory experiment is to examine the relevant physics within these ranges and extrapolate the results to space conditions that may or may not be subject to any restrictions on the values of the plasma parameters. The interrelationship between laboratory and space plasma experiments has been cultivated at a low level and the potential scientific benefit in this area has yet to be realized. The few but excellent examples of joint papers, joint experiments, and directly relevant cross-disciplinary citations are a direct result of the emphasis placed on this interrelationship two decades ago. Building on this special issue Plasma Physics and Controlled Fusion plans to create a dedicated webpage to highlight papers directly relevant to this field published either in the recent past or in the future. It is hoped that this resource will appeal to the readership in the laboratory-experiment and space-plasma communities and improve the cross-fertilization between them.

  2. [Study of quality of a branch laboratory--an opinion of a laboratory manager].

    PubMed

    Yazawa, Naoyuki

    2006-11-01

    At the stage of establishing a branch laboratory, quality evaluation is extremely difficult. Even the results of a control survey by the headquarters of the branch laboratory are unhelpful. For a clinical laboratory, the most important function is to provide reliable data all the time, and to maintain the reliability of clinical doctors with informed responses. We mostly refer to control surveys and daily quality control data to evaluate a clinical laboratory, but we rarely check its fundamental abilities, such as planning events, preserving statistical data about the standard range, using the right method for quality control and others. This is generally disregarded and it is taken for granted that they will be correct the first time. From my six years of experience working with X's branch laboratory, I realized that there might be some relation between the quality of a branch laboratory and the fundamental abilities of the company itself. I would never argue that all branch laboratories are ineffective, but they should be conscious of fundamental activities. The referring laboratory, not the referral laboratory, should be responsible for ensuring that the referral laboratory's examination results and findings are correct.

  3. Evaluation of clinically significant adverse events in patients discharged from a tertiary-care emergency department in Taiwan

    PubMed Central

    Wang, Lee-Min; How, Chorng-Kuang; Yang, Ming-Chin; Su, Syi

    2013-01-01

    Objective To investigate the reasons for the occurrence of clinically significant adverse events (CSAEs) in emergency department-discharged patients through emergency physicians' (EPs) subjective reasoning and senior EPs' objective evaluation. Design This was a combined prospective follow-up and retrospective review of cases of consecutive adult non-traumatic patients who presented to a tertiary-care emergency department in Taiwan between 1 September 2005 and 31 July 2006. Data were extracted from ‘on-duty EPs' subjective reasoning for discharging patients with CSAEs (study group) and without CSAEs (control group)’ and ‘objective evaluation of CSAEs by senior EPs, using clinical evidences such as recording history, physical examinations, laboratory/radiological examinations and observation of inadequacies in the basic management process (such as recording history, physical examinations, laboratory/radiological examinations and observation) as the guide’. Subjective reasons for discharging patients’ improvement of symptoms, and the certainty of safety of the discharge were compared in the two groups using χ2 statistics or t test. Results Of the 20 512 discharged cases, there were 1370 return visits (6.7%, 95% CI 6.3% to 7%) and 165 CSAEs due to physicians' factors (0.82%, 95% CI 0.75% to 0.95%). In comparisons between the study group and the control group, only some components of discharge reasoning showed a significant difference (p<0.001). Inadequacies in the basic management process were the main cause of CSAEs (164/165). Conclusion The authors recommended that EP follow-up of the basic management processes (including history record, physical examination, laboratory and radiological examinations, clinical symptoms/signs and treatment) using clinical evidence as a guideline should be made mandatory. PMID:22433586

  4. Cost Effectiveness of Adopted Quality Requirements in Hospital Laboratories

    PubMed Central

    HAMZA, Alneil; AHMED-ABAKUR, Eltayib; ABUGROUN, Elsir; BAKHIT, Siham; HOLI, Mohamed

    2013-01-01

    Background The present study was designed in quasi-experiment to assess adoption of the essential clauses of particular clinical laboratory quality management requirements based on international organization for standardization (ISO 15189) in hospital laboratories and to evaluate the cost effectiveness of compliance to ISO 15189. Methods: The quality management intervention based on ISO 15189 was conceded through three phases; pre – intervention phase, Intervention phase and Post-intervention phase. Results: In pre-intervention phase the compliance to ISO 15189 was 49% for study group vs. 47% for control group with P value 0.48, while the post intervention results displayed 54% vs. 79% for study group and control group respectively in compliance to ISO 15189 and statistically significant difference (P value 0.00) with effect size (Cohen’s d) of (0.00) in pre-intervention phase and (0.99) in post – intervention phase. The annual average cost per-test for the study group and control group was 1.80 ± 0.25 vs. 1.97 ± 0.39, respectively with P value 0.39 whereas the post-intervention results showed that the annual average total costs per-test for study group and control group was 1.57 ± 0.23 vs 2.08 ± 0.38, P value 0.019 respectively, with cost-effectiveness ratio of (0.88) in pre -intervention phase and (0.52) in post-intervention phase. Conclusion: The planned adoption of quality management requirements (QMS) in clinical laboratories had great effect to increase the compliance percent with quality management system requirement, raise the average total cost effectiveness, and improve the analytical process capability of the testing procedure. PMID:23967422

  5. Intelligent control system for continuous technological process of alkylation

    NASA Astrophysics Data System (ADS)

    Gebel, E. S.; Hakimov, R. A.

    2018-01-01

    Relevance of intelligent control for complex dynamic objects and processes are shown in this paper. The model of a virtual analyzer based on a neural network is proposed. Comparative analysis of mathematical models implemented in MathLab software showed that the most effective from the point of view of the reproducibility of the result is the model with seven neurons in the hidden layer, the training of which was performed using the method of scaled coupled gradients. Comparison of the data from the laboratory analysis and the theoretical model are showed that the root-mean-square error does not exceed 3.5, and the calculated value of the correlation coefficient corresponds to a "strong" connection between the values.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plemons, R.E.; Hopwood, W.H. Jr.; Hamilton, J.H.

    For a number of years the Oak Ridge Y-12 Plant Laboratory has been analyzing coal predominately for the utilities department of the Y-12 Plant. All laboratory procedures, except a Leco sulfur method which used the Leco Instruction Manual as a reference, were written based on the ASTM coal analyses. Sulfur is analyzed at the present time by two methods, gravimetric and Leco. The laboratory has two major endeavors for monitoring the quality of its coal analyses. (1) A control program by the Plant Statistical Quality Control Department. Quality Control submits one sample for every nine samples submitted by the utilitiesmore » departments and the laboratory analyzes a control sample along with the utilities samples. (2) An exchange program with the DOE Coal Analysis Laboratory in Bruceton, Pennsylvania. The Y-12 Laboratory submits to the DOE Coal Laboratory, on even numbered months, a sample that Y-12 has analyzed. The DOE Coal Laboratory submits, on odd numbered months, one of their analyzed samples to the Y-12 Plant Laboratory to be analyzed. The results of these control and exchange programs are monitored not only by laboratory personnel, but also by Statistical Quality Control personnel who provide statistical evaluations. After analysis and reporting of results, all utilities samples are retained by the laboratory until the coal contracts have been settled. The utilities departments have responsibility for the initiation and preparation of the coal samples. The samples normally received by the laboratory have been ground to 4-mesh, reduced to 0.5-gallon quantities, and sealed in air-tight containers. Sample identification numbers and a Request for Analysis are generated by the utilities departments.« less

  7. Adaptive GSA-based optimal tuning of PI controlled servo systems with reduced process parametric sensitivity, robust stability and controller robustness.

    PubMed

    Precup, Radu-Emil; David, Radu-Codrut; Petriu, Emil M; Radac, Mircea-Bogdan; Preitl, Stefan

    2014-11-01

    This paper suggests a new generation of optimal PI controllers for a class of servo systems characterized by saturation and dead zone static nonlinearities and second-order models with an integral component. The objective functions are expressed as the integral of time multiplied by absolute error plus the weighted sum of the integrals of output sensitivity functions of the state sensitivity models with respect to two process parametric variations. The PI controller tuning conditions applied to a simplified linear process model involve a single design parameter specific to the extended symmetrical optimum (ESO) method which offers the desired tradeoff to several control system performance indices. An original back-calculation and tracking anti-windup scheme is proposed in order to prevent the integrator wind-up and to compensate for the dead zone nonlinearity of the process. The minimization of the objective functions is carried out in the framework of optimization problems with inequality constraints which guarantee the robust stability with respect to the process parametric variations and the controller robustness. An adaptive gravitational search algorithm (GSA) solves the optimization problems focused on the optimal tuning of the design parameter specific to the ESO method and of the anti-windup tracking gain. A tuning method for PI controllers is proposed as an efficient approach to the design of resilient control systems. The tuning method and the PI controllers are experimentally validated by the adaptive GSA-based tuning of PI controllers for the angular position control of a laboratory servo system.

  8. Avoiding biohazards in medical, veterinary and research laboratories.

    PubMed

    Grizzle, W E; Fredenburgh, J

    2001-07-01

    Personnel in medical, veterinary or research laboratories may be exposed to a wide variety of pathogens that range from deadly to debilitating. For some of these pathogens, no treatment is available, and in other cases the treatment does not fully control the disease. It is important that personnel in laboratories that process human or microbiological specimens follow universal precautions when handling tissues, cells, or microbiological specimens owing to the increasing numbers of individuals infected with hepatitis C and HIV in the US and the possibility that an individual may be asymptomatic when a specimen is obtained. Similar precautions must be followed in laboratories that use animal tissues owing to the possibility of exposure to agents that are pathogenic in humans. Personnel with conditions associated with immunosuppression should evaluate carefully whether or not specific laboratory environments put them at increased risk of disease. We offer here some general approaches to identifying biohazards and to minimizing the potential risk of exposure. The issues discussed can be used to develop a general safety program as required by regulatory or accrediting agencies, including the Occupational Safety and Health Administration.

  9. Manufacturing information system

    NASA Astrophysics Data System (ADS)

    Allen, D. K.; Smith, P. R.; Smart, M. J.

    1983-12-01

    The size and cost of manufacturing equipment has made it extremely difficult to perform realistic modeling and simulation of the manufacturing process in university research laboratories. Likewise the size and cost factors, coupled with many uncontrolled variables of the production situation has even made it difficult to perform adequate manufacturing research in the industrial setting. Only the largest companies can afford manufacturing research laboratories; research results are often held proprietary and seldom find their way into the university classroom to aid in education and training of new manufacturing engineers. It is the purpose for this research to continue the development of miniature prototype equipment suitable for use in an integrated CAD/CAM Laboratory. The equipment being developed is capable of actually performing production operations (e.g. drilling, milling, turning, punching, etc.) on metallic and non-metallic workpieces. The integrated CAD/CAM Mini-Lab is integrating high resolution, computer graphics, parametric design, parametric N/C parts programmings, CNC machine control, automated storage and retrieval, with robotics materials handling. The availability of miniature CAD/CAM laboratory equipment will provide the basis for intensive laboratory research on manufacturing information systems.

  10. Integrated intelligent systems in advanced reactor control rooms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beckmeyer, R.R.

    1989-01-01

    An intelligent, reactor control room, information system is designed to be an integral part of an advanced control room and will assist the reactor operator's decision making process by continuously monitoring the current plant state and providing recommended operator actions to improve that state. This intelligent system is an integral part of, as well as an extension to, the plant protection and control systems. This paper describes the interaction of several functional components (intelligent information data display, technical specifications monitoring, and dynamic procedures) of the overall system and the artificial intelligence laboratory environment assembled for testing the prototype. 10 refs.,more » 5 figs.« less

  11. Off-Line Quality Control In Integrated Circuit Fabrication Using Experimental Design

    NASA Astrophysics Data System (ADS)

    Phadke, M. S.; Kackar, R. N.; Speeney, D. V.; Grieco, M. J.

    1987-04-01

    Off-line quality control is a systematic method of optimizing production processes and product designs. It is widely used in Japan to produce high quality products at low cost. The method was introduced to us by Professor Genichi Taguchi who is a Deming-award winner and a former Director of the Japanese Academy of Quality. In this paper we will i) describe the off-line quality control method, and ii) document our efforts to optimize the process for forming contact windows in 3.5 Aim CMOS circuits fabricated in the Murray Hill Integrated Circuit Design Capability Laboratory. In the fabrication of integrated circuits it is critically important to produce contact windows of size very near the target dimension. Windows which are too small or too large lead to loss of yield. The off-line quality control method has improved both the process quality and productivity. The variance of the window size has been reduced by a factor of four. Also, processing time for window photolithography has been substantially reduced. The key steps of off-line quality control are: i) Identify important manipulatable process factors and their potential working levels. ii) Perform fractional factorial experiments on the process using orthogonal array designs. iii) Analyze the resulting data to determine the optimum operating levels of the factors. Both the process mean and the process variance are considered in this analysis. iv) Conduct an additional experiment to verify that the new factor levels indeed give an improvement.

  12. A proven knowledge-based approach to prioritizing process information

    NASA Technical Reports Server (NTRS)

    Corsberg, Daniel R.

    1991-01-01

    Many space-related processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect is rapid analysis of the changing process information. During a disturbance, this task can overwhelm humans as well as computers. Humans deal with this by applying heuristics in determining significant information. A simple, knowledge-based approach to prioritizing information is described. The approach models those heuristics that humans would use in similar circumstances. The approach described has received two patents and was implemented in the Alarm Filtering System (AFS) at the Idaho National Engineering Laboratory (INEL). AFS was first developed for application in a nuclear reactor control room. It has since been used in chemical processing applications, where it has had a significant impact on control room environments. The approach uses knowledge-based heuristics to analyze data from process instrumentation and respond to that data according to knowledge encapsulated in objects and rules. While AFS cannot perform the complete diagnosis and control task, it has proven to be extremely effective at filtering and prioritizing information. AFS was used for over two years as a first level of analysis for human diagnosticians. Given the approach's proven track record in a wide variety of practical applications, it should be useful in both ground- and space-based systems.

  13. Electromagnetic containerless processing requirements and recommended facility concept and capabilities for space lab

    NASA Technical Reports Server (NTRS)

    Frost, R. T.; Bloom, H. L.; Napaluch, L. J.; Stockhoff, E. H.; Wouch, G.

    1974-01-01

    Containerless melting, reaction, and solidification experiments and processes which potentially can lead to new understanding of material science and production of new or improved materials in the weightless space environment are reviewed in terms of planning for spacelab. Most of the experiments and processes discussed are amenable to the employment of electromagnetic position control and electromagnetic induction or electron beam heating and melting. The spectrum of relevant properties of materials, which determine requirements for a space laboratory electromagnetic containerless processing facility are reviewed. Appropriate distributions and associated coil structures are analyzed and compared on the basis of efficiency, for providing the functions of position sensing, control, and induction heating. Several coil systems are found capable of providing these functions. Exchangeable modular coils in appropriate sizes are recommended to achieve the maximum power efficiencies, for a wide range of specimen sizes and resistivities, in order to conserve total facility power.

  14. NREL and Sandia National Laboratories to Sharpen Wind Farm Turbine Controls

    Science.gov Websites

    | News | NREL NREL and Sandia National Laboratories to Sharpen Wind Farm Turbine Controls NREL and Sandia National Laboratories to Sharpen Wind Farm Turbine Controls April 1, 2016 Researchers at wind turbine modeling. The NREL controls team have been evaluating their control theory in simulations

  15. Economic Education Laboratory: Initiating a Meaningful Economic Learning through Laboratory

    ERIC Educational Resources Information Center

    Noviani, Leny; Soetjipto, Budi Eko; Sabandi, Muhammad

    2015-01-01

    Laboratory is considered as one of the resources in supporting the learning process. The laboratory can be used as facilities to deepen the concepts, learning methods and enriching students' knowledge and skills. Learning process by utilizing the laboratory facilities can help lecturers and students in grasping the concept easily, constructing the…

  16. Gas-phase advanced oxidation for effective, efficient in situ control of pollution.

    PubMed

    Johnson, Matthew S; Nilsson, Elna J K; Svensson, Erik A; Langer, Sarka

    2014-01-01

    In this article, gas-phase advanced oxidation, a new method for pollution control building on the photo-oxidation and particle formation chemistry occurring in the atmosphere, is introduced and characterized. The process uses ozone and UV-C light to produce in situ radicals to oxidize pollution, generating particles that are removed by a filter; ozone is removed using a MnO2 honeycomb catalyst. This combination of in situ processes removes a wide range of pollutants with a comparatively low specific energy input. Two proof-of-concept devices were built to test and optimize the process. The laboratory prototype was built of standard ventilation duct and could treat up to 850 m(3)/h. A portable continuous-flow prototype built in an aluminum flight case was able to treat 46 m(3)/h. Removal efficiencies of >95% were observed for propane, cyclohexane, benzene, isoprene, aerosol particle mass, and ozone for concentrations in the range of 0.4-6 ppm and exposure times up to 0.5 min. The laboratory prototype generated a OH(•) concentration derived from propane reaction of (2.5 ± 0.3) × 10(10) cm(-3) at a specific energy input of 3 kJ/m(3), and the portable device generated (4.6 ± 0.4) × 10(9) cm(-3) at 10 kJ/m(3). Based on these results, in situ gas-phase advanced oxidation is a viable control strategy for most volatile organic compounds, specifically those with a OH(•) reaction rate higher than ca. 5 × 10(-13) cm(3)/s. Gas-phase advanced oxidation is able to remove compounds that react with OH and to control ozone and total particulate mass. Secondary pollution including formaldehyde and ultrafine particles might be generated, depending on the composition of the primary pollution.

  17. 7 CFR 58.523 - Laboratory and quality control tests.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Laboratory and quality control tests. 58.523 Section... Service 1 Operations and Operating Procedures § 58.523 Laboratory and quality control tests. (a) Quality control tests shall be made on samples as often as necessary to determine the shelf-life and stability of...

  18. 7 CFR 58.523 - Laboratory and quality control tests.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Laboratory and quality control tests. 58.523 Section... Service 1 Operations and Operating Procedures § 58.523 Laboratory and quality control tests. (a) Quality control tests shall be made on samples as often as necessary to determine the shelf-life and stability of...

  19. 7 CFR 58.523 - Laboratory and quality control tests.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Laboratory and quality control tests. 58.523 Section... Service 1 Operations and Operating Procedures § 58.523 Laboratory and quality control tests. (a) Quality control tests shall be made on samples as often as necessary to determine the shelf-life and stability of...

  20. 7 CFR 58.523 - Laboratory and quality control tests.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Laboratory and quality control tests. 58.523 Section... Service 1 Operations and Operating Procedures § 58.523 Laboratory and quality control tests. (a) Quality control tests shall be made on samples as often as necessary to determine the shelf-life and stability of...

Top