ERIC Educational Resources Information Center
Miller, John
1994-01-01
Presents an approach to document numbering, document titling, and process measurement which, when used with fundamental techniques of statistical process control, reveals meaningful process-element variation as well as nominal productivity models. (SR)
2007-02-01
The Food and Drug Administration (FDA) is classifying a cord blood processing system and storage container into class II (special controls). The special control that will apply to this device is the guidance document entitled "Class II Special Controls Guidance Document: Cord Blood Processing System and Storage Container." FDA is classifying this device into class II (special controls) in order to provide a reasonable assurance of safety and effectiveness of this device. Elsewhere in this issue of the Federal Register, FDA is announcing the availability of the guidance document that will serve as the special control for this device.
Operational Control Procedures for the Activated Sludge Process: Appendix.
ERIC Educational Resources Information Center
West, Alfred W.
This document is the appendix for a series of documents developed by the National Training and Operational Technology Center describing operational control procedures for the activated sludge process used in wastewater treatment. Categories discussed include: control test data, trend charts, moving averages, semi-logarithmic plots, probability…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-10
... Boilers and Process Heaters at Petroleum Refineries Correction In rule document 2010-13377 beginning on... limitations for Control [Insert page number any industrial boiler or Requirements. where the document process...
Tank Monitoring and Document control System (TMACS) As Built Software Design Document
DOE Office of Scientific and Technical Information (OSTI.GOV)
GLASSCOCK, J.A.
This document describes the software design for the Tank Monitor and Control System (TMACS). This document captures the existing as-built design of TMACS as of November 1999. It will be used as a reference document to the system maintainers who will be maintaining and modifying the TMACS functions as necessary. The heart of the TMACS system is the ''point-processing'' functionality where a sample value is received from the field sensors and the value is analyzed, logged, or alarmed as required. This Software Design Document focuses on the point-processing functions.
ERIC Educational Resources Information Center
West, Alfred W.
This is the first in a series of documents developed by the National Training and Operational Technology Center describing operational control procedures for the activated sludge process used in wastewater treatment. Part I of this document deals with physical observations which should be performed during each routine control test. Part II…
An overview of selected information storage and retrieval issues in computerized document processing
NASA Technical Reports Server (NTRS)
Dominick, Wayne D. (Editor); Ihebuzor, Valentine U.
1984-01-01
The rapid development of computerized information storage and retrieval techniques has introduced the possibility of extending the word processing concept to document processing. A major advantage of computerized document processing is the relief of the tedious task of manual editing and composition usually encountered by traditional publishers through the immense speed and storage capacity of computers. Furthermore, computerized document processing provides an author with centralized control, the lack of which is a handicap of the traditional publishing operation. A survey of some computerized document processing techniques is presented with emphasis on related information storage and retrieval issues. String matching algorithms are considered central to document information storage and retrieval and are also discussed.
ERIC Educational Resources Information Center
West, Alfred W.
This is the third in a series of documents developed by the National Training and Operational Technology Center describing operational control procedures for the activated sludge process used in wastewater treatment. This document deals with the calculation procedures associated with a step-feed process. Illustrations and examples are included to…
DOE Office of Scientific and Technical Information (OSTI.GOV)
ROOT, R.W.
1999-05-18
This guide provides the Tank Waste Remediation System Privatization Infrastructure Program management with processes and requirements to appropriately control information and documents in accordance with the Tank Waste Remediation System Configuration Management Plan (Vann 1998b). This includes documents and information created by the program, as well as non-program generated materials submitted to the project. It provides appropriate approval/control, distribution and filing systems.
21 CFR 820.70 - Production and process controls.
Code of Federal Regulations, 2011 CFR
2011-04-01
... process control procedures that describe any process controls necessary to ensure conformance to specifications. Where process controls are needed they shall include: (1) Documented instructions, standard operating procedures (SOP's), and methods that define and control the manner of production; (2) Monitoring...
21 CFR 820.70 - Production and process controls.
Code of Federal Regulations, 2013 CFR
2013-04-01
... process control procedures that describe any process controls necessary to ensure conformance to specifications. Where process controls are needed they shall include: (1) Documented instructions, standard operating procedures (SOP's), and methods that define and control the manner of production; (2) Monitoring...
Operational Control Procedures for the Activated Sludge Process, Part III-A: Calculation Procedures.
ERIC Educational Resources Information Center
West, Alfred W.
This is the second in a series of documents developed by the National Training and Operational Technology Center describing operational control procedures for the activated sludge process used in wastewater treatment. This document deals exclusively with the calculation procedures, including simplified mixing formulas, aeration tank…
Phase II Report: Design Study for Automated Document Location and Control System.
ERIC Educational Resources Information Center
Booz, Allen Applied Research, Inc., Bethesda, MD.
The scope of Phase II is the design of a system for document control within the National Agricultural Library (NAL) that will facilitate the processing of the documents selected, ordered, or received; that will avoid backlogs; and that will provide rapid document location reports. The results are set forth as follows: Chapter I, Introduction,…
1 CFR 21.35 - OMB control numbers.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 1 General Provisions 1 2012-01-01 2012-01-01 false OMB control numbers. 21.35 Section 21.35... PROCESSING OF DOCUMENTS PREPARATION OF DOCUMENTS SUBJECT TO CODIFICATION General Omb Control Numbers § 21.35 OMB control numbers. To display OMB control numbers in agency regulations, those numbers shall be...
1 CFR 21.35 - OMB control numbers.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 1 General Provisions 1 2010-01-01 2010-01-01 false OMB control numbers. 21.35 Section 21.35... PROCESSING OF DOCUMENTS PREPARATION OF DOCUMENTS SUBJECT TO CODIFICATION General Omb Control Numbers § 21.35 OMB control numbers. To display OMB control numbers in agency regulations, those numbers shall be...
Lee, Eunjoo; Noh, Hyun Kyung
2016-01-01
To examine the effects of a web-based nursing process documentation system on the stress and anxiety of nursing students during their clinical practice. A quasi-experimental design was employed. The experimental group (n = 110) used a web-based nursing process documentation program for their case reports as part of assignments for a clinical practicum, whereas the control group (n = 106) used traditional paper-based case reports. Stress and anxiety levels were measured with a numeric rating scale before, 2 weeks after, and 4 weeks after using the web-based nursing process documentation program during a clinical practicum. The data were analyzed using descriptive statistics, t tests, chi-square tests, and repeated-measures analyses of variance. Nursing students who used the web-based nursing process documentation program showed significant lower levels of stress and anxiety than the control group. A web-based nursing process documentation program could be used to reduce the stress and anxiety of nursing students during clinical practicum, which ultimately would benefit nursing students by increasing satisfaction with and effectiveness of clinical practicum. © 2015 NANDA International, Inc.
7 CFR 274.5 - Record retention and forms security.
Code of Federal Regulations, 2014 CFR
2014-01-01
... reconciliation process. (c) Accountable documents. (1) EBT cards shall be considered accountable documents. The... validation of inventory controls and records by parties not otherwise involved in maintaining control records...
7 CFR 274.5 - Record retention and forms security.
Code of Federal Regulations, 2013 CFR
2013-01-01
... reconciliation process. (c) Accountable documents. (1) EBT cards shall be considered accountable documents. The... validation of inventory controls and records by parties not otherwise involved in maintaining control records...
7 CFR 274.5 - Record retention and forms security.
Code of Federal Regulations, 2012 CFR
2012-01-01
... reconciliation process. (c) Accountable documents. (1) EBT cards shall be considered accountable documents. The... validation of inventory controls and records by parties not otherwise involved in maintaining control records...
NASA Technical Reports Server (NTRS)
1979-01-01
This document addresses requirements for post-test data reduction in support of the Orbital Flight Tests (OFT) mission evaluation team, specifically those which are planned to be implemented in the ODRC (Orbiter Data Reduction Complex). Only those requirements which have been previously baselined by the Data Systems and Analysis Directorate configuration control board are included. This document serves as the control document between Institutional Data Systems Division and the Integration Division for OFT mission evaluation data processing requirements, and shall be the basis for detailed design of ODRC data processing systems.
Red Plague Control Plan (RPCP)
NASA Technical Reports Server (NTRS)
Cooke, Robert W.
2010-01-01
SCOPE: Prescribes the minimum requirements for the control of cuprous / cupric oxide corrosion (a.k.a. Red Plague) of silver-coated copper wire, cable, and harness assemblies. PURPOSE: Targeted for applications where exposure to assembly processes, environmental conditions, and contamination may promote the development of cuprous / cupric oxide corrosion (a.k.a. Red Plague) in silver-coated copper wire, cable, and harness assemblies. Does not exclude any alternate or contractor-proprietary documents or processes that meet or exceed the baseline of requirements established by this document. Use of alternate or contractor-proprietary documents or processes shall require review and prior approval of the procuring NASA activity.
Means of storage and automated monitoring of versions of text technical documentation
NASA Astrophysics Data System (ADS)
Leonovets, S. A.; Shukalov, A. V.; Zharinov, I. O.
2018-03-01
The paper presents automation of the process of preparation, storage and monitoring of version control of a text designer, and program documentation by means of the specialized software is considered. Automation of preparation of documentation is based on processing of the engineering data which are contained in the specifications and technical documentation or in the specification. Data handling assumes existence of strictly structured electronic documents prepared in widespread formats according to templates on the basis of industry standards and generation by an automated method of the program or designer text document. Further life cycle of the document and engineering data entering it are controlled. At each stage of life cycle, archive data storage is carried out. Studies of high-speed performance of use of different widespread document formats in case of automated monitoring and storage are given. The new developed software and the work benches available to the developer of the instrumental equipment are described.
NASA Technical Reports Server (NTRS)
Callender, E. David; Steinbacher, Jody
1989-01-01
This is the fifth of five volumes on Information System Life-Cycle and Documentation Standards. This volume provides a well organized, easily used standard for management control and status reports used in monitoring and controlling the management, development, and assurance of informations systems and software, hardware, and operational procedures components, and related processes.
Scheuner, Maren T; Peredo, Jane; Tangney, Kelly; Schoeff, Diane; Sale, Taylor; Lubick-Goldzweig, Caroline; Hamilton, Alison; Hilborne, Lee; Lee, Martin; Mittman, Brian; Yano, Elizabeth M; Lubin, Ira M
2017-01-01
To determine whether electronic health record (EHR) tools improve documentation of pre- and postanalytic care processes for genetic tests ordered by nongeneticists. We conducted a nonrandomized, controlled, pre-/postintervention study of EHR point-of-care tools (informational messages and template report) for three genetic tests. Chart review assessed documentation of genetic testing processes of care, with points assigned for each documented item. Multiple linear and logistic regressions assessed factors associated with documentation. Preimplementation, there were no significant site differences (P > 0.05). Postimplementation, mean documentation scores increased (5.9 (2.1) vs. 5.0 (2.2); P = 0.0001) and records with clinically meaningful documentation increased (score >5: 59 vs. 47%; P = 0.02) at the intervention versus the control site. Pre- and postimplementation, a score >5 was positively associated with abnormal test results (OR = 4.0; 95% CI: 1.8-9.2) and trainee provider (OR = 2.3; 95% CI: 1.2-4.6). Postimplementation, a score >5 was also positively associated with intervention site (OR = 2.3; 95% CI: 1.1-5.1) and specialty clinic (OR = 2.0; 95% CI: 1.1-3.6). There were also significantly fewer tests ordered after implementation (264/100,000 vs. 204/100,000; P = 0.03), with no significant change at the control site (280/100,000 vs. 257/100,000; P = 0.50). EHR point-of-care tools improved documentation of genetic testing processes and decreased utilization of genetic tests commonly ordered by nongeneticists.Genet Med 19 1, 112-120.
Issues in Commercial Document Delivery.
ERIC Educational Resources Information Center
Marcinko, Randall Wayne
1997-01-01
Discusses (1) the history of document delivery; (2) the delivery process--end-user request, intermediary request, vendor reference, citation verification, obtaining document and source relations, quality control, transferring document to client, customer service and status, invoicing and billing, research and development, and copyright; and (3)…
NASA Technical Reports Server (NTRS)
Nagaraja, K. S.; Kraft, R. H.
1999-01-01
The HSCT Flight Controls Group has developed longitudinal control laws, utilizing PTC aeroelastic flexible models to minimize aeroservoelastic interaction effects, for a number of flight conditions. The control law design process resulted in a higher order controller and utilized a large number of sensors distributed along the body for minimizing the flexibility effects. Processes were developed to implement these higher order control laws for performing the dynamic gust loads and flutter analyses. The processes and its validation were documented in Reference 2, for selected flight condition. The analytical results for additional flight conditions are presented in this document for further validation.
FLAMMABLE GAS TECHNICAL BASIS DOCUMENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
KRIPPS, L.J.
2005-02-18
This document describes the qualitative evaluation of frequency and consequences for double shell tank (DST) and single shell tank (SST) representative flammable gas accidents and associated hazardous conditions without controls. The evaluation indicated that safety-significant SSCs and/or TSRS were required to prevent or mitigate flammable gas accidents. Discussion on the resulting control decisions is included. This technical basis document was developed to support of the Tank Farms Documented Safety Analysis (DSA) and describes the risk binning process for the flammable gas representative accidents and associated represented hazardous conditions. The purpose of the risk binning process is to determine the needmore » for safety-significant structures, systems, and components (SSC) and technical safety requirement (TSR)-level controls for a given representative accident or represented hazardous condition based on an evaluation of the event frequency and consequence.« less
Instrumentation and Control for Fossil-Energy Processes
NASA Technical Reports Server (NTRS)
Mark, A., Jr.
1984-01-01
Instrumentation and control requirements for fossil-energy processes discussed in working document. Published to foster advancement of instrumentation and control technology by making equipment suppliers and others aware of specifications, needs, and potential markets.
Version control system of CAD documents and PLC projects
NASA Astrophysics Data System (ADS)
Khudyakov, P. Yu; Kisel’nikov, A. Yu; Startcev, I. M.; Kovalev, A. A.
2018-05-01
The paper presents the process of developing a version control system for CAD documents and PLC projects. The software was tested and the optimal composition of the modules was selected. The introduction of the system has made it possible to increase the safety and stability of the process control systems, as well as to reduce the number of conflicts for versions of CAD files. The number of incidents at the enterprise related to the use of incorrect versions of PLC projects is reduced to 0.
The application of intelligent process control to space based systems
NASA Technical Reports Server (NTRS)
Wakefield, G. Steve
1990-01-01
The application of Artificial Intelligence to electronic and process control can help attain the autonomy and safety requirements of manned space systems. An overview of documented applications within various industries is presented. The development process is discussed along with associated issues for implementing an intelligence process control system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ermi, A.M.
1997-05-01
Description of the Proposed Activity/REPORTABLE OCCURRENCE or PIAB: This ECN changes the computer systems design description support document describing the computers system used to control, monitor and archive the processes and outputs associated with the Hydrogen Mitigation Test Pump installed in SY-101. There is no new activity or procedure associated with the updating of this reference document. The updating of this computer system design description maintains an agreed upon documentation program initiated within the test program and carried into operations at time of turnover to maintain configuration control as outlined by design authority practicing guidelines. There are no new crediblemore » failure modes associated with the updating of information in a support description document. The failure analysis of each change was reviewed at the time of implementation of the Systems Change Request for all the processes changed. This document simply provides a history of implementation and current system status.« less
Configuration Management Plan for the Tank Farm Contractor
DOE Office of Scientific and Technical Information (OSTI.GOV)
WEIR, W.R.
The Configuration Management Plan for the Tank Farm Contractor describes configuration management the contractor uses to manage and integrate its technical baseline with the programmatic and functional operations to perform work. The Configuration Management Plan for the Tank Farm Contractor supports the management of the project baseline by providing the mechanisms to identify, document, and control the technical characteristics of the products, processes, and structures, systems, and components (SSC). This plan is one of the tools used to identify and provide controls for the technical baseline of the Tank Farm Contractor (TFC). The configuration management plan is listed in themore » management process documents for TFC as depicted in Attachment 1, TFC Document Structure. The configuration management plan is an integrated approach for control of technical, schedule, cost, and administrative processes necessary to manage the mission of the TFC. Configuration management encompasses the five functional elements of: (1) configuration management administration, (2) configuration identification, (3) configuration status accounting, (4) change control, and (5 ) configuration management assessments.« less
Requirements Specification Document
DOT National Transportation Integrated Search
1996-04-26
The System Definition Document identifies the top level processes, data flows, : and system controls for the Gary-Chicago-Milwaukee (GCM) Corridor Transportation Information Center (C-TIC). This Requirements Specification establishes the requirements...
21 CFR 820.70 - Production and process controls.
Code of Federal Regulations, 2014 CFR
2014-04-01
... to ensure that a device conforms to its specifications. Where deviations from device specifications... specifications. Where process controls are needed they shall include: (1) Documented instructions, standard... establish and maintain procedures for changes to a specification, method, process, or procedure. Such...
21 CFR 820.70 - Production and process controls.
Code of Federal Regulations, 2012 CFR
2012-04-01
... to ensure that a device conforms to its specifications. Where deviations from device specifications... specifications. Where process controls are needed they shall include: (1) Documented instructions, standard... establish and maintain procedures for changes to a specification, method, process, or procedure. Such...
Guideline for Software Documentation Management.
ERIC Educational Resources Information Center
National Bureau of Standards (DOC), Washington, DC.
Designed as a basic reference for federal personnel concerned with the development, maintenance, enhancement, control, and management of computer-based systems, this manual provides a general overview of the software development process and software documentation issues so that managers can assess their own documentation requirements. Reference is…
Review of SDDOT's highway access control process
DOT National Transportation Integrated Search
2000-02-01
This report presents the results and recommendations of a review of the South Dakota Department of Transportation's (SDDOT's) highway access control process. This document presents recommendations that improve South Dakota's access policy. The docume...
NASA Technical Reports Server (NTRS)
1988-01-01
The Johnson Space Center (JSC) document index is intended to provide a single source listing of all published JSC-numbered documents their authors, and the designated offices of prime responsibility (OPR's) by mail code at the time of publication. The index contains documents which have been received and processed by the JSC Technical Library as of January 13, 1988. Other JSC-numbered documents which are controlled but not available through the JSC Library are also listed.
Application of parameter estimation to aircraft stability and control: The output-error approach
NASA Technical Reports Server (NTRS)
Maine, Richard E.; Iliff, Kenneth W.
1986-01-01
The practical application of parameter estimation methodology to the problem of estimating aircraft stability and control derivatives from flight test data is examined. The primary purpose of the document is to present a comprehensive and unified picture of the entire parameter estimation process and its integration into a flight test program. The document concentrates on the output-error method to provide a focus for detailed examination and to allow us to give specific examples of situations that have arisen. The document first derives the aircraft equations of motion in a form suitable for application to estimation of stability and control derivatives. It then discusses the issues that arise in adapting the equations to the limitations of analysis programs, using a specific program for an example. The roles and issues relating to mass distribution data, preflight predictions, maneuver design, flight scheduling, instrumentation sensors, data acquisition systems, and data processing are then addressed. Finally, the document discusses evaluation and the use of the analysis results.
NASA Technical Reports Server (NTRS)
1989-01-01
This document establishes electrical, electronic, and electromechanical (EEE) parts management and control requirements for contractors providing and maintaining space flight and mission-essential or critical ground support equipment for NASA space flight programs. Although the text is worded 'the contractor shall,' the requirements are also to be used by NASA Headquarters and field installations for developing program/project parts management and control requirements for in-house and contracted efforts. This document places increased emphasis on parts programs to ensure that reliability and quality are considered through adequate consideration of the selection, control, and application of parts. It is the intent of this document to identify disciplines that can be implemented to obtain reliable parts which meet mission needs. The parts management and control requirements described in this document are to be selectively applied, based on equipment class and mission needs. Individual equipment needs should be evaluated to determine the extent to which each requirement should be implemented on a procurement. Utilization of this document does not preclude the usage of other documents. The entire process of developing and implementing requirements is referred to as 'tailoring' the program for a specific project. Some factors that should be considered in this tailoring process include program phase, equipment category and criticality, equipment complexity, and mission requirements. Parts management and control requirements advocated by this document directly support the concept of 'reliability by design' and are an integral part of system reliability and maintainability. Achieving the required availability and mission success objectives during operation depends on the attention given reliability and maintainability in the design phase. Consequently, it is intended that the requirements described in this document are consistent with those of NASA publications, 'Reliability Program Requirements for Aeronautical and Space System Contractors,' NHB 5300.4(1A-l); 'Maintainability Program Requirements for Space Systems,' NHB 5300.4(1E); and 'Quality Program Provisions for Aeronautical and Space System Contractors,' NHB 5300.4(1B).
Document Preparation (for Filming). ERIC Processing Manual, Appendix B.
ERIC Educational Resources Information Center
Brandhorst, Ted, Ed.; And Others
The technical report or "fugitive" literature collected by ERIC is produced using a wide variety of printing techniques, many formats, and variable degrees of quality control. Since the documents processed by ERIC go on to be microfilmed and reproduced in microfiche and paper copy for sale to users, it is essential that the ERIC document…
ISO 9002 as Literacy Practice: Coping with Quality-Control Documents in a High-Tech Company
ERIC Educational Resources Information Center
Kleifgen, Jo Anne
2005-01-01
This study describes the process by which a circuit board manufacturing company became certified in an international quality control program known as ISO 9002. Particular attention is paid to how quality documents were made and used in actual practice and to the relationship between these standardized procedures (official literacies) and…
Waste receiving and processing plant control system; system design description
DOE Office of Scientific and Technical Information (OSTI.GOV)
LANE, M.P.
1999-02-24
The Plant Control System (PCS) is a heterogeneous computer system composed of numerous sub-systems. The PCS represents every major computer system that is used to support operation of the Waste Receiving and Processing (WRAP) facility. This document, the System Design Description (PCS SDD), includes several chapters and appendices. Each chapter is devoted to a separate PCS sub-system. Typically, each chapter includes an overview description of the system, a list of associated documents related to operation of that system, and a detailed description of relevant system features. Each appendice provides configuration information for selected PCS sub-systems. The appendices are designed asmore » separate sections to assist in maintaining this document due to frequent changes in system configurations. This document is intended to serve as the primary reference for configuration of PCS computer systems. The use of this document is further described in the WRAP System Configuration Management Plan, WMH-350, Section 4.1.« less
Improved compliance by BPM-driven workflow automation.
Holzmüller-Laue, Silke; Göde, Bernd; Fleischer, Heidi; Thurow, Kerstin
2014-12-01
Using methods and technologies of business process management (BPM) for the laboratory automation has important benefits (i.e., the agility of high-level automation processes, rapid interdisciplinary prototyping and implementation of laboratory tasks and procedures, and efficient real-time process documentation). A principal goal of the model-driven development is the improved transparency of processes and the alignment of process diagrams and technical code. First experiences of using the business process model and notation (BPMN) show that easy-to-read graphical process models can achieve and provide standardization of laboratory workflows. The model-based development allows one to change processes quickly and an easy adaption to changing requirements. The process models are able to host work procedures and their scheduling in compliance with predefined guidelines and policies. Finally, the process-controlled documentation of complex workflow results addresses modern laboratory needs of quality assurance. BPMN 2.0 as an automation language to control every kind of activity or subprocess is directed to complete workflows in end-to-end relationships. BPMN is applicable as a system-independent and cross-disciplinary graphical language to document all methods in laboratories (i.e., screening procedures or analytical processes). That means, with the BPM standard, a communication method of sharing process knowledge of laboratories is also available. © 2014 Society for Laboratory Automation and Screening.
SME Acceptability Determination For DWPF Process Control (U)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edwards, T.
2017-06-12
The statistical system described in this document is called the Product Composition Control System (PCCS). K. G. Brown and R. L. Postles were the originators and developers of this system as well as the authors of the first three versions of this technical basis document for PCCS. PCCS has guided acceptability decisions for the processing at the Defense Waste Processing Facility (DWPF) at the Savannah River Site (SRS) since the start of radioactive operations in 1996. The author of this revision to the document gratefully acknowledges the firm technical foundation that Brown and Postles established to support the ongoing successfulmore » operation at the DWPF. Their integration of the glass propertycomposition models, developed under the direction of C. M. Jantzen, into a coherent and robust control system, has served the DWPF well over the last 20+ years, even as new challenges, such as the introduction into the DWPF flowsheet of auxiliary streams from the Actinide Removal Process (ARP) and other processes, were met. The purpose of this revision is to provide a technical basis for modifications to PCCS required to support the introduction of waste streams from the Salt Waste Processing Facility (SWPF) into the DWPF flowsheet. An expanded glass composition region is anticipated by the introduction of waste streams from SWPF, and property-composition studies of that glass region have been conducted. Jantzen, once again, directed the development of glass property-composition models applicable for this expanded composition region. The author gratefully acknowledges the technical contributions of C.M. Jantzen leading to the development of these glass property-composition models. The integration of these models into the PCCS constraints necessary to administer future acceptability decisions for the processing at DWPF is provided by this sixth revision of this document.« less
TQM (Total Quality Management) SPARC (Special Process Action Review Committees) Handbook
1989-08-01
This document describes the techniques used to support and guide the Special Process Action Review Committees for accomplishing their goals for Total Quality Management (TQM). It includes concepts and definitions, checklists, sample formats, and assessment criteria. Keywords: Continuous process improvement; Logistics information; Process analysis; Quality control; Quality assurance; Total Quality Management ; Statistical processes; Management Planning and control; Management training; Management information systems.
Aeropropulsion facilities configuration control: Procedures manual
NASA Technical Reports Server (NTRS)
Lavelle, James J.
1990-01-01
Lewis Research Center senior management directed that the aeropropulsion facilities be put under configuration control. A Configuration Management (CM) program was established by the Facilities Management Branch of the Aeropropulsion Facilities and Experiments Division. Under the CM program, a support service contractor was engaged to staff and implement the program. The Aeronautics Directorate has over 30 facilities at Lewis of various sizes and complexities. Under the program, a Facility Baseline List (FBL) was established for each facility, listing which systems and their documents were to be placed under configuration control. A Change Control System (CCS) was established requiring that any proposed changes to FBL systems or their documents were to be processed as per the CCS. Limited access control of the FBL master drawings was implemented and an audit system established to ensure all facility changes are properly processed. This procedures manual sets forth the policy and responsibilities to ensure all key documents constituting a facilities configuration are kept current, modified as needed, and verified to reflect any proposed change. This is the essence of the CM program.
PROCESS DOCUMENTATION: A MODEL FOR KNOWLEDGE MANAGEMENT IN ORGANIZATIONS.
Haddadpoor, Asefeh; Taheri, Behjat; Nasri, Mehran; Heydari, Kamal; Bahrami, Gholamreza
2015-10-01
Continuous and interconnected processes are a chain of activities that turn the inputs of an organization to its outputs and help achieve partial and overall goals of the organization. These activates are carried out by two types of knowledge in the organization called explicit and implicit knowledge. Among these, implicit knowledge is the knowledge that controls a major part of the activities of an organization, controls these activities internally and will not be transferred to the process owners unless they are present during the organization's work. Therefore the goal of this study is identification of implicit knowledge and its integration with explicit knowledge in order to improve human resources management, physical resource management, information resource management, training of new employees and other activities of Isfahan University of Medical Science. The project for documentation of activities in department of health of Isfahan University of Medical Science was carried out in several stages. First the main processes and related sub processes were identified and categorized with the help of planning expert. The categorization was carried out from smaller processes to larger ones. In this stage the experts of each process wrote down all their daily activities and organized them into general categories based on logical and physical relations between different activities. Then each activity was assigned a specific code. The computer software was designed after understanding the different parts of the processes, including main and sup processes, and categorization, which will be explained in the following sections. The findings of this study showed that documentation of activities can help expose implicit knowledge because all of inputs and outputs of a process along with the length, location, tools and different stages of the process, exchanged information, storage location of the information and information flow can be identified using proper documentation. A documentation program can create a complete identifier for every process of an organization and also acts as the main tool for establishment of information technology as the basis of the organization and helps achieve the goal of having electronic and information technology based organizations. In other words documentation is the starting step in creating an organizational architecture. Afterwards, in order to reach the desired goal of documentation, computer software containing all tools, methods, instructions and guidelines and implicit knowledge of the organization was designed. This software links all relevant knowledge to the main text of the documentation and identification of a process and provides the users with electronic versions of all documentations and helps use the explicit and implicit knowledge of the organization to facilitate the reengineering of the processes in the organization.
NASA Technical Reports Server (NTRS)
Isenberg, J. M.; Southall, J. W.
1979-01-01
The Integrated Programs for Aerospace Vehicle Design (IPAD) is a computing system to support company-wide design information processing. This document presents a brief description of the management system used to direct and control a product-oriented program. This document, together with the reference design process (CR 2981) and the manufacture interactions with the design process (CR 2982), comprises the reference information that forms the basis for specifying IPAD system requirements.
Moffitt, Christine M.
2017-01-01
This project tested and revised a risk assessment/management tool authored by Moffitt and Stockton designed to provide hatchery biologists and others a structure to measure risk and provide tools to control, prevent or eliminate invasive New Zealand mudsnails (NZMS) and other invasive mollusks in fish hatcheries and hatchery operations. The document has two parts: the risk assessment tool, and an appendix that summarizes options for control or management.The framework of the guidance document for risk assessment/hatchery tool combines approaches used by the Hazard Analysis and Critical Control Points (HACCP) process with those developed by the Commission for Environmental Cooperation (CEC), of Canada, Mexico, and the United States, in the Tri-National Risk Assessment Guidelines for Aquatic Alien Invasive Species. The framework approach for this attached first document assesses risk potential with two activities: probability of infestation and consequences of infestation. Each activity is treated equally to determine the risk potential. These two activities are divided into seven basic elements that utilize scientific, technical, and other relevant information in the process of the risk assessment. To determine the probability of infestation four steps are used that have scores reported or determined and averaged. This assessment follows a familiar HACCP process to assess pathways of entry, entry potential, colonization potential, spread potential. The economic, environmental and social consequences are considered as economic impact, environmental impact, and social and cultural influences.To test this document, the Principal Investigator worked to identify interested hatchery managers through contacts at regional aquaculture meetings, fish health meetings, and through the network of invasive species managers and scientists participating in the Western Regional Panel on Aquatic Nuisance Species and the 100th Meridian Initiative's Columbia River Basin Team, and the Western New Zealand Mudsnail Conference in Seattle. Targeted hatchery workshops were conducted with staff at Dworshak National Fish Hatchery Complex (ID), Similkameen Pond, Oroville WA, and Ringold Springs State Hatchery (WA).As a result of communications with hatchery staff, invasive species managers, and on site assessments of hatchery facilities, the document was modified and enhanced. Additional resources were added to keep it up to date. The result is a more simplified tool that can lead hatchery or management personnel through the process of risk assessment and provide an introduction to the risk management and communication process.In addition to the typical HACCP processes, this tool adds steps to rate and consider uncertainty and the weight of evidence regarding options and monitoring results . Uncertainty of outcome exists in most tools that can be used to control or prevent NZMS or other invasive mollusks from infesting an area. In additional this document emphasizes that specific control tools and plans must be tailored to each specific setting to consider the economic, environmental and social influences. From the testing and evaluation process, there was a strong recognition that a number of control and prevention tools previously suggested and reported in the literature from laboratory and small scale trials may not be compatible with regional and national regulations, economic constraints, social or cultural constraints, engineering or water chemistry characteristics of each facility.The options for control are summarized in the second document, Review of Control Measures for Hatcheries Infested with NZMS (Appendix A) that provides sources for additional resources and specific tools, and guidance regarding the feasibility and success of each approach. This tool also emphasizes that management plans need to be adaptive and incorporate oversight from professionals familiar with measuring risks of fish diseases, and treatments (e.g. the fish health practitioners and water quality and effluent management teams). Finally, with such a team, the adaptive management approach must be ongoing, and become a regular component of hatchery operations.Although it was the intent that this two part document would be included as part of the revised National Management and Control Plan for the NZMS proposed by the U.S. Fish and Wildlife Service (USFWS) and others, it is provided as a stand-alone document.
12 CFR 217.122 - Qualification requirements.
Code of Federal Regulations, 2014 CFR
2014-01-01
... related processes; (ii) Have and document a process (which must capture business environment and internal... current business activities, risk profile, technological processes, and risk management processes; and (ii... assessment systems. (D) Business environment and internal control factors. The Board-regulated institution...
12 CFR 324.122 - Qualification requirements.
Code of Federal Regulations, 2014 CFR
2014-01-01
... related processes; (ii) Have and document a process (which must capture business environment and internal... current business activities, risk profile, technological processes, and risk management processes; and (ii... assessment systems. (D) Business environment and internal control factors. The FDIC-supervised institution...
Alternative control technology document for bakery oven emissions. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanford, C.W.
The document was produced in response to a request by the baking industry for Federal guidance to assist in providing a more uniform information base for State decision-making with regard to control of bakery oven emissions. The information in the document pertains to bakeries that produce yeast-leavened bread, rolls, buns, and similar products but not crackers, sweet goods, or baked foodstuffs that are not yeast leavened. Information on the baking processes, equipment, operating parameters, potential emissions from baking, and potential emission control options are presented. Catalytic and regenerative oxidation are identified as the most appropriate existing control technologies applicable tomore » VOC emissions from bakery ovens. Cost analyses for catalytic and regenerative oxidation are included. A predictive formula for use in estimating oven emissions has been derived from source tests done in junction with the development of the document. Its use and applicability are described.« less
Rosen, Michael A; Chima, Adaora M; Sampson, John B; Jackson, Eric V; Koka, Rahul; Marx, Megan K; Kamara, Thaim B; Ogbuagu, Onyebuchi U; Lee, Benjamin H
2015-08-01
Inadequate observance of basic processes in patient care such as patient monitoring and documentation practices are potential impediments to the timely diagnoses and management of patients. These gaps exist in low resource settings such as Sierra Leone and can be attributed to a myriad of factors such as workforce and technology deficiencies. In the study site, only 12.4% of four critical vital signs were documented in the pre-intervention period. Implement a failure mode and effects analysis (FMEA) to improve documentation of four patient vital signs: temperature, blood pressure, pulse rate and respiratory rate. FMEA was implemented among a subpopulation of health workers who are involved in monitoring and documenting patient vital signs. Pre- and post-FMEA monitoring and documentation practice were compared with a control site. Participants identified a four-step process to monitoring and documenting vital signs, three categories of failure modes and four potential solutions. Based on 2100 patient days of documentation compliance data from 147 patients between July and November 2012, staff members at the study site were 1.79 times more likely to document all four patient vital signs in the post-implementation period (95% CI [1.35, 2.38]). FMEA is a feasible and effective strategy for improving quality and safety in an austere medical environment. Documentation compliance improved at the intervention facility. To evaluate the scalability and sustainability of this approach, programs targeting the development of these types of process improvement skills in local staff should be evaluated. © The Author 2015. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.
A novel process control method for a TT-300 E-Beam/X-Ray system
NASA Astrophysics Data System (ADS)
Mittendorfer, Josef; Gallnböck-Wagner, Bernhard
2018-02-01
This paper presents some aspects of the process control method for a TT-300 E-Beam/X-Ray system at Mediscan, Austria. The novelty of the approach is the seamless integration of routine monitoring dosimetry with process data. This allows to calculate a parametric dose for each production unit and consequently a fine grain and holistic process performance monitoring. Process performance is documented in process control charts for the analysis of individual runs as well as historic trending of runs of specific process categories over a specified time range.
REDUCED PROTECTIVE CLOTHING DETERMINATIONS
DOE Office of Scientific and Technical Information (OSTI.GOV)
BROWN, R.L.
2003-06-13
This technical basis document defines conditions where reduced protective clothing can be allowed, defines reduced protective clothing, and documents the regulatory review that determines the process is compliant with the Tank Farm Radiological Control Manual (TFRCM) and Title 10, Part 835, of the Code of Federal Regulations (10CFR835). The criteria, standards, and requirements contained in this document apply only to Tank Farm Contractor (TFC) facilities.
Knowledge enabled plan of care and documentation prototype.
DaDamio, Rebecca; Gugerty, Brian; Kennedy, Rosemary
2006-01-01
There exist significant challenges in integrating the plan of care into documentation and point of care operational processes. A plan of care is often a static artifact that meets regulatory standards with limited influence on supporting goal-directed care delivery processes. Although this prototype is applicable to many clinical disciplines, we will highlight nursing processes in demonstrating a knowledge-driven computerized solution that fully integrates the plan of care within documentation. The knowledge-driven solution reflects evidenced-based practice; is an effective tool for managing problems, orders/interventions, and the patient's progress towards expected outcomes; meets regulatory standards; and drives quality and process improvement. The knowledge infrastructure consists of fully represented terminology, structured clinical expressions utilizing the controlled terminology and clinical knowledge representing evidence-based practice.
A diagnostic prototype of the potable water subsystem of the Space Station Freedom ECLSS
NASA Technical Reports Server (NTRS)
Lukefahr, Brenda D.; Rochowiak, Daniel M.; Benson, Brian L.; Rogers, John S.; Mckee, James W.
1989-01-01
In analyzing the baseline Environmental Control and Life Support System (ECLSS) command and control architecture, various processes are found which would be enhanced by the use of knowledge based system methods of implementation. The most suitable process for prototyping using rule based methods are documented, while domain knowledge resources and other practical considerations are examined. Requirements for a prototype rule based software system are documented. These requirements reflect Space Station Freedom ECLSS software and hardware development efforts, and knowledge based system requirements. A quick prototype knowledge based system environment is researched and developed.
RELEASE OF DRIED RADIOACTIVE WASTE MATERIALS TECHNICAL BASIS DOCUMENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
KOZLOWSKI, S.D.
2007-05-30
This technical basis document was developed to support RPP-23429, Preliminary Documented Safety Analysis for the Demonstration Bulk Vitrification System (PDSA) and RPP-23479, Preliminary Documented Safety Analysis for the Contact-Handled Transuranic Mixed (CH-TRUM) Waste Facility. The main document describes the risk binning process and the technical basis for assigning risk bins to the representative accidents involving the release of dried radioactive waste materials from the Demonstration Bulk Vitrification System (DBVS) and to the associated represented hazardous conditions. Appendices D through F provide the technical basis for assigning risk bins to the representative dried waste release accident and associated represented hazardous conditionsmore » for the Contact-Handled Transuranic Mixed (CH-TRUM) Waste Packaging Unit (WPU). The risk binning process uses an evaluation of the frequency and consequence of a given representative accident or represented hazardous condition to determine the need for safety structures, systems, and components (SSC) and technical safety requirement (TSR)-level controls. A representative accident or a represented hazardous condition is assigned to a risk bin based on the potential radiological and toxicological consequences to the public and the collocated worker. Note that the risk binning process is not applied to facility workers because credible hazardous conditions with the potential for significant facility worker consequences are considered for safety-significant SSCs and/or TSR-level controls regardless of their estimated frequency. The controls for protection of the facility workers are described in RPP-23429 and RPP-23479. Determination of the need for safety-class SSCs was performed in accordance with DOE-STD-3009-94, Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses, as described below.« less
1980-03-06
performing the present NPFC tasks. Potential automation technologies may include order processing mechanization, demand printing from micrographic or...effort and documented in this volume included the following: a. Functional description of the order processing activities as they currently operate. b...covered under each analysis area. i It is obvious from the exhibit that the functional description of order processing operations was to include COG I
Criteria for Determining whether Equipment is Air Pollution Control Equipment or Process Equipment
This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
NASA Technical Reports Server (NTRS)
Romine, Peter L.
1991-01-01
This final report documents the development and installation of software and hardware for Robotic Welding Process Control. Primary emphasis is on serial communications between the CYRO 750 robotic welder, Heurikon minicomputer running Hunter & Ready VRTX, and an IBM PC/AT, for offline programming and control and closed-loop welding control. The requirements for completion of the implementation of the Rocketdyne weld tracking control are discussed. The procedure for downloading programs from the Intergraph, over the network, is discussed. Conclusions are made on the results of this task, and recommendations are made for efficient implementation of communications, weld process control development, and advanced process control procedures using the Heurikon.
300 Area treated effluent disposal facility sampling schedule
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loll, C.M.
1994-10-11
This document is the interface between the 300 Area Liquid Effluent Process Engineering (LEPE) group and the Waste Sampling and Characterization Facility (WSCF), concerning process control samples. It contains a schedule for process control samples at the 300 Area TEDF which describes the parameters to be measured, the frequency of sampling and analysis, the sampling point, and the purpose for each parameter.
Export Control Requirements for Tritium Processing Design and R&D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hollis, William Kirk; Maynard, Sarah-Jane Wadsworth
This document will address requirements of export control associated with tritium plant design and processes. Los Alamos National Laboratory has been working in the area of tritium plant system design and research and development (R&D) since the early 1970’s at the Tritium Systems Test Assembly (TSTA). This work has continued to the current date with projects associated with the ITER project and other Office of Science Fusion Energy Science (OS-FES) funded programs. ITER is currently the highest funding area for the DOE OS-FES. Although export control issues have been integrated into these projects in the past a general guidance documentmore » has not been available for reference in this area. To address concerns with currently funded tritium plant programs and assist future projects for FES, this document will identify the key reference documents and specific sections within related to tritium research. Guidance as to the application of these sections will be discussed with specific detail to publications and work with foreign nationals.« less
Export Control Requirements for Tritium Processing Design and R&D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hollis, William Kirk; Maynard, Sarah-Jane Wadsworth
2015-10-30
This document will address requirements of export control associated with tritium plant design and processes. Los Alamos National Laboratory has been working in the area of tritium plant system design and research and development (R&D) since the early 1970’s at the Tritium Systems Test Assembly (TSTA). This work has continued to the current date with projects associated with the ITER project and other Office of Science Fusion Energy Science (OS-FES) funded programs. ITER is currently the highest funding area for the DOE OS-FES. Although export control issues have been integrated into these projects in the past a general guidance documentmore » has not been available for reference in this area. To address concerns with currently funded tritium plant programs and assist future projects for FES, this document will identify the key reference documents and specific sections within related to tritium research. Guidance as to the application of these sections will be discussed with specific detail to publications and work with foreign nationals.« less
Hediger, Hannele; Müller-Staub, Maria; Petry, Heidi
2016-01-01
Electronic nursing documentation systems, with standardized nursing terminology, are IT-based systems for recording the nursing processes. These systems have the potential to improve the documentation of the nursing process and to support nurses in care delivery. This article describes the development and initial validation of an instrument (known by its German acronym UEPD) to measure the subjectively-perceived benefits of an electronic nursing documentation system in care delivery. The validity of the UEPD was examined by means of an evaluation study carried out in an acute care hospital (n = 94 nurses) in German-speaking Switzerland. Construct validity was analyzed by principal components analysis. Initial references of validity of the UEPD could be verified. The analysis showed a stable four factor model (FS = 0.89) scoring in 25 items. All factors loaded ≥ 0.50 and the scales demonstrated high internal consistency (Cronbach's α = 0.73 – 0.90). Principal component analysis revealed four dimensions of support: establishing nursing diagnosis and goals; recording a case history/an assessment and documenting the nursing process; implementation and evaluation as well as information exchange. Further testing with larger control samples and with different electronic documentation systems are needed. Another potential direction would be to employ the UEPD in a comparison of various electronic documentation systems.
12 CFR 3.122 - Qualification requirements.
Code of Federal Regulations, 2014 CFR
2014-01-01
... related processes; (ii) Have and document a process (which must capture business environment and internal... association's current business activities, risk profile, technological processes, and risk management...) Business environment and internal control factors. The national bank or Federal savings association must...
Analysis And Control System For Automated Welding
NASA Technical Reports Server (NTRS)
Powell, Bradley W.; Burroughs, Ivan A.; Kennedy, Larry Z.; Rodgers, Michael H.; Goode, K. Wayne
1994-01-01
Automated variable-polarity plasma arc (VPPA) welding apparatus operates under electronic supervision by welding analysis and control system. System performs all major monitoring and controlling functions. It acquires, analyzes, and displays weld-quality data in real time and adjusts process parameters accordingly. Also records pertinent data for use in post-weld analysis and documentation of quality. System includes optoelectronic sensors and data processors that provide feedback control of welding process.
Clean Air Act Section 112(d)(6) Technology Review for Pulping and Papermaking Processes Memorandum
The purpose of this November 2011 document is to present the results of a review of available information on developments in practices, processes, and control technologies that apply to pulping and papermaking processes.
NASA-STD-(I)-6016, Standard Materials and Processes Requirements for Spacecraft
NASA Technical Reports Server (NTRS)
Pedley, Michael; Griffin, Dennis
2006-01-01
This document is directed toward Materials and Processes (M&P) used in the design, fabrication, and testing of flight components for all NASA manned, unmanned, robotic, launch vehicle, lander, in-space and surface systems, and spacecraft program/project hardware elements. All flight hardware is covered by the M&P requirements of this document, including vendor designed, off-the-shelf, and vendor furnished items. Materials and processes used in interfacing ground support equipment (GSE); test equipment; hardware processing equipment; hardware packaging; and hardware shipment shall be controlled to prevent damage to or contamination of flight hardware.
Apollo 16 photographic standards documentation
NASA Technical Reports Server (NTRS)
Bourque, P. F.
1972-01-01
The activities of the Photographic Technology Division, and particularly the Photo Science Office, the Precision Processing Laboratory, and the Motion Picture Laboratory, in connection with the scientific photography of the Apollo 16 manned space mission are documented. Described are the preflight activities involved in establishing a standard process for each of the flight films, the manned in which flight films were handled upon arrival at the Manned Spacecraft Center in Houston, Texas, and how the flight films were processed and duplicated. The tone reproduction method of duplication is described. The specific sensitometric and chemical process controls are not included.
Manufacture and quality control of interconnecting wire harnesses, Volume 3
NASA Technical Reports Server (NTRS)
1972-01-01
The document covers interconnecting wire harnesses defined in the design standard, including type 6, enclosed in TFE heat shrink tubing; and type 7, flexible armored. Knowledge gained through experience on the Saturn 5 program coupled with recent advances in techniques, materials, and processes was incorporated into this document.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blum, T.W.; Selvage, R.D.; Courtney, K.H.
This manual is the guide for initiating change at the Plutonium Facility, which handles the processing of plutonium as well as research on plutonium metallurgy. It describes the change and work control processes employed at TA-55 to ensure that all proposed changes are properly identified, reviewed, approved, implemented, tested, and documented so that operations are maintained within the approved safety envelope. All Laboratory groups, their contractors, and subcontractors doing work at TA-55 follow requirements set forth herein. This manual applies to all new and modified processes and experiments inside the TA-55 Plutonium Facility; general plant project (GPP) and line itemmore » funded construction projects at TA-55; temporary and permanent changes that directly or indirectly affect structures, systems, or components (SSCs) as described in the safety analysis, including Facility Control System (FCS) software; and major modifications to procedures. This manual does not apply to maintenance performed on process equipment or facility SSCs or the replacement of SSCs or equipment with documented approved equivalents.« less
Hartman, Victoria; Castillo-Pelayo, Tania; Babinszky, Sindy; Dee, Simon; Leblanc, Jodi; Matzke, Lise; O'Donoghue, Sheila; Carpenter, Jane; Carter, Candace; Rush, Amanda; Byrne, Jennifer; Barnes, Rebecca; Mes-Messons, Anne-Marie; Watson, Peter
2018-02-01
Ongoing quality management is an essential part of biobank operations and the creation of high quality biospecimen resources. Adhering to the standards of a national biobanking network is a way to reduce variability between individual biobank processes, resulting in cross biobank compatibility and more consistent support for health researchers. The Canadian Tissue Repository Network (CTRNet) implemented a set of required operational practices (ROPs) in 2011 and these serve as the standards and basis for the CTRNet biobank certification program. A review of these 13 ROPs covering 314 directives was conducted after 5 years to identify areas for revision and update, leading to changes to 7/314 directives (2.3%). A review of all internal controlled documents (including policies, standard operating procedures and guides, and forms for actions and processes) used by the BC Cancer Agency's Tumor Tissue Repository (BCCA-TTR) to conform to these ROPs was then conducted. Changes were made to 20/106 (19%) of BCCA-TTR documents. We conclude that a substantial fraction of internal controlled documents require updates at regular intervals to accommodate changes in best practices. Reviewing documentation is an essential aspect of keeping up to date with best practices and ensuring the quality of biospecimens and data managed by biobanks.
300 Area treated effluent disposal facility sampling schedule. Revision 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loll, C.M.
1995-03-28
This document is the interface between the 300 Area liquid effluent process engineering (LEPE) group and the waste sampling and characterization facility (WSCF), concerning process control samples. It contains a schedule for process control samples at the 300 Area TEDF which describes the parameters to be measured, the frequency of sampling and analysis, the sampling point, and the purpose for each parameter.
Quality Control in Clinical Laboratory Samples
2015-01-01
is able to find and correct flaws in the analytical processes of a lab before potentially incorrect patient resu lts are released. According to...verifi es that the results produced are accurate and precise . Clinical labs use management of documentation as well as inco rporation of a continuous...improvement process to streamline the overall quality control process . QC samples are expected to be identical and tested identically to patient
1984-06-29
effort that requires hard copy documentation. As a result, there are generally numerous delays in providing current quality information. In the FoF...process have had fixed controls or were based on " hard -coded" information. A template, for example, is hard -coded information defining the shape of a...represents soft-coded control information. (Although manual handling of punch tapes still possess some of the limitations of " hard -coded" controls
2010-11-05
The Food and Drug Administration (FDA) is announcing the reclassification of the full-field digital mammography (FFDM) system from class III (premarket approval) to class II (special controls). The device type is intended to produce planar digital x-ray images of the entire breast; this generic type of device may include digital mammography acquisition software, full-field digital image receptor, acquisition workstation, automatic exposure control, image processing and reconstruction programs, patient and equipment supports, component parts, and accessories. The special control that will apply to the device is the guidance document entitled "Class II Special Controls Guidance Document: Full-Field Digital Mammography System." FDA is reclassifying the device into class II (special controls) because general controls along with special controls will provide a reasonable assurance of safety and effectiveness of the device. Elsewhere in this issue of the Federal Register, FDA is announcing the availability of the guidance document that will serve as the special control for this device.
The IRMIS object model and services API.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saunders, C.; Dohan, D. A.; Arnold, N. D.
2005-01-01
The relational model developed for the Integrated Relational Model of Installed Systems (IRMIS) toolkit has been successfully used to capture the Advanced Photon Source (APS) control system software (EPICS process variables and their definitions). The relational tables are populated by a crawler script that parses each Input/Output Controller (IOC) start-up file when an IOC reboot is detected. User interaction is provided by a Java Swing application that acts as a desktop for viewing the process variable information. Mapping between the display objects and the relational tables was carried out with the Hibernate Object Relational Modeling (ORM) framework. Work is wellmore » underway at the APS to extend the relational modeling to include control system hardware. For this work, due in part to the complex user interaction required, the primary application development environment has shifted from the relational database view to the object oriented (Java) perspective. With this approach, the business logic is executed in Java rather than in SQL stored procedures. This paper describes the object model used to represent control system software, hardware, and interconnects in IRMIS. We also describe the services API used to encapsulate the required behaviors for creating and maintaining the complex data. In addition to the core schema and object model, many important concepts in IRMIS are captured by the services API. IRMIS is an ambitious collaborative effort for defining and developing a relational database and associated applications to comprehensively document the large and complex EPICS-based control systems of today's accelerators. The documentation effort includes process variables, control system hardware, and interconnections. The approach could also be used to document all components of the accelerator, including mechanical, vacuum, power supplies, etc. One key aspect of IRMIS is that it is a documentation framework, not a design and development tool. We do not generate EPICS control system configurations from IRMIS, and hence do not impose any additional requirements on EPICS developers.« less
This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
Update on Controlling Herds of Cooperative Robots
NASA Technical Reports Server (NTRS)
Quadrelli, Marco; Chang, Johnny
2007-01-01
A document presents further information on the subject matter of "Controlling Herds of Cooperative Robots". The document describes the results of the computational simulations of a one-blimp, three-surface-sonde herd in various operational scenarios, including sensitivity studies as a function of distributed communication and processing delays between the sondes and the blimp. From results of the simulations, it is concluded that the methodology is feasible, even if there are significant uncertainties in the dynamical models.
[Quality control of laser imagers].
Winkelbauer, F; Ammann, M; Gerstner, N; Imhof, H
1992-11-01
Multiformat imagers based on laser systems are used for documentation in an increasing number of investigations. The specific problems of quality control are explained and the persistence of film processing in these imager systems of different configuration with (Machine 1: 3M-Laser-Imager-Plus M952 with connected 3M Film-Processor, 3M-Film IRB, X-Rax Chemical Mixer 3M-XPM, 3M-Developer and Fixer) or without (Machine 2: 3M-Laser-Imager-Plus M952 with separate DuPont-Cronex Film-processor, Kodak IR-Film, Kodak Automixer, Kodak-Developer and Fixer) connected film processing unit are investigated. In our checking based on DIN 6868 and ONORM S 5240 we found persistence of film processing in the equipment with directly adapted film processing unit according to DIN and ONORM. The checking of film persistence as demanded by DIN 6868 in these equipment could therefore be performed in longer periods. Systems with conventional darkroom processing comparatively show plain increased fluctuation, and hence the demanded daily control is essential to guarantee appropriate reaction and constant quality of documentation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malkoske, Kyle; Nielsen, Michelle; Brown, Erika
The Canadian Partnership for Quality Radiotherapy (CPQR) and the Canadian Organization of Medical Physicist’s (COMP) Quality Assurance and Radiation Safety Advisory Committee (QARSAC) have worked together in the development of a suite of Technical Quality Control (TQC) Guidelines for radiation treatment equipment and technologies, that outline specific performance objectives and criteria that equipment should meet in order to assure an acceptable level of radiation treatment quality. Early community engagement and uptake survey data showed 70% of Canadian centers are part of this process and that the data in the guideline documents reflect, and are influencing the way Canadian radiation treatmentmore » centres run their technical quality control programs. As the TQC development framework matured as a cross-country initiative, guidance documents have been developed in many clinical technologies. Recently, there have been new TQC documents initiated for Gamma Knife and Cyberknife technologies where the entire communities within Canada are involved in the review process. At the same time, QARSAC reviewed the suite as a whole for the first time and it was found that some tests and tolerances overlapped across multiple documents as single tests could pertain to multiple quality control areas. The work to streamline the entire suite has allowed for improved usability of the suite while keeping the integrity of single quality control areas. The suite will be published by the JACMP, in the coming year.« less
NASA Technical Reports Server (NTRS)
Lucord, Steve A.; Gully, Sylvain
2009-01-01
The purpose of the PROTOTYPE INTEROPERABILITY DOCUMENT is to document the design and interfaces for the service providers and consumers of a Mission Operations prototype between JSC-OTF and DLR-GSOC. The primary goal is to test the interoperability sections of the CCSDS Spacecraft Monitor & Control (SM&C) Mission Operations (MO) specifications between both control centers. An additional goal is to provide feedback to the Spacecraft Monitor and Control (SM&C) working group through the Review Item Disposition (RID) process. This Prototype is considered a proof of concept and should increase the knowledge base of the CCSDS SM&C Mission Operations standards. No operational capabilities will be provided. The CCSDS Mission Operations (MO) initiative was previously called Spacecraft Monitor and Control (SM&C). The specifications have been renamed to better reflect the scope and overall objectives. The working group retains the name Spacecraft Monitor and Control working group and is under the Mission Operations and Information Services Area (MOIMS) of CCSDS. This document will refer to the specifications as SM&C Mission Operations, Mission Operations or just MO.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-25
... establishment's process control plans, that is, its Hazard Analysis and Critical Control Point plans. DATES... control plans, i.e., its Hazard Analysis and Critical Control Point (HACCP) plans; and (3) make the recall... systematic prevention of biological, chemical, and physical hazards. HACCP plans are establishment-developed...
Training Manual for Elements of Interface Definition and Control
NASA Technical Reports Server (NTRS)
Lalli, Vincent R. (Editor); Kastner, Robert E. (Editor); Hartt, Henry N. (Editor)
1997-01-01
The primary thrust of this manual is to ensure that the format and information needed to control interfaces between equipment are clear and understandable. The emphasis is on controlling the engineering design of the interface and not on the functional performance requirements of the system or the internal workings of the interfacing equipment. Interface control should take place, with rare exception, at the interfacing elements and no further. There are two essential sections of the manual. Chapter 2, Principles of Interface Control, discusses how interfaces are defined. It describes different types of interfaces to be considered and recommends a format for the documentation necessary for adequate interface control. Chapter 3, The Process: Through the Design Phases, provides tailored guidance for interface definition and control. This manual can be used to improve planned or existing interface control processes during system design and development. It can also be used to refresh and update the corporate knowledge base. The information presented herein will reduce the amount of paper and data required in interface definition and control processes by as much as 50 percent and will shorten the time required to prepare an interface control document. It also highlights the essential technical parameters that ensure that flight subsystems will indeed fit together and function as intended after assembly and checkout.
Serials Control System Procedures and Policies.
ERIC Educational Resources Information Center
Schlembach, Mary C.
This document includes procedures and policies for a networked serials control system originally developed at the Grainger Engineering Library Information Center at the University of Illinois at Urbana-Champaign (UIUC). The serials control systems encompass serials processing, public service, and end-user functions. The system employs a…
Tracing And Control Of Engineering Requirements
NASA Technical Reports Server (NTRS)
Turner, Philip R.; Stoller, Richard L.; Neville, Ted; Boyle, Karen A.
1991-01-01
TRACER (Tracing and Control of Engineering Requirements) is data-base/word-processing software system created to document and maintain order of both requirements and descriptions associated with engineering project. Implemented on IBM PC under PC-DOS. Written with CLIPPER.
NASA Technical Reports Server (NTRS)
1981-01-01
The Kennedy Space Center (KSC) Management System for the Inertial Upper Stage (IUS) - spacecraft processing from KSC arrival through launch is described. The roles and responsibilities of the agencies and test team organizations involved in IUS-S/C processing at KSC for non-Department of Defense missions are described. Working relationships are defined with respect to documentation preparation, coordination and approval, schedule development and maintenance, test conduct and control, configuration management, quality control and safety. The policy regarding the use of spacecraft contractor test procedures, IUS contractor detailed operating procedures and KSC operations and maintenance instructions is defined. Review and approval requirements for each documentation system are described.
Analytical control test plan and microbiological methods for the water recovery test
NASA Technical Reports Server (NTRS)
Traweek, M. S. (Editor); Tatara, J. D. (Editor)
1994-01-01
Qualitative and quantitative laboratory results are important to the decision-making process. In some cases, they may represent the only basis for deciding between two or more given options or processes. Therefore, it is essential that handling of laboratory samples and analytical operations employed are performed at a deliberate level of conscientious effort. Reporting erroneous results can lead to faulty interpretations and result in misinformed decisions. This document provides analytical control specifications which will govern future test procedures related to all Water Recovery Test (WRT) Phase 3 activities to be conducted at the National Aeronautics and Space Administration/Marshall Space Flight Center (NASA/MSFC). This document addresses the process which will be used to verify analytical data generated throughout the test period, and to identify responsibilities of key personnel and participating laboratories, the chains of communication to be followed, and ensure that approved methodology and procedures are used during WRT activities. This document does not outline specifics, but provides a minimum guideline by which sampling protocols, analysis methodologies, test site operations, and laboratory operations should be developed.
Seismology software: state of the practice
NASA Astrophysics Data System (ADS)
Smith, W. Spencer; Zeng, Zheng; Carette, Jacques
2018-05-01
We analyzed the state of practice for software development in the seismology domain by comparing 30 software packages on four aspects: product, implementation, design, and process. We found room for improvement in most seismology software packages. The principal areas of concern include a lack of adequate requirements and design specification documents, a lack of test data to assess reliability, a lack of examples to get new users started, and a lack of technological tools to assist with managing the development process. To assist going forward, we provide recommendations for a document-driven development process that includes a problem statement, development plan, requirement specification, verification and validation (V&V) plan, design specification, code, V&V report, and a user manual. We also provide advice on tool use, including issue tracking, version control, code documentation, and testing tools.
[Development of a medical equipment support information system based on PDF portable document].
Cheng, Jiangbo; Wang, Weidong
2010-07-01
According to the organizational structure and management system of the hospital medical engineering support, integrate medical engineering support workflow to ensure the medical engineering data effectively, accurately and comprehensively collected and kept in electronic archives. Analyse workflow of the medical, equipment support work and record all work processes by the portable electronic document. Using XML middleware technology and SQL Server database, complete process management, data calculation, submission, storage and other functions. The practical application shows that the medical equipment support information system optimizes the existing work process, standardized and digital, automatic and efficient orderly and controllable. The medical equipment support information system based on portable electronic document can effectively optimize and improve hospital medical engineering support work, improve performance, reduce costs, and provide full and accurate digital data
Seismology software: state of the practice
NASA Astrophysics Data System (ADS)
Smith, W. Spencer; Zeng, Zheng; Carette, Jacques
2018-02-01
We analyzed the state of practice for software development in the seismology domain by comparing 30 software packages on four aspects: product, implementation, design, and process. We found room for improvement in most seismology software packages. The principal areas of concern include a lack of adequate requirements and design specification documents, a lack of test data to assess reliability, a lack of examples to get new users started, and a lack of technological tools to assist with managing the development process. To assist going forward, we provide recommendations for a document-driven development process that includes a problem statement, development plan, requirement specification, verification and validation (V&V) plan, design specification, code, V&V report, and a user manual. We also provide advice on tool use, including issue tracking, version control, code documentation, and testing tools.
Using Simulation for Launch Team Training and Evaluation
NASA Technical Reports Server (NTRS)
Peaden, Cary J.
2005-01-01
This document describes some of the histor y and uses of simulation systems and processes for the training and evaluation of Launch Processing, Mission Control, and Mission Management teams. It documents some of the types of simulations that are used at Kennedy Space Center (KSC) today and that could be utilized (and possibly enhanced) for future launch vehicles. This article is intended to provide an initial baseline for further research into simulation for launch team training in the near future.
1989-08-14
General Information > MANPGB is a Resource Control Center under the MANPG section of the Industrial W Products Division (MAN) at WR-ALC/ MANPGB is located...zo a N 1 I- I- LU ~a U) La ~ca LU P- 1= .: - "~0. ujN La7N LU 0 U 1I. DATE WORK CONTROL DOCUMENT . . . * ~:0IA~ r JOB ORDER NUMBER. 31J~ANTITY 4. N...30834A ---1.-- Cicleappropriate control number Clean iniao-xenlsrae. 02. Functional analysis - B- f 3 Deseal -Mt___ 04 Internal cleaning & visual. 05
ERIC Educational Resources Information Center
Dade County Public Schools, Miami, FL.
This document presents an outline for a 135-hour course designed to familiarize the beginning student with the basic concepts common to aircraft materials and processes, together with the requirements of proper cleaning and corrosion control as outlined by the Federal Aviation Agency. The aviation airframe and powerplant maintenance technician is…
[IMPLEMENTATION OF A QUALITY MANAGEMENT SYSTEM IN A NUTRITION UNIT ACCORDING TO ISO 9001:2008].
Velasco Gimeno, Cristina; Cuerda Compés, Cristina; Alonso Puerta, Alba; Frías Soriano, Laura; Camblor Álvarez, Miguel; Bretón Lesmes, Irene; Plá Mestre, Rosa; Izquierdo Membrilla, Isabel; García-Peris, Pilar
2015-09-01
the implementation of quality management systems (QMS) in the health sector has made great progress in recent years, remains a key tool for the management and improvement of services provides to patients. to describe the process of implementing a quality management system (QMS) according to the standard ISO 9001:2008 in a Nutrition Unit. the implementation began in October 2012. Nutrition Unit was supported by Hospital Preventive Medicine and Quality Management Service (PMQM). Initially training sessions on QMS and ISO standards for staff were held. Quality Committee (QC) was established with representation of the medical and nursing staff. Every week, meeting took place among members of the QC and PMQM to define processes, procedures and quality indicators. We carry on a 2 months follow-up of these documents after their validation. a total of 4 processes were identified and documented (Nutritional status assessment, Nutritional treatment, Monitoring of nutritional treatment and Planning and control of oral feeding) and 13 operating procedures in which all the activity of the Unit were described. The interactions among them were defined in the processes map. Each process has associated specific quality indicators for measuring the state of the QMS, and identifying opportunities for improvement. All the documents associated with requirements of ISO 9001:2008 were developed: quality policy, quality objectives, quality manual, documents and records control, internal audit, nonconformities and corrective and preventive actions. The unit was certified by AENOR in April 2013. the implementation of a QMS causes a reorganization of the activities of the Unit in order to meet customer's expectations. Documenting these activities ensures a better understanding of the organization, defines the responsibilities of all staff and brings a better management of time and resources. QMS also improves the internal communication and is a motivational element. Explore the satisfaction and expectations of patients can include their view in the design of care processes. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.
Proposed methods for testing and selecting the ERCC external RNA controls
2005-01-01
The External RNA Control Consortium (ERCC) is an ad-hoc group with approximately 70 members from private, public, and academic organizations. The group is developing a set of external RNA control transcripts that can be used to assess technical performance in gene expression assays. The ERCC is now initiating the Testing Phase of the project, during which candidate external RNA controls will be evaluated in both microarray and QRT-PCR gene expression platforms. This document describes the proposed experiments and informatics process that will be followed to test and qualify individual controls. The ERCC is distributing this description of the proposed testing process in an effort to gain consensus and to encourage feedback from the scientific community. On October 4–5, 2005, the ERCC met to further review the document, clarify ambiguities, and plan next steps. A summary of this meeting and changes to the test plan are provided as an appendix to this manuscript. PMID:16266432
Quality Control Study of the GSL Reinsurance System. Final Report.
ERIC Educational Resources Information Center
Advanced Technology, Inc., Reston, VA.
A quality control plan for the U.S. Department of Education's Guaranteed Student Loan (GSL) reinsurance process was developed. To identify existing errors, systems documentation and past analyses of the reinsurance system were analyzed, and interviews were conducted. Corrective actions were proposed, and a quality control checklist was developed…
Renard, P; Van Breusegem, V; Nguyen, M T; Naveau, H; Nyns, E J
1991-10-20
An adaptive control algorithm has been implemented on a biomethanation process to maintain propionate concentration, a stable variable, at a given low value, by steering the dilution rate. It was thereby expected to ensure the stability of the process during the startup and during steady-state running with an acceptable performance. The methane pilot reactor was operated in the completely mixed, once-through mode and computer-controlled during 161 days. The results yielded the real-life validation of the adaptive control algorithm, and documented the stability and acceptable performance expected.
Ethylene Production Maximum Achievable Control Technology (MACT) Compliance Manual
This July 2006 document is intended to help owners and operators of ethylene processes understand and comply with EPA's maximum achievable control technology standards promulgated on July 12, 2002, as amended on April 13, 2005 and April 20, 2006.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1983-04-01
The document is one of six technical handbooks prepared by EPA to help government officials granting permits to build synfuels facilities, synfuels process developers, and other interested parties. They provide technical data on waste streams from synfuels facilities and technologies capable of controlling them. Process technologies covered in the manuals include coal gasification, coal liquefaction by direct and idirect processing, and the extraction of oil from shale. The manuals offer no regulatory guidance, allowing the industry flexibility in deciding how best to comply with environmental regulations.
Dionne, Shannon G.; Granato, Gregory E.; Tana, Cameron K.
1999-01-01
A readily accessible archive of information that is valid, current, and technically defensible is needed to make informed highway-planning, design, and management decisions. The National Highway Runoff Water-Quality Data and Methodology Synthesis (NDAMS) is a cataloging and assessment of the documentation of information relevant to highway-runoff water quality available in published reports. The report review process is based on the NDAMS review sheet, which was designed by the USGS with input from the FHWA, State transportation agencies, and the regulatory community. The report-review process is designed to determine the technical merit of the existing literature in terms of current requirements for data documentation, data quality, quality assurance and quality control (QA/QC), and technical issues that may affect the use of historical data. To facilitate the review process, the NDAMS review sheet is divided into 12 sections: (1) administrative review information, (2) investigation and report information, (3) temporal information, (4) location information (5) water-quality-monitoring information, (6) sample-handling methods, (7) constituent information, (8) sampling focus and matrix, (9) flow monitoring methods, (10) field QA/QC, (11) laboratory, and (12) uncertainty/error analysis. This report describes the NDAMS report reviews and metadata documentation methods and provides an overview of the approach and of the quality-assurance and quality-control program used to implement the review process. Detailed information, including a glossary of relevant terms, a copy of the report-review sheets, and reportreview instructions are completely documented in a series of three appendixes included with this report. Therefore the reviews are repeatable and the methods can be used by transportation research organizations to catalog new reports as they are published.
Research subject privacy protection in otolaryngology.
Noone, Michael C; Walters, K Christian; Gillespie, M Boyd
2004-03-01
Health Insurance Portability and Accountability Act regulations, which took effect on April 14, 2003, placed new constraints on the use of protected health information for research purposes. To review practices of research subject privacy protection in otolaryngology in order to determine steps necessary to achieve compliance with Health Insurance Portability and Accountability Act regulations. Literature review. Articles appearing in 2001 in 3 widely circulated otolaryngology journals were classified according to study design. The "Methods" section of each article was reviewed to determine whether the informed consent and institutional review board processes were clearly documented. Descriptive studies involving case reports and case series were more common than observational studies that include a control group (66% vs 11%). Few case series documented the consent process (18%) and institutional review board process (19%). Observational designs demonstrated better documentation of the consent process (P<.001) and the institutional review board exemption and approval process (P<.001). Methods used to protect subject privacy are not commonly documented in case series in otolaryngology. More attention needs to be given to research subject privacy concerns in the otolaryngology literature in order to comply with Health Insurance Portability and Accountability Act regulations.
CRITICALITY SAFETY CONTROLS AND THE SAFETY BASIS AT PFP
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kessler, S
2009-04-21
With the implementation of DOE Order 420.1B, Facility Safety, and DOE-STD-3007-2007, 'Guidelines for Preparing Criticality Safety Evaluations at Department of Energy Non-Reactor Nuclear Facilities', a new requirement was imposed that all criticality safety controls be evaluated for inclusion in the facility Documented Safety Analysis (DSA) and that the evaluation process be documented in the site Criticality Safety Program Description Document (CSPDD). At the Hanford site in Washington State the CSPDD, HNF-31695, 'General Description of the FH Criticality Safety Program', requires each facility develop a linking document called a Criticality Control Review (CCR) to document performance of these evaluations. Chapter 5,more » Appendix 5B of HNF-7098, Criticality Safety Program, provided an example of a format for a CCR that could be used in lieu of each facility developing its own CCR. Since the Plutonium Finishing Plant (PFP) is presently undergoing Deactivation and Decommissioning (D&D), new procedures are being developed for cleanout of equipment and systems that have not been operated in years. Existing Criticality Safety Evaluations (CSE) are revised, or new ones written, to develop the controls required to support D&D activities. Other Hanford facilities, including PFP, had difficulty using the basic CCR out of HNF-7098 when first implemented. Interpretation of the new guidelines indicated that many of the controls needed to be elevated to TSR level controls. Criterion 2 of the standard, requiring that the consequence of a criticality be examined for establishing the classification of a control, was not addressed. Upon in-depth review by PFP Criticality Safety staff, it was not clear that the programmatic interpretation of criterion 8C could be applied at PFP. Therefore, the PFP Criticality Safety staff decided to write their own CCR. The PFP CCR provides additional guidance for the evaluation team to use by clarifying the evaluation criteria in DOE-STD-3007-2007. In reviewing documents used in classifying controls for Nuclear Safety, it was noted that DOE-HDBK-1188, 'Glossary of Environment, Health, and Safety Terms', defines an Administrative Control (AC) in terms that are different than typically used in Criticality Safety. As part of this CCR, a new term, Criticality Administrative Control (CAC) was defined to clarify the difference between an AC used for criticality safety and an AC used for nuclear safety. In Nuclear Safety terms, an AC is a provision relating to organization and management, procedures, recordkeeping, assessment, and reporting necessary to ensure safe operation of a facility. A CAC was defined as an administrative control derived in a criticality safety analysis that is implemented to ensure double contingency. According to criterion 2 of Section IV, 'Linkage to the Documented Safety Analysis', of DOESTD-3007-2007, the consequence of a criticality should be examined for the purposes of classifying the significance of a control or component. HNF-PRO-700, 'Safety Basis Development', provides control selection criteria based on consequence and risk that may be used in the development of a Criticality Safety Evaluation (CSE) to establish the classification of a component as a design feature, as safety class or safety significant, i.e., an Engineered Safety Feature (ESF), or as equipment important to safety; or merely provides defense-in-depth. Similar logic is applied to the CACs. Criterion 8C of DOE-STD-3007-2007, as written, added to the confusion of using the basic CCR from HNF-7098. The PFP CCR attempts to clarify this criterion by revising it to say 'Programmatic commitments or general references to control philosophy (e.g., mass control or spacing control or concentration control as an overall control strategy for the process without specific quantification of individual limits) is included in the PFP DSA'. Table 1 shows the PFP methodology for evaluating CACs. This evaluation process has been in use since February of 2008 and has proven to be simple and effective. Each control identified in the applicable new/revised CSE is evaluated via the table. The results of this evaluation are documented in tables attached to the CCR as an appendix, for each CSE, to the base document.« less
Code of Federal Regulations, 2013 CFR
2013-07-01
... performance test for those control techniques in accordance with paragraph (b)(6) of this section. The design..., immediately preceding the use of the control technique. A design evaluation shall also address other vent... paragraph (f)(1)(i) of this section, the design evaluation shall document the control efficiency and address...
Electronic reminders improve procedure documentation compliance and professional fee reimbursement.
Kheterpal, Sachin; Gupta, Ruchika; Blum, James M; Tremper, Kevin K; O'Reilly, Michael; Kazanjian, Paul E
2007-03-01
Medicolegal, clinical, and reimbursement needs warrant complete and accurate documentation. We sought to identify and improve our compliance rate for the documentation of arterial catheterization in the perioperative setting. We first reviewed 12 mo of electronic anesthesia records to establish a baseline compliance rate for arterial catheter documentation. Residents and Certified Registered Nurse Anesthetists were randomly assigned to a control group and experimental group. When surgical incision and anesthesia end were documented in the electronic record keeper, a reminder routine checked for an invasive arterial blood pressure tracing. If a case used an arterial catheter, but no procedure note was observed, the resident or Certified Registered Nurse Anesthetist assigned to the case was sent an automated alphanumeric pager and e-mail reminder. Providers in the control group received no pager or e-mail message. After 2 mo, all staff received the reminders. A baseline compliance rate of 80% was observed (1963 of 2459 catheters documented). During the 2-mo study period, providers in the control group documented 152 of 202 (75%) arterial catheters, and the experimental group documented 177 of 201 (88%) arterial lines (P < 0.001). After all staff began receiving reminders, 309 of 314 arterial lines were documented in a subsequent 2 mo period (98%). Extrapolating this compliance rate to 12 mo of expected arterial catheter placement would result in an annual incremental $40,500 of professional fee reimbursement. The complexity of the tertiary care process results in documentation deficiencies. Inexpensive automated reminders can drastically improve compliance without the need for complicated negative or positive feedback.
Fine scale variations of surface water chemistry in an ephemeral to perennial drainage network
Margaret A. Zimmer; Scott W. Bailey; Kevin J. McGuire; Thomas D. Bullen
2013-01-01
Although temporal variation in headwater stream chemistry has long been used to document baseline conditions and response to environmental drivers, less attention is paid to fine scale spatial variations that could yield clues to processes controlling stream water sources. We documented spatial and temporal variation in water composition in a headwater catchment (41 ha...
Manufacturing and quality control of interconnecting wire harnesses, Volume 4
NASA Technical Reports Server (NTRS)
1972-01-01
The document covers interconnecting wire harnesses defined in the design standard, including type 8, flat conductor cable. Volume breadth covers installations of groups of harnesses in a major assembly and the associated post installation inspections and electrical tests. Knowledge gained through experience on the Saturn 5 program coupled with recent advances in techniques, materials, and processes was incorporated into this document.
Computer software configuration description, 241-AY and 241-AZ tank farm MICON automation system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winkelman, W.D.
This document describes the configuration process, choices and conventions used during the configuration activities, and issues involved in making changes to the configuration. Includes the master listings of the Tag definitions, which should be revised to authorize any changes. Revision 2 incorporates minor changes to ensure the document setpoints accurately reflect limits (including exhaust stack flow of 800 scfm) established in OSD-T-151-00019. The MICON DCS software controls and monitors the instrumentation and equipment associated with plant systems and processes.
A database for TMT interface control documents
NASA Astrophysics Data System (ADS)
Gillies, Kim; Roberts, Scott; Brighton, Allan; Rogers, John
2016-08-01
The TMT Software System consists of software components that interact with one another through a software infrastructure called TMT Common Software (CSW). CSW consists of software services and library code that is used by developers to create the subsystems and components that participate in the software system. CSW also defines the types of components that can be constructed and their roles. The use of common component types and shared middleware services allows standardized software interfaces for the components. A software system called the TMT Interface Database System was constructed to support the documentation of the interfaces for components based on CSW. The programmer describes a subsystem and each of its components using JSON-style text files. A command interface file describes each command a component can receive and any commands a component sends. The event interface files describe status, alarms, and events a component publishes and status and events subscribed to by a component. A web application was created to provide a user interface for the required features. Files are ingested into the software system's database. The user interface allows browsing subsystem interfaces, publishing versions of subsystem interfaces, and constructing and publishing interface control documents that consist of the intersection of two subsystem interfaces. All published subsystem interfaces and interface control documents are versioned for configuration control and follow the standard TMT change control processes. Subsystem interfaces and interface control documents can be visualized in the browser or exported as PDF files.
Managing computer-controlled operations
NASA Technical Reports Server (NTRS)
Plowden, J. B.
1985-01-01
A detailed discussion of Launch Processing System Ground Software Production is presented to establish the interrelationships of firing room resource utilization, configuration control, system build operations, and Shuttle data bank management. The production of a test configuration identifier is traced from requirement generation to program development. The challenge of the operational era is to implement fully automated utilities to interface with a resident system build requirements document to eliminate all manual intervention in the system build operations. Automatic update/processing of Shuttle data tapes will enhance operations during multi-flow processing.
2011-02-18
Control Limit Lower Control Limit Reaction Plan 1 Complaints from other suppliers (synopsis, award) SCG During award process Identify Sole- Source...Parts 0.0 1.0 0.0 Evaluate complaint, if valid remove item from contract. 2 Tracking timeline for procurement/reviews SCG During pre- award process...Review Solicitation 100.0 Determine where the document stands in the approval process. Adjust milestones and followup . 3 FAR/DPAP guidance SCG
2015-06-01
adequate documentation to substantiate transactions , and effective internal controls surrounding business processes along with the verification that...organization, such as its personnel, processes, and objectives. The internal auditing profession brings a composite of in-depth knowledge and best business ...with internal auditors. Organizations should keep internal auditors abreast of changes in expectations as the business evolves. Doing so helps
NASA Technical Reports Server (NTRS)
Lowman, Douglas S.; Withers, B. Edward; Shagnea, Anita M.; Dent, Leslie A.; Hayhurst, Kelly J.
1990-01-01
A variety of instructions to be used in the development of implementations of software for the Guidance and Control Software (GCS) project is described. This document fulfills the Radio Technical Commission for Aeronautics RTCA/DO-178A guidelines, 'Software Considerations in Airborne Systems and Equipment Certification' requirements for document No. 4, which specifies the information necessary for understanding and programming the host computer, and document No. 12, which specifies the software design and implementation standards that are applicable to the software development and testing process. Information on the following subjects is contained: activity recording, communication protocol, coding standards, change management, error handling, design standards, problem reporting, module testing logs, documentation formats, accuracy requirements, and programmer responsibilities.
Sensors for process control Focus Team report
NASA Astrophysics Data System (ADS)
At the Semiconductor Technology Workshop, held in November 1992, the Semiconductor Industry Association (SIA) convened 179 semiconductor technology experts to assess the 15-year outlook for the semiconductor manufacturing industry. The output of the Workshop, a document entitled 'Semiconductor Technology: Workshop Working Group Reports,' contained an overall roadmap for the technology characteristics envisioned in integrated circuits (IC's) for the period 1992-2007. In addition, the document contained individual roadmaps for numerous key areas in IC manufacturing, such as film deposition, thermal processing, manufacturing systems, exposure technology, etc. The SIA Report did not contain a separate roadmap for contamination free manufacturing (CFM). A key component of CFM for the next 15 years is the use of sensors for (1) defect reduction, (2) improved product quality, (3) improved yield, (4) improved tool utilization through contamination reduction, and (5) real time process control in semiconductor fabrication. The objective of this Focus Team is to generate a Sensors for Process Control Roadmap. Implicit in this objective is the identification of gaps in current sensor technology so that research and development activity in the sensor industry can be stimulated to develop sensor systems capable of meeting the projected roadmap needs. Sensor performance features of interest include detection limit, specificity, sensitivity, ease of installation and maintenance, range, response time, accuracy, precision, ease and frequency of calibration, degree of automation, and adaptability to in-line process control applications.
Double shell tanks (DST) chemistry control data quality objectives
DOE Office of Scientific and Technical Information (OSTI.GOV)
BANNING, D.L.
2001-10-09
One of the main functions of the River Protection Project is to store the Hanford Site tank waste until the Waste Treatment Plant (WTP) is ready to receive and process the waste. Waste from the older single-shell tanks is being transferred to the newer double-shell tanks (DSTs). Therefore, the integrity of the DSTs must be maintained until the waste from all tanks has been retrieved and transferred to the WTP. To help maintain the integrity of the DSTs over the life of the project, specific chemistry limits have been established to control corrosion of the DSTs. These waste chemistry limitsmore » are presented in the Technical Safety Requirements (TSR) document HNF-SD-WM-TSR-006, Sec. 5 . IS, Rev 2B (CHG 200 I). In order to control the chemistry in the DSTs, the Chemistry Control Program will require analyses of the tank waste. This document describes the Data Quality Objective (DUO) process undertaken to ensure appropriate data will be collected to control the waste chemistry in the DSTs. The DQO process was implemented in accordance with Data Quality Objectives for Sampling and Analyses, HNF-IP-0842, Rev. Ib, Vol. IV, Section 4.16, (Banning 2001) and the U.S. Environmental Protection Agency EPA QA/G4, Guidance for the Data Quality Objectives Process (EPA 1994), with some modifications to accommodate project or tank specific requirements and constraints.« less
Auditing of chromatographic data.
Mabie, J T
1998-01-01
During a data audit, it is important to ensure that there is clear documentation and an audit trail. The Quality Assurance Unit should review all areas, including the laboratory, during the conduct of the sample analyses. The analytical methodology that is developed should be documented prior to sample analyses. This is an important document for the auditor, as it is the instrumental piece used by the laboratory personnel to maintain integrity throughout the process. It is expected that this document will give insight into the sample analysis, run controls, run sequencing, instrument parameters, and acceptance criteria for the samples. The sample analysis and all supporting documentation should be audited in conjunction with this written analytical method and any supporting Standard Operating Procedures to ensure the quality and integrity of the data.
HARV ANSER Flight Test Data Retrieval and Processing Procedures
NASA Technical Reports Server (NTRS)
Yeager, Jessie C.
1997-01-01
Under the NASA High-Alpha Technology Program the High Alpha Research Vehicle (HARV) was used to conduct flight tests of advanced control effectors, advanced control laws, and high-alpha design guidelines for future super-maneuverable fighters. The High-Alpha Research Vehicle is a pre-production F/A-18 airplane modified with a multi-axis thrust-vectoring system for augmented pitch and yaw control power and Actuated Nose Strakes for Enhanced Rolling (ANSER) to augment body-axis yaw control power. Flight testing at the Dryden Flight Research Center (DFRC) began in July 1995 and continued until May 1996. Flight data will be utilized to evaluate control law performance and aircraft dynamics, determine aircraft control and stability derivatives using parameter identification techniques, and validate design guidelines. To accomplish these purposes, essential flight data parameters were retrieved from the DFRC data system and stored on the Dynamics and Control Branch (DCB) computer complex at Langley. This report describes the multi-step task used to retrieve and process this data and documents the results of these tasks. Documentation includes software listings, flight information, maneuver information, time intervals for which data were retrieved, lists of data parameters and definitions, and example data plots.
Inspection of the Department`s export licensing process for dual-use and munitions commodities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1993-08-10
The purpose of our inspection was to review the Department of Energy`s (Energy) export licensing process for dual-use and military (munitions) commodities subject to nuclear nonproliferation controls. Specifically, we reviewed Energy`s authorities, procedures, and policies pertaining to the export licensing process and examined procedures for safeguarding data transmitted between Energy and other agencies involved in the export licensing process. We also reviewed Energy`s role as a member of the Subgroup on Nuclear Export Coordination. Our review of the sample of 60 export cases did not find evidence to lead us to believe that Energy`s recommendations for these cases were inappropriatemore » or incorrect. We identified, however, problems regarding management systems associated with the export license review process. We found that without documentation supporting export licensing decisions by the Export Control Operations Division (ECOD), we could not determine whether ECOD analysts considered all required criteria in their review of export cases referred to Energy. For example, we found that the ECOD did not retain records documenting the bases for its advice, recommendations, or decisions regarding its reviews of export license cases or revisions to lists of controlled commodities and, therefore, was not in compliance with certain provisions of the Export Administration Act, as amended, and Energy records management directives. Additionally, we found that the degree of compliance by Energy with the export licensing review criteria contained in the Export Administration Regulations and the Nuclear Non-Proliferation Act of 1978 could not be determined because ECOD did not retain records documenting the bases for its advice and recommendations on export cases.« less
Computer-Assisted Instruction: Authoring Languages. ERIC Digest.
ERIC Educational Resources Information Center
Reeves, Thomas C.
One of the most perplexing tasks in producing computer-assisted instruction (CAI) is the authoring process. Authoring is generally defined as the process of turning the flowcharts, control algorithms, format sheets, and other documentation of a CAI program's design into computer code that will operationalize the simulation on the delivery system.…
PROCEEDINGS: SEMINAR ON IN-STACK PARTICLE SIZING FOR PARTICULATE CONTROL DEVICE EVALUATION
The proceedings document discussions during an EPA/IERL-RTP-sponsored seminar on In-stack Particle Sizing for Particulate Control Device Evaluation. The seminar, organized by IERL-RTP's Process Measurements Branch, was held at IERL-RTP in North Carolina on December 3 and 4, 1975....
High-level waste tank farm set point document
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anthony, J.A. III
1995-01-15
Setpoints for nuclear safety-related instrumentation are required for actions determined by the design authorization basis. Minimum requirements need to be established for assuring that setpoints are established and held within specified limits. This document establishes the controlling methodology for changing setpoints of all classifications. The instrumentation under consideration involve the transfer, storage, and volume reduction of radioactive liquid waste in the F- and H-Area High-Level Radioactive Waste Tank Farms. The setpoint document will encompass the PROCESS AREA listed in the Safety Analysis Report (SAR) (DPSTSA-200-10 Sup 18) which includes the diversion box HDB-8 facility. In addition to the PROCESS AREASmore » listed in the SAR, Building 299-H and the Effluent Transfer Facility (ETF) are also included in the scope.« less
NASA Technical Reports Server (NTRS)
Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.
1982-01-01
Documentation of the preliminary software developed as a framework for a generalized integrated robotic system simulation is presented. The program structure is composed of three major functions controlled by a program executive. The three major functions are: system definition, analysis tools, and post processing. The system definition function handles user input of system parameters and definition of the manipulator configuration. The analysis tools function handles the computational requirements of the program. The post processing function allows for more detailed study of the results of analysis tool function executions. Also documented is the manipulator joint model software to be used as the basis of the manipulator simulation which will be part of the analysis tools capability.
Intelligent Software for System Design and Documentation
NASA Technical Reports Server (NTRS)
2002-01-01
In an effort to develop a real-time, on-line database system that tracks documentation changes in NASA's propulsion test facilities, engineers at Stennis Space Center teamed with ECT International of Brookfield, WI, through the NASA Dual-Use Development Program to create the External Data Program and Hyperlink Add-on Modules for the promis*e software. Promis*e is ECT's top-of-the-line intelligent software for control system design and documentation. With promis*e the user can make use of the automated design process to quickly generate control system schematics, panel layouts, bills of material, wire lists, terminal plans and more. NASA and its testing contractors currently use promis*e to create the drawings and schematics at the E2 Cell 2 test stand located at Stennis Space Center.
NASA Astrophysics Data System (ADS)
Beretta, Giordano
2007-01-01
The words in a document are often supported, illustrated, and enriched by visuals. When color is used, some of it is used to define the document's identity and is therefore strictly controlled in the design process. The result of this design process is a "color specification sheet," which must be created for every background color. While in traditional publishing there are only a few backgrounds, in variable data publishing a larger number of backgrounds can be used. We present an algorithm that nudges the colors in a visual to be distinct from a background while preserving the visual's general color character.
2013-04-12
statement of work. This document may be tailored by the acquisition activity for the specific application or program prior to contract award. 1.3...5 3. Definitions and Acronyms 3.1 Definitions The following definitions describe terms used throughout this document. Acquisition Activity The...acquisition activity is the Government, contractor, or subcontractor acquiring the equipment, system, subsystem, part, or material for which this
ISO 9000 and/or Systems Engineering Capability Maturity Model?
NASA Technical Reports Server (NTRS)
Gholston, Sampson E.
2002-01-01
For businesses and organizations to remain competitive today they must have processes and systems in place that will allow them to first identify customer needs and then develop products/processes that will meet or exceed the customers needs and expectations. Customer needs, once identified, are normally stated as requirements. Designers can then develop products/processes that will meet these requirements. Several functions, such as quality management and systems engineering management are used to assist product development teams in the development process. Both functions exist in all organizations and both have a similar objective, which is to ensure that developed processes will meet customer requirements. Are efforts in these organizations being duplicated? Are both functions needed by organizations? What are the similarities and differences between the functions listed above? ISO 9000 is an international standard of goods and services. It sets broad requirements for the assurance of quality and for management's involvement. It requires organizations to document the processes and to follow these documented processes. ISO 9000 gives customers assurance that the suppliers have control of the process for product development. Systems engineering can broadly be defined as a discipline that seeks to ensure that all requirements for a system are satisfied throughout the life of the system by preserving their interrelationship. The key activities of systems engineering include requirements analysis, functional analysis/allocation, design synthesis and verification, and system analysis and control. The systems engineering process, when followed properly, will lead to higher quality products, lower cost products, and shorter development cycles. The System Engineering Capability Maturity Model (SE-CMM) will allow companies to measure their system engineering capability and continuously improve those capabilities. ISO 9000 and SE-CMM seem to have a similar objective, which is to document the organization's processes and certify to potential customers the capability of a supplier to control the processes that determine the quality of the product or services being produced. The remaining sections of this report examine the differences and similarities between ISO 9000 and SE-CMM and make recommendations for implementation.
The potential of artificial aging for modelling of natural aging processes of ballpoint ink.
Weyermann, Céline; Spengler, Bernhard
2008-08-25
Artificial aging has been used to reproduce natural aging processes in an accelerated pace. Questioned documents were exposed to light or high temperature in a well-defined manner in order to simulate an increased age. This may be used to study the aging processes or to date documents by reproducing their aging curve. Ink was studied especially because it is deposited on the paper when a document, such as a contract, is produced. Once on the paper, aging processes start through degradation of dyes, solvents drying and resins polymerisation. Modelling of dye's and solvent's aging was attempted. These processes, however, follow complex pathways, influenced by many factors which can be classified as three major groups: ink composition, paper type and storage conditions. The influence of these factors is such that different aging states can be obtained for an identical point in time. Storage conditions in particular are difficult to simulate, as they are dependent on environmental conditions (e.g. intensity and dose of light, temperature, air flow, humidity) and cannot be controlled in the natural aging of questioned documents. The problem therefore lies more in the variety of different conditions a questioned document might be exposed to during its natural aging, rather than in the simulation of such conditions in the laboratory. Nevertheless, a precise modelling of natural aging curves based on artificial aging curves is obtained when performed on the same paper and ink. A standard model for aging processes of ink on paper is therefore presented that is based on a fit of aging curves to a power law of solvent concentrations as a function of time. A mathematical transformation of artificial aging curves into modelled natural aging curves results in excellent overlap with data from real natural aging processes.
Speech Recognition as a Transcription Aid: A Randomized Comparison With Standard Transcription
Mohr, David N.; Turner, David W.; Pond, Gregory R.; Kamath, Joseph S.; De Vos, Cathy B.; Carpenter, Paul C.
2003-01-01
Objective. Speech recognition promises to reduce information entry costs for clinical information systems. It is most likely to be accepted across an organization if physicians can dictate without concerning themselves with real-time recognition and editing; assistants can then edit and process the computer-generated document. Our objective was to evaluate the use of speech-recognition technology in a randomized controlled trial using our institutional infrastructure. Design. Clinical note dictations from physicians in two specialty divisions were randomized to either a standard transcription process or a speech-recognition process. Secretaries and transcriptionists also were assigned randomly to each of these processes. Measurements. The duration of each dictation was measured. The amount of time spent processing a dictation to yield a finished document also was measured. Secretarial and transcriptionist productivity, defined as hours of secretary work per minute of dictation processed, was determined for speech recognition and standard transcription. Results. Secretaries in the endocrinology division were 87.3% (confidence interval, 83.3%, 92.3%) as productive with the speech-recognition technology as implemented in this study as they were using standard transcription. Psychiatry transcriptionists and secretaries were similarly less productive. Author, secretary, and type of clinical note were significant (p < 0.05) predictors of productivity. Conclusion. When implemented in an organization with an existing document-processing infrastructure (which included training and interfaces of the speech-recognition editor with the existing document entry application), speech recognition did not improve the productivity of secretaries or transcriptionists. PMID:12509359
Lessons from New Zealand's introduction of pictorial health warnings on tobacco packaging.
Hoek, Janet; Wilson, Nick; Allen, Matthew; Edwards, Richard; Thomson, George; Li, Judy
2010-11-01
While international evidence suggests that featuring pictorial health warnings on tobacco packaging is an effective tobacco control intervention, the process used to introduce these new warnings has not been well documented. We examined relevant documents and interviewed officials responsible for this process in New Zealand. We found that, despite tobacco companies' opposition to pictorial health warnings and the resource constraints facing health authorities, the implementation process was generally robust and successful. Potential lessons for other countries planning to introduce or refresh existing pictorial health warnings include: (i) strengthening the link between image research and policy; (ii) requiring frequent image development and refreshment; (iii) using larger pictures (e.g. 80% of the front of the packet); (iv) developing themes that recognize concerns held by different smoker sub-groups; and (v) running integrated mass media campaigns when the warnings are introduced. All countries could also support moves by the World Health Organization Framework Convention on Tobacco Control's Secretariat to develop an international bank of copyright-free warnings.
Annalaura, Carducci; Giulia, Davini; Stefano, Ceccanti
2013-01-01
Risk analysis is widely used in the pharmaceutical industry to manage production processes, validation activities, training, and other activities. Several methods of risk analysis are available (for example, failure mode and effects analysis, fault tree analysis), and one or more should be chosen and adapted to the specific field where they will be applied. Among the methods available, hazard analysis and critical control points (HACCP) is a methodology that has been applied since the 1960s, and whose areas of application have expanded over time from food to the pharmaceutical industry. It can be easily and successfully applied to several processes because its main feature is the identification, assessment, and control of hazards. It can be also integrated with other tools, such as fishbone diagram and flowcharting. The aim of this article is to show how HACCP can be used to manage an analytical process, propose how to conduct the necessary steps, and provide data templates necessary to document and useful to follow current good manufacturing practices. In the quality control process, risk analysis is a useful tool for enhancing the uniformity of technical choices and their documented rationale. Accordingly, it allows for more effective and economical laboratory management, is capable of increasing the reliability of analytical results, and enables auditors and authorities to better understand choices that have been made. The aim of this article is to show how hazard analysis and critical control points can be used to manage bacterial endotoxins testing and other analytical processes in a formal, clear, and detailed manner.
This paper documents the process used by the United States Environmental Protection Agency (USEPA) to estimate the mean and standard deviation of data reported by in-control drinking water laboratories during Water Supply (WS) studies. This process is then applied to the data re...
This paper documents the process used by the United States Environmental Protection Agency (USEPA) to estimate the mean and standard deviation of data reported by in-control wastewater laboratories during Water Pollution (WP) studies. This process is then applied to the data rep...
Processes, Procedures, and Methods to Control Pollution Resulting from Silvicultural Activities.
ERIC Educational Resources Information Center
Environmental Protection Agency, Washington, DC. Office of Water Programs.
This report presents brief documentation of silvicultural practices, both those now in use and those in stages of research and development. A majority of the text is concerned with the specific aspects of silvicultural activities which relate to nonpoint source pollution control methods. Analyzed are existing and near future pollution control…
The planning and control of NASA programs and resources
NASA Technical Reports Server (NTRS)
1983-01-01
The major management systems used to plan and control NASA programs and resources are described as well as their integration to form the agency's general management approach in carrying out its mission. Documents containing more detailed descriptions of the processes and techniques involved in the agency's major management systems are listed.
Math Problems for Water Quality Control Personnel, Instructor's Manual. Second Edition.
ERIC Educational Resources Information Center
Delvecchio, Fred; Brutsch, Gloria
This document is the instructor's manual for a course in mathematics for water quality control personnel. It is designed so a program may be designed for a specific facility. The problem structures are arranged alphabetically by treatment process. Charts, graphs and/or drawings representing familiar data forms contain the necessary information to…
Math Problems for Water Quality Control Personnel, Student Workbook. Second Edition.
ERIC Educational Resources Information Center
Delvecchio, Fred; Brutsch, Gloria
This document is the student workbook for a course in mathematics for water quality control personnel. This version contains complete problems, answers and references. Problems are arranged alphabetically by treatment process. Charts, graphs, and drawings represent data forms an operator might see in a plant containing information necessary for…
40 CFR 725.975 - EPA approval of alternative control measures.
Code of Federal Regulations, 2014 CFR
2014-07-01
...) TOXIC SUBSTANCES CONTROL ACT REPORTING REQUIREMENTS AND REVIEW PROCESSES FOR MICROORGANISMS Additional... equivalency to EPA under this part must submit the request to EPA (via CDX) using e-PMN software. See 40 CFR 720.40(a)(2)(ii) for information on how to obtain e-PMN software. Support documents related to these...
Electrical Ground Support Equipment Fabrication, Specification for
NASA Technical Reports Server (NTRS)
Denson, Erik C.
2014-01-01
This document specifies parts, materials, and processes used in the fabrication, maintenance, repair, and procurement of electrical and electronic control and monitoring equipment associated with ground support equipment (GSE) at the Kennedy Space Center (KSC).
Railroad Classification Yard Technology Manual: Volume II : Yard Computer Systems
DOT National Transportation Integrated Search
1981-08-01
This volume (Volume II) of the Railroad Classification Yard Technology Manual documents the railroad classification yard computer systems methodology. The subjects covered are: functional description of process control and inventory computer systems,...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turner, Alan E.; Crow, Vernon L.; Payne, Deborah A.
Data visualization methods, data visualization devices, data visualization apparatuses, and articles of manufacture are described according to some aspects. In one aspect, a data visualization method includes accessing a plurality of initial documents at a first moment in time, first processing the initial documents providing processed initial documents, first identifying a plurality of first associations of the initial documents using the processed initial documents, generating a first visualization depicting the first associations, accessing a plurality of additional documents at a second moment in time after the first moment in time, second processing the additional documents providing processed additional documents, secondmore » identifying a plurality of second associations of the additional documents and at least some of the initial documents, wherein the second identifying comprises identifying using the processed initial documents and the processed additional documents, and generating a second visualization depicting the second associations.« less
242A Distributed Control System Year 2000 Acceptance Test Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
TEATS, M.C.
1999-08-31
This report documents acceptance test results for the 242-A Evaporator distributive control system upgrade to D/3 version 9.0-2 for year 2000 compliance. This report documents the test results obtained by acceptance testing as directed by procedure HNF-2695. This verification procedure will document the initial testing and evaluation of the potential 242-A Distributed Control System (DCS) operating difficulties across the year 2000 boundary and the calendar adjustments needed for the leap year. Baseline system performance data will be recorded using current, as-is operating system software. Data will also be collected for operating system software that has been modified to correct yearmore » 2000 problems. This verification procedure is intended to be generic such that it may be performed on any D/3{trademark} (GSE Process Solutions, Inc.) distributed control system that runs with the VMSTM (Digital Equipment Corporation) operating system. This test may be run on simulation or production systems depending upon facility status. On production systems, DCS outages will occur nine times throughout performance of the test. These outages are expected to last about 10 minutes each.« less
Harnessing the Potential of Additive Manufacturing
2016-12-01
manufacturing age, which is dominated by standards for materials, processes and process control. Conventional manufacturing is based upon a design that is...documented either in a drawing or a computer-aided design (CAD) file. The manufacturing team then develops a docu- mented public or private process for...31 Defense AT&L: November-December 2016 Harnessing the Potential of Additive Manufacturing Bill Decker Decker is director of Technology
Controlled electrostatic methodology for imaging indentations in documents.
Yaraskavitch, Luke; Graydon, Matthew; Tanaka, Tobin; Ng, Lay-Keow
2008-05-20
The electrostatic process for imaging indentations on documents using the ESDA device is investigated under controlled experimental settings. An in-house modified commercial xerographic developer housing is used to control the uniformity and volume of toner deposition, allowing for reproducible image development. Along with this novel development tool, an electrostatic voltmeter and fixed environmental conditions facilitate an optimization process. Sample documents are preconditioned in a humidity cabinet with microprocessor control, and the significant benefit of humidification above 70% RH on image quality is verified. Improving on the subjective methods of previous studies, image quality analysis is carried out in an objective and reproducible manner using the PIAS-II. For the seven commercial paper types tested, the optimum ESDA operating point is found to be at an electric potential near -400V at the Mylar surface; however, for most paper types, the optimum operating regime is found to be quite broad, spanning relatively small electric potentials between -200 and -550V. At -400V, the film right above an indented area generally carries a voltage which is 30-50V less negative than the non-indented background. In contrast with Seward's findings [G.H. Seward, Model for electrostatic imaging of forensic evidence via discharge through Mylar-paper path, J. Appl. Phys. 83 (3) (1998) 1450-1456; G.H. Seward, Practical implications of the charge transport model for electrostatic detection apparatus (ESDA), J. Forensic Sci. 44 (4) (1999) 832-836], a period of charge decay before image development is not required when operating in this optimal regime. A brief investigation of the role played by paper-to-paper friction during the indentation process is conducted using our optimized development method.
VLTI auxiliary telescopes: a full object-oriented approach
NASA Astrophysics Data System (ADS)
Chiozzi, Gianluca; Duhoux, Philippe; Karban, Robert
2000-06-01
The Very Large Telescope (VLT) Telescope Control Software (TCS) is a portable system. It is now in use or will be used in a whole family of ESO telescopes VLT Unit Telescopes, VLTI Auxiliary Telescopes, NTT, La Silla 3.6, VLT Survey Telescope and Astronomical Site Monitors in Paranal and La Silla). Although it has been developed making extensive usage of Object Oriented (OO) methodologies, the overall development process chosen at the beginning of the project used traditional methods. In order to warranty a longer lifetime to the system (improving documentation and maintainability) and to prepare for future projects, we have introduced a full OO process. We have taken as a basis the United Software Development Process with the Unified Modeling Language (UML) and we have adapted the process to our specific needs. This paper describes how the process has been applied to the VLTI Auxiliary Telescopes Control Software (ATCS). The ATCS is based on the portable VLT TCS, but some subsystems are new or have specific characteristics. The complete process has been applied to the new subsystems, while reused code has been integrated in the UML models. We have used the ATCS on one side to tune the process and train the team members and on the other side to provide a UML and WWW based documentation for the portable VLT TCS.
[Computerized system validation of clinical researches].
Yan, Charles; Chen, Feng; Xia, Jia-lai; Zheng, Qing-shan; Liu, Daniel
2015-11-01
Validation is a documented process that provides a high degree of assurance. The computer system does exactly and consistently what it is designed to do in a controlled manner throughout the life. The validation process begins with the system proposal/requirements definition, and continues application and maintenance until system retirement and retention of the e-records based on regulatory rules. The objective to do so is to clearly specify that each application of information technology fulfills its purpose. The computer system validation (CSV) is essential in clinical studies according to the GCP standard, meeting product's pre-determined attributes of the specifications, quality, safety and traceability. This paper describes how to perform the validation process and determine relevant stakeholders within an organization in the light of validation SOPs. Although a specific accountability in the implementation of the validation process might be outsourced, the ultimate responsibility of the CSV remains on the shoulder of the business process owner-sponsor. In order to show that the compliance of the system validation has been properly attained, it is essential to set up comprehensive validation procedures and maintain adequate documentations as well as training records. Quality of the system validation should be controlled using both QC and QA means.
Supervised Gamma Process Poisson Factorization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Dylan Zachary
This thesis develops the supervised gamma process Poisson factorization (S- GPPF) framework, a novel supervised topic model for joint modeling of count matrices and document labels. S-GPPF is fully generative and nonparametric: document labels and count matrices are modeled under a uni ed probabilistic framework and the number of latent topics is controlled automatically via a gamma process prior. The framework provides for multi-class classification of documents using a generative max-margin classifier. Several recent data augmentation techniques are leveraged to provide for exact inference using a Gibbs sampling scheme. The first portion of this thesis reviews supervised topic modeling andmore » several key mathematical devices used in the formulation of S-GPPF. The thesis then introduces the S-GPPF generative model and derives the conditional posterior distributions of the latent variables for posterior inference via Gibbs sampling. The S-GPPF is shown to exhibit state-of-the-art performance for joint topic modeling and document classification on a dataset of conference abstracts, beating out competing supervised topic models. The unique properties of S-GPPF along with its competitive performance make it a novel contribution to supervised topic modeling.« less
Software control and system configuration management - A process that works
NASA Technical Reports Server (NTRS)
Petersen, K. L.; Flores, C., Jr.
1983-01-01
A comprehensive software control and system configuration management process for flight-crucial digital control systems of advanced aircraft has been developed and refined to insure efficient flight system development and safe flight operations. Because of the highly complex interactions among the hardware, software, and system elements of state-of-the-art digital flight control system designs, a systems-wide approach to configuration control and management has been used. Specific procedures are implemented to govern discrepancy reporting and reconciliation, software and hardware change control, systems verification and validation testing, and formal documentation requirements. An active and knowledgeable configuration control board reviews and approves all flight system configuration modifications and revalidation tests. This flexible process has proved effective during the development and flight testing of several research aircraft and remotely piloted research vehicles with digital flight control systems that ranged from relatively simple to highly complex, integrated mechanizations.
Poissant, Lise; Pereira, Jennifer; Tamblyn, Robyn; Kawasumi, Yuko
2005-01-01
A systematic review of the literature was performed to examine the impact of electronic health records (EHRs) on documentation time of physicians and nurses and to identify factors that may explain efficiency differences across studies. In total, 23 papers met our inclusion criteria; five were randomized controlled trials, six were posttest control studies, and 12 were one-group pretest-posttest designs. Most studies (58%) collected data using a time and motion methodology in comparison to work sampling (33%) and self-report/survey methods (8%). A weighted average approach was used to combine results from the studies. The use of bedside terminals and central station desktops saved nurses, respectively, 24.5% and 23.5% of their overall time spent documenting during a shift. Using bedside or point-of-care systems increased documentation time of physicians by 17.5%. In comparison, the use of central station desktops for computerized provider order entry (CPOE) was found to be inefficient, increasing the work time from 98.1% to 328.6% of physician's time per working shift (weighted average of CPOE-oriented studies, 238.4%). Studies that conducted their evaluation process relatively soon after implementation of the EHR tended to demonstrate a reduction in documentation time in comparison to the increases observed with those that had a longer time period between implementation and the evaluation process. This review highlighted that a goal of decreased documentation time in an EHR project is not likely to be realized. It also identified how the selection of bedside or central station desktop EHRs may influence documentation time for the two main user groups, physicians and nurses.
Poissant, Lise; Pereira, Jennifer; Tamblyn, Robyn; Kawasumi, Yuko
2005-01-01
A systematic review of the literature was performed to examine the impact of electronic health records (EHRs) on documentation time of physicians and nurses and to identify factors that may explain efficiency differences across studies. In total, 23 papers met our inclusion criteria; five were randomized controlled trials, six were posttest control studies, and 12 were one-group pretest-posttest designs. Most studies (58%) collected data using a time and motion methodology in comparison to work sampling (33%) and self-report/survey methods (8%). A weighted average approach was used to combine results from the studies. The use of bedside terminals and central station desktops saved nurses, respectively, 24.5% and 23.5% of their overall time spent documenting during a shift. Using bedside or point-of-care systems increased documentation time of physicians by 17.5%. In comparison, the use of central station desktops for computerized provider order entry (CPOE) was found to be inefficient, increasing the work time from 98.1% to 328.6% of physician's time per working shift (weighted average of CPOE-oriented studies, 238.4%). Studies that conducted their evaluation process relatively soon after implementation of the EHR tended to demonstrate a reduction in documentation time in comparison to the increases observed with those that had a longer time period between implementation and the evaluation process. This review highlighted that a goal of decreased documentation time in an EHR project is not likely to be realized. It also identified how the selection of bedside or central station desktop EHRs may influence documentation time for the two main user groups, physicians and nurses. PMID:15905487
TRACER - TRACING AND CONTROL OF ENGINEERING REQUIREMENTS
NASA Technical Reports Server (NTRS)
Turner, P. R.
1994-01-01
TRACER (Tracing and Control of Engineering Requirements) is a database/word processing system created to document and maintain the order of both requirements and descriptive material associated with an engineering project. A set of hierarchical documents are normally generated for a project whereby the requirements of the higher level documents levy requirements on the same level or lower level documents. Traditionally, the requirements are handled almost entirely by manual paper methods. The problem with a typical paper system, however, is that requirements written and changed continuously in different areas lead to misunderstandings and noncompliance. The purpose of TRACER is to automate the capture, tracing, reviewing, and managing of requirements for an engineering project. The engineering project still requires communications, negotiations, interactions, and iterations among people and organizations, but TRACER promotes succinct and precise identification and treatment of real requirements separate from the descriptive prose in a document. TRACER permits the documentation of an engineering project's requirements and progress in a logical, controllable, traceable manner. TRACER's attributes include the presentation of current requirements and status from any linked computer terminal and the ability to differentiate headers and descriptive material from the requirements. Related requirements can be linked and traced. The program also enables portions of documents to be printed, individual approval and release of requirements, and the tracing of requirements down into the equipment specification. Requirement "links" can be made "pending" and invisible to others until the pending link is made "binding". Individuals affected by linked requirements can be notified of significant changes with acknowledgement of the changes required. An unlimited number of documents can be created for a project and an ASCII import feature permits existing documents to be incorporated. TRACER can automatically renumber section headers when inserting or deleting sections of a document and generate sign-off forms for any approval process as well as a table of contents. TRACER was implemented on an IBM PC under PC-DOS. The program requires 640K RAM, a hard disk, and PC-DOS version 3.3 or higher. It was written in CLIPPER (Summer '87). TRACER is available on two 5.25 inch 1.2Mb MS-DOS format diskettes. The executable program is also provided with the distribution. TRACER is a copyrighted work with all copyright vested in the National Aeronautics and Space Administration. IBM PC and PC-DOS are registered trademarks of International Business Machines. CLIPPER is a trademark of Nantucket Corporation.
Computer model of one-dimensional equilibrium controlled sorption processes
Grove, D.B.; Stollenwerk, K.G.
1984-01-01
A numerical solution to the one-dimensional solute-transport equation with equilibrium-controlled sorption and a first-order irreversible-rate reaction is presented. The computer code is written in FORTRAN language, with a variety of options for input and output for user ease. Sorption reactions include Langmuir, Freundlich, and ion-exchange, with or without equal valance. General equations describing transport and reaction processes are solved by finite-difference methods, with nonlinearities accounted for by iteration. Complete documentation of the code, with examples, is included. (USGS)
National Historic Preservation Act (NHPA) Section 106 Process: Florence Copper, Inc. (FCI)
Collection of National Historic Preservation Act (NHPA) documents relating to Public Notice of Intent to Issue a Class III Underground Injection Control Area Permit for Florence Copper, Inc. (FCI) - Florence, AZ (closed)
The hormonal control of ejaculation.
Corona, Giovanni; Jannini, Emmanuele A; Vignozzi, Linda; Rastrelli, Giulia; Maggi, Mario
2012-09-01
Hormones regulate all aspects of male reproduction, from sperm production to sexual drive. Although emerging evidence from animal models and small clinical studies in humans clearly point to a role for several hormones in controlling the ejaculatory process, the exact endocrine mechanisms are unclear. Evidence shows that oxytocin is actively involved in regulating orgasm and ejaculation via peripheral, central and spinal mechanisms. Associations between delayed and premature ejaculation with hypothyroidism and hyperthyroidism, respectively, have also been extensively documented. Some models suggest that glucocorticoids are involved in the regulation of the ejaculatory reflex, but corresponding data from human studies are scant. Oestrogens regulate epididymal motility, whereas testosterone can affect the central and peripheral aspects of the ejaculatory process. Overall, the data of the endocrine system in regulating the ejaculatory reflex suggest that widely available endocrine therapies might be effective in treating sexual disorders in these men. Indeed, substantial evidence has documented that treatments of thyroid diseases are able to improve some ejaculatory difficulties.
21 CFR 820.40 - Document controls.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Document controls. 820.40 Section 820.40 Food and... QUALITY SYSTEM REGULATION Document Controls § 820.40 Document controls. Each manufacturer shall establish and maintain procedures to control all documents that are required by this part. The procedures shall...
21 CFR 820.40 - Document controls.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Document controls. 820.40 Section 820.40 Food and... QUALITY SYSTEM REGULATION Document Controls § 820.40 Document controls. Each manufacturer shall establish and maintain procedures to control all documents that are required by this part. The procedures shall...
21 CFR 820.40 - Document controls.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Document controls. 820.40 Section 820.40 Food and... QUALITY SYSTEM REGULATION Document Controls § 820.40 Document controls. Each manufacturer shall establish and maintain procedures to control all documents that are required by this part. The procedures shall...
21 CFR 820.40 - Document controls.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Document controls. 820.40 Section 820.40 Food and... QUALITY SYSTEM REGULATION Document Controls § 820.40 Document controls. Each manufacturer shall establish and maintain procedures to control all documents that are required by this part. The procedures shall...
21 CFR 820.40 - Document controls.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Document controls. 820.40 Section 820.40 Food and... QUALITY SYSTEM REGULATION Document Controls § 820.40 Document controls. Each manufacturer shall establish and maintain procedures to control all documents that are required by this part. The procedures shall...
NASA Technical Reports Server (NTRS)
1982-01-01
The format of the HDT-AM product which contains partially processed LANDSAT D and D Prime multispectral scanner image data is defined. Recorded-data formats, tape format, and major frame types are described.
Considerations for the Systematic Analysis and Use of Single-Case Research
ERIC Educational Resources Information Center
Horner, Robert H.; Swaminathan, Hariharan; Sugai, George; Smolkowski, Keith
2012-01-01
Single-case research designs provide a rigorous research methodology for documenting experimental control. If single-case methods are to gain wider application, however, a need exists to define more clearly (a) the logic of single-case designs, (b) the process and decision rules for visual analysis, and (c) an accepted process for integrating…
Robot welding process control development task
NASA Technical Reports Server (NTRS)
Romine, Peter L.
1992-01-01
The completion of, and improvements made to, the software developed during 1990 for program maintenance on the PC and HEURIKON and transfer to the CYRO, and integration of the Rocketdyne vision software with the CYRO is documented. The new programs were used successfully by NASA, Rocketdyne, and UAH technicians and engineers to create, modify, upload, download, and control CYRO NC programs.
Manufacture and quality control of interconnecting wire hardnesses, Volume 1
NASA Technical Reports Server (NTRS)
1972-01-01
A standard is presented for manufacture, installation, and quality control of eight types of interconnecting wire harnesses. The processes, process controls, and inspection and test requirements reflected are based on acknowledgment of harness design requirements, acknowledgment of harness installation requirements, identification of the various parts, materials, etc., utilized in harness manufacture, and formulation of a typical manufacturing flow diagram for identification of each manufacturing and quality control process, operation, inspection, and test. The document covers interconnecting wire harnesses defined in the design standard, including type 1, enclosed in fluorocarbon elastomer convolute, tubing; type 2, enclosed in TFE convolute tubing lines with fiberglass braid; type 3, enclosed in TFE convolute tubing; and type 5, combination of types 3 and 4. Knowledge gained through experience on the Saturn 5 program coupled with recent advances in techniques, materials, and processes was incorporated.
[Interpersonal relationships management in the nursery work process].
Urbanetto, Janete de Souza; Capella, Beatriz Beduschi
2004-01-01
This study deals with the problem of interpersonal relationships in the work process of the nurse and, is supported in the referential of the work process and the evolutionary stages of group relationships. To data collection was used a research-action and documental method, at two university hospitals from the South Region. It has been detected some fragilities faced by the controlling nurse in all the stages of the relationship process, with the presence of inefficacious mechanisms of inclusion of these professionals at work, no differential treatment between the controlling position and other functions, relations with an emphasis in the bipersonal contacts and as inefficient as unsatisfactory mechanisms of work process evaluation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
This document presents guidance for implementing the process that the U.S. Department of Energy (DOE) Office of Legacy Management (LM) will use for assuming perpetual responsibility for a closed uranium mill tailings site. The transition process specifically addresses sites regulated under Title II of the Uranium Mill Tailings Radiation Control Act (UMTRCA) but is applicable in principle to the transition of sites under other regulatory structures, such as the Formerly Utilized Sites Remedial Action Program.
Development of Evidence-Based Health Policy Documents in Developing Countries: A Case of Iran
Imani-Nasab, Mohammad Hasan; Seyedin, Hesam; Majdzadeh, Reza; Yazdizadeh, Bahareh; Salehi, Masoud
2014-01-01
Background: Evidence-based policy documents that are well developed by senior civil servants and are timely available can reduce the barriers to evidence utilization by health policy makers. This study examined the barriers and facilitators in developing evidence-based health policy documents from the perspective of their producers in a developing country. Methods: In a qualitative study with a framework analysis approach, we conducted semi-structured interviews using purposive and snowball sampling. A qualitative analysis software (MAXQDA-10) was used to apply the codes and manage the data. This study was theory-based and the results were compared to exploratory studies about the factors influencing evidence-based health policymaking. Results: 18 codes and three main themes of behavioral, normative, and control beliefs were identified. Factors that influence the development of evidence-based policy documents were identified by the participants: behavioral beliefs included quality of policy documents, use of resources, knowledge and innovation, being time-consuming and contextualization; normative beliefs included policy authorities, policymakers, policy administrators, and co-workers; and control beliefs included recruitment policy, performance management, empowerment, management stability, physical environment, access to evidence, policy making process, and effect of other factors. Conclusion: Most of the cited barriers to the development of evidence-based policy were related to control beliefs, i.e. barriers at the organizational and health system levels. This study identified the factors that influence the development of evidence-based policy documents based on the components of the theory of planned behavior. But in exploratory studies on evidence utilization by health policymakers, the identified factors were only related to control behaviors. This suggests that the theoretical approach may be preferable to the exploratory approach in identifying the barriers and facilitators of a behavior. PMID:24762343
Development of evidence-based health policy documents in developing countries: a case of Iran.
Imani-Nasab, Mohammad Hasan; Seyedin, Hesam; Majdzadeh, Reza; Yazdizadeh, Bahareh; Salehi, Masoud
2014-02-07
Evidence-based policy documents that are well developed by senior civil servants and are timely available can reduce the barriers to evidence utilization by health policy makers. This study examined the barriers and facilitators in developing evidence-based health policy documents from the perspective of their producers in a developing country. In a qualitative study with a framework analysis approach, we conducted semi-structured interviews using purposive and snowball sampling. A qualitative analysis software (MAXQDA-10) was used to apply the codes and manage the data. This study was theory-based and the results were compared to exploratory studies about the factors influencing evidence-based health policy-making. 18 codes and three main themes of behavioral, normative, and control beliefs were identified. Factors that influence the development of evidence-based policy documents were identified by the participants: behavioral beliefs included quality of policy documents, use of resources, knowledge and innovation, being time-consuming and contextualization; normative beliefs included policy authorities, policymakers, policy administrators, and co-workers; and control beliefs included recruitment policy, performance management, empowerment, management stability, physical environment, access to evidence, policy making process, and effect of other factors. Most of the cited barriers to the development of evidence-based policy were related to control beliefs, i.e. barriers at the organizational and health system levels. This study identified the factors that influence the development of evidence-based policy documents based on the components of the theory of planned behavior. But in exploratory studies on evidence utilization by health policymakers, the identified factors were only related to control behaviors. This suggests that the theoretical approach may be preferable to the exploratory approach in identifying the barriers and facilitators of a behavior.
NASA Technical Reports Server (NTRS)
2003-01-01
When NASA needed a real-time, online database system capable of tracking documentation changes in its propulsion test facilities, engineers at Stennis Space Center joined with ECT International, of Brookfield, Wisconsin, to create a solution. Through NASA's Dual-Use Program, ECT developed Exdata, a software program that works within the company's existing Promise software. Exdata not only satisfied NASA s requirements, but also expanded ECT s commercial product line. Promise, ECT s primary product, is an intelligent software program with specialized functions for designing and documenting electrical control systems. An addon to AutoCAD software, Promis e generates control system schematics, panel layouts, bills of material, wire lists, and terminal plans. The drawing functions include symbol libraries, macros, and automatic line breaking. Primary Promise customers include manufacturing companies, utilities, and other organizations with complex processes to control.
HALE UAS Command and Control Communications: Step 1 - Functional Requirements Document. Version 4.0
NASA Technical Reports Server (NTRS)
2006-01-01
The High Altitude Long Endurance (HALE) unmanned aircraft system (UAS) communicates with an off-board pilot-in-command in all flight phases via the C2 data link, making it a critical component for the UA to fly in the NAS safely and routinely. This is a new requirement in current FAA communications planning and monitoring processes. This document provides a set of comprehensive C2 communications functional requirements and performance guidelines to help facilitate the future FAA certification process for civil UAS to operate in the NAS. The objective of the guidelines is to provide the ability to validate the functional requirements and in future be used to develop performance-level requirements.
Qualification Procedures for VHSIC/VLSI
1990-12-01
alternative approach for qualification of complex microcircuits. To address the technical issues related to a process oriented qualification approach, the...methodology of microcircuit process control to promote the United States to a position of supplying the nighest quality and most reliable...available resources . o Coordinate document reviews with weekly and monthly status reviews on progress. o Summarize results and collate into four basic
PRE-QUALITY ASSURANCE PROJECT PLAN (QAPP) AGREEMENT (PQA) (HANDOUT)
The Land Remediation and Pollution Control Division (LRPCD) QA Manager strives to assist LRPCD researchers in developing functional planning documents for their research projects. As part of the planning process, several pieces of information are needed, including information re...
APPROACHING ZERO DISCHARGE IN SURFACE FINISHING
This document provides guidance to surface finishing manufacturers on control technologies and process changes for approaching zero discharge (AZD). AZD is a key theme underlying the Strategic Goals Program (SGP). The SGP is a cooperative effort between the EPA nd the American El...
TECHNIQUES AND APPROACHES TO EVALUATE THE NATURAL ATTENUATION OF MTBE
Natural anaerobic biodegradation is the most important processes controlling natural attenuation of MTBE along a flow path. However, natural biological degradation has been particularly difficult to document at field scale. Biodegradation of the BTEX compounds produce the same ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greene, N.M.; Petrie, L.M.; Westfall, R.M.
SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system hasmore » been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. The manual is divided into three volumes: Volume 1--for the control module documentation; Volume 2--for functional module documentation; and Volume 3--for documentation of the data libraries and subroutine libraries.« less
CANISTER HANDLING FACILITY DESCRIPTION DOCUMENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
J.F. Beesley
The purpose of this facility description document (FDD) is to establish requirements and associated bases that drive the design of the Canister Handling Facility (CHF), which will allow the design effort to proceed to license application. This FDD will be revised at strategic points as the design matures. This FDD identifies the requirements and describes the facility design, as it currently exists, with emphasis on attributes of the design provided to meet the requirements. This FDD is an engineering tool for design control; accordingly, the primary audience and users are design engineers. This FDD is part of an iterative designmore » process. It leads the design process with regard to the flowdown of upper tier requirements onto the facility. Knowledge of these requirements is essential in performing the design process. The FDD follows the design with regard to the description of the facility. The description provided in this FDD reflects the current results of the design process.« less
Vits, Sabine; Dissemond, Joachim; Schadendorf, Dirk; Kriegler, Lisa; Körber, Andreas; Schedlowski, Manfred; Cesko, Elvir
2015-12-01
Placebo responses have been shown to affect the symptomatology of skin diseases. However, expectation-induced placebo effects on wound healing processes have not been investigated yet. We analysed whether subjects' expectation of receiving an active drug accelerates the healing process of experimentally induced wounds. In 22 healthy men (experimental group, n = 11; control group, n = 11) wounds were induced by ablative laser on both thighs. Using a deceptive paradigm, participants in the experimental group were informed that an innovative 'wound gel' was applied on one of the two wounds, whereas a 'non-active gel' was applied on the wound of the other thigh. In fact, both gels were identical hydrogels without any active components. A control group was informed to receive a non-active gel on both wounds. Progress in wound healing was documented via planimetry on days 1, 4 and 7 after wound induction. From day 9 onwards wound inspections were performed daily accompanied by a change of the dressing and a new application of the gel. No significant differences could be observed with regard to duration or process of wound healing, either by intraindividual or by interindividual comparisons. These data document no expectation-induced placebo effect on the healing process of experimentally induced wounds in healthy volunteers. © 2013 The Authors. International Wound Journal © 2013 Medicalhelplines.com Inc and John Wiley & Sons Ltd.
Payload/cargo processing at the launch site
NASA Technical Reports Server (NTRS)
Ragusa, J. M.
1983-01-01
Payload processing at Kennedy Space Center is described, with emphasis on payload contamination control. Support requirements are established after documentation of the payload. The processing facilities feature enclosed, environmentally controlled conditions, with account taken of the weather conditions, door openings, accessing the payload, industrial activities, and energy conservation. Apparatus are also available for purges after Orbiter landing. The payloads are divided into horizontal, vertical, mixed, and life sciences and Getaway Special categories, which determines the processing route through the facilities. A canister/transport system features sealed containers for moving payloads from one facility building to another. All payloads are exposed to complete Orbiter bay interface checkouts in a simulator before actually being mounted in the bay.
SLUDGE TREATMENT PROJECT KOP CONCEPTUAL DESIGN CONTROL DECISION REPORT
DOE Office of Scientific and Technical Information (OSTI.GOV)
CARRO CA
2010-03-09
This control decision addresses the Knock-Out Pot (KOP) Disposition KOP Processing System (KPS) conceptual design. The KPS functions to (1) retrieve KOP material from canisters, (2) remove particles less than 600 {micro}m in size and low density materials from the KOP material, (3) load the KOP material into Multi-Canister Overpack (MCO) baskets, and (4) stage the MCO baskets for subsequent loading into MCOs. Hazard and accident analyses of the KPS conceptual design have been performed to incorporate safety into the design process. The hazard analysis is documented in PRC-STP-00098, Knock-Out Pot Disposition Project Conceptual Design Hazard Analysis. The accident analysismore » is documented in PRC-STP-CN-N-00167, Knock-Out Pot Disposition Sub-Project Canister Over Lift Accident Analysis. Based on the results of these analyses, and analyses performed in support of MCO transportation and MCO processing and storage activities at the Cold Vacuum Drying Facility (CVDF) and Canister Storage Building (CSB), control decision meetings were held to determine the controls required to protect onsite and offsite receptors and facility workers. At the conceptual design stage, these controls are primarily defined by their safety functions. Safety significant structures, systems, and components (SSCs) that could provide the identified safety functions have been selected for the conceptual design. It is anticipated that some safety SSCs identified herein will be reclassified based on hazard and accident analyses performed in support of preliminary and detailed design.« less
NASA Astrophysics Data System (ADS)
Boehnlein, Thomas R.; Kramb, Victoria
2018-04-01
Proper formal documentation of computer acquired NDE experimental data generated during research is critical to the longevity and usefulness of the data. Without documentation describing how and why the data was acquired, NDE research teams lose capability such as their ability to generate new information from previously collected data or provide adequate information so that their work can be replicated by others seeking to validate their research. Despite the critical nature of this issue, NDE data is still being generated in research labs without appropriate documentation. By generating documentation in series with data, equal priority is given to both activities during the research process. One way to achieve this is to use a reactive documentation system (RDS). RDS prompts an operator to document the data as it is generated rather than relying on the operator to decide when and what to document. This paper discusses how such a system can be implemented in a dynamic environment made up of in-house and third party NDE data acquisition systems without creating additional burden on the operator. The reactive documentation approach presented here is agnostic enough that the principles can be applied to any operator controlled, computer based, data acquisition system.
Apollo experience report: Systems and flight procedures development
NASA Technical Reports Server (NTRS)
Kramer, P. C.
1973-01-01
This report describes the process of crew procedures development used in the Apollo Program. The two major categories, Systems Procedures and Flight Procedures, are defined, as are the forms of documentation required. A description is provided of the operation of the procedures change control process, which includes the roles of man-in-the-loop simulations and the Crew Procedures Change Board. Brief discussions of significant aspects of the attitude control, computer, electrical power, environmental control, and propulsion subsystems procedures development are presented. Flight procedures are subdivided by mission phase: launch and translunar injection, rendezvous, lunar descent and ascent, and entry. Procedures used for each mission phase are summarized.
Zheng, Shuai; Ghasemzadeh, Nima; Hayek, Salim S; Quyyumi, Arshed A
2017-01-01
Background Extracting structured data from narrated medical reports is challenged by the complexity of heterogeneous structures and vocabularies and often requires significant manual effort. Traditional machine-based approaches lack the capability to take user feedbacks for improving the extraction algorithm in real time. Objective Our goal was to provide a generic information extraction framework that can support diverse clinical reports and enables a dynamic interaction between a human and a machine that produces highly accurate results. Methods A clinical information extraction system IDEAL-X has been built on top of online machine learning. It processes one document at a time, and user interactions are recorded as feedbacks to update the learning model in real time. The updated model is used to predict values for extraction in subsequent documents. Once prediction accuracy reaches a user-acceptable threshold, the remaining documents may be batch processed. A customizable controlled vocabulary may be used to support extraction. Results Three datasets were used for experiments based on report styles: 100 cardiac catheterization procedure reports, 100 coronary angiographic reports, and 100 integrated reports—each combines history and physical report, discharge summary, outpatient clinic notes, outpatient clinic letter, and inpatient discharge medication report. Data extraction was performed by 3 methods: online machine learning, controlled vocabularies, and a combination of these. The system delivers results with F1 scores greater than 95%. Conclusions IDEAL-X adopts a unique online machine learning–based approach combined with controlled vocabularies to support data extraction for clinical reports. The system can quickly learn and improve, thus it is highly adaptable. PMID:28487265
ANALYTICAL METHOD CHECKLIST FOR VOLATILE ORGANIC COMPOUNDS BY GC/MS (HANDOUT)
The Land Remediation and Pollution Control Division (LRPCD) QA Manager strives to assist LRPCD researchers in developing functional planning documents for their research projects. As part of the planning process, several pieces of information are needed, including information re...
HYDRAULIC REDISTRIBUTION IN THE PACIFIC NORTHWEST: TWEAKING THE SYSTEM
Hydraulic redistribution (HR) has recently been documented in Pacific Northwest forests, but the controls governing this process and its importance to shallow-rooted species are poorly understood. Our objective in this study was to manipulate the soil-root system to tease apart ...
Manufacturing and quality control of interconnecting wire harnesses, Volume 2
NASA Technical Reports Server (NTRS)
1972-01-01
Interconnecting wire harnesses defined in the design standard are considered, including type 4, open bundle (not enclosed). Knowledge gained through experience on the Saturn 5 program coupled with recent advances in techniques, materials, and processes was incorporated into the document.
Microcomputers in the Anesthesia Library.
ERIC Educational Resources Information Center
Wright, A. J.
The combination of computer technology and library operation is helping to alleviate such library problems as escalating costs, increasing collection size, deteriorating materials, unwieldy arrangement schemes, poor subject control, and the acquisition and processing of large numbers of rarely used documents. Small special libraries such as…
FDA 2011 process validation guidance: lifecycle compliance model.
Campbell, Cliff
2014-01-01
This article has been written as a contribution to the industry's efforts in migrating from a document-driven to a data-driven compliance mindset. A combination of target product profile, control engineering, and general sum principle techniques is presented as the basis of a simple but scalable lifecycle compliance model in support of modernized process validation. Unit operations and significant variables occupy pole position within the model, documentation requirements being treated as a derivative or consequence of the modeling process. The quality system is repositioned as a subordinate of system quality, this being defined as the integral of related "system qualities". The article represents a structured interpretation of the U.S. Food and Drug Administration's 2011 Guidance for Industry on Process Validation and is based on the author's educational background and his manufacturing/consulting experience in the validation field. The U.S. Food and Drug Administration's Guidance for Industry on Process Validation (2011) provides a wide-ranging and rigorous outline of compliant drug manufacturing requirements relative to its 20(th) century predecessor (1987). Its declared focus is patient safety, and it identifies three inter-related (and obvious) stages of the compliance lifecycle. Firstly, processes must be designed, both from a technical and quality perspective. Secondly, processes must be qualified, providing evidence that the manufacturing facility is fully "roadworthy" and fit for its intended purpose. Thirdly, processes must be verified, meaning that commercial batches must be monitored to ensure that processes remain in a state of control throughout their lifetime.
Knowledge-based processing for aircraft flight control
NASA Technical Reports Server (NTRS)
Painter, John H.; Glass, Emily; Economides, Gregory; Russell, Paul
1994-01-01
This Contractor Report documents research in Intelligent Control using knowledge-based processing in a manner dual to methods found in the classic stochastic decision, estimation, and control discipline. Such knowledge-based control has also been called Declarative, and Hybid. Software architectures were sought, employing the parallelism inherent in modern object-oriented modeling and programming. The viewpoint adopted was that Intelligent Control employs a class of domain-specific software architectures having features common over a broad variety of implementations, such as management of aircraft flight, power distribution, etc. As much attention was paid to software engineering issues as to artificial intelligence and control issues. This research considered that particular processing methods from the stochastic and knowledge-based worlds are duals, that is, similar in a broad context. They provide architectural design concepts which serve as bridges between the disparate disciplines of decision, estimation, control, and artificial intelligence. This research was applied to the control of a subsonic transport aircraft in the airport terminal area.
Improving the Product Documentation Process of a Small Software Company
NASA Astrophysics Data System (ADS)
Valtanen, Anu; Ahonen, Jarmo J.; Savolainen, Paula
Documentation is an important part of the software process, even though it is often neglected in software companies. The eternal question is how much documentation is enough. In this article, we present a practical implementation of lightweight product documentation process resulting from SPI efforts in a small company. Small companies’ financial and human resources are often limited. The documentation process described here, offers a template for creating adequate documentation consuming minimal amount of resources. The key element of the documentation process is an open source web-based bugtracking system that was customized to be used as a documentation tool. The use of the tool enables iterative and well structured documentation. The solution best serves the needs of a small company with off-the-shelf software products and striving for SPI.
Lightning Protection System for HE Facilities at LLNL - Certification Template
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clancy, T J; Ong, M M; Brown, C G
2005-12-08
This document is meant as a template to assist in the development of your own lighting certification process. Aside from this introduction and the mock representative name of the building (Building A), this document is nearly identical to a lightning certification report issued by the Engineering Directorate at Lawrence Livermore National Laboratory. At the date of this release, we have certified over 70 HE processing and storage cells at our Site 300 facilities. In Chapters 1 and 2 respectively, we address the need and methods of lightning certification for HE processing and storage facilities at LLNL. We present the preferredmore » method of lightning protection in Chapter 3, as well as the likely building modifications that are needed to comply with this method. In Chapter 4, we present the threat assessment and resulting safe work areas within a cell. After certification, there may be changes to operations during a lightning alert, and this is discussed in Chapter 5. Chapter 6 lists the maintenance requirements for the continuation of lighting certification status. Appendices of this document are meant as an aid in developing your own certification process, and they include a bonding list, an inventory of measurement equipment, surge suppressors in use at LLNL, an Integrated Work and Safety form (IWS), and a template certification sign-off sheet. The lightning certification process involves more that what is spelled out in this document. The first steps involve considerable planning, the securing of funds, and management and explosives safety buy-in. Permits must be obtained, measurement equipment must be assembled and tested, and engineers and technicians must be trained in their use. Cursory building inspections are also recommended, and surge suppression for power systems must be addressed. Upon completion of a certification report and its sign-off by management, additional work is required. Training will be needed in order to educate workers and facility managers of the requirements of lightning certification. Operating procedures will need to be generated and/or modified with additional controls. Engineering controls may also be implemented requiring the modification of cells. Careful planning should bring most of these issues to light, making it clear where this document is helpful and were additional assistance may be necessary.« less
Authorized limits for Fernald copper ingots
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frink, N.; Kamboj, S.; Hensley, J.
This development document contains data and analysis to support the approval of authorized limits for the unrestricted release of 59 t of copper ingots containing residual radioactive material from the U.S. Department of Energy (DOE) Fernald Environmental Management Project (FEMP). The analysis presented in this document comply with the requirements of DOE Order 5400.5, {open_quotes}Radiation Protection of the Public and the Environment,{close_quotes} as well as the requirements of the proposed promulgation of this order as 10 CFR Part 834. The document was developed following the step-by-step process described in the Draft Handbook for Controlling Release for Reuse or Recycle Propertymore » Containing Residual Radioactive Material.« less
Praxis language reference manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, J.H.
1981-01-01
This document is a language reference manual for the programming language Praxis. The document contains the specifications that must be met by any compiler for the language. The Praxis language was designed for systems programming in real-time process applications. Goals for the language and its implementations are: (1) highly efficient code generated by the compiler; (2) program portability; (3) completeness, that is, all programming requirements can be met by the language without needing an assembler; and (4) separate compilation to aid in design and management of large systems. The language does not provide any facilities for input/output, stack and queuemore » handling, string operations, parallel processing, or coroutine processing. These features can be implemented as routines in the language, using machine-dependent code to take advantage of facilities in the control environment on different machines.« less
LCS Content Document Application
NASA Technical Reports Server (NTRS)
Hochstadt, Jake
2011-01-01
My project at KSC during my spring 2011 internship was to develop a Ruby on Rails application to manage Content Documents..A Content Document is a collection of documents and information that describes what software is installed on a Launch Control System Computer. It's important for us to make sure the tools we use everyday are secure, up-to-date, and properly licensed. Previously, keeping track of the information was done by Excel and Word files between different personnel. The goal of the new application is to be able to manage and access the Content Documents through a single database backed web application. Our LCS team will benefit greatly with this app. Admin's will be able to login securely to keep track and update the software installed on each computer in a timely manner. We also included exportability such as attaching additional documents that can be downloaded from the web application. The finished application will ease the process of managing Content Documents while streamlining the procedure. Ruby on Rails is a very powerful programming language and I am grateful to have the opportunity to build this application.
Process Based on SysML for New Launchers System and Software Developments
NASA Astrophysics Data System (ADS)
Hiron, Emmanuel; Miramont, Philippe
2010-08-01
The purpose of this paper is to present the Astrium-ST engineering process based on SysML. This process is currently set-up in the frame of common CNES /Astrium-ST R&T studies related to the Ariane 5 electrical system and flight software modelling. The tool used to set up this process is Rhapsody release 7.3 from IBM-Software firm [1]. This process focuses on the system engineering phase dedicated to Software with the objective to generate both System documents (sequential system design and flight control) and Software specifications.
Canister Storage Building (CSB) Hazard Analysis Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
POWERS, T.B.
2000-03-16
This report describes the methodology used in conducting the Canister Storage Building (CSB) Hazard Analysis to support the final CSB Safety Analysis Report and documents the results. This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the CSB final safety analysis report (FSAR) and documents the results. The hazard analysis process identified hazardous conditions and material-at-risk, determined causes for potential accidents, identified preventive and mitigative features, and qualitatively estimated the frequencies and consequences of specific occurrences. The hazard analysis was performed by a team of cognizant CSB operations and design personnel, safetymore » analysts familiar with the CSB, and technical experts in specialty areas. The material included in this report documents the final state of a nearly two-year long process. Attachment A provides two lists of hazard analysis team members and describes the background and experience of each. The first list is a complete list of the hazard analysis team members that have been involved over the two-year long process. The second list is a subset of the first list and consists of those hazard analysis team members that reviewed and agreed to the final hazard analysis documentation. The material included in this report documents the final state of a nearly two-year long process involving formal facilitated group sessions and independent hazard and accident analysis work. The hazard analysis process led to the selection of candidate accidents for further quantitative analysis. New information relative to the hazards, discovered during the accident analysis, was incorporated into the hazard analysis data in order to compile a complete profile of facility hazards. Through this process, the results of the hazard and accident analyses led directly to the identification of safety structures, systems, and components, technical safety requirements, and other controls required to protect the public, workers, and environment.« less
Software Development Of XML Parser Based On Algebraic Tools
NASA Astrophysics Data System (ADS)
Georgiev, Bozhidar; Georgieva, Adriana
2011-12-01
In this paper, is presented one software development and implementation of an algebraic method for XML data processing, which accelerates XML parsing process. Therefore, the proposed in this article nontraditional approach for fast XML navigation with algebraic tools contributes to advanced efforts in the making of an easier user-friendly API for XML transformations. Here the proposed software for XML documents processing (parser) is easy to use and can manage files with strictly defined data structure. The purpose of the presented algorithm is to offer a new approach for search and restructuring hierarchical XML data. This approach permits fast XML documents processing, using algebraic model developed in details in previous works of the same authors. So proposed parsing mechanism is easy accessible to the web consumer who is able to control XML file processing, to search different elements (tags) in it, to delete and to add a new XML content as well. The presented various tests show higher rapidity and low consumption of resources in comparison with some existing commercial parsers.
Role and interest of new technologies in data processing for space control centers
NASA Astrophysics Data System (ADS)
Denier, Jean-Paul; Caspar, Raoul; Borillo, Mario; Soubie, Jean-Luc
1990-10-01
The ways in which a multidisplinary approach will improve space control centers is discussed. Electronic documentation, ergonomics of human computer interfaces, natural language, intelligent tutoring systems and artificial intelligence systems are considered and applied in the study of the Hermes flight control center. It is concluded that such technologies are best integrated into a classical operational environment rather than taking a revolutionary approach which would involve a global modification of the system.
10 CFR 71.113 - Document control.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 2 2013-01-01 2013-01-01 false Document control. 71.113 Section 71.113 Energy NUCLEAR....113 Document control. The licensee, certificate holder, and applicant for a CoC shall establish measures to control the issuance of documents such as instructions, procedures, and drawings, including...
10 CFR 71.113 - Document control.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 2 2014-01-01 2014-01-01 false Document control. 71.113 Section 71.113 Energy NUCLEAR....113 Document control. The licensee, certificate holder, and applicant for a CoC shall establish measures to control the issuance of documents such as instructions, procedures, and drawings, including...
10 CFR 71.113 - Document control.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 2 2010-01-01 2010-01-01 false Document control. 71.113 Section 71.113 Energy NUCLEAR....113 Document control. The licensee, certificate holder, and applicant for a CoC shall establish measures to control the issuance of documents such as instructions, procedures, and drawings, including...
Insect Pests of Field Crops. MP-28.
ERIC Educational Resources Information Center
Burkhardt, Chris C.
This document addresses the principles of field crop insect control through biological, mechanical, and chemical processes. Identification, life history, damage, pesticides, pesticide use and environmental considerations are presented for the major pests of corn, alfalfa, beans, small grains, sugar beets, and potatoes. Each section is accompanied…
Operational readiness review phase-1 final report for WRAP-1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowen, W., Westinghouse Hanford
1996-12-27
This report documents the Operational Readiness Review for WRAP-1 Phase-1 operations. The report includes all criteria, lines of inquiry with resulting Findings and Observations. The review included assessing operational capability of the organization and the computer controlled process and facility systems.
TRANSPORT AND FATE OF CONTAMINANTS IN THE SUBSURFACE
This publication is based on a series of t.technology Transfer seminars that were conducted in 1987 and 1988. The document provides an overview of many of the issues associated with the physical, chemical and biological processes that control contaminant transport in the subsurfa...
The Land Remediation and Pollution Control Division (LRPCD) QA Manager strives to assist LRPCD researchers in developing functional planning documents for their research projects. As part of the planning process, several pieces of information are needed, including information re...
DOT National Transportation Integrated Search
2002-04-01
This document catalogs the symbols presented with the various interfaces used by Federal Aviation Administration Airway Facilities specialists. It includes a high-level overview of each system and the symbols and coding conventions used. These data w...
Mduma, Estomih R; Ersdal, Hege; Kvaloy, Jan Terje; Svensen, Erling; Mdoe, Paschal; Perlman, Jeffrey; Kidanto, Hussein Lessio; Soreide, Eldar
2018-05-01
To trace and document smaller changes in perinatal survival over time. Prospective observational study, with retrospective analysis. Labor ward and operating theater at Haydom Lutheran Hospital in rural north-central Tanzania. All women giving birth and birth attendants. Helping Babies Breathe (HBB) simulation training on newborn care and resuscitation and some other efforts to improve perinatal outcome. Perinatal survival, including fresh stillbirths and early (24-h) newborn survival. The variable life-adjusted plot and cumulative sum chart revealed a steady improvement in survival over time, after the baseline period. There were some variations throughout the study period, and some of these could be linked to different interventions and events. To our knowledge, this is the first time statistical process control methods have been used to document changes in perinatal mortality over time in a rural Sub-Saharan hospital, showing a steady increase in survival. These methods can be utilized to continuously monitor and describe changes in patient outcomes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Farrell, T.; Hund, F.
1986-12-01
The document presents the technical rationale for best conventional technology (BCI) effluent limitations guidelines for the pharmaceutical manufacturing point-source category as required by the Clean Water Act of 1977 (P.L. 95-217, the Act). The document describes the technologies considered as the bases for BCT limitations. Section II of this document summarizes the rulemaking process. Sections III through V describe the technical data and engineering analyses used to develop the regulatory technology options. The costs and removals associated with each technology option for each plant and the application of the BCT cost test methodology are presented in Section VI. BCI limitationsmore » bases on the best conventional pollutant control technology are to be achieved by existing direct-discharging facilities.« less
Reference Model for an Open Archival Information System
NASA Technical Reports Server (NTRS)
1997-01-01
This document is a technical report for use in developing a consensus on what is required to operate a permanent, or indefinite long-term, archive of digital information. It may be useful as a starting point for a similar document addressing the indefinite long-term preservation of non-digital information. This report establishes a common framework of terms and concepts which comprise an Open Archival Information System (OAIS). It allows existing and future archives to be more meaningfully compared and contrasted. It provides a basis for further standardization of within an archival context and it should promote greater vendor awareness of, and support of , archival requirements. Through the process of normal evolution, it is expected that expansion, deletion, or modification to this document may occur. This report is therefore subject to CCSDS document management and change control procedures.
Kaufman, David R; Sheehan, Barbara; Stetson, Peter; Bhatt, Ashish R; Field, Adele I; Patel, Chirag; Maisel, James Mark
2016-10-28
The process of documentation in electronic health records (EHRs) is known to be time consuming, inefficient, and cumbersome. The use of dictation coupled with manual transcription has become an increasingly common practice. In recent years, natural language processing (NLP)-enabled data capture has become a viable alternative for data entry. It enables the clinician to maintain control of the process and potentially reduce the documentation burden. The question remains how this NLP-enabled workflow will impact EHR usability and whether it can meet the structured data and other EHR requirements while enhancing the user's experience. The objective of this study is evaluate the comparative effectiveness of an NLP-enabled data capture method using dictation and data extraction from transcribed documents (NLP Entry) in terms of documentation time, documentation quality, and usability versus standard EHR keyboard-and-mouse data entry. This formative study investigated the results of using 4 combinations of NLP Entry and Standard Entry methods ("protocols") of EHR data capture. We compared a novel dictation-based protocol using MediSapien NLP (NLP-NLP) for structured data capture against a standard structured data capture protocol (Standard-Standard) as well as 2 novel hybrid protocols (NLP-Standard and Standard-NLP). The 31 participants included neurologists, cardiologists, and nephrologists. Participants generated 4 consultation or admission notes using 4 documentation protocols. We recorded the time on task, documentation quality (using the Physician Documentation Quality Instrument, PDQI-9), and usability of the documentation processes. A total of 118 notes were documented across the 3 subject areas. The NLP-NLP protocol required a median of 5.2 minutes per cardiology note, 7.3 minutes per nephrology note, and 8.5 minutes per neurology note compared with 16.9, 20.7, and 21.2 minutes, respectively, using the Standard-Standard protocol and 13.8, 21.3, and 18.7 minutes using the Standard-NLP protocol (1 of 2 hybrid methods). Using 8 out of 9 characteristics measured by the PDQI-9 instrument, the NLP-NLP protocol received a median quality score sum of 24.5; the Standard-Standard protocol received a median sum of 29; and the Standard-NLP protocol received a median sum of 29.5. The mean total score of the usability measure was 36.7 when the participants used the NLP-NLP protocol compared with 30.3 when they used the Standard-Standard protocol. In this study, the feasibility of an approach to EHR data capture involving the application of NLP to transcribed dictation was demonstrated. This novel dictation-based approach has the potential to reduce the time required for documentation and improve usability while maintaining documentation quality. Future research will evaluate the NLP-based EHR data capture approach in a clinical setting. It is reasonable to assert that EHRs will increasingly use NLP-enabled data entry tools such as MediSapien NLP because they hold promise for enhancing the documentation process and end-user experience. ©David R. Kaufman, Barbara Sheehan, Peter Stetson, Ashish R. Bhatt, Adele I. Field, Chirag Patel, James Mark Maisel. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 28.10.2016.
NASA Technical Reports Server (NTRS)
Plante, Jeannete
2010-01-01
GEIA-STD-0005-1 defines the objectives of, and requirements for, documenting processes that assure customers and regulatory agencies that AHP electronic systems containing lead-free solder, piece parts, and boards will satisfy the applicable requirements for performance, reliability, airworthiness, safety, and certify-ability throughout the specified life of performance. It communicates requirements for a Lead-Free Control Plan (LFCP) to assist suppliers in the development of their own Plans. The Plan documents the Plan Owner's (supplier's) processes, that assure their customer, and all other stakeholders that the Plan owner's products will continue to meet their requirements. The presentation reviews quality assurance requirements traceability and LFCP template instructions.
Enhancing diabetes management while teaching quality improvement methods.
Sievers, Beth A; Negley, Kristin D F; Carlson, Marny L; Nelson, Joyce L; Pearson, Kristina K
2014-01-01
Six medical units realized that they were having issues with accurate timing of bedtime blood glucose measurement for their patients with diabetes. They decided to investigate the issues by using their current staff nurse committee structure. The clinical nurse specialists and nurse education specialists decided to address the issue by educating and engaging the staff in the define, measure, analyze, improve, control (DMAIC) framework process. They found that two issues needed to be improved, including timing of bedtime blood glucose measurement and snack administration and documentation. Several educational interventions were completed and resulted in improved timing of bedtime glucose measurement and bedtime snack documentation. The nurses understood the DMAIC process, and collaboration and cohesion among the medical units was enhanced. Copyright 2014, SLACK Incorporated.
Fisher, Arielle M; Herbert, Mary I; Douglas, Gerald P
2016-02-19
The Birmingham Free Clinic (BFC) in Pittsburgh, Pennsylvania, USA is a free, walk-in clinic that serves medically uninsured populations through the use of volunteer health care providers and an on-site medication dispensary. The introduction of an electronic medical record (EMR) has improved several aspects of clinic workflow. However, pharmacists' tasks involving medication management and dispensing have become more challenging since EMR implementation due to its inability to support workflows between the medical and pharmaceutical services. To inform the design of a systematic intervention, we conducted a needs assessment study to identify workflow challenges and process inefficiencies in the dispensary. We used contextual inquiry to document the dispensary workflow and facilitate identification of critical aspects of intervention design specific to the user. Pharmacists were observed according to contextual inquiry guidelines. Graphical models were produced to aid data and process visualization. We created a list of themes describing workflow challenges and asked the pharmacists to rank them in order of significance to narrow the scope of intervention design. Three pharmacists were observed at the BFC. Observer notes were documented and analyzed to produce 13 themes outlining the primary challenges pharmacists encounter during dispensation at the BFC. The dispensary workflow is labor intensive, redundant, and inefficient when integrated with the clinical service. Observations identified inefficiencies that may benefit from the introduction of informatics interventions including: medication labeling, insufficient process notification, triple documentation, and inventory control. We propose a system for Prescription Management and General Inventory Control (RxMAGIC). RxMAGIC is a framework designed to mitigate workflow challenges and improve the processes of medication management and inventory control. While RxMAGIC is described in the context of the BFC dispensary, we believe it will be generalizable to pharmacies in other low-resource settings, both domestically and internationally.
10 CFR 72.152 - Document control.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 2 2013-01-01 2013-01-01 false Document control. 72.152 Section 72.152 Energy NUCLEAR... Document control. The licensee, applicant for a license, certificate holder, and applicant for a CoC shall establish measures to control the issuance of documents such as instructions, procedures, and drawings...
10 CFR 72.152 - Document control.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 2 2014-01-01 2014-01-01 false Document control. 72.152 Section 72.152 Energy NUCLEAR... Document control. The licensee, applicant for a license, certificate holder, and applicant for a CoC shall establish measures to control the issuance of documents such as instructions, procedures, and drawings...
10 CFR 72.152 - Document control.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 2 2010-01-01 2010-01-01 false Document control. 72.152 Section 72.152 Energy NUCLEAR... Document control. The licensee, applicant for a license, certificate holder, and applicant for a CoC shall establish measures to control the issuance of documents such as instructions, procedures, and drawings...
NASA Astrophysics Data System (ADS)
Dwisatyadini, M.; Hariyati, R. T. S.; Afifah, E.
2018-03-01
Nursing documentation is clinical information that has a vital role in nursing services. The nursing process includes assessment, diagnosis, intervention, implementation, and evaluation. The purpose of this study was to determine the effects of the application of SIMPRO on the completeness and the efficiency of nursing documentation in the outpatient installation at Dompet Dhuafa Hospital Parung. This study used quantitative method with pre experimental (pre and posttest without control group) design. The mean of the documentation completeness marks before the application of SIMPRO was 1.87 (SD 0.922), and after SIMPRO was applied increased to 3.61 (0.588). This increase indicated an improvement of the nursing documentation completeness after the implementation of SIMPRO. The mean of time needed by nurses in documenting the nursing care before the application of SIMPRO was 476.13 seconds (SD 78.896). The mean of documenting time decreased more than a half after the application of SIMPRO which was 202.52 seconds (SD 196.723). SIMPRO made a nurse easier to take a decision analysis and decision support system to nursing care plan and documentation.
GSC configuration management plan
NASA Technical Reports Server (NTRS)
Withers, B. Edward
1990-01-01
The tools and methods used for the configuration management of the artifacts (including software and documentation) associated with the Guidance and Control Software (GCS) project are described. The GCS project is part of a software error studies research program. Three implementations of GCS are being produced in order to study the fundamental characteristics of the software failure process. The Code Management System (CMS) is used to track and retrieve versions of the documentation and software. Application of the CMS for this project is described and the numbering scheme is delineated for the versions of the project artifacts.
Flynn, Marilyn E.; Hart, Robert J.; Marzolf, G. Richard; Bowser, Carl J.
2001-01-01
The productivity of the trout fishery in the tailwater reach of the Colorado River downstream from Glen Canyon Dam depends on the productivity of lower trophic levels. Photosynthesis and respiration are basic biological processes that control productivity and alter pH and oxygen concentration. During 1998?99, data were collected to aid in the documentation of short- and long-term trends in these basic ecosystem processes in the Glen Canyon reach. Dissolved-oxygen, temperature, and specific-conductance profile data were collected monthly in the forebay of Glen Canyon Dam to document the status of water chemistry in the reservoir. In addition, pH, dissolved-oxygen, temperature, and specific-conductance data were collected at five sites in the Colorado River tailwater of Glen Canyon Dam to document the daily, seasonal, and longitudinal range of variation in water chemistry that could occur annually within the Glen Canyon reach.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 30 2010-07-01 2010-07-01 false EPA review. 725.50 Section 725.50... REQUIREMENTS AND REVIEW PROCESSES FOR MICROORGANISMS Administrative Procedures § 725.50 EPA review. (a) MCANs... Document Control Officer receives a complete submission, or the date EPA determines the submission is...
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 31 2011-07-01 2011-07-01 false EPA review. 725.50 Section 725.50... REQUIREMENTS AND REVIEW PROCESSES FOR MICROORGANISMS Administrative Procedures § 725.50 EPA review. (a) MCANs... Document Control Officer receives a complete submission, or the date EPA determines the submission is...
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 32 2013-07-01 2013-07-01 false EPA review. 725.50 Section 725.50... REQUIREMENTS AND REVIEW PROCESSES FOR MICROORGANISMS Administrative Procedures § 725.50 EPA review. (a) MCANs... Document Control Officer receives a complete submission, or the date EPA determines the submission is...
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 31 2014-07-01 2014-07-01 false EPA review. 725.50 Section 725.50... REQUIREMENTS AND REVIEW PROCESSES FOR MICROORGANISMS Administrative Procedures § 725.50 EPA review. (a) MCANs... Document Control Officer receives a complete submission, or the date EPA determines the submission is...
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 32 2012-07-01 2012-07-01 false EPA review. 725.50 Section 725.50... REQUIREMENTS AND REVIEW PROCESSES FOR MICROORGANISMS Administrative Procedures § 725.50 EPA review. (a) MCANs... Document Control Officer receives a complete submission, or the date EPA determines the submission is...
Software control and system configuration management: A systems-wide approach
NASA Technical Reports Server (NTRS)
Petersen, K. L.; Flores, C., Jr.
1984-01-01
A comprehensive software control and system configuration management process for flight-crucial digital control systems of advanced aircraft has been developed and refined to insure efficient flight system development and safe flight operations. Because of the highly complex interactions among the hardware, software, and system elements of state-of-the-art digital flight control system designs, a systems-wide approach to configuration control and management has been used. Specific procedures are implemented to govern discrepancy reporting and reconciliation, software and hardware change control, systems verification and validation testing, and formal documentation requirements. An active and knowledgeable configuration control board reviews and approves all flight system configuration modifications and revalidation tests. This flexible process has proved effective during the development and flight testing of several research aircraft and remotely piloted research vehicles with digital flight control systems that ranged from relatively simple to highly complex, integrated mechanizations.
E-documentation as a process management tool for nursing care in hospitals.
Rajkovic, Uros; Sustersic, Olga; Rajkovic, Vladislav
2009-01-01
Appropriate documentation plays a key role in process management in nursing care. It includes holistic data management based on patient's data along the clinical path with regard to nursing care. We developed an e-documentation model that follows the process method of work in nursing care. It assesses the patient's status on the basis of Henderson's theoretical model of 14 basic living activities and is aligned with internationally recognized nursing classifications. E-documentation development requires reengineering of existing documentation and facilitates process reengineering. A prototype solution of an e-nursing documentation, already being in testing process at University medical centres in Ljubljana and Maribor, will be described.
User-friendly solutions for microarray quality control and pre-processing on ArrayAnalysis.org
Eijssen, Lars M. T.; Jaillard, Magali; Adriaens, Michiel E.; Gaj, Stan; de Groot, Philip J.; Müller, Michael; Evelo, Chris T.
2013-01-01
Quality control (QC) is crucial for any scientific method producing data. Applying adequate QC introduces new challenges in the genomics field where large amounts of data are produced with complex technologies. For DNA microarrays, specific algorithms for QC and pre-processing including normalization have been developed by the scientific community, especially for expression chips of the Affymetrix platform. Many of these have been implemented in the statistical scripting language R and are available from the Bioconductor repository. However, application is hampered by lack of integrative tools that can be used by users of any experience level. To fill this gap, we developed a freely available tool for QC and pre-processing of Affymetrix gene expression results, extending, integrating and harmonizing functionality of Bioconductor packages. The tool can be easily accessed through a wizard-like web portal at http://www.arrayanalysis.org or downloaded for local use in R. The portal provides extensive documentation, including user guides, interpretation help with real output illustrations and detailed technical documentation. It assists newcomers to the field in performing state-of-the-art QC and pre-processing while offering data analysts an integral open-source package. Providing the scientific community with this easily accessible tool will allow improving data quality and reuse and adoption of standards. PMID:23620278
MODIS Information, Data, and Control System (MIDACS) system specifications and conceptual design
NASA Technical Reports Server (NTRS)
Han, D.; Salomonson, V.; Ormsby, J.; Ardanuy, P.; Mckay, A.; Hoyt, D.; Jaffin, S.; Vallette, B.; Sharts, B.; Folta, D.
1988-01-01
The MODIS Information, Data, and Control System (MIDACS) Specifications and Conceptual Design Document discusses system level requirements, the overall operating environment in which requirements must be met, and a breakdown of MIDACS into component subsystems, which include the Instrument Support Terminal, the Instrument Control Center, the Team Member Computing Facility, the Central Data Handling Facility, and the Data Archive and Distribution System. The specifications include sizing estimates for the processing and storage capacities of each data system element, as well as traffic analyses of data flows between the elements internally, and also externally across the data system interfaces. The specifications for the data system, as well as for the individual planning and scheduling, control and monitoring, data acquisition and processing, calibration and validation, and data archive and distribution components, do not yet fully specify the data system in the complete manner needed to achieve the scientific objectives of the MODIS instruments and science teams. The teams have not yet been formed; however, it was possible to develop the specifications and conceptual design based on the present concept of EosDIS, the Level-1 and Level-2 Functional Requirements Documents, the Operations Concept, and through interviews and meetings with key members of the scientific community.
Fosness, Ryan L.; Dietsch, Benjamin J.
2015-10-21
This report presents the surveying techniques and data-processing methods used to collect, process, and disseminate topographic and hydrographic data. All standard and non‑standard data-collection methods, techniques, and data process methods were documented. Additional discussion describes the quality-assurance and quality-control elements used in this study, along with the limitations for the Torrinha-Itacoatiara study reach data. The topographic and hydrographic geospatial data are published along with associated metadata.
Gonzalez, Mariaelena; Green, Lawrence W; Glantz, Stanton A
2011-01-01
Objective To analyse the models Philip Morris (PM) and British American Tobacco (BAT) used internally to understand tobacco control non-governmental organizations (NGOs) and their relationship to the global tobacco control policy-making process that resulted in the Framework Convention for Tobacco Control (FCTC). Methods Analysis of internal tobacco industry documents in the Legacy Tobacco Document Library. Results PM contracted with Mongoven, Biscoe, and Duchin, Inc. (MBD, a consulting firm specialising in NGO surveillance) as advisors. MBD argued that because NGOs are increasingly linked to epistemic communities, NGOs could insert themselves into the global policy-making process and influence the discourse surrounding the treaty-making process. MBD advised PM to insert itself into the policy-making process, mimicking NGO behaviour. BAT’s Consumer and Regulatory Affairs (CORA) department argued that global regulation emerged from the perception (by NGOs and governments) that the industry could not regulate itself, leading to BAT advocating social alignment and self-regulation to minimise the impact of the FCTC. Most efforts to block or redirect the FCTC failed. Conclusions PM and BAT articulated a global policy-making environment in which NGOs are key, non-state stakeholders, and as a result, internationalised some of their previous national-level strategies. After both companies failed to prevent the FCTC, their strategies began to align. Multinational corporations have continued to successfully employ some of the strategies outlined in this paper at the local and national level while being formally excluded from ongoing FCTC negotiations at the global level. PMID:21636611
NASA Technical Reports Server (NTRS)
1981-01-01
The software package evaluation was designed to analyze commercially available, field-proven, production control or manufacturing resource planning management technology and software package. The analysis was conducted by comparing SRB production control software requirements and conceptual system design to software package capabilities. The methodology of evaluation and the findings at each stage of evaluation are described. Topics covered include: vendor listing; request for information (RFI) document; RFI response rate and quality; RFI evaluation process; and capabilities versus requirements.
Analysis and control of the METC fluid bed gasifier. Quarterly progress report, January--March 1995
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-03-01
This document summarizes work performed for the period 10/1/94 to 3/31/95. In this work, three components will form the basis for design of a control scheme for the Fluidized Bed Gasifier (FBG) at METC: (1) a control systems analysis based on simple linear models derived from process data, (2) review of the literature on fluid bed gasifier operation and control, and (3) understanding of present FBG operation and real world considerations. Below we summarize work accomplished to data in each of these areas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Izmaylov, Alexandr V.; Babkin, Vladimir; Kurov, Valeriy
2009-10-07
The development of new or the upgrade of existing physical protection systems (PPS) for nuclear facilities involves a multi-step and multidimensional process. The process consists of conceptual design, design, and commissioning stages. The activities associated with each of these stages are governed by Russian government and agency regulations. To ensure a uniform approach to development or upgrading of PPS at Russian nuclear facilities, the development of a range of regulatory and methodological documents is necessary. Some issues of PPS development are covered by the regulatory documents developed by Rosatom, as well as other Russian agencies with nuclear facilities under theirmore » control. This regulatory development has been accomplished as part of the U.S.-Russian MPC&A cooperation or independently by the Russian Federation. While regulatory coverage is extensive, there are a number of issues such as vulnerability analysis, effectiveness assessment, upgrading PPS, and protection of information systems for PPS that require additional regulations be developed. This paper reports on the status of regulatory coverage for PPS development or upgrade, and outlines a new approach to regulatory document development. It describes the evolutionary process of regulatory development through experience gained in the design, development and implementation of PPS as well as experience gained through the cooperative efforts of Russian and U.S. experts involved the development of MPC&A regulations.« less
Regulatory constraints as seen from the pharmaceutical industry.
Galligani, G; David-Andersen, I; Fossum, B
2005-01-01
In Chile, Canada, Europe, Japan, and the USA, which are the main geographical areas for fish farming of high value fish such as salmonids, sea bass, sea bream, yellowtail and catfish, vaccination has been established as an important method for the prevention of infectious diseases. To make new vaccines available to the fish farming industry, pharmaceutical companies must comply with the regulatory framework for licensing of fish vaccines, which in recent years has become more regulated. Considerable scientific and regulatory skills are thus required to develop, document and license vaccines in accordance with the requirements in the different geographical areas. International co-operation to harmonise requirements for the licensing documentation is ongoing. Even though there are obvious benefits to the pharmaceutical industry from the harmonisation process, it may sometimes impose unreasonable requirements. The regulatory framework for fish vaccines clearly has an impact on the time for bringing a new fish vaccine to the market. Several hurdles need to be passed to complete the regulatory process, i.e. obtain a licence. Fulfilment of the rather detailed and extensive requirements for documentation of the production and controls, as well as safety and efficacy of the vaccine, represent a challenge to the pharmaceutical industry, as do the different national and regional licensing procedures. This paper describes regulatory constraints related to the documentation, the licensing process, the site of production and the continuing international harmonisation work, with emphasis on inactivated conventional fish vaccines.
Vijayakrishnan, Rajakrishnan; Steinhubl, Steven R.; Ng, Kenney; Sun, Jimeng; Byrd, Roy J.; Daar, Zahra; Williams, Brent A.; deFilippi, Christopher; Ebadollahi, Shahram; Stewart, Walter F.
2014-01-01
Background The electronic health record contains a tremendous amount of data that if appropriately detected can lead to earlier identification of disease states such as heart failure (HF). Using a novel text and data analytic tool we explored the longitudinal EHR of over 50,000 primary care patients to identify the documentation of the signs and symptoms of HF in the years preceding its diagnosis. Methods and Results Retrospective analysis consisting of 4,644 incident HF cases and 45,981 group-matched controls. Documentation of Framingham HF signs and symptoms within encounter notes were carried out using a previously validated natural language processing procedure. A total of 892,805 affirmed criteria were documented over an average observation period of 3.4 years. Among eventual HF cases, 85% had at least one criterion within a year prior to their HF diagnosis (as did 55% of controls). Substantial variability in the prevalence of individual signs and symptoms were found in both cases and controls. Conclusions HF signs and symptoms are frequently documented in a primary care population as identified through automated text and data mining of EHRs. Their frequent identification demonstrates the rich data available within EHRs that will allow for future work on automated criterion identification to help develop predictive models for HF. PMID:24709663
Transitioning from Distributed and Traditional to Distributed and Agile: An Experience Report
NASA Astrophysics Data System (ADS)
Wildt, Daniel; Prikladnicki, Rafael
Global companies that experienced extensive waterfall phased plans are trying to improve their existing processes to expedite team engagement. Agile methodologies have become an acceptable path to follow because it comprises project management as part of its practices. Agile practices have been used with the objective of simplifying project control through simple processes, easy to update documentation and higher team iteration over exhaustive documentation, focusing rather on team continuous improvement and aiming to add value to business processes. The purpose of this chapter is to describe the experience of a global multinational company on transitioning from distributed and traditional to distributed and agile. This company has development centers across North America, South America and Asia. This chapter covers challenges faced by the project teams of two pilot projects, including strengths of using agile practices in a globally distributed environment and practical recommendations for similar endeavors.
Schnabel, M; Mann, D; Efe, T; Schrappe, M; V Garrel, T; Gotzen, L; Schaeg, M
2004-10-01
The introduction of the German Diagnostic Related Groups (D-DRG) system requires redesigning administrative patient management strategies. Wrong coding leads to inaccurate grouping and endangers the reimbursement of treatment costs. This situation emphasizes the roles of documentation and coding as factors of economical success. The aims of this study were to assess the quantity and quality of initial documentation and coding (ICD-10 and OPS-301) and find operative strategies to improve efficiency and strategic means to ensure optimal documentation and coding quality. In a prospective study, documentation and coding quality were evaluated in a standardized way by weekly assessment. Clinical data from 1385 inpatients were processed for initial correctness and quality of documentation and coding. Principal diagnoses were found to be accurate in 82.7% of cases, inexact in 7.1%, and wrong in 10.1%. Effects on financial returns occurred in 16%. Based on these findings, an optimized, interdisciplinary, and multiprofessional workflow on medical documentation, coding, and data control was developed. Workflow incorporating regular assessment of documentation and coding quality is required by the DRG system to ensure efficient accounting of hospital services. Interdisciplinary and multiprofessional cooperation is recognized to be an important factor in establishing an efficient workflow in medical documentation and coding.
19 CFR 201.16 - Service of process and other documents.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 19 Customs Duties 3 2010-04-01 2010-04-01 false Service of process and other documents. 201.16... APPLICATION Initiation and Conduct of Investigations § 201.16 Service of process and other documents. (a) By..., the service of a process or other document of the Commission shall be served by anyone duly authorized...
19 CFR 210.7 - Service of process and other documents; publication of notices.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 19 Customs Duties 3 2010-04-01 2010-04-01 false Service of process and other documents... § 210.7 Service of process and other documents; publication of notices. (a) Manner of service. (1) The service of process and all documents issued by or on behalf of the Commission or the administrative law...
Zheng, Shuai; Lu, James J; Ghasemzadeh, Nima; Hayek, Salim S; Quyyumi, Arshed A; Wang, Fusheng
2017-05-09
Extracting structured data from narrated medical reports is challenged by the complexity of heterogeneous structures and vocabularies and often requires significant manual effort. Traditional machine-based approaches lack the capability to take user feedbacks for improving the extraction algorithm in real time. Our goal was to provide a generic information extraction framework that can support diverse clinical reports and enables a dynamic interaction between a human and a machine that produces highly accurate results. A clinical information extraction system IDEAL-X has been built on top of online machine learning. It processes one document at a time, and user interactions are recorded as feedbacks to update the learning model in real time. The updated model is used to predict values for extraction in subsequent documents. Once prediction accuracy reaches a user-acceptable threshold, the remaining documents may be batch processed. A customizable controlled vocabulary may be used to support extraction. Three datasets were used for experiments based on report styles: 100 cardiac catheterization procedure reports, 100 coronary angiographic reports, and 100 integrated reports-each combines history and physical report, discharge summary, outpatient clinic notes, outpatient clinic letter, and inpatient discharge medication report. Data extraction was performed by 3 methods: online machine learning, controlled vocabularies, and a combination of these. The system delivers results with F1 scores greater than 95%. IDEAL-X adopts a unique online machine learning-based approach combined with controlled vocabularies to support data extraction for clinical reports. The system can quickly learn and improve, thus it is highly adaptable. ©Shuai Zheng, James J Lu, Nima Ghasemzadeh, Salim S Hayek, Arshed A Quyyumi, Fusheng Wang. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 09.05.2017.
Vogel, Markus; Kaisers, Wolfgang; Wassmuth, Ralf; Mayatepek, Ertan
2015-11-03
Clinical documentation has undergone a change due to the usage of electronic health records. The core element is to capture clinical findings and document therapy electronically. Health care personnel spend a significant portion of their time on the computer. Alternatives to self-typing, such as speech recognition, are currently believed to increase documentation efficiency and quality, as well as satisfaction of health professionals while accomplishing clinical documentation, but few studies in this area have been published to date. This study describes the effects of using a Web-based medical speech recognition system for clinical documentation in a university hospital on (1) documentation speed, (2) document length, and (3) physician satisfaction. Reports of 28 physicians were randomized to be created with (intervention) or without (control) the assistance of a Web-based system of medical automatic speech recognition (ASR) in the German language. The documentation was entered into a browser's text area and the time to complete the documentation including all necessary corrections, correction effort, number of characters, and mood of participant were stored in a database. The underlying time comprised text entering, text correction, and finalization of the documentation event. Participants self-assessed their moods on a scale of 1-3 (1=good, 2=moderate, 3=bad). Statistical analysis was done using permutation tests. The number of clinical reports eligible for further analysis stood at 1455. Out of 1455 reports, 718 (49.35%) were assisted by ASR and 737 (50.65%) were not assisted by ASR. Average documentation speed without ASR was 173 (SD 101) characters per minute, while it was 217 (SD 120) characters per minute using ASR. The overall increase in documentation speed through Web-based ASR assistance was 26% (P=.04). Participants documented an average of 356 (SD 388) characters per report when not assisted by ASR and 649 (SD 561) characters per report when assisted by ASR. Participants' average mood rating was 1.3 (SD 0.6) using ASR assistance compared to 1.6 (SD 0.7) without ASR assistance (P<.001). We conclude that medical documentation with the assistance of Web-based speech recognition leads to an increase in documentation speed, document length, and participant mood when compared to self-typing. Speech recognition is a meaningful and effective tool for the clinical documentation process.
Engineering and Safety Partnership Enhances Safety of the Space Shuttle Program (SSP)
NASA Technical Reports Server (NTRS)
Duarte, Alberto
2007-01-01
Project Management must use the risk assessment documents (RADs) as tools to support their decision making process. Therefore, these documents have to be initiated, developed, and evolved parallel to the life of the project. Technical preparation and safety compliance of these documents require a great deal of resources. Updating these documents after-the-fact not only requires substantial increase in resources - Project Cost -, but this task is also not useful and perhaps an unnecessary expense. Hazard Reports (HRs), Failure Modes and Effects Analysis (FMEAs), Critical Item Lists (CILs), Risk Management process are, among others, within this category. A positive action resulting from a strong partnership between interested parties is one way to get these documents and related processes and requirements, released and updated in useful time. The Space Shuttle Program (SSP) at the Marshall Space Flight Center has implemented a process which is having positive results and gaining acceptance within the Agency. A hybrid Panel, with equal interest and responsibilities for the two larger organizations, Safety and Engineering, is the focal point of this process. Called the Marshall Safety and Engineering Review Panel (MSERP), its charter (Space Shuttle Program Directive 110 F, April 15, 2005), and its Operating Control Plan emphasizes the technical and safety responsibilities over the program risk documents: HRs; FMEA/CILs; Engineering Changes; anomalies/problem resolutions and corrective action implementations, and trend analysis. The MSERP has undertaken its responsibilities with objectivity, assertiveness, dedication, has operated with focus, and has shown significant results and promising perspectives. The MSERP has been deeply involved in propulsion systems and integration, real time technical issues and other relevant reviews, since its conception. These activities have transformed the propulsion MSERP in a truly participative and value added panel, making a difference for the safety of the Space Shuttle Vehicle, its crew, and personnel. Because of the MSERP's valuable contribution to the assessment of safety risk for the SSP, this paper also proposes an enhanced Panel concept that takes this successful partnership concept to a higher level of 'true partnership'. The proposed panel is aimed to be responsible for the review and assessment of all risk relative to Safety for new and future aerospace and related programs.
USDA-ARS?s Scientific Manuscript database
Soil temperature (Ts) exerts critical controls on hydrologic and biogeochemical processes but magnitude and nature of Ts variability in a landscape setting are rarely documented. Fiber optic distributed temperature sensing systems (FO-DTS) potentially measure Ts at high density over a large extent. ...
1990-01-01
This document contains summaries of fifteen of the well known books which underlie the Total Quality Management philosophy. Members of the DCASR St Louis staff offer comments and opinions on how the authors have presented the quality concept in todays business environment. Keywords: TQM (Total Quality Management ), Quality concepts, Statistical process control.
40 CFR 63.1036 - Alternative means of emission limitation: Batch processes.
Code of Federal Regulations, 2010 CFR
2010-07-01
... SOURCE CATEGORIES (CONTINUED) National Emission Standards for Equipment Leaks-Control Level 2 Standards... leaks. The owner or operator may switch among the alternatives provided the change is documented as... shall be pressure-tested for leaks before regulated material is first fed to the equipment and the...
40 CFR 63.775 - Reporting requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... operator of a glycol dehydration unit subject to this subpart that is exempt from the control requirements for glycol dehydration unit process vents in § 63.765, is exempt from all reporting requirements for....0 or higher and documentation stating why the TEG dehydration unit must operate using the alternate...
Breaking up the transcription logjam can improve cash flow.
Paulik, Dennis
2004-06-01
Using more than 20 transcription companies to handle its annual volume of 36 million lines, Health Midwest knew it had to gain control of the document transcription and delivery process. By centralizing its transcription service, the organization saved $600,000 and reduced days in accounts receivable by 10 days.
ERIC Educational Resources Information Center
Abbott, Anthony
1992-01-01
Discusses the electronic publishing activities of Meckler Publishing on the Internet, including a publications catalog, an electronic journal, and tables of contents databases. Broader issues of commercial network publishing are also addressed, including changes in the research process, changes in publishing, bibliographic control,…
The Lincoln Training System: A Summary Report.
ERIC Educational Resources Information Center
Butman, Robert C.; Frick, Frederick C.
The current status of the Lincoln Training System (LTS) is reported. This document describes LTS as a computer supported microfiche system which: 1) provides random access to voice quality audio and to graphics; 2) supports student-controlled interactive processes; and 3) functions in a variety of environments. The report offers a detailed…
12 CFR 7.5007 - Correspondent services.
Code of Federal Regulations, 2010 CFR
2010-01-01
...) Item processing services and related software; (f) Document control and record keeping through the use... 12 Banks and Banking 1 2010-01-01 2010-01-01 false Correspondent services. 7.5007 Section 7.5007... Electronic Activities § 7.5007 Correspondent services. It is part of the business of banking for a national...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larson, D.E.
1996-09-01
This report provides a collection of annotated bibliographies for documents prepared under the Hanford High-Level Waste Vitrification (Plant) Program. The bibliographies are for documents from Fiscal Year 1983 through Fiscal Year 1995, and include work conducted at or under the direction of the Pacific Northwest National Laboratory. The bibliographies included focus on the technology developed over the specified time period for vitrifying Hanford pretreated high-level waste. The following subject areas are included: General Documentation; Program Documentation; High-Level Waste Characterization; Glass Formulation and Characterization; Feed Preparation; Radioactive Feed Preparation and Glass Properties Testing; Full-Scale Feed Preparation Testing; Equipment Materials Testing; Meltermore » Performance Assessment and Evaluations; Liquid-Fed Ceramic Melter; Cold Crucible Melter; Stirred Melter; High-Temperature Melter; Melter Off-Gas Treatment; Vitrification Waste Treatment; Process, Product Control and Modeling; Analytical; and Canister Closure, Decontamination, and Handling« less
Basic Electronic Design for Proposed NMSU Hitchhiker Payload
NASA Technical Reports Server (NTRS)
Horan, Stephen
2000-01-01
This document presents the bas'c hardware design developed by the EE 499 class during the spring semester of the 1999-2000 academic year. This design covers the electrical components to supply power to the experiments, the computer software and interfaces to control the experiments, and the ground data processing to provide an operator interface. This document is a follow-on to the Payload Mission description document and the System Requirements document developed during the EE 498 class during the fall semester. The design activities are broken down by functional area within the structure. For each area, we give the requirements that need to be met and the design to meet the requirements. For each of these areas, a prototype selection of hardware and/or software was done by the class and the components assembled as part of the class to verify that they worked as intended.
NASA Technical Reports Server (NTRS)
Saiidi, M. J.; Duffy, R. E.; Mclaughlin, T. D.
1986-01-01
The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis/Critical Items List (FMEA/CIL) are presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The independent analysis results corresponding to the Orbiter Atmospheric Revitalization and Pressure Control Subsystem (ARPCS) are documented. The ARPCS hardware was categorized into the following subdivisions: (1) Atmospheric Make-up and Control (including the Auxiliary Oxygen Assembly, Oxygen Assembly, and Nitrogen Assembly); and (2) Atmospheric Vent and Control (including the Positive Relief Vent Assembly, Negative Relief Vent Assembly, and Cabin Vent Assembly). The IOA analysis process utilized available ARPCS hardware drawings and schematics for defining hardware assemblies, components, and hardware items. Each level of hardware was evaluated and analyzed for possible failure modes and effects. Criticality was assigned based upon the severity of the effect for each failure mode.
An Action-Based Fine-Grained Access Control Mechanism for Structured Documents and Its Application
Su, Mang; Li, Fenghua; Tang, Zhi; Yu, Yinyan; Zhou, Bo
2014-01-01
This paper presents an action-based fine-grained access control mechanism for structured documents. Firstly, we define a describing model for structured documents and analyze the application scenarios. The describing model could support the permission management on chapters, pages, sections, words, and pictures of structured documents. Secondly, based on the action-based access control (ABAC) model, we propose a fine-grained control protocol for structured documents by introducing temporal state and environmental state. The protocol covering different stages from document creation, to permission specification and usage control are given by using the Z-notation. Finally, we give the implementation of our mechanism and make the comparisons between the existing methods and our mechanism. The result shows that our mechanism could provide the better solution of fine-grained access control for structured documents in complicated networks. Moreover, it is more flexible and practical. PMID:25136651
An action-based fine-grained access control mechanism for structured documents and its application.
Su, Mang; Li, Fenghua; Tang, Zhi; Yu, Yinyan; Zhou, Bo
2014-01-01
This paper presents an action-based fine-grained access control mechanism for structured documents. Firstly, we define a describing model for structured documents and analyze the application scenarios. The describing model could support the permission management on chapters, pages, sections, words, and pictures of structured documents. Secondly, based on the action-based access control (ABAC) model, we propose a fine-grained control protocol for structured documents by introducing temporal state and environmental state. The protocol covering different stages from document creation, to permission specification and usage control are given by using the Z-notation. Finally, we give the implementation of our mechanism and make the comparisons between the existing methods and our mechanism. The result shows that our mechanism could provide the better solution of fine-grained access control for structured documents in complicated networks. Moreover, it is more flexible and practical.
NASA Technical Reports Server (NTRS)
Balakrishna, S.; Kilgore, W. Allen
1992-01-01
The NASA Langley 0.3-m Transonic Cryogenic Tunnel is to be modified to operate with sulfur hexafluoride gas while retaining its present capability to operate with nitrogen. The modified tunnel will provide high Reynolds number flow on aerodynamic models with two different test gases. The document details a study of the SF6 tunnel performance boundaries, thermodynamic modeling of the tunnel process, nonlinear dynamical simulation of math model to yield tunnel responses, the closed loop control requirements, control laws, and mechanization of the control laws on the microprocessor based controller.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sailer, S.J.
This Quality Assurance Project Plan (QAPJP) specifies the quality of data necessary and the characterization techniques employed at the Idaho National Engineering Laboratory (INEL) to meet the objectives of the Department of Energy (DOE) Waste Isolation Pilot Plant (WIPP) Transuranic Waste Characterization Quality Assurance Program Plan (QAPP) requirements. This QAPJP is written to conform with the requirements and guidelines specified in the QAPP and the associated documents referenced in the QAPP. This QAPJP is one of a set of five interrelated QAPjPs that describe the INEL Transuranic Waste Characterization Program (TWCP). Each of the five facilities participating in the TWCPmore » has a QAPJP that describes the activities applicable to that particular facility. This QAPJP describes the roles and responsibilities of the Idaho Chemical Processing Plant (ICPP) Analytical Chemistry Laboratory (ACL) in the TWCP. Data quality objectives and quality assurance objectives are explained. Sample analysis procedures and associated quality assurance measures are also addressed; these include: sample chain of custody; data validation; usability and reporting; documentation and records; audits and 0385 assessments; laboratory QC samples; and instrument testing, inspection, maintenance and calibration. Finally, administrative quality control measures, such as document control, control of nonconformances, variances and QA status reporting are described.« less
Web-based X-ray quality control documentation.
David, George; Burnett, Lou Ann; Schenkel, Robert
2003-01-01
The department of radiology at the Medical College of Georgia Hospital and Clinics has developed an equipment quality control web site. Our goal is to provide immediate access to virtually all medical physics survey data. The web site is designed to assist equipment engineers, department management and technologists. By improving communications and access to equipment documentation, we believe productivity is enhanced. The creation of the quality control web site was accomplished in three distinct steps. First, survey data had to be placed in a computer format. The second step was to convert these various computer files to a format supported by commercial web browsers. Third, a comprehensive home page had to be designed to provide convenient access to the multitude of surveys done in the various x-ray rooms. Because we had spent years previously fine-tuning the computerization of the medical physics quality control program, most survey documentation was already in spreadsheet or database format. A major technical decision was the method of conversion of survey spreadsheet and database files into documentation appropriate for the web. After an unsatisfactory experience with a HyperText Markup Language (HTML) converter (packaged with spreadsheet and database software), we tried creating Portable Document Format (PDF) files using Adobe Acrobat software. This process preserves the original formatting of the document and takes no longer than conventional printing; therefore, it has been very successful. Although the PDF file generated by Adobe Acrobat is a proprietary format, it can be displayed through a conventional web browser using the freely distributed Adobe Acrobat Reader program that is available for virtually all platforms. Once a user installs the software, it is automatically invoked by the web browser whenever the user follows a link to a file with a PDF extension. Although no confidential patient information is available on the web site, our legal department recommended that we secure the site in order to keep out those wishing to make mischief. Our interim solution has not been to password protect the page, which we feared would hinder access for occasional legitimate users, but also not to provide links to it from other hospital and department pages. Utility and productivity were improved and time and money were saved by making radiological equipment quality control documentation instantly available on-line.
Sánchez, José; Guarnes, Miguel Ángel; Dormido, Sebastián
2009-01-01
This paper is an experimental study of the utilization of different event-based strategies for the automatic control of a simple but very representative industrial process: the level control of a tank. In an event-based control approach it is the triggering of a specific event, and not the time, that instructs the sensor to send the current state of the process to the controller, and the controller to compute a new control action and send it to the actuator. In the document, five control strategies based on different event-based sampling techniques are described, compared, and contrasted with a classical time-based control approach and a hybrid one. The common denominator in the time, the hybrid, and the event-based control approaches is the controller: a proportional-integral algorithm with adaptations depending on the selected control approach. To compare and contrast each one of the hybrid and the pure event-based control algorithms with the time-based counterpart, the two tasks that a control strategy must achieve (set-point following and disturbance rejection) are independently analyzed. The experimental study provides new proof concerning the ability of event-based control strategies to minimize the data exchange among the control agents (sensors, controllers, actuators) when an error-free control of the process is not a hard requirement. PMID:22399975
Peres, Heloísa; Cruz, Diná; Tellez, Michelle; de Cássia Gengo E Silva, Rita; Ortiz, Diley; Diogo, Regina; Ortiz, Dóris R
2016-01-01
The aim of this study was to present the experience of a teaching hospital with the implementation of improvements to an electronic documentation system of the nursing process (PROCEnf-USP®). The improvements were based on functional performance and technical quality of the system. It was adopted Scrum™ method for version control PROCEnf-USP® by enabling agility, flexibility and possibility of integration between development and users. The PROCEnf-USP® has been used since 2009 and has professional and academic environments. The current version lets you generate reports and supports decisions about diagnoses, outcomes and interventions. It is provided the use of indicators to monitor results and registration at the point of care. The establishment of important.
Design and Data Management System
NASA Technical Reports Server (NTRS)
Messer, Elizabeth; Messer, Brad; Carter, Judy; Singletary, Todd; Albasini, Colby; Smith, Tammy
2007-01-01
The Design and Data Management System (DDMS) was developed to automate the NASA Engineering Order (EO) and Engineering Change Request (ECR) processes at the Propulsion Test Facilities at Stennis Space Center for efficient and effective Configuration Management (CM). Prior to the development of DDMS, the CM system was a manual, paper-based system that required an EO or ECR submitter to walk the changes through the acceptance process to obtain necessary approval signatures. This approval process could take up to two weeks, and was subject to a variety of human errors. The process also requires that the CM office make copies and distribute them to the Configuration Control Board members for review prior to meetings. At any point, there was a potential for an error or loss of the change records, meaning the configuration of record was not accurate. The new Web-based DDMS eliminates unnecessary copies, reduces the time needed to distribute the paperwork, reduces time to gain the necessary signatures, and prevents the variety of errors inherent in the previous manual system. After implementation of the DDMS, all EOs and ECRs can be automatically checked prior to submittal to ensure that the documentation is complete and accurate. Much of the configuration information can be documented in the DDMS through pull-down forms to ensure consistent entries by the engineers and technicians in the field. The software also can electronically route the documents through the signature process to obtain the necessary approvals needed for work authorization. The workflow of the system allows for backups and timestamps that determine the correct routing and completion of all required authorizations in a more timely manner, as well as assuring the quality and accuracy of the configuration documents.
Sheehan, Barbara; Stetson, Peter; Bhatt, Ashish R; Field, Adele I; Patel, Chirag; Maisel, James Mark
2016-01-01
Background The process of documentation in electronic health records (EHRs) is known to be time consuming, inefficient, and cumbersome. The use of dictation coupled with manual transcription has become an increasingly common practice. In recent years, natural language processing (NLP)–enabled data capture has become a viable alternative for data entry. It enables the clinician to maintain control of the process and potentially reduce the documentation burden. The question remains how this NLP-enabled workflow will impact EHR usability and whether it can meet the structured data and other EHR requirements while enhancing the user’s experience. Objective The objective of this study is evaluate the comparative effectiveness of an NLP-enabled data capture method using dictation and data extraction from transcribed documents (NLP Entry) in terms of documentation time, documentation quality, and usability versus standard EHR keyboard-and-mouse data entry. Methods This formative study investigated the results of using 4 combinations of NLP Entry and Standard Entry methods (“protocols”) of EHR data capture. We compared a novel dictation-based protocol using MediSapien NLP (NLP-NLP) for structured data capture against a standard structured data capture protocol (Standard-Standard) as well as 2 novel hybrid protocols (NLP-Standard and Standard-NLP). The 31 participants included neurologists, cardiologists, and nephrologists. Participants generated 4 consultation or admission notes using 4 documentation protocols. We recorded the time on task, documentation quality (using the Physician Documentation Quality Instrument, PDQI-9), and usability of the documentation processes. Results A total of 118 notes were documented across the 3 subject areas. The NLP-NLP protocol required a median of 5.2 minutes per cardiology note, 7.3 minutes per nephrology note, and 8.5 minutes per neurology note compared with 16.9, 20.7, and 21.2 minutes, respectively, using the Standard-Standard protocol and 13.8, 21.3, and 18.7 minutes using the Standard-NLP protocol (1 of 2 hybrid methods). Using 8 out of 9 characteristics measured by the PDQI-9 instrument, the NLP-NLP protocol received a median quality score sum of 24.5; the Standard-Standard protocol received a median sum of 29; and the Standard-NLP protocol received a median sum of 29.5. The mean total score of the usability measure was 36.7 when the participants used the NLP-NLP protocol compared with 30.3 when they used the Standard-Standard protocol. Conclusions In this study, the feasibility of an approach to EHR data capture involving the application of NLP to transcribed dictation was demonstrated. This novel dictation-based approach has the potential to reduce the time required for documentation and improve usability while maintaining documentation quality. Future research will evaluate the NLP-based EHR data capture approach in a clinical setting. It is reasonable to assert that EHRs will increasingly use NLP-enabled data entry tools such as MediSapien NLP because they hold promise for enhancing the documentation process and end-user experience. PMID:27793791
NASA Technical Reports Server (NTRS)
Watson, R. T.; Geller, M. A.; Stolarski, R. S.; Hampson, R. F.
1986-01-01
The state of knowledge of the upper atmosphere was assessed as of January 1986. The physical, chemical, and radiative processes which control the spatial and temporal distribution of ozone in the atmosphere; the predicted magnitude of ozone perturbations and climate changes for a variety of trace gas scenarios; and the ozone and temperature data used to detect the presence or absence of a long term trend were discussed. This assessment report was written by a small group of NASA scientists, was peer reviewed, and is based primarily on the comprehensive international assessment document entitled Atmospheric Ozone 1985: Assessment of Our Understanding of the Processes Controlling Its Present Distribution and Change, to be published as the World Meteorological Organization Global Ozone Research and Monitoring Project Report No. 16.
Lean Six Sigma Challenges and Opportunities
2008-02-13
five key stages: Define, Measure, Analyze, Improve, and Control ( DMAIC ). During the ‘Define’ stage, participants identify the process that will be...that wastes time and effort. 4 The ‘Measure’ stage of DMAIC documents the measure of time or quantity of activities that occur at this stage of...chosen to be implemented. In the final ‘Control’ stage of the DMAIC methodology, the purpose is to ensure that benefits from the improved process
Controlling changes - lessons learned from waste management facilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, B.M.; Koplow, A.S.; Stoll, F.E.
This paper discusses lessons learned about change control at the Waste Reduction Operations Complex (WROC) and Waste Experimental Reduction Facility (WERF) of the Idaho National Engineering Laboratory (INEL). WROC and WERF have developed and implemented change control and an as-built drawing process and have identified structures, systems, and components (SSCS) for configuration management. The operations have also formed an Independent Review Committee to minimize costs and resources associated with changing documents. WROC and WERF perform waste management activities at the INEL. WROC activities include storage, treatment, and disposal of hazardous and mixed waste. WERF provides volume reduction of solid low-levelmore » waste through compaction, incineration, and sizing operations. WROC and WERF`s efforts aim to improve change control processes that have worked inefficiently in the past.« less
NASA Technical Reports Server (NTRS)
Braslow, A. L.
1999-01-01
The paper contains the following sections: Foreword; Preface; Laminar-Flow Control Concepts and Scope of Monograph; Early Research on Suction-Type Laminar-Flow Control (Research from the 1930s through the War Years; Research from after World War II to the Mid-1960s); Post X-21 Research on Suction-Type Laminar-Flow Control; Status of Laminar-Flow Control Technology in the Mid-1990s; Glossary; Document 1-Aeronautics Panel, AACB, R&D Review, Report of the Subpanel on Aeronautic Energy Conservation/Fuels; Document 2-Report of Review Group on X-21A Laminar Flow Control Program; Document 3-Langley Research Center Announcement, Establishment of Laminar Flow Control Working Group; Document 4-Intercenter Agreement for Laminar Flow Control Leading Edge Glove Flights, LaRC and DFRC; Document 5-Flight Report NLF-144, of AFTIF-111 Aircraft with the TACT Wing Modified by a Natural Laminar Flow Glove; Document 6-Flight Record, F-16XL Supersonic Laminar Flow Control Aircraft; Index; and About the Author.
Adaptive Algorithms for Automated Processing of Document Images
2011-01-01
ABSTRACT Title of dissertation: ADAPTIVE ALGORITHMS FOR AUTOMATED PROCESSING OF DOCUMENT IMAGES Mudit Agrawal, Doctor of Philosophy, 2011...2011 4. TITLE AND SUBTITLE Adaptive Algorithms for Automated Processing of Document Images 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM...ALGORITHMS FOR AUTOMATED PROCESSING OF DOCUMENT IMAGES by Mudit Agrawal Dissertation submitted to the Faculty of the Graduate School of the University
Saltcedar and Russian Olive Control Demonstration Act Science Assessment
Shafroth, Patrick B.; Brown, Curtis A.; Merritt, David M.
2010-01-01
The primary intent of this document is to provide the science assessment called for under The Saltcedar and Russian Olive Control Demonstration Act of 2006 (Public Law 109-320; the Act). A secondary purpose is to provide a common background for applicants for prospective demonstration projects, should funds be appropriated for this second phase of the Act. This document synthesizes the state-of-the-science on the following topics: the distribution and abundance (extent) of saltcedar (Tamarix spp.) and Russian olive (Elaeagnus angustifolia) in the Western United States, potential for water savings associated with controlling saltcedar and Russian olive and the associated restoration of occupied sites, considerations related to wildlife use of saltcedar and Russian olive habitat or restored habitats, methods to control saltcedar and Russian olive, possible utilization of dead biomass following removal of saltcedar and Russian olive, and approaches and challenges associated with revegetation or restoration following control efforts. A concluding chapter discusses possible long-term management strategies, needs for additional study, potentially useful field demonstration projects, and a planning process for on-the-ground projects involving removal of saltcedar and Russian olive.
Management: A continuing literature survey with indexes, March 1976
NASA Technical Reports Server (NTRS)
1976-01-01
Management is a compilation of references to selected reports, journal articles, and other documents on the subject of management. This publication lists 368 documents originally announced in the 1975 issues of Scientific and Technical Aerospace Reports (STAR) or International Aerospace Abstracts (IAA). It includes references on the management of research and development, contracts, production, logistics, personnel, safety, reliability and quality control. It also includes references on: program, project and systems management; management policy, philosophy, tools, and techniques; decisionmaking processes for managers; technology assessment; management of urban problems; and information for managers on Federal resources, expenditures, financing, and budgeting.
TOPEX Software Document Series. Volume 5; Rev. 1; TOPEX GDR Processing
NASA Technical Reports Server (NTRS)
Lee, Jeffrey; Lockwood, Dennis; Hancock, David W., III
2003-01-01
This document is a compendium of the WFF TOPEX Software Development Team's knowledge regarding Geophysical Data Record (GDR) Processing. It includes many elements of a requirements document, a software specification document, a software design document, and a user's manual. In the more technical sections, this document assumes the reader is familiar with TOPEX and instrument files.
NASA Technical Reports Server (NTRS)
Bartelson, D.
1984-01-01
The PLB, its cargo, and payload canister must satisfy the cleanliness requirements of visual clean (VC) level 1, 2, 3, or special as stated in NASA document SN-C-0005A. The specific level of cleanliness is chosen by the payload bay customer for their mission. During orbiter turnaround processing at KSC, the payload bay is exposed to the environments of the Orbiter Processing Facility (OPF) and the Payload Changeout Room (PCR). In supportive response to the orbiter payload bay/facility interface, it is necessary that the facility environment be controlled and monitored to protect the cleanliness/environmental integrity of the payload bay and its cargo. Techniques used to meet environmental requirements during orbiter processing are introduced.
Validation of the F-18 high alpha research vehicle flight control and avionics systems modifications
NASA Technical Reports Server (NTRS)
Chacon, Vince; Pahle, Joseph W.; Regenie, Victoria A.
1990-01-01
The verification and validation process is a critical portion of the development of a flight system. Verification, the steps taken to assure the system meets the design specification, has become a reasonably understood and straightforward process. Validation is the method used to ensure that the system design meets the needs of the project. As systems become more integrated and more critical in their functions, the validation process becomes more complex and important. The tests, tools, and techniques which are being used for the validation of the high alpha research vehicle (HARV) turning valve control system (TVCS) are discussed, and their solutions are documented. The emphasis of this paper is on the validation of integrated systems.
Validation of the F-18 high alpha research vehicle flight control and avionics systems modifications
NASA Technical Reports Server (NTRS)
Chacon, Vince; Pahle, Joseph W.; Regenie, Victoria A.
1990-01-01
The verification and validation process is a critical portion of the development of a flight system. Verification, the steps taken to assure the system meets the design specification, has become a reasonably understood and straightforward process. Validation is the method used to ensure that the system design meets the needs of the project. As systems become more integrated and more critical in their functions, the validation process becomes more complex and important. The tests, tools, and techniques which are being used for the validation of the high alpha research vehicle (HARV) turning vane control system (TVCS) are discussed and the problems and their solutions are documented. The emphasis of this paper is on the validation of integrated system.
Kappen, Claudia
2016-01-01
The process of patterning along the anterior-posterior axis in vertebrates is highly conserved. The function of Hox genes in the axis patterning process is particularly well documented for bone development in the vertebral column and the limbs. We here show that Hoxb6, in skeletal elements at the cervico-thoracic junction, controls multiple independent aspects of skeletal pattern, implicating discrete developmental pathways as substrates for this transcription factor. In addition, we demonstrate that Hoxb6 function is subject to modulation by genetic factors. These results establish Hox-controlled skeletal pattern as a quantitative trait modulated by gene-gene interactions, and provide evidence that distinct modifiers influence the function of conserved developmental genes in fundamental patterning processes. PMID:26800342
Hung, Yu-Ting; Liu, Chi-Te; Peng, I-Chen; Hsu, Chin; Yu, Roch-Chui; Cheng, Kuan-Chen
2015-09-01
To ensure the safety of the peanut butter ice cream manufacture, a Hazard Analysis and Critical Control Point (HACCP) plan has been designed and applied to the production process. Potential biological, chemical, and physical hazards in each manufacturing procedure were identified. Critical control points for the peanut butter ice cream were then determined as the pasteurization and freezing process. The establishment of a monitoring system, corrective actions, verification procedures, and documentation and record keeping were followed to complete the HACCP program. The results of this study indicate that implementing the HACCP system in food industries can effectively enhance food safety and quality while improving the production management. Copyright © 2015. Published by Elsevier B.V.
DOCU-TEXT: A tool before the data dictionary
NASA Technical Reports Server (NTRS)
Carter, B.
1983-01-01
DOCU-TEXT, a proprietary software package that aids in the production of documentation for a data processing organization and can be installed and operated only on IBM computers is discussed. In organizing information that ultimately will reside in a data dictionary, DOCU-TEXT proved to be a useful documentation tool in extracting information from existing production jobs, procedure libraries, system catalogs, control data sets and related files. DOCU-TEXT reads these files to derive data that is useful at the system level. The output of DOCU-TEXT is a series of user selectable reports. These reports can reflect the interactions within a single job stream, a complete system, or all the systems in an installation. Any single report, or group of reports, can be generated in an independent documentation pass.
Spent Nuclear Fuel Project Configuration Management Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reilly, M.A.
This document is a rewrite of the draft ``C`` that was agreed to ``in principle`` by SNF Project level 2 managers on EDT 609835, dated March 1995 (not released). The implementation process philosphy was changed in keeping with the ongoing reengineering of the WHC Controlled Manuals to achieve configuration management within the SNF Project.
Saturn S-2 quality assurance techniques, critical process control. Volume 7: Metallic materials
NASA Technical Reports Server (NTRS)
Ross, W. D., Jr.
1970-01-01
The special skills developed during the Saturn S-2 Program are documented to enable qualified personnel to carry out efficient operations in future S-2 production. Skills covered include: acceptance testing of fusion-welding equipment, weld operators and inspector certification, machine certification, preweld operations, and repair weld certification.
SPC-Prep 1. Participant's Manual. Workplace Education. Project ALERT.
ERIC Educational Resources Information Center
Ruetz, Nancy
This companion document to the instructor's guide for a course designed to prepare employees for statistical process control (SPC) training given at their workplace by refreshing math skills and building the concepts and vocabulary necessary to understand SPC in manufacturing environments. SPC-Prep 1 addresses the math skills necessary to perform…
Aerodynamic Indices of Velopharyngeal Function in Childhood Apraxia of Speech
ERIC Educational Resources Information Center
Sealey, Linda R.; Giddens, Cheryl L.
2010-01-01
Childhood apraxia of speech (CAS) is characterized as a deficit in the motor processes of speech for the volitional control of the articulators, including the velum. One of the many characteristics attributed to children with CAS is intermittent or inconsistent hypernasality. The purpose of this study was to document differences in velopharyngeal…
Normalising the Breast: Early Childhood Services Battling the Bottle and the Breast
ERIC Educational Resources Information Center
Duncan, Judith; Bartle, Carol
2014-01-01
Normalising practices as a tool for controlling the body and bodily processes have been well-documented using Foucault's theories, including debates around breastfeeding. In this article we explore how the ideas of "normalisation" of the bottle-feeding culture of infants in New Zealand early childhood settings has become the accepted…
Application of Preventative Legal Considerations to the Alumni Affairs Administrator.
ERIC Educational Resources Information Center
Miles, Albert S.; Miller, Michael T.
Colleges and universities increasingly rely on fund raising activities as a major source of operating revenue, a process which is wrought with legal pitfalls. This document provides an overview of the legal considerations of fund raising for the alumni and development officer, focusing particularly on span of control considerations between alumni…
Demonstration for Social Change: An Experiment in Local Control.
ERIC Educational Resources Information Center
Gittell, Marilyn; And Others
This evaluative history of New York City's recent experiments in public education documents three projects: Ocean Hill-Brownsville, IS 201, and Two Bridges. This study employs a process oriented and a qualitative methodology which relies most on field observation and participant-observers. The Institute study observed the new community boards,…
SDDL- SOFTWARE DESIGN AND DOCUMENTATION LANGUAGE
NASA Technical Reports Server (NTRS)
Kleine, H.
1994-01-01
Effective, efficient communication is an essential element of the software development process. The Software Design and Documentation Language (SDDL) provides an effective communication medium to support the design and documentation of complex software applications. SDDL supports communication between all the members of a software design team and provides for the production of informative documentation on the design effort. Even when an entire development task is performed by a single individual, it is important to explicitly express and document communication between the various aspects of the design effort including concept development, program specification, program development, and program maintenance. SDDL ensures that accurate documentation will be available throughout the entire software life cycle. SDDL offers an extremely valuable capability for the design and documentation of complex programming efforts ranging from scientific and engineering applications to data management and business sytems. Throughout the development of a software design, the SDDL generated Software Design Document always represents the definitive word on the current status of the ongoing, dynamic design development process. The document is easily updated and readily accessible in a familiar, informative form to all members of the development team. This makes the Software Design Document an effective instrument for reconciling misunderstandings and disagreements in the development of design specifications, engineering support concepts, and the software design itself. Using the SDDL generated document to analyze the design makes it possible to eliminate many errors that might not be detected until coding and testing is attempted. As a project management aid, the Software Design Document is useful for monitoring progress and for recording task responsibilities. SDDL is a combination of language, processor, and methodology. The SDDL syntax consists of keywords to invoke design structures and a collection of directives which control processor actions. The designer has complete control over the choice of keywords, commanding the capabilities of the processor in a way which is best suited to communicating the intent of the design. The SDDL processor translates the designer's creative thinking into an effective document for communication. The processor performs as many automatic functions as possible, thereby freeing the designer's energy for the creative effort. Document formatting includes graphical highlighting of structure logic, accentuation of structure escapes and module invocations, logic error detection, and special handling of title pages and text segments. The SDDL generated document contains software design summary information including module invocation hierarchy, module cross reference, and cross reference tables of user selected words or phrases appearing in the document. The basic forms of the methodology are module and block structures and the module invocation statement. A design is stated in terms of modules that represent problem abstractions which are complete and independent enough to be treated as separate problem entities. Blocks are lower-level structures used to build the modules. Both kinds of structures may have an initiator part, a terminator part, an escape segment, or a substructure. The SDDL processor is written in PASCAL for batch execution on a DEC VAX series computer under VMS. SDDL was developed in 1981 and last updated in 1984.
NASA Technical Reports Server (NTRS)
Cecil, R. W.; White, R. A.; Szczur, M. R.
1972-01-01
The IDAMS Processor is a package of task routines and support software that performs convolution filtering, image expansion, fast Fourier transformation, and other operations on a digital image tape. A unique task control card for that program, together with any necessary parameter cards, selects each processing technique to be applied to the input image. A variable number of tasks can be selected for execution by including the proper task and parameter cards in the input deck. An executive maintains control of the run; it initiates execution of each task in turn and handles any necessary error processing.
GeoDeepDive: Towards a Machine Reading-Ready Digital Library and Information Integration Resource
NASA Astrophysics Data System (ADS)
Husson, J. M.; Peters, S. E.; Livny, M.; Ross, I.
2015-12-01
Recent developments in machine reading and learning approaches to text and data mining hold considerable promise for accelerating the pace and quality of literature-based data synthesis, but these advances have outpaced even basic levels of access to the published literature. For many geoscience domains, particularly those based on physical samples and field-based descriptions, this limitation is significant. Here we describe a general infrastructure to support published literature-based machine reading and learning approaches to information integration and knowledge base creation. This infrastructure supports rate-controlled automated fetching of original documents, along with full bibliographic citation metadata, from remote servers, the secure storage of original documents, and the utilization of considerable high-throughput computing resources for the pre-processing of these documents by optical character recognition, natural language parsing, and other document annotation and parsing software tools. New tools and versions of existing tools can be automatically deployed against original documents when they are made available. The products of these tools (text/XML files) are managed by MongoDB and are available for use in data extraction applications. Basic search and discovery functionality is provided by ElasticSearch, which is used to identify documents of potential relevance to a given data extraction task. Relevant files derived from the original documents are then combined into basic starting points for application building; these starting points are kept up-to-date as new relevant documents are incorporated into the digital library. Currently, our digital library stores contains more than 360K documents supplied by Elsevier and the USGS and we are actively seeking additional content providers. By focusing on building a dependable infrastructure to support the retrieval, storage, and pre-processing of published content, we are establishing a foundation for complex, and continually improving, information integration and data extraction applications. We have developed one such application, which we present as an example, and invite new collaborations to develop other such applications.
Consequences of Early Conductive Hearing Loss on Long-Term Binaural Processing.
Graydon, Kelley; Rance, Gary; Dowell, Richard; Van Dun, Bram
The aim of the study was to investigate the long-term effects of early conductive hearing loss on binaural processing in school-age children. One hundred and eighteen children participated in the study, 82 children with a documented history of conductive hearing loss associated with otitis media and 36 controls who had documented histories showing no evidence of otitis media or conductive hearing loss. All children were demonstrated to have normal-hearing acuity and middle ear function at the time of assessment. The Listening in Spatialized Noise Sentence (LiSN-S) task and the masking level difference (MLD) task were used as the two different measures of binaural interaction ability. Children with a history of conductive hearing loss performed significantly poorer than controls on all LiSN-S conditions relying on binaural cues (DV90, p = <0.001 and SV90, p = 0.003). No significant difference was found between the groups in listening conditions without binaural cues. Fifteen children with a conductive hearing loss history (18%) showed results consistent with a spatial processing disorder. No significant difference was observed between the conductive hearing loss group and the controls on the MLD task. Furthermore, no correlations were found between LiSN-S and MLD. Results show a relationship between early conductive hearing loss and listening deficits that persist once hearing has returned to normal. Results also suggest that the two binaural interaction tasks (LiSN-S and MLD) may be measuring binaural processing at different levels. Findings highlight the need for a screening measure of functional listening ability in children with a history of early otitis media.
Bonan, Brigitte; Martelli, Nicolas; Berhoune, Malik; Maestroni, Marie-Laure; Havard, Laurent; Prognon, Patrice
2009-02-01
To apply the Hazard analysis and Critical Control Points method to the preparation of anti-cancer drugs. To identify critical control points in our cancer chemotherapy process and to propose control measures and corrective actions to manage these processes. The Hazard Analysis and Critical Control Points application began in January 2004 in our centralized chemotherapy compounding unit. From October 2004 to August 2005, monitoring of the process nonconformities was performed to assess the method. According to the Hazard Analysis and Critical Control Points method, a multidisciplinary team was formed to describe and assess the cancer chemotherapy process. This team listed all of the critical points and calculated their risk indexes according to their frequency of occurrence, their severity and their detectability. The team defined monitoring, control measures and corrective actions for each identified risk. Finally, over a 10-month period, pharmacists reported each non-conformity of the process in a follow-up document. Our team described 11 steps in the cancer chemotherapy process. The team identified 39 critical control points, including 11 of higher importance with a high-risk index. Over 10 months, 16,647 preparations were performed; 1225 nonconformities were reported during this same period. The Hazard Analysis and Critical Control Points method is relevant when it is used to target a specific process such as the preparation of anti-cancer drugs. This method helped us to focus on the production steps, which can have a critical influence on product quality, and led us to improve our process.
Software development to support sensor control of robot arc welding
NASA Technical Reports Server (NTRS)
Silas, F. R., Jr.
1986-01-01
The development of software for a Digital Equipment Corporation MINC-23 Laboratory Computer to provide functions of a workcell host computer for Space Shuttle Main Engine (SSME) robotic welding is documented. Routines were written to transfer robot programs between the MINC and an Advanced Robotic Cyro 750 welding robot. Other routines provide advanced program editing features while additional software allows communicatin with a remote computer aided design system. Access to special robot functions were provided to allow advanced control of weld seam tracking and process control for future development programs.
Review of nuclear pharmacy practice in hospitals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kawada, T.K.; Tubis, M.; Ebenkamp, T.
1982-02-01
An operational profile for nuclear pharmacy practice is presented, and the technical and professional role of nuclear pharmacists is reviewed. Key aspects of nuclear pharmacy practice in hospitals discussed are the basic facilities and equipment for the preparation, quality control, and distribution of radioactive drug products. Standards for receiving, storing, and processing radioactive material are described. The elements of a radiopharmaceutical quality assurance program, including the working procedures, documentation systems, data analysis, and specific control tests, are presented. Details of dose preparation and administration and systems of inventory control for radioactive products are outlined.
WFF TOPEX Software Documentation Altimeter Instrument File (AIF) Processing, October 1998. Volume 3
NASA Technical Reports Server (NTRS)
Lee, Jeffrey; Lockwood, Dennis
2003-01-01
This document is a compendium of the WFF TOPEX Software Development Team's knowledge regarding Sensor Data Record (SDR) Processing. It includes many elements of a requirements document, a software specification document, a software design document, and a user's manual. In the more technical sections, this document assumes the reader is familiar with TOPEX and instrument files.
Conceptual-level workflow modeling of scientific experiments using NMR as a case study
Verdi, Kacy K; Ellis, Heidi JC; Gryk, Michael R
2007-01-01
Background Scientific workflows improve the process of scientific experiments by making computations explicit, underscoring data flow, and emphasizing the participation of humans in the process when intuition and human reasoning are required. Workflows for experiments also highlight transitions among experimental phases, allowing intermediate results to be verified and supporting the proper handling of semantic mismatches and different file formats among the various tools used in the scientific process. Thus, scientific workflows are important for the modeling and subsequent capture of bioinformatics-related data. While much research has been conducted on the implementation of scientific workflows, the initial process of actually designing and generating the workflow at the conceptual level has received little consideration. Results We propose a structured process to capture scientific workflows at the conceptual level that allows workflows to be documented efficiently, results in concise models of the workflow and more-correct workflow implementations, and provides insight into the scientific process itself. The approach uses three modeling techniques to model the structural, data flow, and control flow aspects of the workflow. The domain of biomolecular structure determination using Nuclear Magnetic Resonance spectroscopy is used to demonstrate the process. Specifically, we show the application of the approach to capture the workflow for the process of conducting biomolecular analysis using Nuclear Magnetic Resonance (NMR) spectroscopy. Conclusion Using the approach, we were able to accurately document, in a short amount of time, numerous steps in the process of conducting an experiment using NMR spectroscopy. The resulting models are correct and precise, as outside validation of the models identified only minor omissions in the models. In addition, the models provide an accurate visual description of the control flow for conducting biomolecular analysis using NMR spectroscopy experiment. PMID:17263870
Conceptual-level workflow modeling of scientific experiments using NMR as a case study.
Verdi, Kacy K; Ellis, Heidi Jc; Gryk, Michael R
2007-01-30
Scientific workflows improve the process of scientific experiments by making computations explicit, underscoring data flow, and emphasizing the participation of humans in the process when intuition and human reasoning are required. Workflows for experiments also highlight transitions among experimental phases, allowing intermediate results to be verified and supporting the proper handling of semantic mismatches and different file formats among the various tools used in the scientific process. Thus, scientific workflows are important for the modeling and subsequent capture of bioinformatics-related data. While much research has been conducted on the implementation of scientific workflows, the initial process of actually designing and generating the workflow at the conceptual level has received little consideration. We propose a structured process to capture scientific workflows at the conceptual level that allows workflows to be documented efficiently, results in concise models of the workflow and more-correct workflow implementations, and provides insight into the scientific process itself. The approach uses three modeling techniques to model the structural, data flow, and control flow aspects of the workflow. The domain of biomolecular structure determination using Nuclear Magnetic Resonance spectroscopy is used to demonstrate the process. Specifically, we show the application of the approach to capture the workflow for the process of conducting biomolecular analysis using Nuclear Magnetic Resonance (NMR) spectroscopy. Using the approach, we were able to accurately document, in a short amount of time, numerous steps in the process of conducting an experiment using NMR spectroscopy. The resulting models are correct and precise, as outside validation of the models identified only minor omissions in the models. In addition, the models provide an accurate visual description of the control flow for conducting biomolecular analysis using NMR spectroscopy experiment.
Ghorbani, Nima; Watson, P J; Farhadi, Mehran; Chen, Zhuo
2014-04-01
Self-regulation presumably rests upon multiple processes that include an awareness of ongoing self-experience, enduring self-knowledge and self-control. The present investigation tested this multi-process model using the Five-Facet Mindfulness Questionnaire (FFMQ) and the Integrative Self-Knowledge and Brief Self-Control Scales. Using a sample of 1162 Iranian university students, we confirmed the five-factor structure of the FFMQ in Iran and documented its factorial invariance across males and females. Self-regulatory variables correlated negatively with Perceived Stress, Depression, and Anxiety and positively with Self-Esteem and Satisfaction with Life. Partial mediation effects confirmed that self-regulatory measures ameliorated the disturbing effects of Perceived Stress. Integrative Self-Knowledge and Self-Control interacted to partially mediate the association of Perceived Stress with lower levels of Satisfaction with Life. Integrative Self-Knowledge, alone or in interaction with Self-Control, was the only self-regulation variable to display the expected mediation of Perceived Stress associations with all other measures. Self-Control failed to be implicated in self-regulation only in the mediation of Anxiety. These data confirmed the need to further examine this multi-process model of self-regulation. © 2014 International Union of Psychological Science.
Guidance on validation and qualification of processes and operations involving radiopharmaceuticals.
Todde, S; Peitl, P Kolenc; Elsinga, P; Koziorowski, J; Ferrari, V; Ocak, E M; Hjelstuen, O; Patt, M; Mindt, T L; Behe, M
2017-01-01
Validation and qualification activities are nowadays an integral part of the day by day routine work in a radiopharmacy. This document is meant as an Appendix of Part B of the EANM "Guidelines on Good Radiopharmacy Practice (GRPP)" issued by the Radiopharmacy Committee of the EANM, covering the qualification and validation aspects related to the small-scale "in house" preparation of radiopharmaceuticals. The aim is to provide more detailed and practice-oriented guidance to those who are involved in the small-scale preparation of radiopharmaceuticals which are not intended for commercial purposes or distribution. The present guideline covers the validation and qualification activities following the well-known "validation chain", that begins with editing the general Validation Master Plan document, includes all the required documentation (e.g. User Requirement Specification, Qualification protocols, etc.), and leads to the qualification of the equipment used in the preparation and quality control of radiopharmaceuticals, until the final step of Process Validation. A specific guidance to the qualification and validation activities specifically addressed to small-scale hospital/academia radiopharmacies is here provided. Additional information, including practical examples, are also available.
"Key to the future": British American tobacco and cigarette smuggling in China.
Lee, Kelley; Collin, Jeff
2006-07-01
Cigarette smuggling is a major public health issue, stimulating increased tobacco consumption and undermining tobacco control measures. China is the ultimate prize among tobacco's emerging markets, and is also believed to have the world's largest cigarette smuggling problem. Previous work has demonstrated the complicity of British American Tobacco (BAT) in this illicit trade within Asia and the former Soviet Union. This paper analyses internal documents of BAT available on site from the Guildford Depository and online from the BAT Document Archive. Documents dating from the early 1900s to 2003 were searched and indexed on a specially designed project database to enable the construction of an historical narrative. Document analysis incorporated several validation techniques within a hermeneutic process. This paper describes the huge scale of this illicit trade in China, amounting to billions of (United States) dollars in sales, and the key supply routes by which it has been conducted. It examines BAT's efforts to optimise earnings by restructuring operations, and controlling the supply chain and pricing of smuggled cigarettes. Our research shows that smuggling has been strategically critical to BAT's ongoing efforts to penetrate the Chinese market, and to its overall goal to become the leading company within an increasingly global industry. These findings support the need for concerted efforts to strengthen global collaboration to combat cigarette smuggling.
Definition and documentation of engineering processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDonald, G.W.
1997-11-01
This tutorial is an extract of a two-day workshop developed under the auspices of the Quality Engineering Department at Sandia National Laboratories. The presentation starts with basic definitions and addresses why processes should be defined and documented. It covers three primary topics: (1) process considerations and rationale, (2) approach to defining and documenting engineering processes, and (3) an IDEFO model of the process for defining engineering processes.
Systems engineering implementation in the preliminary design phase of the Giant Magellan Telescope
NASA Astrophysics Data System (ADS)
Maiten, J.; Johns, M.; Trancho, G.; Sawyer, D.; Mady, P.
2012-09-01
Like many telescope projects today, the 24.5-meter Giant Magellan Telescope (GMT) is truly a complex system. The primary and secondary mirrors of the GMT are segmented and actuated to support two operating modes: natural seeing and adaptive optics. GMT is a general-purpose telescope supporting multiple science instruments operated in those modes. GMT is a large, diverse collaboration and development includes geographically distributed teams. The need to implement good systems engineering processes for managing the development of systems like GMT becomes imperative. The management of the requirements flow down from the science requirements to the component level requirements is an inherently difficult task in itself. The interfaces must also be negotiated so that the interactions between subsystems and assemblies are well defined and controlled. This paper will provide an overview of the systems engineering processes and tools implemented for the GMT project during the preliminary design phase. This will include requirements management, documentation and configuration control, interface development and technical risk management. Because of the complexity of the GMT system and the distributed team, using web-accessible tools for collaboration is vital. To accomplish this GMTO has selected three tools: Cognition Cockpit, Xerox Docushare, and Solidworks Enterprise Product Data Management (EPDM). Key to this is the use of Cockpit for managing and documenting the product tree, architecture, error budget, requirements, interfaces, and risks. Additionally, drawing management is accomplished using an EPDM vault. Docushare, a documentation and configuration management tool is used to manage workflow of documents and drawings for the GMT project. These tools electronically facilitate collaboration in real time, enabling the GMT team to track, trace and report on key project metrics and design parameters.
Information Quality in Regulatory Decision Making: Peer Review versus Good Laboratory Practice.
McCarty, Lynn S; Borgert, Christopher J; Mihaich, Ellen M
2012-07-01
There is an ongoing discussion on the provenance of toxicity testing data regarding how best to ensure its validity and credibility. A central argument is whether journal peer-review procedures are superior to Good Laboratory Practice (GLP) standards employed for compliance with regulatory mandates. We sought to evaluate the rationale for regulatory decision making based on peer-review procedures versus GLP standards. We examined pertinent published literature regarding how scientific data quality and validity are evaluated for peer review, GLP compliance, and development of regulations. Some contend that peer review is a coherent, consistent evaluative procedure providing quality control for experimental data generation, analysis, and reporting sufficient to reliably establish relative merit, whereas GLP is seen as merely a tracking process designed to thwart investigator corruption. This view is not supported by published analyses pointing to subjectivity and variability in peer-review processes. Although GLP is not designed to establish relative merit, it is an internationally accepted quality assurance, quality control method for documenting experimental conduct and data. Neither process is completely sufficient for establishing relative scientific soundness. However, changes occurring both in peer-review processes and in regulatory guidance resulting in clearer, more transparent communication of scientific information point to an emerging convergence in ensuring information quality. The solution to determining relative merit lies in developing a well-documented, generally accepted weight-of-evidence scheme to evaluate both peer-reviewed and GLP information used in regulatory decision making where both merit and specific relevance inform the process.
TOPEX SDR Processing, October 1998. Volume 4
NASA Technical Reports Server (NTRS)
Lee, Jeffrey E.; Lockwood, Dennis W.
2003-01-01
This document is a compendium of the WFF TOPEX Software Development Team's knowledge regarding Sensor Data Record (SDR) Processing. It includes many elements of a requirements document, a software specification document, a software design document, and a user's manual. In the more technical sections, this document assumes the reader is familiar with TOPEX and instrument files.
Metrology: Measurement Assurance Program Guidelines
NASA Technical Reports Server (NTRS)
Eicke, W. G.; Riley, J. P.; Riley, K. J.
1995-01-01
The 5300.4 series of NASA Handbooks for Reliability and Quality Assurance Programs have provisions for the establishment and utilization of a documented metrology system to control measurement processes and to provide objective evidence of quality conformance. The intent of these provisions is to assure consistency and conformance to specifications and tolerances of equipment, systems, materials, and processes procured and/or used by NASA, its international partners, contractors, subcontractors, and suppliers. This Measurement Assurance Program (MAP) guideline has the specific objectives to: (1) ensure the quality of measurements made within NASA programs; (2) establish realistic measurement process uncertainties; (3) maintain continuous control over the measurement processes; and (4) ensure measurement compatibility among NASA facilities. The publication addresses MAP methods as applied within and among NASA installations and serves as a guide to: control measurement processes at the local level (one facility); conduct measurement assurance programs in which a number of field installations are joint participants; and conduct measurement integrity (round robin) experiments in which a number of field installations participate to assess the overall quality of particular measurement processes at a point in time.
Fast title extraction method for business documents
NASA Astrophysics Data System (ADS)
Katsuyama, Yutaka; Naoi, Satoshi
1997-04-01
Conventional electronic document filing systems are inconvenient because the user must specify the keywords in each document for later searches. To solve this problem, automatic keyword extraction methods using natural language processing and character recognition have been developed. However, these methods are slow, especially for japanese documents. To develop a practical electronic document filing system, we focused on the extraction of keyword areas from a document by image processing. Our fast title extraction method can automatically extract titles as keywords from business documents. All character strings are evaluated for similarity by rating points associated with title similarity. We classified these points as four items: character sitting size, position of character strings, relative position among character strings, and string attribution. Finally, the character string that has the highest rating is selected as the title area. The character recognition process is carried out on the selected area. It is fast because this process must recognize a small number of patterns in the restricted area only, and not throughout the entire document. The mean performance of this method is an accuracy of about 91 percent and a 1.8 sec. processing time for an examination of 100 Japanese business documents.
Gill, Preetinder S; Gill, Tejkaran S; Kamath, Ashwini; Whisnant, Billy
2012-01-01
Health literacy is associated with a person’s capacity to find, access, contextualize, and understand information needed for health care-related decisions. The level of health literacy thus has an influence on an individual’s health status. It can be argued that low health literacy is associated with poor health status. Health care literature (eg, pamphlets, brochures, postcards, posters, forms) are published by public and private organizations worldwide to provide information to the general public. The ability to read, use, and understand is critical to the successful application of knowledge disseminated by this literature. This study assessed the readability, suitability, and usability of health care literature associated with concussion and traumatic brain injury published by the United States Centers for Disease Control and Prevention. The Flesch–Kincaid Grade Level, Flesch Reading Ease, Gunning Fog, Simple Measure of Gobbledygook, and Suitability Assessment of Materials indices were used to assess 40 documents obtained from the Centers for Disease Control and Prevention website. The documents analyzed were targeted towards the general public. It was found that in order to be read properly, on average, these documents needed more than an eleventh grade/high school level education. This was consistent with the findings of other similar studies. However, the qualitative Suitability Assessment of Materials index showed that, on average, usability and suitability of these documents was superior. Hence, it was concluded that formatting, illustrations, layout, and graphics play a pivotal role in improving health care-related literature and, in turn, promoting health literacy. Based on the comprehensive literature review and assessment of the 40 documents associated with concussion and traumatic brain injury, recommendations have been made for improving the readability, suitability, and usability of health care-related documents. The recommendations are presented in the form of an incremental improvement process cycle and a list of dos and don’ts. PMID:23204856
Creation of structured documentation templates using Natural Language Processing techniques.
Kashyap, Vipul; Turchin, Alexander; Morin, Laura; Chang, Frank; Li, Qi; Hongsermeier, Tonya
2006-01-01
Structured Clinical Documentation is a fundamental component of the healthcare enterprise, linking both clinical (e.g., electronic health record, clinical decision support) and administrative functions (e.g., evaluation and management coding, billing). One of the challenges in creating good quality documentation templates has been the inability to address specialized clinical disciplines and adapt to local clinical practices. A one-size-fits-all approach leads to poor adoption and inefficiencies in the documentation process. On the other hand, the cost associated with manual generation of documentation templates is significant. Consequently there is a need for at least partial automation of the template generation process. We propose an approach and methodology for the creation of structured documentation templates for diabetes using Natural Language Processing (NLP).
ERIC Educational Resources Information Center
Hendley, Tom
1995-01-01
Discussion of digital document image processing focuses on issues and options associated with greyscale and color image processing. Topics include speed; size of original document; scanning resolution; markets for different categories of scanners, including photographic libraries, publishing, and office applications; hybrid systems; data…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-14
... an Interface Control Working Group (ICWG) Meeting for Document ICD-GPS-870 AGENCY: Interface Control Working Group (ICWG) meeting for document ICD-GPS-870. ACTION: Meeting Notice. SUMMARY: This notice... Working Group (ICWG) meeting for document ICD-GPS-870, Navstar Next Generation GPS Operational Control...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-19
...; Class II Special Controls Guidance Document: Nucleic Acid-Based In Vitro Diagnostic Devices for the... Guidance for Industry and Food and Drug Administration Staff; Class II Special Controls Guidance Document... II Special Controls Guidance Document: Nucleic Acid-Based In Vitro Diagnostic Devices for the...
Representation-based user interfaces for the audiovisual library of the year 2000
NASA Astrophysics Data System (ADS)
Aigrain, Philippe; Joly, Philippe; Lepain, Philippe; Longueville, Veronique
1995-03-01
The audiovisual library of the future will be based on computerized access to digitized documents. In this communication, we address the user interface issues which will arise from this new situation. One cannot simply transfer a user interface designed for the piece by piece production of some audiovisual presentation and make it a tool for accessing full-length movies in an electronic library. One cannot take a digital sound editing tool and propose it as a means to listen to a musical recording. In our opinion, when computers are used as mediations to existing contents, document representation-based user interfaces are needed. With such user interfaces, a structured visual representation of the document contents is presented to the user, who can then manipulate it to control perception and analysis of these contents. In order to build such manipulable visual representations of audiovisual documents, one needs to automatically extract structural information from the documents contents. In this communication, we describe possible visual interfaces for various temporal media, and we propose methods for the economically feasible large scale processing of documents. The work presented is sponsored by the Bibliotheque Nationale de France: it is part of the program aiming at developing for image and sound documents an experimental counterpart to the digitized text reading workstation of this library.
TkPl_SU: An Open-source Perl Script Builder for Seismic Unix
NASA Astrophysics Data System (ADS)
Lorenzo, J. M.
2017-12-01
TkPl_SU (beta) is a graphical user interface (GUI) to select parameters for Seismic Unix (SU) modules. Seismic Unix (Stockwell, 1999) is a widely distributed free software package for processing seismic reflection and signal processing. Perl/Tk is a mature, well-documented and free object-oriented graphical user interface for Perl. In a classroom environment, shell scripting of SU modules engages students and helps focus on the theoretical limitations and strengths of signal processing. However, complex interactive processing stages, e.g., selection of optimal stacking velocities, killing bad data traces, or spectral analysis requires advanced flows beyond the scope of introductory classes. In a research setting, special functionality from other free seismic processing software such as SioSeis (UCSD-NSF) can be incorporated readily via an object-oriented style to programming. An object oriented approach is a first step toward efficient extensible programming of multi-step processes, and a simple GUI simplifies parameter selection and decision making. Currently, in TkPl_SU, Perl 5 packages wrap 19 of the most common SU modules that are used in teaching undergraduate and first-year graduate student classes (e.g., filtering, display, velocity analysis and stacking). Perl packages (classes) can advantageously add new functionality around each module and clarify parameter names for easier usage. For example, through the use of methods, packages can isolate the user from repetitive control structures, as well as replace the names of abbreviated parameters with self-describing names. Moose, an extension of the Perl 5 object system, greatly facilitates an object-oriented style. Perl wrappers are self-documenting via Perl programming document markup language.
Tobacco industry strategy to undermine tobacco control in Finland
Hiilamo, H
2003-01-01
Objective: To identify and explain tobacco industry strategy in undermining tobacco control measures in Finland and results of these interferences in tobacco policy development during the 1980s and early 1990s. Methods: Tobacco industry documents, which have been publicly available on the internet as a result of litigation in the USA, were analysed. Documents were sought by Finland and by names of organisations and tobacco control activists. Documents were accessed and assessed between September 2000 and November 2002. Tactics of the tobacco industry activities were categorised as presented by Saloojee and Dagli. Results: The international tobacco companies utilised similar strategies in Finland as in other industrial markets to fight tobacco control and legislation, the health advocacy movement, and litigation. These activities slowed down the development and implementation of the Tobacco Act in Finland. However, despite the extensive pressure, the industry was not able to prevent the most progressive tobacco legislation in Europe from being passed and coming into force in Finland in 1977 and in 1995. Conclusion: Denying the health hazards caused by tobacco—despite indisputable scientific evidence—decreased the credibility of the tobacco industry. Strategy of denial was falsely chosen, as health advocacy groups were active both in society and the parliamentary system. The strong influence of the tobacco industry may have in fact increased the visibility of tobacco control in Finland as the litigation process was also drawing attention to negative health effects of tobacco. Therefore the tobacco industry did not manage to convince public opinion. However, the tobacco industry did obtain experience in Finland in how to object to tobacco control measures. PMID:14660780
Independent Orbiter Assessment (IOA): Analysis of the guidance, navigation, and control subsystem
NASA Technical Reports Server (NTRS)
Trahan, W. H.; Odonnell, R. A.; Pietz, K. C.; Hiott, J. M.
1986-01-01
The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) is presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The independent analysis results corresponding to the Orbiter Guidance, Navigation, and Control (GNC) Subsystem hardware are documented. The function of the GNC hardware is to respond to guidance, navigation, and control software commands to effect vehicle control and to provide sensor and controller data to GNC software. Some of the GNC hardware for which failure modes analysis was performed includes: hand controllers; Rudder Pedal Transducer Assembly (RPTA); Speed Brake Thrust Controller (SBTC); Inertial Measurement Unit (IMU); Star Tracker (ST); Crew Optical Alignment Site (COAS); Air Data Transducer Assembly (ADTA); Rate Gyro Assemblies; Accelerometer Assembly (AA); Aerosurface Servo Amplifier (ASA); and Ascent Thrust Vector Control (ATVC). The IOA analysis process utilized available GNC hardware drawings, workbooks, specifications, schematics, and systems briefs for defining hardware assemblies, components, and circuits. Each hardware item was evaluated and analyzed for possible failure modes and effects. Criticality was assigned based upon the severity of the effect for each failure mode.
ERIC Educational Resources Information Center
Urcelay, Gonzalo P.; Lipatova, Olga; Miller, Ralph R.
2009-01-01
Three Pavlovian fear conditioning experiments with rats as subjects explored the effect of extinction in the presence of a concurrent excitor. Our aim was to explore this particular treatment, documented in previous studies to deepen extinction, with novel control groups to shed light on the processes involved in extinction. Relative to subjects…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-14
... the following address: Roberto Morales, OAQPS Document Control Officer (C404-02), U.S. EPA, Research... well as to conduct a clearly defined, time-limited process by which any similarly justified revisions... docket for this rulemaking for a quantitative demonstration of this proposed revision, as well as for the...
Interpreting and Reporting Radiological Water-Quality Data
McCurdy, David E.; Garbarino, John R.; Mullin, Ann H.
2008-01-01
This document provides information to U.S. Geological Survey (USGS) Water Science Centers on interpreting and reporting radiological results for samples of environmental matrices, most notably water. The information provided is intended to be broadly useful throughout the United States, but it is recommended that scientists who work at sites containing radioactive hazardous wastes need to consult additional sources for more detailed information. The document is largely based on recognized national standards and guidance documents for radioanalytical sample processing, most notably the Multi-Agency Radiological Laboratory Analytical Protocols Manual (MARLAP), and on documents published by the U.S. Environmental Protection Agency and the American National Standards Institute. It does not include discussion of standard USGS practices including field quality-control sample analysis, interpretive report policies, and related issues, all of which shall always be included in any effort by the Water Science Centers. The use of 'shall' in this report signifies a policy requirement of the USGS Office of Water Quality.
NASA Technical Reports Server (NTRS)
Boyd, D. Douglas, Jr.; Brooks, Thomas F.; Burley, Casey L.; Jolly, J. Ralph, Jr.
1998-01-01
This document details the methodology and use of the CAMRAD.Mod1/HIRES codes, which were developed at NASA Langley Research Center for the prediction of helicopter harmonic and Blade-Vortex Interaction (BVI) noise. CANMAD.Mod1 is a substantially modified version of the performance/trim/wake code CANMAD. High resolution blade loading is determined in post-processing by HIRES and an associated indicial aerodynamics code. Extensive capabilities of importance to noise prediction accuracy are documented, including a new multi-core tip vortex roll-up wake model, higher harmonic and individual blade control, tunnel and fuselage correction input, diagnostic blade motion input, and interfaces for acoustic and CFD aerodynamics codes. Modifications and new code capabilities are documented with examples. A users' job preparation guide and listings of variables and namelists are given.
KSC Technical Capabilities Website
NASA Technical Reports Server (NTRS)
Nufer, Brian; Bursian, Henry; Brown, Laurette L.
2010-01-01
This document is the website pages that review the technical capabilities that the Kennedy Space Center (KSC) has for partnership opportunities. The purpose of this information is to make prospective customers aware of the capabilities and provide an opportunity to form relationships with the experts at KSC. The technical capabilities fall into these areas: (1) Ground Operations and Processing Services, (2) Design and Analysis Solutions, (3) Command and Control Systems / Services, (4) Materials and Processes, (5) Research and Technology Development and (6) Laboratories, Shops and Test Facilities.
Issues and Techniques of CASE Integration With Configuration Management
1992-03-01
all four!) process architecture classes. For example, Frame Technology’s FrameMaker is a client/server tool because it provides server functions for... FrameMaker clients; it is a parent/child tool since a top-level control panel is used to "fork" child FrameMaker sessions; the "forked" FrameMaker ...sessions are persistent tools since they may be reused to create and modify any number of FrameMaker documents. Despite this, however, these process
Cargo Movement Operations System (CMOS). Increment II System Design Document, Final
1990-08-02
NO [ ] COMMENT DISPOSITION: COMMENT STATUS: OPEN [ ] CLOSED [ ] ORIGINATOR CONTROL NUMBER: SSDD-0002 PROGRAM OFFICE CONTROL NUMBER: DATA ITEM DISCREPANCY WORKSHEET CDRL NUMBER: A002-06 DATE: 08/02/90 ORIGINATOR NAME: John J. Brassil OFFICE SYMBOL: SAIC TELEPHONE NUMBER: 272-2999 SUBSTANTIVE: X EDITORIAL: PAGE NUMBER: 35 PARA NUMBER: 4.2.1.3.1 COMMENT OR RECOMMENDED CHANGE: Add subordinate paragraphs to describe the 3-digit System Capabilities that are listed under System Administration in Appendix G. RATIONALE: Both Process Outbound Cargo
ATV Engineering Support Team Safety Console Preparation for the Johannes Kepler Mission
NASA Astrophysics Data System (ADS)
Chase, R.; Oliefka, L.
2010-09-01
This paper describes the improvements to be implemented in the Safety console position of the Engineering Support Team(EST) at the Automated Transfer Vehicle(ATV) Control Centre(ATV-CC) for the upcoming ATV Johannes Kepler mission. The ATV missions to the International Space Station are monitored and controlled from the ATV-CC in Toulouse, France. The commanding of ATV is performed by the Vehicle Engineering Team(VET) in the main control room under authority of the Flight Director. The EST performs a monitoring function in a room beside the main control room. One of the EST positions is the Safety console, which is staffed by safety engineers from ESA and the industrial prime contractor, Astrium. The function of the Safety console is to check whether the hazard controls are available throughout the mission as required by the Hazard Reports approved by the ISS Safety Review Panel. Safety console preparation activities were limited prior to the first ATV mission due to schedule constraints, and the safety engineers involved have been working to improve the readiness for ATV 2. The following steps have been taken or are in process, and will be described in this paper: • review of the implementation of Operations Control Agreement Documents(OCADs) that record the way operational hazard controls are performed to meet the needs of the Hazard Reports(typically in Flight Rules and Crew Procedures), • crosscheck of operational control needs and implementations with respect to ATV's first flight observations and post flight evaluations, with a view to identifying additional, obsolete or revised operational hazard controls, • participation in the Flight Rule review and update process carried out between missions, • participation in the assessment of anomalies observed during the first ATV mission, to ensure that any impacts are addressed in the ATV 2 safety documentation, • preparation of a Safety console handbook to provide lists of important safety aspects to be monitored at various stages of the mission, including links to relevant Hazard Reports, Flight Rules, and supporting documentation, • participation to training courses conducted in the frame of the ATV Training Academy(ATAC), and provision of courses related to safety for the other members of the VET and EST, • participation to simulations conducted at ATV-CC, including off-nominal cases. The result of these activities will be an improved level of readiness for the ATV 2 mission.
Campione-Barr, Nicole; Lindell, Anna K; Greer, Kelly Bassett; Rose, Amanda J
2014-08-01
The association between mothers' psychological control and their children's emotional adjustment problems is well documented. However, processes that may explain this association are not well understood. The present study tested the idea that relational aggression and psychological control within the context of the sibling relationship may help to account for the relation between mothers' psychological control and adolescents' internalizing symptoms. Older (M = 16.46, SD = 1.35 years) and younger (M = 13.67, SD = 1.56 years) siblings from 101 dyads rated the psychological control they received from mothers and siblings, and the relational aggression they received from siblings. Despite some similarities between psychological control and relational aggression, confirmatory factor analyses provided evidence that the two sibling processes are distinct. Maternal psychological control was related to psychological control and relational aggression within the sibling relationship, which were related to adolescents' anxiety and depressed mood. In addition, sibling relational aggression was a more powerful mediator of the relationship between maternal psychological control and adolescent adjustment than sibling psychological control.
1 CFR 18.4 - Form of document.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Provisions ADMINISTRATIVE COMMITTEE OF THE FEDERAL REGISTER PREPARATION, TRANSMITTAL, AND PROCESSING OF DOCUMENTS PREPARATION AND TRANSMITTAL OF DOCUMENTS GENERALLY § 18.4 Form of document. (a) A printed or... processed data are urged to consult with the Office of the Federal Register staff about possible use of the...
1 CFR 18.4 - Form of document.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Provisions ADMINISTRATIVE COMMITTEE OF THE FEDERAL REGISTER PREPARATION, TRANSMITTAL, AND PROCESSING OF DOCUMENTS PREPARATION AND TRANSMITTAL OF DOCUMENTS GENERALLY § 18.4 Form of document. (a) A printed or... processed data are urged to consult with the Office of the Federal Register staff about possible use of the...
NASA Astrophysics Data System (ADS)
Rzonca, A.
2013-12-01
The paper presents the state of the art of quality control of photogrammetric and laser scanning data captured by airborne sensors. The described subject is very important for photogrammetric and LiDAR project execution, because the data quality a prior decides about the final product quality. On the other hand, precise and effective quality control process allows to execute the missions without wide margin of safety, especially in case of the mountain areas projects. For introduction, the author presents theoretical background of the quality control, basing on his own experience, instructions and technical documentation. He describes several variants of organization solutions. Basically, there are two main approaches: quality control of the captured data and the control of discrepancies of the flight plan and its results of its execution. Both of them are able to use test of control and analysis of the data. The test is an automatic algorithm controlling the data and generating the control report. Analysis is a less complicated process, that is based on documentation, data and metadata manual check. The example of quality control system for large area project was presented. The project is being realized periodically for the territory of all Spain and named National Plan of Aerial Orthophotography (Plan Nacional de Ortofotografía Aérea, PNOA). The system of the internal control guarantees its results soon after the flight and informs the flight team of the company. It allows to correct all the errors shortly after the flight and it might stop transferring the data to another team or company, for further data processing. The described system of data quality control contains geometrical and radiometrical control of photogrammetric data and geometrical control of LiDAR data. According to all specified parameters, it checks all of them and generates the reports. They are very helpful in case of some errors or low quality data. The paper includes the author experience in the field of data quality control, presents the conclusions and suggestions of the organization and technical aspects, with a short definition of the necessary control software.
78 FR 15056 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-08
... establish, document, and maintain a system of internal risk management controls. The Rule sets forth the..., documenting, and reviewing its internal risk management control system, which are designed to, among other... documenting its risk management control system is 2,000 hours and that, on average, a registered OTC...
48 CFR 247.370 - DD Form 1384, Transportation Control and Movement Document.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false DD Form 1384, Transportation Control and Movement Document. 247.370 Section 247.370 Federal Acquisition Regulations System... Transportation in Supply Contracts 247.370 DD Form 1384, Transportation Control and Movement Document. The...
Phase 111A Crew Interface Specifications Development for Inflight Maintenance and Stowage Functions
NASA Technical Reports Server (NTRS)
Carl, John G.
1973-01-01
This report presents the findings and data products developed during the Phase IIIA Crew Interface Specification Study for Inflight Maintenance and Stowage Functions, performed by General Electric for the NASA, Johnson Space Center with a set of documentation that can be used as definitive guidelines to improve the present process of defining, controlling and managing flight crew interface requirements that are related to inflight maintenance (including assembly and servicing) and stowage functions. During the Phase IIIA contract period, the following data products were developed: 1) Projected NASA Crew Procedures/Flight Data File Development Process. 2) Inflight Maintenance Management Process Description. 3) Preliminary Draft, General Specification, Inflight Maintenance Management Requirements. 4) Inflight Maintenance Operational Process Description. 5) Preliminary Draft, General Specification, Inflight Maintenance Task and Support Requirements Analysis. 6) Suggested IFM Data Processing Reports for Logistics Management The above Inflight Maintenance data products have been developed during the Phase IIIA study after review of Space Shuttle Program Documentation, including the Level II Integrated Logistics Requirements and other DOD and NASA data relative to Payloads Accommodations and Satellite On-Orbit Servicing. These Inflight Maintenance data products were developed to be in consonance with Space Shuttle Program technical and management requirements.
Document control and Conduct of Operations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, S.K.; Meltzer, F.L.
1993-01-01
Department of Energy (DOE) Order 5480.19, Conduct of operations, places stringent requirements on a wide range of activities at DOE facilities. These requirements directly affect personnel at the Advanced Test Reactor (ATR), which is located in the Test Reactor Area of the Idaho National Engineering Laboratory and operated for DOE by EG G Idaho, Inc. In order for the ATR to comply with 5480.19, the very personality of the reactor facility's document control unit has had to undergo a major change. The Facility and Administrative Support Unit (FAS) is charged with nudntenance of ATR's controlled ddcuments-diousands of operating and administrativemore » procedures. Prior to the imposition of 5480.19, FAS was content to operate in a clerical support mode, seldom questioning or seeking to improve. This numer of doing business is inappropriate within the framework of DOE 5480.19 and is also at odds with the approach to Total Quality Management (TQM) promulgated by EG G Idaho.To comply with the requirements of 5480.19, FAS has Actively applied TQM principles. Empowered personnel to unprove operations through the establishment of a teatn approach. Begun an automation process that is already paying large dividends in terms of improved procedure accuracy and compliance. A state-of-the-art text retrival system is already in place. We are vigorously pursuing fully automated document tmcidng and document management. This paper describes in detail the steps taken to date, the improvements and the lessons learned. It aLw discusses plans for the future that will enable FAS to support the ATR in inccreasing its responsiveness to the Conduct of Operations Order.« less
Perez, Cristina de Abreu; Silva, Vera Luiza da Costa E; Bialous, Stella Aguinaga
2017-10-19
This article aims to analyze the relationship between the Brazilian government's adoption of a regulatory measure with a strong impact on the population and the opposition by invested interest groups. The methodology involves the analysis of official documents on the enforcement of health warnings on tobacco products sold in Brazil. In parallel, a search was conducted for publicly available tobacco industry documents resulting from lawsuits, with the aim of identifying the industry's reactions to this process. The findings suggest that various government acts were affected by direct interference from the tobacco industry. In some cases the interventions were explicit and in others they were indirect or difficult to identify. In light of the study's theoretical framework, the article provides original information on the Brazilian process that can be useful for government policymakers in the strategic identification of tobacco control policies.
Automatic remote-integration metering center. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Philippidis, P.A.; Weinreb, M.; de Gil, B.F.
1988-11-01
The report documents a multi-phase program for the development and demonstration of a unique automatic and remote metering system. The system consists of a solid-state meter module to provide electrical consumption data, tamper detection, and load control functions; a central master station to interrogate the meter modules for their data and also to transmit load control signals; and a data display module to be accessible to tenants wishing to obtain their meter readings. The system has the capability to measure and allocate demand and to process time of use rates. It also has a meter accuracy self-test feature. The systemmore » is suitable for both direct metering of multi-family buildings and the sub-metering of master-metered apartment buildings. In addition to describing the system, the report documents the results of a 371-point field trial at Scott Tower, a cooperative apartment building in the Bronx, New York.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-09-01
This document presents a modeling and control study of the Fluid Bed Gasification (FBG) unit at the Morgantown Energy Technology Center (METC). The work is performed under contract no. DE-FG21-94MC31384. The purpose of this study is to generate a simple FBG model from process data, and then use the model to suggest an improved control scheme which will improve operation of the gasifier. The work first developes a simple linear model of the gasifier, then suggests an improved gasifier pressure and MGCR control configuration, and finally suggests the use of a multivariable control strategy for the gasifier.
NASA Astrophysics Data System (ADS)
Chartosias, Marios
Acceptance of Carbon Fiber Reinforced Polymer (CFRP) structures requires a robust surface preparation method with improved process controls capable of ensuring high bond quality. Surface preparation in a production clean room environment prior to applying adhesive for bonding would minimize risk of contamination and reduce cost. Plasma treatment is a robust surface preparation process capable of being applied in a production clean room environment with process parameters that are easily controlled and documented. Repeatable and consistent processing is enabled through the development of a process parameter window utilizing techniques such as Design of Experiments (DOE) tailored to specific adhesive and substrate bonding applications. Insight from respective plasma treatment Original Equipment Manufacturers (OEMs) and screening tests determined critical process factors from non-factors and set the associated factor levels prior to execution of the DOE. Results from mode I Double Cantilever Beam (DCB) testing per ASTM D 5528 [1] standard and DOE statistical analysis software are used to produce a regression model and determine appropriate optimum settings for each factor.
Infection prevention and control in the design of healthcare facilities.
Farrow, Tye S; Black, Stephen M
2009-01-01
The lead paper, "Healthcare-Associated Infections as Patient Safety Indicators," written by Gardam, Lemieux, Reason, van Dijk and Goel, puts forward the design of healthcare facilities as one of many strategies to improve patient safety with respect to healthcare-associated infections. This commentary explores some of the issues in balancing infection prevention and control priorities with other needs and values brought to the design process. This balance is challenged not only by a lack of supporting evidence but also by the superficial nature in which infection prevention and control are often discussed within a design context. For the physical environment to support any patient safety initiative, the design of the processes must be developed in conjunction with that of the physical environment so that compliance can be natural and convenient. Finally, consideration is given to the value of documenting decision-making related to infection prevention and control in facility design and ongoing assessments of existing facilities.
McCarty, L Kelsey; Saddawi-Konefka, Daniel; Gargan, Lauren M; Driscoll, William D; Walsh, John L; Peterfreund, Robert A
2014-12-01
Process improvement in healthcare delivery settings can be difficult, even when there is consensus among clinicians about a clinical practice or desired outcome. Airway management is a medical intervention fundamental to the delivery of anesthesia care. Like other medical interventions, a detailed description of the management methods should be documented. Despite this expectation, airway documentation is often insufficient. The authors hypothesized that formal adoption of process improvement methods could be used to increase the rate of "complete" airway management documentation. The authors defined a set of criteria as a local practice standard of "complete" airway management documentation. The authors then employed selected process improvement methodologies over 13 months in three iterative and escalating phases to increase the percentage of records with complete documentation. The criteria were applied retrospectively to determine the baseline frequency of complete records, and prospectively to measure the impact of process improvements efforts over the three phases of implementation. Immediately before the initial intervention, a retrospective review of 23,011 general anesthesia cases over 6 months showed that 13.2% of patient records included complete documentation. At the conclusion of the 13-month improvement effort, documentation improved to a completion rate of 91.6% (P<0.0001). During the subsequent 21 months, the completion rate was sustained at an average of 90.7% (SD, 0.9%) across 82,571 general anesthetic records. Systematic application of process improvement methodologies can improve airway documentation and may be similarly effective in improving other areas of anesthesia clinical practice.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jantzen, C. M.; Edwards, T. B.; Trivelpiece, C. L.
Radioactive high level waste (HLW) at the Savannah River Site (SRS) has successfully been vitrified into borosilicate glass in the Defense Waste Processing Facility (DWPF) since 1996. Vitrification requires stringent product/process (P/P) constraints since the glass cannot be reworked once it is poured into ten foot tall by two foot diameter canisters. A unique “feed forward” statistical process control (SPC) was developed for this control rather than statistical quality control (SQC). In SPC, the feed composition to the DWPF melter is controlled prior to vitrification. In SQC, the glass product would be sampled after it is vitrified. Individual glass property-compositionmore » models form the basis for the “feed forward” SPC. The models transform constraints on the melt and glass properties into constraints on the feed composition going to the melter in order to guarantee, at the 95% confidence level, that the feed will be processable and that the durability of the resulting waste form will be acceptable to a geologic repository. This report documents the development of revised TiO 2, Na 2O, Li 2O and Fe 2O 3 coefficients in the SWPF liquidus model and revised coefficients (a, b, c, and d).« less
Using Inspections to Improve the Quality of Product Documentation and Code.
ERIC Educational Resources Information Center
Zuchero, John
1995-01-01
Describes how, by adapting software inspections to assess documentation and code, technical writers can collaborate with development personnel, editors, and customers to dramatically improve both the quality of documentation and the very process of inspecting that documentation. Notes that the five steps involved in the inspection process are:…
Document boundary determination using structural and lexical analysis
NASA Astrophysics Data System (ADS)
Taghva, Kazem; Cartright, Marc-Allen
2009-01-01
The document boundary determination problem is the process of identifying individual documents in a stack of papers. In this paper, we report on a classification system for automation of this process. The system employs features based on document structure and lexical content. We also report on experimental results to support the effectiveness of this system.
RBAC-Matrix-based EMR right management system to improve HIPAA compliance.
Lee, Hung-Chang; Chang, Shih-Hsin
2012-10-01
Security control of Electronic Medical Record (EMR) is a mechanism used to manage electronic medical records files and protect sensitive medical records document from information leakage. Researches proposed the Role-Based Access Control(RBAC). However, with the increasing scale of medical institutions, the access control behavior is difficult to have a detailed declaration among roles in RBAC. Furthermore, with the stringent specifications such as the U.S. HIPAA and Canada PIPEDA etc., patients are encouraged to have the right in regulating the access control of his EMR. In response to these problems, we propose an EMR digital rights management system, which is a RBAC-based extension to a matrix organization of medical institutions, known as RBAC-Matrix. With the aim of authorizing the EMR among roles in the organization, RBAC-Matrix also allow patients to be involved in defining access rights of his records. RBAC-Matrix authorizes access control declaration among matrix organizations of medical institutions by using XrML file in association with each EMR. It processes XrML rights declaration file-based authorization of behavior in the two-stage design, called master & servant stage, thus makes the associated EMR to be better protected. RBAC-Matrix will also make medical record file and its associated XrML declaration to two different EMRA(EMR Authorization)roles, namely, the medical records Document Creator (DC) and the medical records Document Right Setting (DRS). Access right setting, determined by the DRS, is cosigned by the patient, thus make the declaration of rights and the use of EMR to comply with HIPAA specifications.
The evolving story of information assurance at the DoD.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campbell, Philip LaRoche
2007-01-01
This document is a review of five documents on information assurance from the Department of Defense (DoD), namely 5200.40, 8510.1-M, 8500.1, 8500.2, and an ''interim'' document on DIACAP [9]. The five documents divide into three sets: (1) 5200.40 & 8510.1-M, (2) 8500.1 & 8500.2, and (3) the interim DIACAP document. The first two sets describe the certification and accreditation process known as ''DITSCAP''; the last two sets describe the certification and accreditation process known as ''DIACAP'' (the second set applies to both processes). Each set of documents describes (1) a process, (2) a systems classification, and (3) a measurement standard.more » Appendices in this report (a) list the Phases, Activities, and Tasks of DITSCAP, (b) note the discrepancies between 5200.40 and 8510.1-M concerning DITSCAP Tasks and the System Security Authorization Agreement (SSAA), (c) analyze the DIACAP constraints on role fusion and on reporting, (d) map terms shared across the documents, and (e) review three additional documents on information assurance, namely DCID 6/3, NIST 800-37, and COBIT{reg_sign}.« less
The coupling of fluids, dynamics, and controls on advanced architecture computers
NASA Technical Reports Server (NTRS)
Atwood, Christopher
1995-01-01
This grant provided for the demonstration of coupled controls, body dynamics, and fluids computations in a workstation cluster environment; and an investigation of the impact of peer-peer communication on flow solver performance and robustness. The findings of these investigations were documented in the conference articles.The attached publication, 'Towards Distributed Fluids/Controls Simulations', documents the solution and scaling of the coupled Navier-Stokes, Euler rigid-body dynamics, and state feedback control equations for a two-dimensional canard-wing. The poor scaling shown was due to serialized grid connectivity computation and Ethernet bandwidth limits. The scaling of a peer-to-peer communication flow code on an IBM SP-2 was also shown. The scaling of the code on the switched fabric-linked nodes was good, with a 2.4 percent loss due to communication of intergrid boundary point information. The code performance on 30 worker nodes was 1.7 (mu)s/point/iteration, or a factor of three over a Cray C-90 head. The attached paper, 'Nonlinear Fluid Computations in a Distributed Environment', documents the effect of several computational rate enhancing methods on convergence. For the cases shown, the highest throughput was achieved using boundary updates at each step, with the manager process performing communication tasks only. Constrained domain decomposition of the implicit fluid equations did not degrade the convergence rate or final solution. The scaling of a coupled body/fluid dynamics problem on an Ethernet-linked cluster was also shown.
Off-Line Quality Control In Integrated Circuit Fabrication Using Experimental Design
NASA Astrophysics Data System (ADS)
Phadke, M. S.; Kackar, R. N.; Speeney, D. V.; Grieco, M. J.
1987-04-01
Off-line quality control is a systematic method of optimizing production processes and product designs. It is widely used in Japan to produce high quality products at low cost. The method was introduced to us by Professor Genichi Taguchi who is a Deming-award winner and a former Director of the Japanese Academy of Quality. In this paper we will i) describe the off-line quality control method, and ii) document our efforts to optimize the process for forming contact windows in 3.5 Aim CMOS circuits fabricated in the Murray Hill Integrated Circuit Design Capability Laboratory. In the fabrication of integrated circuits it is critically important to produce contact windows of size very near the target dimension. Windows which are too small or too large lead to loss of yield. The off-line quality control method has improved both the process quality and productivity. The variance of the window size has been reduced by a factor of four. Also, processing time for window photolithography has been substantially reduced. The key steps of off-line quality control are: i) Identify important manipulatable process factors and their potential working levels. ii) Perform fractional factorial experiments on the process using orthogonal array designs. iii) Analyze the resulting data to determine the optimum operating levels of the factors. Both the process mean and the process variance are considered in this analysis. iv) Conduct an additional experiment to verify that the new factor levels indeed give an improvement.
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly J. (Editor)
2008-01-01
The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes configuration management and quality assurance documents from the GCS project. Volume 4 contains six appendices: A. Software Accomplishment Summary for the Guidance and Control Software Project; B. Software Configuration Index for the Guidance and Control Software Project; C. Configuration Management Records for the Guidance and Control Software Project; D. Software Quality Assurance Records for the Guidance and Control Software Project; E. Problem Report for the Pluto Implementation of the Guidance and Control Software Project; and F. Support Documentation Change Reports for the Guidance and Control Software Project.
Selecting a Clinical Intervention Documentation System for an Academic Setting
Andrus, Miranda; Hester, E. Kelly; Byrd, Debbie C.
2011-01-01
Pharmacists' clinical interventions have been the subject of a substantial body of literature that focuses on the process and outcomes of establishing an intervention documentation program within the acute care setting. Few reports describe intervention documentation as a component of doctor of pharmacy (PharmD) programs; none describe the process of selecting an intervention documentation application to support the complete array of pharmacy practice and experiential sites. The process that a school of pharmacy followed to select and implement a school-wide intervention system to document the clinical and financial impact of an experiential program is described. Goals included finding a tool that allowed documentation from all experiential sites and the ability to assign dollar savings (hard and soft) to all documented interventions. The paper provides guidance for other colleges and schools of pharmacy in selecting a clinical intervention documentation system for program-wide use. PMID:21519426
Selecting a clinical intervention documentation system for an academic setting.
Fox, Brent I; Andrus, Miranda; Hester, E Kelly; Byrd, Debbie C
2011-03-10
Pharmacists' clinical interventions have been the subject of a substantial body of literature that focuses on the process and outcomes of establishing an intervention documentation program within the acute care setting. Few reports describe intervention documentation as a component of doctor of pharmacy (PharmD) programs; none describe the process of selecting an intervention documentation application to support the complete array of pharmacy practice and experiential sites. The process that a school of pharmacy followed to select and implement a school-wide intervention system to document the clinical and financial impact of an experiential program is described. Goals included finding a tool that allowed documentation from all experiential sites and the ability to assign dollar savings (hard and soft) to all documented interventions. The paper provides guidance for other colleges and schools of pharmacy in selecting a clinical intervention documentation system for program-wide use.
Script identification from images using cluster-based templates
Hochberg, J.G.; Kelly, P.M.; Thomas, T.R.
1998-12-01
A computer-implemented method identifies a script used to create a document. A set of training documents for each script to be identified is scanned into the computer to store a series of exemplary images representing each script. Pixels forming the exemplary images are electronically processed to define a set of textual symbols corresponding to the exemplary images. Each textual symbol is assigned to a cluster of textual symbols that most closely represents the textual symbol. The cluster of textual symbols is processed to form a representative electronic template for each cluster. A document having a script to be identified is scanned into the computer to form one or more document images representing the script to be identified. Pixels forming the document images are electronically processed to define a set of document textual symbols corresponding to the document images. The set of document textual symbols is compared to the electronic templates to identify the script. 17 figs.
Script identification from images using cluster-based templates
Hochberg, Judith G.; Kelly, Patrick M.; Thomas, Timothy R.
1998-01-01
A computer-implemented method identifies a script used to create a document. A set of training documents for each script to be identified is scanned into the computer to store a series of exemplary images representing each script. Pixels forming the exemplary images are electronically processed to define a set of textual symbols corresponding to the exemplary images. Each textual symbol is assigned to a cluster of textual symbols that most closely represents the textual symbol. The cluster of textual symbols is processed to form a representative electronic template for each cluster. A document having a script to be identified is scanned into the computer to form one or more document images representing the script to be identified. Pixels forming the document images are electronically processed to define a set of document textual symbols corresponding to the document images. The set of document textual symbols is compared to the electronic templates to identify the script.
Documentation Panels Enhance Teacher Education Programs
ERIC Educational Resources Information Center
Warash, Bobbie Gibson
2005-01-01
Documentation of children's projects is advantageous to their learning process and is also a good method for student teachers to observe the process of learning. Documentation panels are a unique way to help student teachers understand how children learn. Completing a panel requires a student teacher to think through a process. Teachers must learn…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-24
... feedwater valve isolation times to the Licensee Controlled Document that is referenced in the Bases. The... Controlled Document that is referenced in the Bases and replacing the isolation time with the phase, ``within... isolation valve times to the Licensee Controlled Document that is referenced in the Bases. The requirements...
Harold S.J. Zald; Thomas A. Spies; Manuela Huso; Demetrios Gatziolis
2012-01-01
Tree invasions have been documented throughout Northern Hemisphere high elevation meadows, as well as globally in many grass and forb-dominated ecosystems. Tree invasions are often associated with large-scale changes in climate or disturbance regimes, but are fundamentally driven by regeneration processes influenced by interactions between climatic, topographic, and...
This document is a project plan for a pilot study at the United Chrome NPL site, Corvallis, Oregon and includes the health and safety and quality assurance/quality control plans. The plan reports results of a bench-scale study of the treatment process as iieasured by the ...
ERIC Educational Resources Information Center
Miller, Melvin D.; Wilkins, Sandra
A third-party evaluation was designed to document the processes undertaken to implement a Memphis, Tennessee, experience-based career education (EBCE) program. It also intended to assess project effects on student outcomes. Evaluation included pre- and post-testing of a control group and experimental group of tenth grade students enrolled in the…
ERIC Educational Resources Information Center
Ahmad, Ikhlas; Vansteenkiste, Maarten; Soenens, Bart
2013-01-01
Although the effects of important parenting dimensions, such as responsiveness and psychological control, are well documented among Western populations, research has only recently begun to systematically identify psychological processes that may account for the cross-cultural generalization of these effects. A first aim of this study was to…
User's Guide for ERB 7 SEFDT. Volume 1: User's Guide. Volume 2: Quality Control Report, Year 1
NASA Technical Reports Server (NTRS)
Ray, S. N.; Tighe, R. J.; Scherrer, S. A.
1984-01-01
The Nimbus-7 ERB SEFDT Data User's Guide is presented. The guide consists of four subsections which describe: (1) the scope of the data User's Guide; (2) the background on Nimbus-7 Spacecraft and the ERB experiment; (3) the SEFDT data product and processing scenario; and (4) other related products and documents.
Cleanup Verification Package for the 118-C-1, 105-C Solid Waste Burial Ground
DOE Office of Scientific and Technical Information (OSTI.GOV)
M. J. Appel and J. M. Capron
2007-07-25
This cleanup verification package documents completion of remedial action for the 118-C-1, 105-C Solid Waste Burial Ground. This waste site was the primary burial ground for general wastes from the operation of the 105-C Reactor and received process tubes, aluminum fuel spacers, control rods, reactor hardware, spent nuclear fuel and soft wastes.
NASA Technical Reports Server (NTRS)
Yanosy, James L.
1988-01-01
A Model Description Document for the Emulation Simulation Computer Model was already published. The model consisted of a detailed model (emulation) of a SAWD CO2 removal subsystem which operated with much less detailed (simulation) models of a cabin, crew, and condensing and sensible heat exchangers. The purpose was to explore the utility of such an emulation simulation combination in the design, development, and test of a piece of ARS hardware, SAWD. Extensions to this original effort are presented. The first extension is an update of the model to reflect changes in the SAWD control logic which resulted from test. Also, slight changes were also made to the SAWD model to permit restarting and to improve the iteration technique. The second extension is the development of simulation models for more pieces of air and water processing equipment. Models are presented for: EDC, Molecular Sieve, Bosch, Sabatier, a new condensing heat exchanger, SPE, SFWES, Catalytic Oxidizer, and multifiltration. The third extension is to create two system simulations using these models. The first system presented consists of one air and one water processing system. The second consists of a potential air revitalization system.
“Key to the Future”: British American Tobacco and Cigarette Smuggling in China
Lee, Kelley; Collin, Jeff
2006-01-01
Background Cigarette smuggling is a major public health issue, stimulating increased tobacco consumption and undermining tobacco control measures. China is the ultimate prize among tobacco's emerging markets, and is also believed to have the world's largest cigarette smuggling problem. Previous work has demonstrated the complicity of British American Tobacco (BAT) in this illicit trade within Asia and the former Soviet Union. Methods and Findings This paper analyses internal documents of BAT available on site from the Guildford Depository and online from the BAT Document Archive. Documents dating from the early 1900s to 2003 were searched and indexed on a specially designed project database to enable the construction of an historical narrative. Document analysis incorporated several validation techniques within a hermeneutic process. This paper describes the huge scale of this illicit trade in China, amounting to billions of (United States) dollars in sales, and the key supply routes by which it has been conducted. It examines BAT's efforts to optimise earnings by restructuring operations, and controlling the supply chain and pricing of smuggled cigarettes. Conclusions Our research shows that smuggling has been strategically critical to BAT's ongoing efforts to penetrate the Chinese market, and to its overall goal to become the leading company within an increasingly global industry. These findings support the need for concerted efforts to strengthen global collaboration to combat cigarette smuggling. PMID:16834455
Sefton, Gerri; Lane, Steven; Killen, Roger; Black, Stuart; Lyon, Max; Ampah, Pearl; Sproule, Cathryn; Loren-Gosling, Dominic; Richards, Caitlin; Spinty, Jean; Holloway, Colette; Davies, Coral; Wilson, April; Chean, Chung Shen; Carter, Bernie; Carrol, E D
2017-05-01
Pediatric Early Warning Scores are advocated to assist health professionals to identify early signs of serious illness or deterioration in hospitalized children. Scores are derived from the weighting applied to recorded vital signs and clinical observations reflecting deviation from a predetermined "norm." Higher aggregate scores trigger an escalation in care aimed at preventing critical deterioration. Process errors made while recording these data, including plotting or calculation errors, have the potential to impede the reliability of the score. To test this hypothesis, we conducted a controlled study of documentation using five clinical vignettes. We measured the accuracy of vital sign recording, score calculation, and time taken to complete documentation using a handheld electronic physiological surveillance system, VitalPAC Pediatric, compared with traditional paper-based charts. We explored the user acceptability of both methods using a Web-based survey. Twenty-three staff participated in the controlled study. The electronic physiological surveillance system improved the accuracy of vital sign recording, 98.5% versus 85.6%, P < .02, Pediatric Early Warning Score calculation, 94.6% versus 55.7%, P < .02, and saved time, 68 versus 98 seconds, compared with paper-based documentation, P < .002. Twenty-nine staff completed the Web-based survey. They perceived that the electronic physiological surveillance system offered safety benefits by reducing human error while providing instant visibility of recorded data to the entire clinical team.
Methods and apparatuses for information analysis on shared and distributed computing systems
Bohn, Shawn J [Richland, WA; Krishnan, Manoj Kumar [Richland, WA; Cowley, Wendy E [Richland, WA; Nieplocha, Jarek [Richland, WA
2011-02-22
Apparatuses and computer-implemented methods for analyzing, on shared and distributed computing systems, information comprising one or more documents are disclosed according to some aspects. In one embodiment, information analysis can comprise distributing one or more distinct sets of documents among each of a plurality of processes, wherein each process performs operations on a distinct set of documents substantially in parallel with other processes. Operations by each process can further comprise computing term statistics for terms contained in each distinct set of documents, thereby generating a local set of term statistics for each distinct set of documents. Still further, operations by each process can comprise contributing the local sets of term statistics to a global set of term statistics, and participating in generating a major term set from an assigned portion of a global vocabulary.
The centrality of meta-programming in the ES-DOC eco-system
NASA Astrophysics Data System (ADS)
Greenslade, Mark
2017-04-01
The Earth System Documentation (ES-DOC) project is an international effort aiming to deliver a robust earth system model inter-comparison project documentation infrastructure. Such infrastructure both simplifies & standardizes the process of documenting (in detail) projects, experiments, models, forcings & simulations. In support of CMIP6, ES-DOC has upgraded its eco-system of tools, web-services & web-sites. The upgrade consolidates the existing infrastructure (built for CMIP5) and extends it with the introduction of new capabilities. The strategic focus of the upgrade is improvements in the documentation experience and broadening the range of scientific use-cases that the archived documentation may help deliver. Whether it is highlighting dataset errors, exploring experimental protocols, comparing forcings across ensemble runs, understanding MIP objectives, reviewing citations, exploring component properties of configured models, visualising inter-model relationships, scientists involved in CMIP6 will find the ES-DOC infrastructure helpful. This presentation underlines the centrality of meta-programming within the ES-DOC eco-system. We will demonstrate how agility is greatly enhanced by taking a meta-programming approach to representing data models and controlled vocabularies. Such an approach nicely decouples representations from encodings. Meta-models will be presented along with the associated tooling chain that forward engineers artefacts as diverse as: class hierarchies, IPython notebooks, mindmaps, configuration files, OWL & SKOS documents, spreadsheets …etc.
Content-based retrieval of historical Ottoman documents stored as textual images.
Saykol, Ediz; Sinop, Ali Kemal; Güdükbay, Ugur; Ulusoy, Ozgür; Cetin, A Enis
2004-03-01
There is an accelerating demand to access the visual content of documents stored in historical and cultural archives. Availability of electronic imaging tools and effective image processing techniques makes it feasible to process the multimedia data in large databases. In this paper, a framework for content-based retrieval of historical documents in the Ottoman Empire archives is presented. The documents are stored as textual images, which are compressed by constructing a library of symbols occurring in a document, and the symbols in the original image are then replaced with pointers into the codebook to obtain a compressed representation of the image. The features in wavelet and spatial domain based on angular and distance span of shapes are used to extract the symbols. In order to make content-based retrieval in historical archives, a query is specified as a rectangular region in an input image and the same symbol-extraction process is applied to the query region. The queries are processed on the codebook of documents and the query images are identified in the resulting documents using the pointers in textual images. The querying process does not require decompression of images. The new content-based retrieval framework is also applicable to many other document archives using different scripts.
Krebs, Erin E; Bair, Matthew J; Carey, Timothy S; Weinberger, Morris
2010-03-01
Researchers and quality improvement advocates sometimes use review of chart-documented pain care processes to assess the quality of pain management. Studies have found that primary care providers frequently fail to document pain assessment and management. To assess documentation of pain care processes in an academic primary care clinic and evaluate the validity of this documentation as a measure of pain care delivered. Prospective observational study. 237 adult patients at a university-affiliated internal medicine clinic who reported any pain in the last week. Immediately after a visit, we asked patients to report the pain treatment they received. Patients completed the Brief Pain Inventory (BPI) to assess pain severity at baseline and 1 month later. We extracted documentation of pain care processes from the medical record and used kappa statistics to assess agreement between documentation and patient report of pain treatment. Using multivariable linear regression, we modeled whether documented or patient-reported pain care predicted change in pain at 1 month. Participants' mean age was 53.7 years, 66% were female, and 74% had chronic pain. Physicians documented pain assessment for 83% of visits. Patients reported receiving pain treatment more often (67%) than was documented by physicians (54%). Agreement between documentation and patient report was moderate for receiving a new pain medication (k = 0.50) and slight for receiving pain management advice (k = 0.13). In multivariable models, documentation of new pain treatment was not associated with change in pain (p = 0.134). In contrast, patient-reported receipt of new pain treatment predicted pain improvement (p = 0.005). Chart documentation underestimated pain care delivered, compared with patient report. Documented pain care processes had no relationship with pain outcomes at 1 month, but patient report of receiving care predicted clinically significant improvement. Chart review measures may not accurately reflect the pain management patients receive in primary care.
NASA information resources management handbook
NASA Technical Reports Server (NTRS)
1992-01-01
This National Aeronautics and Space Administration (NASA) Handbook (NHB) implements recent changes to Federal laws and regulations involving the acquisition, management, and use of Federal Information Processing (FIP) resources. This document defines NASA's Information Resources Management (IRM) practices and procedures and is applicable to all NASA personnel. The dynamic nature of the IRM environment requires that the controlling management practices and procedures for an Agency at the leading edge of technology, such as NASA, must be periodically updated to reflect the changes in this environment. This revision has been undertaken to accommodate changes in the technology and the impact of new laws and regulations dealing with IRM. The contents of this document will be subject to a complete review annually to determine its continued applicability to the acquisition, management, and use of FIP resources by NASA. Updates to this document will be accomplished by page changes. This revision cancels NHB 2410.1D, dated April 1985.
NASA Technical Reports Server (NTRS)
Banda, Carolyn; Bushnell, David; Chen, Scott; Chiu, Alex; Constantine, Betsy; Murray, Jerry; Neukom, Christian; Prevost, Michael; Shankar, Renuka; Staveland, Lowell
1991-01-01
The Man-Machine Integration Design and Analysis System (MIDAS) is an integrated suite of software components that constitutes a prototype workstation to aid designers in applying human factors principles to the design of complex human-machine systems. MIDAS is intended to be used at the very early stages of conceptual design to provide an environment wherein designers can use computational representations of the crew station and operator, instead of hardware simulators and man-in-the-loop studies, to discover problems and ask 'what if' questions regarding the projected mission, equipment, and environment. This document is the Software Product Specification for MIDAS. Introductory descriptions of the processing requirements, hardware/software environment, structure, I/O, and control are given in the main body of the document for the overall MIDAS system, with detailed discussion of the individual modules included in Annexes A-J.
Beall, Robert M.
1974-01-01
Urban water planning, development, and management are many sectored, costly efforts, subject to a multitude of controls and demands including those imposed by nature. One primary concern in development is for providing a dependable and safe water supply. In spite of a bountiful natural availability, the process of satisfying consumer needs involves the resolution of a variety of problems, not the least of which are cooperation and coordination among suppliers. One of the fundamental requisites in seeking sound solutions to developmental and environmental problems is inventory documentation. This map is one facet of documentation; the data listing, given on sheet 2, is the companion inventory. These supplement State, regional, and local efforts directed toward both long-range planning and current evaluation programs. Such documentation also assists the assessment of the effect of one water-management subsystem on hydrologic characteristics.
PIMS-Universal Payload Information Management
NASA Technical Reports Server (NTRS)
Elmore, Ralph; McNair, Ann R. (Technical Monitor)
2002-01-01
As the overall manager and integrator of International Space Station (ISS) science payloads and experiments, the Payload Operations Integration Center (POIC) at Marshall Space Flight Center had a critical need to provide an information management system for exchange and management of ISS payload files as well as to coordinate ISS payload related operational changes. The POIC's information management system has a fundamental requirement to provide secure operational access not only to users physically located at the POIC, but also to provide collaborative access to remote experimenters and International Partners. The Payload Information Management System (PIMS) is a ground based electronic document configuration management and workflow system that was built to service that need. Functionally, PIMS provides the following document management related capabilities: 1. File access control, storage and retrieval from a central repository vault. 2. Collect supplemental data about files in the vault. 3. File exchange with a PMS GUI client, or any FTP connection. 4. Files placement into an FTP accessible dropbox for pickup by interfacing facilities, included files transmitted for spacecraft uplink. 5. Transmission of email messages to users notifying them of new version availability. 6. Polling of intermediate facility dropboxes for files that will automatically be processed by PIMS. 7. Provide an API that allows other POIC applications to access PIMS information. Functionally, PIMS provides the following Change Request processing capabilities: 1. Ability to create, view, manipulate, and query information about Operations Change Requests (OCRs). 2. Provides an adaptable workflow approval of OCRs with routing through developers, facility leads, POIC leads, reviewers, and implementers. Email messages can be sent to users either involving them in the workflow process or simply notifying them of OCR approval progress. All PIMS document management and OCR workflow controls are coordinated through and routed to individual user's "to do" list tasks. A user is given a task when it is their turn to perform some action relating to the approval of the Document or OCR. The user's available actions are restricted to only functions available for the assigned task. Certain actions, such as review or action implementation by non-PIMS users, can also be coordinated through automated emails.
British American Tobacco's tactics during China's accession to the World Trade Organization
Zhong, Fei; Yano, Eiji
2007-01-01
Background China entered the World Trade Organization (WTO) in 2001 after years of negotiations. As a WTO member, China had to reduce tariffs on imported cigarettes and remove non‐tariff barriers to allow foreign cigarettes to be more competitive in the Chinese market. Among foreign tobacco companies, British American Tobacco (BAT) was the most active lobbyist during China's WTO negotiations. Objective To review and analyse BAT's tactics and activities relating to China's entry into the WTO. Methods Internal tobacco industry documents were reviewed and are featured here. Industry documents were searched mainly on the website of BAT's Guildford Depository and other documents' websites. 528 documents were evaluated and 142 were determined to be relevant to China's entry into the WTO. Results BAT was extremely active during the progress of China's entry into the WTO. The company focused its lobbying efforts on two main players in the negotiations: the European Union (EU) and the US. Because of the negative moral and health issues related to tobacco, BAT did not seek public support from officials associated with the WTO negotiations. Instead, BAT lobbyists suggested that officials protect the interests of BAT by presenting the company's needs as similar to those of all European companies. During the negotiation process, BAT officials repeatedly spoke favourably of China's accession into the WTO, with the aim of presenting BAT as a facilitator in this process and of gaining preferential treatment from their Chinese competitor. Conclusions BAT's activities clearly suggest that tobacco companies place their own interests above public health interests. Today, China struggles with issues of tobacco control that are aggravated by the aggressive practices of transnational tobacco companies, tobacco‐tariff reductions and the huge number of smokers. For the tobacco‐control movement to progress in China, health advocates must understand how foreign tobacco companies have undermined anti‐tobacco activities by taking advantage of trade liberalisation policies. China should attach importance to public health and comprehensive tobacco‐control policies and guarantee strong protection measures from national and international tobacco interests supported by international trade agreements. PMID:17400952
British American Tobacco's tactics during China's accession to the World Trade Organization.
Zhong, Fei; Yano, Eiji
2007-04-01
China entered the World Trade Organization (WTO) in 2001 after years of negotiations. As a WTO member, China had to reduce tariffs on imported cigarettes and remove non-tariff barriers to allow foreign cigarettes to be more competitive in the Chinese market. Among foreign tobacco companies, British American Tobacco (BAT) was the most active lobbyist during China's WTO negotiations. To review and analyse BAT's tactics and activities relating to China's entry into the WTO. Internal tobacco industry documents were reviewed and are featured here. Industry documents were searched mainly on the website of BAT's Guildford Depository and other documents' websites. 528 documents were evaluated and 142 were determined to be relevant to China's entry into the WTO. BAT was extremely active during the progress of China's entry into the WTO. The company focused its lobbying efforts on two main players in the negotiations: the European Union (EU) and the US. Because of the negative moral and health issues related to tobacco, BAT did not seek public support from officials associated with the WTO negotiations. Instead, BAT lobbyists suggested that officials protect the interests of BAT by presenting the company's needs as similar to those of all European companies. During the negotiation process, BAT officials repeatedly spoke favourably of China's accession into the WTO, with the aim of presenting BAT as a facilitator in this process and of gaining preferential treatment from their Chinese competitor. BAT's activities clearly suggest that tobacco companies place their own interests above public health interests. Today, China struggles with issues of tobacco control that are aggravated by the aggressive practices of transnational tobacco companies, tobacco-tariff reductions and the huge number of smokers. For the tobacco-control movement to progress in China, health advocates must understand how foreign tobacco companies have undermined anti-tobacco activities by taking advantage of trade liberalisation policies. China should attach importance to public health and comprehensive tobacco-control policies and guarantee strong protection measures from national and international tobacco interests supported by international trade agreements.
Simplifying operations with an uplink/downlink integration toolkit
NASA Technical Reports Server (NTRS)
Murphy, Susan C.; Miller, Kevin J.; Guerrero, Ana Maria; Joe, Chester; Louie, John J.; Aguilera, Christine
1994-01-01
The Operations Engineering Lab (OEL) at JPL has developed a simple, generic toolkit to integrate the uplink/downlink processes, (often called closing the loop), in JPL's Multimission Ground Data System. This toolkit provides capabilities for integrating telemetry verification points with predicted spacecraft commands and ground events in the Mission Sequence Of Events (SOE) document. In the JPL ground data system, the uplink processing functions and the downlink processing functions are separate subsystems that are not well integrated because of the nature of planetary missions with large one-way light times for spacecraft-to-ground communication. Our new closed-loop monitoring tool allows an analyst or mission controller to view and save uplink commands and ground events with their corresponding downlinked telemetry values regardless of the delay in downlink telemetry and without requiring real-time intervention by the user. An SOE document is a time-ordered list of all the planned ground and spacecraft events, including all commands, sequence loads, ground events, significant mission activities, spacecraft status, and resource allocations. The SOE document is generated by expansion and integration of spacecraft sequence files, ground station allocations, navigation files, and other ground event files. This SOE generation process has been automated within the OEL and includes a graphical, object-oriented SOE editor and real-time viewing tool running under X/Motif. The SOE toolkit was used as the framework for the integrated implementation. The SOE is used by flight engineers to coordinate their operations tasks, serving as a predict data set in ground operations and mission control. The closed-loop SOE toolkit allows simple, automated integration of predicted uplink events with correlated telemetry points in a single SOE document for on-screen viewing and archiving. It automatically interfaces with existing real-time or non real-time sources of information, to display actual values from the telemetry data stream. This toolkit was designed to greatly simplify the user's ability to access and view telemetry data, and also provide a means to view this data in the context of the commands and ground events that are used to interpret it. A closed-loop system can prove especially useful in small missions with limited resources requiring automated monitoring tools. This paper will discuss the toolkit implementation, including design trade-offs and future plans for enhancing the automated capabilities.
Simplifying operations with an uplink/downlink integration toolkit
NASA Astrophysics Data System (ADS)
Murphy, Susan C.; Miller, Kevin J.; Guerrero, Ana Maria; Joe, Chester; Louie, John J.; Aguilera, Christine
1994-11-01
The Operations Engineering Lab (OEL) at JPL has developed a simple, generic toolkit to integrate the uplink/downlink processes, (often called closing the loop), in JPL's Multimission Ground Data System. This toolkit provides capabilities for integrating telemetry verification points with predicted spacecraft commands and ground events in the Mission Sequence Of Events (SOE) document. In the JPL ground data system, the uplink processing functions and the downlink processing functions are separate subsystems that are not well integrated because of the nature of planetary missions with large one-way light times for spacecraft-to-ground communication. Our new closed-loop monitoring tool allows an analyst or mission controller to view and save uplink commands and ground events with their corresponding downlinked telemetry values regardless of the delay in downlink telemetry and without requiring real-time intervention by the user. An SOE document is a time-ordered list of all the planned ground and spacecraft events, including all commands, sequence loads, ground events, significant mission activities, spacecraft status, and resource allocations. The SOE document is generated by expansion and integration of spacecraft sequence files, ground station allocations, navigation files, and other ground event files. This SOE generation process has been automated within the OEL and includes a graphical, object-oriented SOE editor and real-time viewing tool running under X/Motif. The SOE toolkit was used as the framework for the integrated implementation. The SOE is used by flight engineers to coordinate their operations tasks, serving as a predict data set in ground operations and mission control. The closed-loop SOE toolkit allows simple, automated integration of predicted uplink events with correlated telemetry points in a single SOE document for on-screen viewing and archiving. It automatically interfaces with existing real-time or non real-time sources of information, to display actual values from the telemetry data stream. This toolkit was designed to greatly simplify the user's ability to access and view telemetry data, and also provide a means to view this data in the context of the commands and ground events that are used to interpret it. A closed-loop system can prove especially useful in small missions with limited resources requiring automated monitoring tools. This paper will discuss the toolkit implementation, including design trade-offs and future plans for enhancing the automated capabilities.
Independent Orbiter Assessment (IOA): Analysis of the hydraulics/water spray boiler subsystem
NASA Technical Reports Server (NTRS)
Duval, J. D.; Davidson, W. R.; Parkman, William E.
1986-01-01
The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items (PCIs). To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. This report documents the independent analysis results for the Orbiter Hydraulics/Water Spray Boiler Subsystem. The hydraulic system provides hydraulic power to gimbal the main engines, actuate the main engine propellant control valves, move the aerodynamic flight control surfaces, lower the landing gear, apply wheel brakes, steer the nosewheel, and dampen the external tank (ET) separation. Each hydraulic system has an associated water spray boiler which is used to cool the hydraulic fluid and APU lubricating oil. The IOA analysis process utilized available HYD/WSB hardware drawings, schematics and documents for defining hardware assemblies, components, and hardware items. Each level of hardware was evaluated and analyzed for possible failure modes and effects. Criticality was assigned based upon the severity of the effect for each failure mode. Of the 430 failure modes analyzed, 166 were determined to be PCIs.
Independent Orbiter Assessment (IOA): Analysis of the remote manipulator system
NASA Technical Reports Server (NTRS)
Tangorra, F.; Grasmeder, R. F.; Montgomery, A. D.
1987-01-01
The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items (PCIs). To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The independent analysis results for the Orbiter Remote Manipulator System (RMS) are documented. The RMS hardware and software are primarily required for deploying and/or retrieving up to five payloads during a single mission, capture and retrieve free-flying payloads, and for performing Manipulator Foot Restraint operations. Specifically, the RMS hardware consists of the following components: end effector; displays and controls; manipulator controller interface unit; arm based electronics; and the arm. The IOA analysis process utilized available RMS hardware drawings, schematics and documents for defining hardware assemblies, components and hardware items. Each level of hardware was evaluated and analyzed for possible failure modes and effects. Criticality was assigned based upon the severity of the effect for each failure mode. Of the 574 failure modes analyzed, 413 were determined to be PCIs.
NASA Technical Reports Server (NTRS)
Robinson, W. W.
1987-01-01
The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA approach features a top-down analysis of the Electrical Power Distribution and Control (EPD and C)/Remote Manipulator System (RMS) hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained in the NASA FMEA/CIL documentation. This report documents the results of the independent analysis of the EPD and C/RMS (both port and starboard) hardware. The EPD and C/RMS subsystem hardware provides the electrical power and power control circuitry required to safely deploy, operate, control, and stow or guillotine and jettison two (one port and one starboard) RMSs. The EPD and C/RMS subsystem is subdivided into the four following functional divisions: Remote Manipulator Arm; Manipulator Deploy Control; Manipulator Latch Control; Manipulator Arm Shoulder Jettison; and Retention Arm Jettison. The IOA analysis process utilized available EPD and C/RMS hardware drawings and schematics for defining hardware assemblies, components, and hardware items. Each level of hardware was evaluated and analyzed for possible failure modes and effects. Criticality was assigned based on the severity of the effect for each failure mode.
Takeda, Toshihiro; Ueda, Kanayo; Nakagawa, Akito; Manabe, Shirou; Okada, Katsuki; Mihara, Naoki; Matsumura, Yasushi
2017-01-01
Electronic health record (EHR) systems are necessary for the sharing of medical information between care delivery organizations (CDOs). We developed a document-based EHR system in which all of the PDF documents that are stored in our electronic medical record system can be disclosed to selected target CDOs. An access control list (ACL) file was designed based on the HL7 CDA header to manage the information that is disclosed.
NASA Technical Reports Server (NTRS)
Barry, Matthew R.; Osborne, Richard N.
2005-01-01
The RoseDoclet computer program extends the capability of Java doclet software to automatically synthesize Unified Modeling Language (UML) content from Java language source code. [Doclets are Java-language programs that use the doclet application programming interface (API) to specify the content and format of the output of Javadoc. Javadoc is a program, originally designed to generate API documentation from Java source code, now also useful as an extensible engine for processing Java source code.] RoseDoclet takes advantage of Javadoc comments and tags already in the source code to produce a UML model of that code. RoseDoclet applies the doclet API to create a doclet passed to Javadoc. The Javadoc engine applies the doclet to the source code, emitting the output format specified by the doclet. RoseDoclet emits a Rose model file and populates it with fully documented packages, classes, methods, variables, and class diagrams identified in the source code. The way in which UML models are generated can be controlled by use of new Javadoc comment tags that RoseDoclet provides. The advantage of using RoseDoclet is that Javadoc documentation becomes leveraged for two purposes: documenting the as-built API and keeping the design documentation up to date.
Bar-code medication administration system for anesthetics: effects on documentation and billing.
Nolen, Agatha L; Rodes, W Dyer
2008-04-01
The effects of using a new bar-code medication administration (BCMA) system for anesthetics to automate documentation of drug administration by anesthesiologists were studied. From October 1, 2004, to September 15, 2005, all medications administered to patients undergoing cardiac surgery were documented with a BCMA system at a large acute care facility. Drug claims data for 12 targeted anesthetics in diagnosis-related groups (DRGs) 104-111 were analyzed to determine the quantity of drugs charged and the revenue generated. Those data were compared with claims data for a historical case-control group (October 1, 2003, to September 15, 2004, for the same DRGs) for which medication use was documented manually. From October 1, 2005, to October 1, 2006, anesthesiologists for cardiac surgeries either voluntarily used the automated system or completed anesthesia records manually. A total of 870 cardiac surgery cases for which the BCMA system was used were evaluated. There were 961 cardiac surgery cases in the historical control group. The BCMA system increased the quantity of drugs documented per case by 21.7% and drug revenue captured per case by 18.8%. The time needed by operating-room pharmacy staff to process an anesthesia record for billing decreased by eight minutes per case. After two years, anesthesiologists voluntarily used the new technology on 100% of cardiac surgery patients. Implementation of a BCMA system for anesthetic use in cardiac surgery increased the quantity of drugs charged by 21.7% per case and drug revenue per case by 18.8%. Anesthesiologists continued to use the automated system on a voluntary basis after conclusion of the initial study.
Feng, Haixia; Li, Guohong; Xu, Cuirong; Ju, Changping; Suo, Peiheng
2017-12-01
The aim of the study was to analyse the influence of prevention measures on pressure injuries for high-risk patients and to establish the most appropriate methods of implementation. Nurses assessed patients using a checklist and factors influencing the prevention of a pressure injury determined by brain storming. A specific series of measures was drawn up and an estimate of risk of pressure injury determined using the Braden Scale, analysis of nursing documents, implementation of prevention measures for pressure sores and awareness of the system both before and after carrying out a quality control circle (QCC) process. The overall scores of implementation of prevention measures ranged from 74.86 ± 14.24 to 87.06 ± 17.04, a result that was statistically significant (P < 0.0025). The Braden Scale scores ranged from 8.53 ± 3.21 to 13.48 ± 3.57. The nursing document scores ranged from 7.67 ± 3.98 to 10.12 ± 1.63; prevention measure scores ranged from 11.48 ± 4.18 to 13.96 ± 3.92. Differences in all of the above results are statistically significant (P < 0.05). Implementation of a QCC can standardise and improve the prevention measures for patients who are vulnerable to pressure sores and is of practical importance to their prevention and control. © 2017 Medicalhelplines.com Inc and John Wiley & Sons Ltd.
Cross-reference identification within a PDF document
NASA Astrophysics Data System (ADS)
Li, Sida; Gao, Liangcai; Tang, Zhi; Yu, Yinyan
2015-01-01
Cross-references, such like footnotes, endnotes, figure/table captions, references, are a common and useful type of page elements to further explain their corresponding entities in the target document. In this paper, we focus on cross-reference identification in a PDF document, and present a robust method as a case study of identifying footnotes and figure references. The proposed method first extracts footnotes and figure captions, and then matches them with their corresponding references within a document. A number of novel features within a PDF document, i.e., page layout, font information, lexical and linguistic features of cross-references, are utilized for the task. Clustering is adopted to handle the features that are stable in one document but varied in different kinds of documents so that the process of identification is adaptive with document types. In addition, this method leverages results from the matching process to provide feedback to the identification process and further improve the algorithm accuracy. The primary experiments in real document sets show that the proposed method is promising to identify cross-reference in a PDF document.
Using color management in color document processing
NASA Astrophysics Data System (ADS)
Nehab, Smadar
1995-04-01
Color Management Systems have been used for several years in Desktop Publishing (DTP) environments. While this development hasn't matured yet, we are already experiencing the next generation of the color imaging revolution-Device Independent Color for the small office/home office (SOHO) environment. Though there are still open technical issues with device independent color matching, they are not the focal point of this paper. This paper discusses two new and crucial aspects in using color management in color document processing: the management of color objects and their associated color rendering methods; a proposal for a precedence order and handshaking protocol among the various software components involved in color document processing. As color peripherals become affordable to the SOHO market, color management also becomes a prerequisite for common document authoring applications such as word processors. The first color management solutions were oriented towards DTP environments whose requirements were largely different. For example, DTP documents are image-centric, as opposed to SOHO documents that are text and charts centric. To achieve optimal reproduction on low-cost SOHO peripherals, it is critical that different color rendering methods are used for the different document object types. The first challenge in using color management of color document processing is the association of rendering methods with object types. As a result of an evolutionary process, color matching solutions are now available as application software, as driver embedded software and as operating system extensions. Consequently, document processing faces a new challenge, the correct selection of the color matching solution while avoiding duplicate color corrections.
10 CFR 850 Implementation of Requirements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, S
2012-01-05
10 CFR 850 defines a contractor as any entity, including affiliated entities, such as a parent corporation, under contract with DOE, including a subcontractor at any tier, with responsibility for performing work at a DOE site in furtherance of a DOE mission. The Chronic Beryllium Disease Prevention Program (CBDPP) applies to beryllium-related activities that are performed at the Lawrence Livermore National Laboratory (LLNL). The CBDPP or Beryllium Safety Program is integrated into the LLNL Worker Safety and Health Program and, thus, implementation documents and responsibilities are integrated in various documents and organizational structures. Program development and management of the CBDPPmore » is delegated to the Environment, Safety and Health (ES&H) Directorate, Worker Safety and Health Functional Area. As per 10 CFR 850, Lawrence Livermore National Security, LLC (LLNS) periodically submits a CBDPP to the National Nuclear Security Administration/Livermore Site Office (NNSA/LSO). The requirements of this plan are communicated to LLNS workers through ES&H Manual Document 14.4, 'Working Safely with Beryllium.' 10 CFR 850 is implemented by the LLNL CBDPP, which integrates the safety and health standards required by the regulation, components of the LLNL Integrated Safety Management System (ISMS), and incorporates other components of the LLNL ES&H Program. As described in the regulation, and to fully comply with the regulation, specific portions of existing programs and additional requirements are identified in the CBDPP. The CBDPP is implemented by documents that interface with the workers, principally through ES&H Manual Document 14.4. This document contains information on how the management practices prescribed by the LLNL ISMS are implemented, how beryllium hazards that are associated with LLNL work activities are controlled, and who is responsible for implementing the controls. Adherence to the requirements and processes described in the ES&H Manual ensures that ES&H practices across LLNL are developed in a consistent manner. Other implementing documents, such as the ES&H Manual, are integral in effectively implementing 10 CFR 850.« less
Design of a modular digital computer system, CDRL no. D001, final design plan
NASA Technical Reports Server (NTRS)
Easton, R. A.
1975-01-01
The engineering breadboard implementation for the CDRL no. D001 modular digital computer system developed during design of the logic system was documented. This effort followed the architecture study completed and documented previously, and was intended to verify the concepts of a fault tolerant, automatically reconfigurable, modular version of the computer system conceived during the architecture study. The system has a microprogrammed 32 bit word length, general register architecture and an instruction set consisting of a subset of the IBM System 360 instruction set plus additional fault tolerance firmware. The following areas were covered: breadboard packaging, central control element, central processing element, memory, input/output processor, and maintenance/status panel and electronics.
The TMIS life-cycle process document, revision A
NASA Technical Reports Server (NTRS)
1991-01-01
The Technical and Management Information System (TMIS) Life-Cycle Process Document describes the processes that shall be followed in the definition, design, development, test, deployment, and operation of all TMIS products and data base applications. This document is a roll out of TMIS Standards Document (SSP 30546). The purpose of this document is to define the life cycle methodology that the developers of all products and data base applications and any subsequent modifications shall follow. Included in this methodology are descriptions of the tasks, deliverables, reviews, and approvals that are required before a product or data base application is accepted in the TMIS environment.
Automated data acquisition technology development:Automated modeling and control development
NASA Technical Reports Server (NTRS)
Romine, Peter L.
1995-01-01
This report documents the completion of, and improvements made to, the software developed for automated data acquisition and automated modeling and control development on the Texas Micro rackmounted PC's. This research was initiated because a need was identified by the Metal Processing Branch of NASA Marshall Space Flight Center for a mobile data acquisition and data analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC based system was chosen. The Welding Measurement System (WMS), is a dedicated instrument strickly for use of data acquisition and data analysis. In addition to the data acquisition functions described in this thesis, WMS also supports many functions associated with process control. The hardware and software requirements for an automated acquisition system for welding process parameters, welding equipment checkout, and welding process modeling were determined in 1992. From these recommendations, NASA purchased the necessary hardware and software. The new welding acquisition system is designed to collect welding parameter data and perform analysis to determine the voltage versus current arc-length relationship for VPPA welding. Once the results of this analysis are obtained, they can then be used to develop a RAIL function to control welding startup and shutdown without torch crashing.
Building new hospitals: a UK infection control perspective.
Stockley, J M; Constantine, C E; Orr, K E
2006-03-01
Infection control input is vital throughout the planning, design and building stages of a new hospital project, and must continue through the commissioning (and decommissioning) process, evaluation and putting the facility into full clinical service. Many hospitals continue to experience problems months or years after occupying the new premises; some of these could have been avoided by infection control involvement earlier in the project. The importance of infection control must be recognized by the chief executive of the hospital trust and project teams overseeing the development. Clinical user groups and contractors must also be made aware of infection control issues. It is vital that good working relationships are built up between the infection control team (ICT) and all these parties. ICTs need the authority to influence the process. This may require their specific recognition by the Private Finance Initiative National Unit, the Department of Health or other relevant authorities. ICTs need training in how to read design plans, how to write effective specifications, and in other areas with which they may be unfamiliar. The importance of documentation and record keeping is paramount. External or independent validation of processes should be available, particularly in commissioning processes. Building design in relation to infection control needs stricter national regulations, allowing ICTs to focus on more local usage issues. Further research is needed to provide evidence regarding the relationship between building design and the prevalence of infection.
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, Wes; Sanders, Les
1991-01-01
The design of the Framework Processor (FP) component of the Framework Programmable Software Development Platform (FFP) is described. The FFP is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by the model, this Framework Processor will take advantage of an integrated operating environment to provide automated support for the management and control of the software development process so that costly mistakes during the development phase can be eliminated.
A Dedicated Microprocessor Controller for a Bound Document Scanner.
1984-06-01
focused onto the CCD which converts the image into 2048 pixels. After the pixel data are processed by the scanner hardware, they are sent to the display...data in real time after the data on each of the 2048 pixel elements .-.- .---.; . has been transferred out of the device. Display-control commands and...05 06 07 Fig. 4.9 2716 EPROM Block D~iagram and Pin Assignment HE-E 64 BYTES RA ’ FFF 4095 INTERNAL I FCO 4032 EXECUTABLE FBP 4031 RA Soo0 2048 _ _7FF
Development of a Software Safety Process and a Case Study of Its Use
NASA Technical Reports Server (NTRS)
Knight, J. C.
1996-01-01
Research in the year covered by this reporting period has been primarily directed toward: continued development of mock-ups of computer screens for operator of a digital reactor control system; development of a reactor simulation to permit testing of various elements of the control system; formal specification of user interfaces; fault-tree analysis including software; evaluation of formal verification techniques; and continued development of a software documentation system. Technical results relating to this grant and the remainder of the principal investigator's research program are contained in various reports and papers.
The control of float zone interfaces by the use of selected boundary conditions
NASA Technical Reports Server (NTRS)
Foster, L. M.; Mcintosh, J.
1983-01-01
The main goal of the float zone crystal growth project of NASA's Materials Processing in Space Program is to thoroughly understand the molten zone/freezing crystal system and all the mechanisms that govern this system. The surface boundary conditions required to give flat float zone solid melt interfaces were studied and computed. The results provide float zone furnace designers with better methods for controlling solid melt interface shapes and for computing thermal profiles and gradients. Documentation and a user's guide were provided for the computer software.
Documenting an ISO 9000 Quality System.
ERIC Educational Resources Information Center
Fisher, Barry
1995-01-01
Discusses six steps to follow when documenting an ISO 9000 quality system: using ISO 9000 to develop a quality system, identifying a company's business processes, analyzing the business processes; describing the procedures, writing the quality manual, and working to the documented procedures. (SR)
TECHNICAL GUIDANCE DOCUMENT: QUALITY ASSURANCE AND QUALITY CONTROL FOR WASTE CONTAINMENT FACILITIES
This Technical Guidance Document provides comprehensive guidance on procedures for quality assurance and quality control for waste containment facilities. he document includes a discussion of principles and concepts, compacted soil liners, soil drainage systems, geosynthetic drai...
TECHNICAL GUIDANCE DOCUMENT: QUALITY ASSURANCE AND QUALITY CONTROL FOR WASTE CONTAINMENT FACILITIES
This Technical Guidance Document provides comprehensive guidance on procedures for quality assurance and quality control for waste containment facilities. The document includes a discussion of principles and concepts, compacted soil liners, soil drainage systems, geosynthetic dr...
Step-and-Repeat Nanoimprint-, Photo- and Laser Lithography from One Customised CNC Machine.
Greer, Andrew Im; Della-Rosa, Benoit; Khokhar, Ali Z; Gadegaard, Nikolaj
2016-12-01
The conversion of a computer numerical control machine into a nanoimprint step-and-repeat tool with additional laser- and photolithography capacity is documented here. All three processes, each demonstrated on a variety of photoresists, are performed successfully and analysed so as to enable the reader to relate their known lithography process(es) to the findings. Using the converted tool, 1 cm(2) of nanopattern may be exposed in 6 s, over 3300 times faster than the electron beam equivalent. Nanoimprint tools are commercially available, but these can cost around 1000 times more than this customised computer numerical control (CNC) machine. The converted equipment facilitates rapid production and large area micro- and nanoscale research on small grants, ultimately enabling faster and more diverse growth in this field of science. In comparison to commercial tools, this converted CNC also boasts capacity to handle larger substrates, temperature control and active force control, up to ten times more curing dose and compactness. Actual devices are fabricated using the machine including an expanded nanotopographic array and microfluidic PDMS Y-channel mixers.
Step-and-Repeat Nanoimprint-, Photo- and Laser Lithography from One Customised CNC Machine
NASA Astrophysics Data System (ADS)
Greer, Andrew IM; Della-Rosa, Benoit; Khokhar, Ali Z.; Gadegaard, Nikolaj
2016-03-01
The conversion of a computer numerical control machine into a nanoimprint step-and-repeat tool with additional laser- and photolithography capacity is documented here. All three processes, each demonstrated on a variety of photoresists, are performed successfully and analysed so as to enable the reader to relate their known lithography process(es) to the findings. Using the converted tool, 1 cm2 of nanopattern may be exposed in 6 s, over 3300 times faster than the electron beam equivalent. Nanoimprint tools are commercially available, but these can cost around 1000 times more than this customised computer numerical control (CNC) machine. The converted equipment facilitates rapid production and large area micro- and nanoscale research on small grants, ultimately enabling faster and more diverse growth in this field of science. In comparison to commercial tools, this converted CNC also boasts capacity to handle larger substrates, temperature control and active force control, up to ten times more curing dose and compactness. Actual devices are fabricated using the machine including an expanded nanotopographic array and microfluidic PDMS Y-channel mixers.
Dynamic reduction of dimensions of a document vector in a document search and retrieval system
Jiao, Yu; Potok, Thomas E.
2011-05-03
The method and system of the invention involves processing each new document (20) coming into the system into a document vector (16), and creating a document vector with reduced dimensionality (17) for comparison with the data model (15) without recomputing the data model (15). These operations are carried out by a first computer (11) while a second computer (12) updates the data model (18), which can be comprised of an initial large group of documents (19) and is premised on the computing an initial data model (13, 14, 15) to provide a reference point for determining document vectors from documents processed from the data stream (20).
Code of Federal Regulations, 2011 CFR
2011-04-01
... of proceeding and scoping document, or of approval to use traditional licensing process or... PROCESS § 5.8 Notice of commencement of proceeding and scoping document, or of approval to use traditional... required under § 5.5, filing of the pre-application document pursuant to § 5.6, and filing of any request...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kunkle, Paige Elizabeth; Zhang, Ning
Nuclear Criticality Safety (NCS) has reviewed the fissionable material small sample preparation and NDA operations in Wing 7 Basement of the CMR Facility. This is a Level-1 evaluation conducted in accordance with NCS-AP-004 [Reference 1], formerly NCS-GUIDE-01, and the guidance set forth on use of the Standard Criticality Safety Requirements (SCSRs) [Reference 2]. As stated in Reference 2, the criticality safety evaluation consists of both the SCSR CSED and the SCSR Application CSED. The SCSR CSED is a Level-3 CSED [Reference 3]. This Level-1 CSED is the SCSR Application CSED. This SCSR Application (Level-1) evaluation does not derive controls, itmore » simply applies controls derived from the SCSR CSED (Level-3) for the application of operations conducted here. The controls derived in the SCSR CSED (Level-3) were evaluated via the process described in Section 6.6.5 of SD-130 (also reproduced in Section 4.3.5 of NCS-AP-004 [Reference 1]) and were determined to not meet the requirements for consideration of elevation into the safety basis documentation for CMR. According to the guidance set forth on use of the SCSRs [Reference 2], the SCSR CSED (Level-3) is also applicable to the CMR Facility because the process and the normal and credible abnormal conditions in question are bounded by those that are described in the SCSR CSED. The controls derived in the SCSR CSED include allowances for solid materials and solution operations. Based on the operations conducted at this location, there are less-than-accountable (LTA) amounts of 233U. Based on the evaluation documented herein, the normal and credible abnormal conditions that might arise during the execution of this process will remain subcritical with the following recommended controls.« less
Methods, media, and systems for detecting attack on a digital processing device
Stolfo, Salvatore J.; Li, Wei-Jen; Keromylis, Angelos D.; Androulaki, Elli
2014-07-22
Methods, media, and systems for detecting attack are provided. In some embodiments, the methods include: comparing at least part of a document to a static detection model; determining whether attacking code is included in the document based on the comparison of the document to the static detection model; executing at least part of the document; determining whether attacking code is included in the document based on the execution of the at least part of the document; and if attacking code is determined to be included in the document based on at least one of the comparison of the document to the static detection model and the execution of the at least part of the document, reporting the presence of an attack. In some embodiments, the methods include: selecting a data segment in at least one portion of an electronic document; determining whether the arbitrarily selected data segment can be altered without causing the electronic document to result in an error when processed by a corresponding program; in response to determining that the arbitrarily selected data segment can be altered, arbitrarily altering the data segment in the at least one portion of the electronic document to produce an altered electronic document; and determining whether the corresponding program produces an error state when the altered electronic document is processed by the corresponding program.
Methods, media, and systems for detecting attack on a digital processing device
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stolfo, Salvatore J.; Li, Wei-Jen; Keromytis, Angelos D.
Methods, media, and systems for detecting attack are provided. In some embodiments, the methods include: comparing at least part of a document to a static detection model; determining whether attacking code is included in the document based on the comparison of the document to the static detection model; executing at least part of the document; determining whether attacking code is included in the document based on the execution of the at least part of the document; and if attacking code is determined to be included in the document based on at least one of the comparison of the document tomore » the static detection model and the execution of the at least part of the document, reporting the presence of an attack. In some embodiments, the methods include: selecting a data segment in at least one portion of an electronic document; determining whether the arbitrarily selected data segment can be altered without causing the electronic document to result in an error when processed by a corresponding program; in response to determining that the arbitrarily selected data segment can be altered, arbitrarily altering the data segment in the at least one portion of the electronic document to produce an altered electronic document; and determining whether the corresponding program produces an error state when the altered electronic document is processed by the corresponding program.« less
Semi-Automated Methods for Refining a Domain-Specific Terminology Base
2011-02-01
only as a resource for written and oral translation, but also for Natural Language Processing ( NLP ) applications, text retrieval, document indexing...Natural Language Processing ( NLP ) applications, text retrieval, document indexing, and other knowledge management tasks. The objective of this...also for Natural Language Processing ( NLP ) applications, text retrieval (1), document indexing, and other knowledge management tasks. The National
Ensuring Cross-Cultural Equivalence in Translation of Research Consents and Clinical Documents
Lee, Cheng-Chih; Li, Denise; Arai, Shoshana; Puntillo, Kathleen
2010-01-01
The aim of this article is to describe a formal process used to translate research study materials from English into traditional Chinese characters. This process may be useful for translating documents for use by both research participants and clinical patients. A modified Brislin model was used as the systematic translation process. Four bilingual translators were involved, and a Flaherty 3-point scale was used to evaluate the translated documents. The linguistic discrepancies that arise in the process of ensuring cross-cultural congruency or equivalency between the two languages are presented to promote the development of patient-accessible cross-cultural documents. PMID:18948451
Skyttberg, Niclas; Vicente, Joana; Chen, Rong; Blomqvist, Hans; Koch, Sabine
2016-06-04
Vital sign data are important for clinical decision making in emergency care. Clinical Decision Support Systems (CDSS) have been advocated to increase patient safety and quality of care. However, the efficiency of CDSS depends on the quality of the underlying vital sign data. Therefore, possible factors affecting vital sign data quality need to be understood. This study aims to explore the factors affecting vital sign data quality in Swedish emergency departments and to determine in how far clinicians perceive vital sign data to be fit for use in clinical decision support systems. A further aim of the study is to provide recommendations on how to improve vital sign data quality in emergency departments. Semi-structured interviews were conducted with sixteen physicians and nurses from nine hospitals and vital sign documentation templates were collected and analysed. Follow-up interviews and process observations were done at three of the hospitals to verify the results. Content analysis with constant comparison of the data was used to analyse and categorize the collected data. Factors related to care process and information technology were perceived to affect vital sign data quality. Despite electronic health records (EHRs) being available in all hospitals, these were not always used for vital sign documentation. Only four out of nine sites had a completely digitalized vital sign documentation flow and paper-based triage records were perceived to provide a better mobile workflow support than EHRs. Observed documentation practices resulted in low currency, completeness, and interoperability of the vital signs. To improve vital sign data quality, we propose to standardize the care process, improve the digital documentation support, provide workflow support, ensure interoperability and perform quality control. Vital sign data quality in Swedish emergency departments is currently not fit for use by CDSS. To address both technical and organisational challenges, we propose five steps for vital sign data quality improvement to be implemented in emergency care settings.
NASA Technical Reports Server (NTRS)
Patton, Jeff A.
1986-01-01
The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. This report documents the independent analysis results corresponding to the Orbiter Electrical Power Distribution and Control (EPD and C)/Electrical Power Generation (EPG) hardware. The EPD and C/EPG hardware is required for performing critical functions of cryogenic reactant storage, electrical power generation and product water distribution in the Orbiter. Specifically, the EPD and C/EPG hardware consists of the following components: Power Section Assembly (PSA); Reactant Control Subsystem (RCS); Thermal Control Subsystem (TCS); Water Removal Subsystem (WRS); and Power Reactant Storage and Distribution System (PRSDS). The IOA analysis process utilized available EPD and C/EPG hardware drawings and schematics for defining hardware assemblies, components, and hardware items. Each level of hardware was evaluated and analyzed for possible failure modes and effects. Criticality was assigned based upon the severity of the effect for each failure mode.
MODIS information, data and control system (MIDACS) operations concepts
NASA Technical Reports Server (NTRS)
Han, D.; Salomonson, V.; Ormsby, J.; Ardanuy, P.; Mckay, A.; Hoyt, D.; Jaffin, S.; Vallette, B.; Sharts, B.; Folta, D.
1988-01-01
The MODIS Information, Data, and Control System (MIDACS) Operations Concepts Document provides a basis for the mutual understanding between the users and the designers of the MIDACS, including the requirements, operating environment, external interfaces, and development plan. In defining the concepts and scope of the system, how the MIDACS will operate as an element of the Earth Observing System (EOS) within the EosDIS environment is described. This version follows an earlier release of a preliminary draft version. The individual operations concepts for planning and scheduling, control and monitoring, data acquisition and processing, calibration and validation, data archive and distribution, and user access do not yet fully represent the requirements of the data system needed to achieve the scientific objectives of the MODIS instruments and science teams. The teams are not yet formed; however, it is possible to develop the operations concepts based on the present concept of EosDIS, the level 1 and level 2 Functional Requirements Documents, and through interviews and meetings with key members of the scientific community. The operations concepts were exercised through the application of representative scenarios.
NASA Astrophysics Data System (ADS)
A'diat, Arkan Addien Al; Liquiddanu, Eko; Laksono, Pringgo Widyo; Sutopo, Wahyudi; Suletra, I. Wayan
2018-02-01
Along with the increasing number of the modern retail business in Indonesia, give an opportunity to small and medium enterprise (SME) to sell its products through the modern retailer. There are some obstacles faced by the SMEs, one of them is about product standard. Product standard that must be owned by SMEs are GMP standard and halal standard. This research was conducted to know the fulfillment by the beef floss enterprise in jagalan in fulfilling the GMP standard and halal. In addition, Hazard Analysis and Critical Control Points (HACCP) system was applied to analyze the process. HACCP which used in this research was based on the seven principles in SNI (Indonesian National Standard) 01-4852-1998. The seven principles included hazard analysis, critical control point (CCP) determination, critical limit establishment, CCP monitor system establishment, corrective action establishment, verification, and also documentation establishment that must be applied in preparing HACCP plan. Based on this case study, it is concluded that there were 5 CCPs : the boiling process, roasting process, frying process, the beef floss draining process, and the packaging process.
Quality assurance and quality control in mammography: a review of available guidance worldwide.
Reis, Cláudia; Pascoal, Ana; Sakellaris, Taxiarchis; Koutalonis, Manthos
2013-10-01
Review available guidance for quality assurance (QA) in mammography and discuss its contribution to harmonise practices worldwide. Literature search was performed on different sources to identify guidance documents for QA in mammography available worldwide in international bodies, healthcare providers, professional/scientific associations. The guidance documents identified were reviewed and a selection was compared for type of guidance (clinical/technical), technology and proposed QA methodologies focusing on dose and image quality (IQ) performance assessment. Fourteen protocols (targeted at conventional and digital mammography) were reviewed. All included recommendations for testing acquisition, processing and display systems associated with mammographic equipment. All guidance reviewed highlighted the importance of dose assessment and testing the Automatic Exposure Control (AEC) system. Recommended tests for assessment of IQ showed variations in the proposed methodologies. Recommended testing focused on assessment of low-contrast detection, spatial resolution and noise. QC of image display is recommended following the American Association of Physicists in Medicine guidelines. The existing QA guidance for mammography is derived from key documents (American College of Radiology and European Union guidelines) and proposes similar tests despite the variations in detail and methodologies. Studies reported on QA data should provide detail on experimental technique to allow robust data comparison. Countries aiming to implement a mammography/QA program may select/prioritise the tests depending on available technology and resources. •An effective QA program should be practical to implement in a clinical setting. •QA should address the various stages of the imaging chain: acquisition, processing and display. •AEC system QC testing is simple to implement and provides information on equipment performance.
Results-driven approach to improving quality and productivity
John Dramm
2000-01-01
Quality control (QC) programs do not often realize their full potential. Elaborate and expensive QC programs can easily get side tracked by the process of building a program with promises of âSomeday, this will all pay off.â Training employees in QC methods is no guarantee that quality will improve. Several documented cases show that such activity-centered efforts...
Voice/Natural Language Interfacing for Robotic Control.
1987-11-01
THIS PAGE REPORT DOCUMENTATION PAGE Is. REPORT SECURITY CLASSIFICATION lb . RESTRICTIVE MARKINGS UNCLASSIFIED 2a. SECURITY CLASSIFICATION AUTHORITY 3...until major computing power can be profitably allocated to the speech recognition process, off-the- shelf units will never have sufficient intelligence to...coordinate transformation for a location, and opening or closing the gripper’s toggles. External to world operations, each joint may be rotated
1987-08-17
e part ma;’ ;e ,’eprodu, ed Ly an% process without wiitl.n ptrt..;ssion. Copyigh i *h ’esponsibility .-i :,e DirecLor !2hlisning n1 Mari:e irg...23. . W% :*. -% ’ S.- , .--. i44.;: SPAE O.ASSIFICATIONAL id DEPRTENT OF DEEC UNCLASSIFIED REVISED APRIL V7~UCASFID’-’ DOCUMENT CONTROL DATA PIVACY
ERIC Educational Resources Information Center
Aspira, Inc., New York, NY.
School desegregation did not lead to greater understanding of the Hispanic community by white educational personnel in two school districts analyzed to document the desegregation process and the impact of school desegregation on the Hispanic community. Each district was in a white-controlled, tri-ethnic community in its second year of successful…
NASA Technical Reports Server (NTRS)
1972-01-01
The shuttle GN&C software functions for horizontal flight operations are defined. Software functional requirements are grouped into two categories: first horizontal flight requirements and full mission horizontal flight requirements. The document privides the intial step in the shuttle GN&C software design process. It also serves as a management tool to identify analyses which are required to define requirements.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-15
... Submissions (E-Filing) All documents filed in NRC adjudicatory proceedings, including a request for hearing, a... entities participating under 10 CFR 2.315(c), must be filed in accordance with the NRC E-Filing rule (72 FR 49139, August 28, 2007). The E-Filing process requires participants to submit and serve all adjudicatory...
Best Practices and Controls for Mitigating Insider Threats
2013-08-08
for plagiarism in academic papers, the process is virtually identical Solution: Managing The Insider Threat: What Every Organization Should Know...University Plagiarism Detection & DLP Managing The Insider Threat: What Every Organization Should Know Twitter #CERTinsiderthreat © 2013 Carnegie...How do we test document similarity? • Cosine similarity algorithms • Laymen’s terms: Plagiarism Detection • Even though we’re not checking
AeroValve Experimental Test Data Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Noakes, Mark W.
This report documents the collection of experimental test data and presents performance characteristics for the AeroValve brand prototype pneumatic bidirectional solenoid valves tested at the Oak Ridge National Laboratory (ORNL) in July/August 2014 as part of a validation of AeroValve energy efficiency claims. The test stand and control programs were provided by AeroValve. All raw data and processing are included in the report attachments.
Ammenwerth, Elske; Mansmann, Ulrich; Iller, Carola; Eichstädter, Ronald
2003-01-01
The documentation of the nursing process is an important but often neglected part of clinical documentation. Paper-based systems have been introduced to support nursing process documentation. Frequently, however, problems such as low quality of documentation are reported. It is unclear whether computer-based documentation systems can reduce these problems and which factors influence their acceptance by users. We introduced a computer-based nursing documentation system on four wards of the University Hospitals of Heidelberg and systematically evaluated its preconditions and its effects in a pretest-posttest intervention study. For the analysis of user acceptance, we concentrated on subjective data drawn from questionnaires and interviews. A questionnaire was developed using items from published questionnaires and items that had to be developed for the special purpose of this study. The quantitative results point to two factors influencing the acceptance of a new computer-based documentation system: the previous acceptance of the nursing process and the previous amount of self-confidence when using computers. On one ward, the diverse acceptance scores heavily declined after the introduction of the nursing documentation system. Explorative qualitative analysis on this ward points to further success factors of computer-based nursing documentation systems. Our results can be used to assist the planning and introduction of computer-based nursing documentation systems. They demonstrate the importance of computer experience and acceptance of the nursing process on a ward but also point to other factors such as the fit between nursing workflow and the functionality of a nursing documentation system.
Applying language technology to nursing documents: pros and cons with a focus on ethics.
Suominen, Hanna; Lehtikunnas, Tuija; Back, Barbro; Karsten, Helena; Salakoski, Tapio; Salanterä, Sanna
2007-10-01
The present study discusses ethics in building and using applications based on natural language processing in electronic nursing documentation. Specifically, we first focus on the question of how patient confidentiality can be ensured in developing language technology for the nursing documentation domain. Then, we identify and theoretically analyze the ethical outcomes which arise when using natural language processing to support clinical judgement and decision-making. In total, we put forward and justify 10 claims related to ethics in applying language technology to nursing documents. A review of recent scientific articles related to ethics in electronic patient records or in the utilization of large databases was conducted. Then, the results were compared with ethical guidelines for nurses and the Finnish legislation covering health care and processing of personal data. Finally, the practical experiences of the authors in applying the methods of natural language processing to nursing documents were appended. Patient records supplemented with natural language processing capabilities may help nurses give better, more efficient and more individualized care for their patients. In addition, language technology may facilitate patients' possibility to receive truthful information about their health and improve the nature of narratives. Because of these benefits, research about the use of language technology in narratives should be encouraged. In contrast, privacy-sensitive health care documentation brings specific ethical concerns and difficulties to the natural language processing of nursing documents. Therefore, when developing natural language processing tools, patient confidentiality must be ensured. While using the tools, health care personnel should always be responsible for the clinical judgement and decision-making. One should also consider that the use of language technology in nursing narratives may threaten patients' rights by using documentation collected for other purposes. Applying language technology to nursing documents may, on the one hand, contribute to the quality of care, but, on the other hand, threaten patient confidentiality. As an overall conclusion, natural language processing of nursing documents holds the promise of great benefits if the potential risks are taken into consideration.
Webster, Joan; Bucknall, Tracey; Wallis, Marianne; McInnes, Elizabeth; Roberts, Shelley; Chaboyer, Wendy
2017-06-01
Participation in a clinical trial is believed to benefit patients but little is known about the post-trial effects on routine hospital-based care. To describe (1) hospital-based, pressure ulcer care-processes after patients were discharged from a pressure ulcer prevention, cluster randomised controlled trial; and (2) to investigate if the trial intervention had any impact on subsequent hospital-based care. We conducted a retrospective analysis of 133 trial participants who developed a pressure ulcer during the clinical trial. We compared outcomes and care processes between participants who received the pressure ulcer prevention intervention and those in the usual care, control group. We also compared care processes according to the pressure ulcer stage. A repositioning schedule was reported for 19 (14.3%) patients; 33 (24.8%) had a dressing applied to the pressure ulcer; 17 (12.8) patients were assessed by a wound care team; and 20 (15.0%) were seen by an occupational therapist. Patients in the trial's intervention group were more likely to have the presence of a pressure ulcer documented in their chart (odds ratio (OR) 8.18, 95% confidence intervals (CI) 3.64-18.36); to be referred to an occupational therapist OR 0.92 (95% CI 0.07; 0.54); to receive a pressure relieving device OR 0.31 (95% CI 0.14; 0.69); or a pressure relieving mattress OR 0.44 (95% CI 0.20; 0.96). Participants with Stage 2 or unstageable ulcers were more likely than others to have dressings applied to their wounds (p=<0.001) and to be referred to an occupational therapist for protective devices (p=0.022). Participants in the intervention group of a clinical trial were more likely to receive additional post trial care and improved documentation compared with those in the control group but documentation of pressure ulcer status and care is poor. Copyright © 2017 Elsevier Ltd. All rights reserved.
Development of Algorithms for Control of Humidity in Plant Growth Chambers
NASA Technical Reports Server (NTRS)
Costello, Thomas A.
2003-01-01
Algorithms were developed to control humidity in plant growth chambers used for research on bioregenerative life support at Kennedy Space Center. The algorithms used the computed water vapor pressure (based on measured air temperature and relative humidity) as the process variable, with time-proportioned outputs to operate the humidifier and de-humidifier. Algorithms were based upon proportional-integral-differential (PID) and Fuzzy Logic schemes and were implemented using I/O Control software (OPTO-22) to define and download the control logic to an autonomous programmable logic controller (PLC, ultimate ethernet brain and assorted input-output modules, OPTO-22), which performed the monitoring and control logic processing, as well the physical control of the devices that effected the targeted environment in the chamber. During limited testing, the PLC's successfully implemented the intended control schemes and attained a control resolution for humidity of less than 1%. The algorithms have potential to be used not only with autonomous PLC's but could also be implemented within network-based supervisory control programs. This report documents unique control features that were implemented within the OPTO-22 framework and makes recommendations regarding future uses of the hardware and software for biological research by NASA.
U-10Mo Baseline Fuel Fabrication Process Description
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hubbard, Lance R.; Arendt, Christina L.; Dye, Daniel F.
This document provides a description of the U.S. High Power Research Reactor (USHPRR) low-enriched uranium (LEU) fuel fabrication process. This document is intended to be used in conjunction with the baseline process flow diagram (PFD) presented in Appendix A. The baseline PFD is used to document the fabrication process, communicate gaps in technology or manufacturing capabilities, convey alternatives under consideration, and as the basis for a dynamic simulation model of the fabrication process. The simulation model allows for the assessment of production rates, costs, and manufacturing requirements (manpower, fabrication space, numbers and types of equipment, etc.) throughout the lifecycle ofmore » the USHPRR program. This document, along with the accompanying PFD, is updated regularly« less
Use of software engineering techniques in the design of the ALEPH data acquisition system
NASA Astrophysics Data System (ADS)
Charity, T.; McClatchey, R.; Harvey, J.
1987-08-01
The SASD methodology is being used to provide a rigorous design framework for various components of the ALEPH data acquisition system. The Entity-Relationship data model is used to describe the layout and configuration of the control and acquisition systems and detector components. State Transition Diagrams are used to specify control applications such as run control and resource management and Data Flow Diagrams assist in decomposing software tasks and defining interfaces between processes. These techniques encourage rigorous software design leading to enhanced functionality and reliability. Improved documentation and communication ensures continuity over the system life-cycle and simplifies project management.
[Design of an HACCP program for a cocoa processing facility].
López D'Sola, Patrizia; Sandia, María Gabriela; Bou Rached, Lizet; Hernández Serrano, Pilar
2012-12-01
The HACCP plan is a food safety management tool used to control physical, chemical and biological hazards associated to food processing through all the processing chain. The aim of this work is to design a HACCP Plan for a Venezuelan cocoa processing facility.The production of safe food products requires that the HACCP system be built upon a solid foundation of prerequisite programs such as Good Manufacturing Practices (GMP) and Sanitation Standard Operating Procedures (SSOP). The existence and effectiveness of these prerequisite programs were previously assessed.Good Agriculture Practices (GAP) audit to cocoa nibs suppliers were performed. To develop the HACCP plan, the five preliminary tasks and the seven HACCP principles were accomplished according to Codex Alimentarius procedures. Three Critical Control Points (CCP) were identified using a decision tree: winnowing (control of ochratoxin A), roasting (Salmonella control) and metallic particles detection. For each CCP, Critical limits were established, the Monitoring procedures, Corrective actions, Procedures for Verification and Documentation concerning all procedures and records appropriate to these principles and their application was established. To implement and maintain a HACCP plan for this processing plant is suggested. Recently OchratoxinA (OTA) has been related to cocoa beans. Although the shell separation from the nib has been reported as an effective measure to control this chemical hazard, ochratoxin prevalence study in cocoa beans produced in the country is recommended, and validate the winnowing step as well
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paul L. Wichlacz; Robert C. Starr; Brennon Orr
2003-09-01
This document summarizes previous descriptions of geochemical system conceptual models for the vadose zone and groundwater zone (aquifer) beneath the Idaho National Engineering and Environmental Laboratory (INEEL). The primary focus is on groundwater because contaminants derived from wastes disposed at INEEL are present in groundwater, groundwater provides a pathway for potential migration to receptors, and because geochemical characteristics in and processes in the aquifer can substantially affect the movement, attenuation, and toxicity of contaminants. The secondary emphasis is perched water bodies in the vadose zone. Perched water eventually reaches the regional groundwater system, and thus processes that affect contaminants inmore » the perched water bodies are important relative to the migration of contaminants into groundwater. Similarly, processes that affect solutes during transport from nearsurface disposal facilities downward through the vadose zone to the aquifer are relevant. Sediments in the vadose zone can affect both water and solute transport by restricting the downward migration of water sufficiently that a perched water body forms, and by retarding solute migration via ion exchange. Geochemical conceptual models have been prepared by a variety of researchers for different purposes. They have been published in documents prepared by INEEL contractors, the United States Geological Survey (USGS), academic researchers, and others. The documents themselves are INEEL and USGS reports, and articles in technical journals. The documents reviewed were selected from citation lists generated by searching the INEEL Technical Library, the INEEL Environmental Restoration Optical Imaging System, and the ISI Web of Science databases. The citation lists were generated using the keywords ground water, groundwater, chemistry, geochemistry, contaminant, INEL, INEEL, and Idaho. In addition, a list of USGS documents that pertain to the INEEL was obtained and manually searched. The documents that appeared to be the most pertinent were selected from further review. These documents are tabulated in the citation list. This report summarizes existing geochemical conceptual models, but does not attempt to generate a new conceptual model or select the ''right'' model. This document is organized as follows. Geochemical models are described in general in Section 2. Geochemical processes that control the transport and fate of contaminants introduced into groundwater are described in Section 3. The natural geochemistry of the Eastern Snake River Plain Aquifer (SRPA) is described in Section 4. The effect of waste disposal on the INEEL subsurface is described in Section 5. The geochemical behavior of the major contaminants is described in Section 6. Section 7 describes the site-specific geochemical models developed for various INEEL facilities.« less
The Electronic Documentation Project in the NASA mission control center environment
NASA Technical Reports Server (NTRS)
Wang, Lui; Leigh, Albert
1994-01-01
NASA's space programs like many other technical programs of its magnitude is supported by a large volume of technical documents. These documents are not only diverse but also abundant. Management, maintenance, and retrieval of these documents is a challenging problem by itself; but, relating and cross-referencing this wealth of information when it is all on a medium of paper is an even greater challenge. The Electronic Documentation Project (EDP) is to provide an electronic system capable of developing, distributing and controlling changes for crew/ground controller procedures and related documents. There are two primary motives for the solution. The first motive is to reduce the cost of maintaining the current paper based method of operations by replacing paper documents with electronic information storage and retrieval. And, the other is to improve the efficiency and provide enhanced flexibility in document usage. Initially, the current paper based system will be faithfully reproduced in an electronic format to be used in the document viewing system. In addition, this metaphor will have hypertext extensions. Hypertext features support basic functions such as full text searches, key word searches, data retrieval, and traversal between nodes of information as well as speeding up the data access rate. They enable related but separate documents to have relationships, and allow the user to explore information naturally through non-linear link traversals. The basic operational requirements of the document viewing system are to: provide an electronic corollary to the current method of paper based document usage; supplement and ultimately replace paper-based documents; maintain focused toward control center operations such as Flight Data File, Flight Rules and Console Handbook viewing; and be available NASA wide.
Secure Control Systems for the Energy Sector
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Rhett; Campbell, Jack; Hadley, Mark
2012-03-31
Schweitzer Engineering Laboratories (SEL) will conduct the Hallmark Project to address the need to reduce the risk of energy disruptions because of cyber incidents on control systems. The goals is to develop solutions that can be both applied to existing control systems and designed into new control systems to add the security measures needed to mitigate energy network vulnerabilities. The scope of the Hallmark Project contains four primary elements: 1. Technology transfer of the Secure Supervisory Control and Data Acquisition (SCADA) Communications Protocol (SSCP) from Pacific Northwest National Laboratories (PNNL) to Schweitzer Engineering Laboratories (SEL). The project shall use thismore » technology to develop a Federal Information Processing Standard (FIPS) 140-2 compliant original equipment manufacturer (OEM) module to be called a Cryptographic Daughter Card (CDC) with the ability to directly connect to any PC enabling that computer to securely communicate across serial to field devices. Validate the OEM capabilities with another vendor. 2. Development of a Link Authenticator Module (LAM) using the FIPS 140-2 validated Secure SCADA Communications Protocol (SSCP) CDC module with a central management software kit. 3. Validation of the CDC and Link Authenticator modules via laboratory and field tests. 4. Creation of documents that record the impact of the Link Authenticator to the operators of control systems and on the control system itself. The information in the documents can assist others with technology deployment and maintenance.« less
IDC Re-Engineering Phase 2 System Specification Document Version 1.5
DOE Office of Scientific and Technical Information (OSTI.GOV)
Satpathi, Meara Allena; Burns, John F.; Harris, James M.
This document contains the system specifications derived to satisfy the system requirements found in the IDC System Requirements Document for the IDC Re-Engineering Phase 2 project. This System Specification Document (SSD) defines waveform data processing requirements for the International Data Centre (IDC) of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO). The routine processing includes characterization of events with the objective of screening out events considered to be consistent with natural phenomena or non-nuclear, man-made phenomena. This document does not address requirements concerning acquisition, processing and analysis of radionuclide data but does include requirements for the dissemination of radionuclide datamore » and products.« less
33 CFR Appendix A to Part 230 - Processing Corps NEPA Documents
Code of Federal Regulations, 2010 CFR
2010-07-01
... Corps NEPA Documents NEFA documents for Civil Works activities other than permits will be processed in... Preconstruction Engineering, and Design, Construction, and Completed Projects in an Operations and Maintenance... reconnaissance phase, the district commander should undertake environmental studies along with engineering...
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Blinn, Thomas M.; Dewitte, Paula S.; Crump, John W.; Ackley, Keith A.
1992-01-01
In the second volume of the Demonstration Framework Document, the graphical representation of the demonstration framework is given. This second document was created to facilitate the reading and comprehension of the demonstration framework. It is designed to be viewed in parallel with Section 4.2 of the first volume to help give a picture of the relationships between the UOB's (Unit of Behavior) of the model. The model is quite large and the design team felt that this form of presentation would make it easier for the reader to get a feel for the processes described in this document. The IDEF3 (Process Description Capture Method) diagrams of the processes of an Information System Development are presented. Volume 1 describes the processes and the agents involved with each process, while this volume graphically shows the precedence relationships among the processes.
Risk Quantification of Systems Engineering Documents Improves Probability of DOD Project Success
2009-09-01
comprehensive risk model for DoD milestone review documentation as well as recommended changes to the Capability Maturity Model Integration ( CMMI ) Project...Milestone Documentation, Project Planning, Rational Frame, Political Frame, CMMI Project Planning Process Area, CMMI Risk Management Process Area...well as recommended changes to the Capability Maturity Model Integration ( CMMI ) Project Planning and Risk Management process areas. The intent is to
Registration and Marking Requirements for UAS. Unmanned Aircraft System (UAS) Registration
NASA Technical Reports Server (NTRS)
2005-01-01
The registration of an aircraft is a prerequisite for issuance of a U.S. certificate of airworthiness by the FAA. The procedures and requirements for aircraft registration, and the subsequent issuance of registration numbers, are contained in FAR Part 47. However, the process/method(s) for applying the requirements of Parts 45 & 47 to Unmanned Aircraft Systems (UAS) has not been defined. This task resolved the application of 14 CFR Parts 45 and 47 to UAS. Key Findings: UAS are aircraft systems and as such the recommended approach to registration is to follow the same process for registration as manned aircraft. This will require manufacturers to comply with the requirements for 14 CFR 47, Aircraft Registration and 14 CFR 45, Identification and Registration Marking. In addition, only the UA should be identified with the N number registration markings. There should also be a documentation link showing the applicability of the control station and communication link to the UA. The documentation link can be in the form of a Type Certificate Data Sheet (TCDS) entry or a UAS logbook entry. The recommended process for the registration of UAS is similar to the manned aircraft process and is outlined in a 6-step process in the paper.
Information system life-cycle and documentation standards, volume 1
NASA Technical Reports Server (NTRS)
Callender, E. David; Steinbacher, Jody
1989-01-01
The Software Management and Assurance Program (SMAP) Information System Life-Cycle and Documentation Standards Document describes the Version 4 standard information system life-cycle in terms of processes, products, and reviews. The description of the products includes detailed documentation standards. The standards in this document set can be applied to the life-cycle, i.e., to each phase in the system's development, and to the documentation of all NASA information systems. This provides consistency across the agency as well as visibility into the completeness of the information recorded. An information system is software-intensive, but consists of any combination of software, hardware, and operational procedures required to process, store, or transmit data. This document defines a standard life-cycle model and content for associated documentation.
Uneke, Chigozie Jesse; Sombie, Issiaka; Keita, Namoudou; Lokossou, Virgil; Johnson, Ermel; Ongolo-Zogo, Pierre; Uro-Chukwu, Henry Chukwuemeka
2017-01-01
Background: There is increasing recognition worldwide that health policymaking process should be informed by best available evidence. The purpose of this study was to review the policy documents on maternal, newborn and child health (MNCH) in Nigeria to assess the extent evidence informed policymaking mechanism was employed in the policy formulation process. Methods: A comprehensive literature search of websites of the Federal Ministry of Health(FMOH) Nigeria and other related ministries and agencies for relevant health policy documents related to MNCH from year 2000 to 2015 was undertaken. The following terms were used interchangeably for the literature search: maternal, child, newborn, health, policy, strategy,framework, guidelines, Nigeria. Results: Of the 108 policy documents found, 19 (17.6%) of them fulfilled the study inclusion criteria. The policy documents focused on the major aspects of maternal health improvements in Nigeria such as reproductive health, anti-malaria treatment, development of adolescent and young people health, mid wives service scheme, prevention of mother to child transmission of HIV and family planning. All the policy documents indicated that a consultative process of collection of input involving multiple stakeholders was employed, but there was no rigorous scientific process of assessing, adapting, synthesizing and application of scientific evidence reported in the policy development process. Conclusion: It is recommended that future health policy development process on MNCH should follow evidence informed policy making process and clearly document the process of incorporating evidence in the policy development. PMID:29085794
Civil society and the negotiation of the Framework Convention on Tobacco Control
MAMUDU, H. M.
2008-01-01
Tobacco control civil society organisations mobilised to influence countries during the negotiation of the World Health Organization (WHO) Framework Convention on Tobacco Control (FCTC) between 1999 and 2003. Tobacco control civil society organisations and coalitions around the world embraced the idea of an international tobacco control treaty and came together as the Framework Convention Alliance (FCA), becoming an important non-state actor within the international system of tobacco control. Archival documents and interviews demonstrate that the FCA successfully used strategies including publication of a newsletter, shaming, symbolism and media advocacy to influence policy positions of countries during the FCTC negotiation. The FCA became influential in the negotiation process by mobilising tobacco control civil society organisations and resources with the help of the Internet and framing the tobacco control discussion around global public health. PMID:19333806
[Study on the change of semantic perspective of schistosomiasis control in China].
Zhou, Li-ying; Liu, Si-yuan; Li, Yu-ye; Deng, Yao; Yang, Kun
2015-12-01
To analyze the evolution process, discourse and semantic meaning of schistosomiasis prevention and control, so as to provide suggestions for control work. The official documents and mainstream media reports of schistosomiasis prevention and control were selected at different periods as discourse samples, and the deep social reasons behind the strategy change and the semantic meaning of the utterance were analyzed at different periods. The discourse of schistosomiasis prevention and control experienced the evolution of the political discourse, pluralistic discourse and public discourse, and the semantic connotations showed the authority conflict semantic features, and then transferred to semantic cooperation. The prevention and control of schistosomiasis have different semantic meanings at different periods, and the prevention and control work should correspond to a social practice, seek truth from facts, correctly understand the actual situation, and then establish the effective control policy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lynn Kidman
This document constitutes an addendum to the August 2001, Corrective Action Decision Document / Closure Report for Corrective Action Unit 321: Area 22 Weather Station Fuel Storage as described in the document Recommendations and Justifications for Modifications for Use Restrictions Established under the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office Federal Facility Agreement and Consent Order (UR Modification document) dated February 2008. The UR Modification document was approved by NDEP on February 26, 2008. The approval of the UR Modification document constituted approval of each of the recommended UR modifications. In conformance with the UR Modificationmore » document, this addendum consists of: • This cover page that refers the reader to the UR Modification document for additional information • The cover and signature pages of the UR Modification document • The NDEP approval letter • The corresponding section of the UR Modification document This addendum provides the documentation justifying the cancellation of the UR for CAS 22-99-05, Fuel Storage Area. This UR was established as part of a Federal Facility Agreement and Consent Order (FFACO) corrective action and is based on the presence of contaminants at concentrations greater than the action levels established at the time of the initial investigation (FFACO, 1996; as amended August 2006). Since this UR was established, practices and procedures relating to the implementation of risk-based corrective actions (RBCA) have changed. Therefore, this UR was re-evaluated against the current RBCA criteria as defined in the Industrial Sites Project Establishment of Final Action Levels (NNSA/NSO, 2006c). This re-evaluation consisted of comparing the original data (used to define the need for the UR) to risk-based final action levels (FALs) developed using the current Industrial Sites RBCA process. The re-evaluation resulted in a recommendation to remove the UR because contamination is not present at the site above the risk-based FALs. Requirements for inspecting and maintaining this UR will be canceled, and the postings and signage at this site will be removed. Fencing and posting may be present at this site that are unrelated to the FFACO UR such as for radiological control purposes as required by the NV/YMP Radiological Control Manual (NNSA/NSO, 2004f). This modification will not affect or modify any non-FFACO requirements for fencing, posting, or monitoring at this site.« less
PACS quality control and automatic problem notifier
NASA Astrophysics Data System (ADS)
Honeyman-Buck, Janice C.; Jones, Douglas; Frost, Meryll M.; Staab, Edward V.
1997-05-01
One side effect of installing a clinical PACS Is that users become dependent upon the technology and in some cases it can be very difficult to revert back to a film based system if components fail. The nature of system failures range from slow deterioration of function as seen in the loss of monitor luminance through sudden catastrophic loss of the entire PACS networks. This paper describes the quality control procedures in place at the University of Florida and the automatic notification system that alerts PACS personnel when a failure has happened or is anticipated. The goal is to recover from a failure with a minimum of downtime and no data loss. Routine quality control is practiced on all aspects of PACS, from acquisition, through network routing, through display, and including archiving. Whenever possible, the system components perform self and between platform checks for active processes, file system status, errors in log files, and system uptime. When an error is detected or a exception occurs, an automatic page is sent to a pager with a diagnostic code. Documentation on each code, trouble shooting procedures, and repairs are kept on an intranet server accessible only to people involved in maintaining the PACS. In addition to the automatic paging system for error conditions, acquisition is assured by an automatic fax report sent on a daily basis to all technologists acquiring PACS images to be used as a cross check that all studies are archived prior to being removed from the acquisition systems. Daily quality control is preformed to assure that studies can be moved from each acquisition and contrast adjustment. The results of selected quality control reports will be presented. The intranet documentation server will be described with the automatic pager system. Monitor quality control reports will be described and the cost of quality control will be quantified. As PACS is accepted as a clinical tool, the same standards of quality control must be established as are expected on other equipment used in the diagnostic process.
Steeply dipping heaving bedrock, Colorado: Part 3 - Environmental controls and heaving processes
Noe, D.C.; Higgins, J.D.; Olsen, H.W.
2007-01-01
This paper examines the environmental processes and mechanisms that govern differential heaving in steeply dipping claystone bedrock near Denver, Colorado. Three potential heave mechanisms and causal processes were evaluated: (1) rebound expansion, from reduced overburden stress; (2) expansive gypsum-crystal precipitation, from oxidation of pyrite; and (3) swelling of clay minerals, from increased ground moisture. First, we documented the effect of short-term changes in overburden stress, atmospheric exposure, and ground moisture on bedrock at various field sites and in laboratory samples. Second, we documented differential heaving episodes in outcrops and at construction and developed sites. We found that unloading and exposure of the bedrock in construction-cut areas are essentially one-time processes that result in drying and desiccation of the near-surface bedrock, with no visible heaving response. In contrast, wetting produces a distinct swelling response in the claystone strata, and it may occur repeatedly as natural precipitation or from lawn irrigation. We documented 2.5 to 7.5 cm (1 to 3 in.) of differential heaving in 24 hours triggered by sudden infiltration of water at the exposed ground surface in outcrops and at construction sites. From these results, we interpret that rebound and pyrite weathering, both of which figure strongly into the long-term geologic evolution of the geologic framework, do not appear to be major heave mechanisms at these excavation depths. Heaving of the claystone takes two forms: (1) hydration swelling of dipping bentonitic beds or zones, and (2) hydration swelling within bedrock blocks accommodated by lateral, thrust-shear movements, along pre-existing bedding and fracture planes.
Bova, G Steven; Eltoum, Isam A; Kiernan, John A; Siegal, Gene P; Frost, Andra R; Best, Carolyn J M; Gillespie, John W; Emmert-Buck, Michael R
2005-01-01
Isolation of well-preserved pure cell populations is a prerequisite for sound studies of the molecular basis of pancreatic malignancy and other biological phenomena. This chapter reviews current methods for obtaining anatomically specific signals from molecules isolated from tissues, a basic requirement for productive linking of phenotype and genotype. The quality of samples isolated from tissue and used for molecular analysis is often glossed-over or omitted from publications, making interpretation and replication of data difficult or impossible. Fortunately, recently developed techniques allow life scientists to better document and control the quality of samples used for a given assay, creating a foundation for improvement in this area. Tissue processing for molecular studies usually involves some or all of the following steps: tissue collection, gross dissection/identification, fixation, processing/embedding, storage/archiving, sectioning, staining, microdissection/annotation, and pure analyte labeling/identification. High-quality tissue microdissection does not necessarily mean high-quality samples to analyze. The quality of biomaterials obtained for analysis is highly dependent on steps upstream and downstream from tissue microdissection. We provide protocols for each of these steps, and encourage you to improve upon these. It is worth the effort of every laboratory to optimize and document its technique at each stage of the process, and we provide a starting point for those willing to spend the time to optimize. In our view, poor documentation of tissue and cell type of origin and the use of nonoptimized protocols is a source of inefficiency in current life science research. Even incremental improvement in this area will increase productivity significantly.
TU-B-19A-01: Image Registration II: TG132-Quality Assurance for Image Registration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brock, K; Mutic, S
2014-06-15
AAPM Task Group 132 was charged with a review of the current approaches and solutions for image registration in radiotherapy and to provide recommendations for quality assurance and quality control of these clinical processes. As the results of image registration are always used as the input of another process for planning or delivery, it is important for the user to understand and document the uncertainty associate with the algorithm in general and the Result of a specific registration. The recommendations of this task group, which at the time of abstract submission are currently being reviewed by the AAPM, include themore » following components. The user should understand the basic image registration techniques and methods of visualizing image fusion. The disclosure of basic components of the image registration by commercial vendors is critical in this respect. The physicists should perform end-to-end tests of imaging, registration, and planning/treatment systems if image registration is performed on a stand-alone system. A comprehensive commissioning process should be performed and documented by the physicist prior to clinical use of the system. As documentation is important to the safe implementation of this process, a request and report system should be integrated into the clinical workflow. Finally, a patient specific QA practice should be established for efficient evaluation of image registration results. The implementation of these recommendations will be described and illustrated during this educational session. Learning Objectives: Highlight the importance of understanding the image registration techniques used in their clinic. Describe the end-to-end tests needed for stand-alone registration systems. Illustrate a comprehensive commissioning program using both phantom data and clinical images. Describe a request and report system to ensure communication and documentation. Demonstrate an clinically-efficient patient QA practice for efficient evaluation of image registration.« less
10 CFR 1017.16 - Unclassified Controlled Nuclear Information markings on documents or material.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 4 2010-01-01 2010-01-01 false Unclassified Controlled Nuclear Information markings on...) IDENTIFICATION AND PROTECTION OF UNCLASSIFIED CONTROLLED NUCLEAR INFORMATION Review of a Document or Material for Unclassified Controlled Nuclear Information § 1017.16 Unclassified Controlled Nuclear Information markings on...
10 CFR 1017.16 - Unclassified Controlled Nuclear Information markings on documents or material.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 4 2011-01-01 2011-01-01 false Unclassified Controlled Nuclear Information markings on...) IDENTIFICATION AND PROTECTION OF UNCLASSIFIED CONTROLLED NUCLEAR INFORMATION Review of a Document or Material for Unclassified Controlled Nuclear Information § 1017.16 Unclassified Controlled Nuclear Information markings on...
10 CFR 1017.16 - Unclassified Controlled Nuclear Information markings on documents or material.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 4 2012-01-01 2012-01-01 false Unclassified Controlled Nuclear Information markings on...) IDENTIFICATION AND PROTECTION OF UNCLASSIFIED CONTROLLED NUCLEAR INFORMATION Review of a Document or Material for Unclassified Controlled Nuclear Information § 1017.16 Unclassified Controlled Nuclear Information markings on...
10 CFR 1017.16 - Unclassified Controlled Nuclear Information markings on documents or material.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 4 2013-01-01 2013-01-01 false Unclassified Controlled Nuclear Information markings on...) IDENTIFICATION AND PROTECTION OF UNCLASSIFIED CONTROLLED NUCLEAR INFORMATION Review of a Document or Material for Unclassified Controlled Nuclear Information § 1017.16 Unclassified Controlled Nuclear Information markings on...
10 CFR 1017.16 - Unclassified Controlled Nuclear Information markings on documents or material.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 4 2014-01-01 2014-01-01 false Unclassified Controlled Nuclear Information markings on...) IDENTIFICATION AND PROTECTION OF UNCLASSIFIED CONTROLLED NUCLEAR INFORMATION Review of a Document or Material for Unclassified Controlled Nuclear Information § 1017.16 Unclassified Controlled Nuclear Information markings on...
1 CFR 21.35 - OMB control numbers.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 1 General Provisions 1 2014-01-01 2012-01-01 true OMB control numbers. 21.35 Section 21.35 General... DOCUMENTS PREPARATION OF DOCUMENTS SUBJECT TO CODIFICATION General Omb Control Numbers § 21.35 OMB control numbers. To display OMB control numbers in agency regulations, those numbers shall be placed...
1 CFR 21.35 - OMB control numbers.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 1 General Provisions 1 2013-01-01 2012-01-01 true OMB control numbers. 21.35 Section 21.35 General... DOCUMENTS PREPARATION OF DOCUMENTS SUBJECT TO CODIFICATION General Omb Control Numbers § 21.35 OMB control numbers. To display OMB control numbers in agency regulations, those numbers shall be placed...
49 CFR 1104.2 - Document specifications.
Code of Federal Regulations, 2014 CFR
2014-10-01
... to facilitate automated processing in document sheet feeders, original documents of more than one... textual submissions. Use of color in filings is limited to images such as graphs, maps and photographs. To facilitate automated processing of color pages, color pages may not be inserted among pages containing text...
49 CFR 1104.2 - Document specifications.
Code of Federal Regulations, 2010 CFR
2010-10-01
... to facilitate automated processing in document sheet feeders, original documents of more than one... textual submissions. Use of color in filings is limited to images such as graphs, maps and photographs. To facilitate automated processing of color pages, color pages may not be inserted among pages containing text...
49 CFR 1104.2 - Document specifications.
Code of Federal Regulations, 2012 CFR
2012-10-01
... to facilitate automated processing in document sheet feeders, original documents of more than one... textual submissions. Use of color in filings is limited to images such as graphs, maps and photographs. To facilitate automated processing of color pages, color pages may not be inserted among pages containing text...
49 CFR 1104.2 - Document specifications.
Code of Federal Regulations, 2011 CFR
2011-10-01
... to facilitate automated processing in document sheet feeders, original documents of more than one... textual submissions. Use of color in filings is limited to images such as graphs, maps and photographs. To facilitate automated processing of color pages, color pages may not be inserted among pages containing text...
49 CFR 1104.2 - Document specifications.
Code of Federal Regulations, 2013 CFR
2013-10-01
... to facilitate automated processing in document sheet feeders, original documents of more than one... textual submissions. Use of color in filings is limited to images such as graphs, maps and photographs. To facilitate automated processing of color pages, color pages may not be inserted among pages containing text...
Gaia DR2 documentation Chapter 8: Astrophysical Parameters
NASA Astrophysics Data System (ADS)
Manteiga, M.; Andrae, R.; Fouesneau, M.; Creevey, O.; Ordenovic, C.; Mary, N.; Jean-Antoine-Piccolo, A.; Bailer-Jones, C. A. L.
2018-04-01
This chapter of the Gaia DR2 documentation describes Apsis, the Astrophysical Parameters Inference System used for processing Gaia DR2 data. Beyond this documentation, a complete description of the processing and the results, as well as additional validations, have been published in Andrae et al. (2018).
Spatial Reorientation of Sensorimotor Balance Control in Altered Gravity
NASA Technical Reports Server (NTRS)
Paloski, W. H.; Black, F. L.; Kaufman, G. D.; Reschke, M. F.; Wood, S. J.
2007-01-01
Sensorimotor coordination of body segments following space flight are more pronounced after landing when the head is actively tilted with respect to the trunk. This suggests that central vestibular processing shifts from a gravitational frame of reference to a head frame of reference in microgravity. A major effect of such changes is a significant postural instability documented by standard head-erect Sensory Organization Tests. Decrements in functional performance may still be underestimated when head and gravity reference frames remained aligned. The purpose of this study was to examine adaptive changes in spatial processing for balance control following space flight by incorporating static and dynamic tilts that dissociate head and gravity reference frames. A second aim of this study was to examine the feasibility of altering the re-adaptation process following space flight by providing discordant visual-vestibular-somatosensory stimuli using short-radius pitch centrifugation.
Raveis, Victoria H; Conway, Laurie J; Uchida, Mayuko; Pogorzelska-Maziarz, Monika; Larson, Elaine L; Stone, Patricia W
2014-04-01
Health-care-associated infections (HAIs) remain a major patient safety problem even as policy and programmatic efforts designed to reduce HAIs have increased. Although information on implementing effective infection control (IC) efforts has steadily grown, knowledge gaps remain regarding the organizational elements that improve bedside practice and accommodate variations in clinical care settings. We conducted in-depth, semistructured interviews in 11 hospitals across the United States with a range of hospital personnel involved in IC (n = 116). We examined the collective nature of IC and the organizational elements that can enable disparate groups to work together to prevent HAIs. Our content analysis of participants' narratives yielded a rich description of the organizational process of implementing adherence to IC. Findings document the dynamic, fluid, interactional, and reactive nature of this process. Three themes emerged: implementing adherence efforts institution-wide, promoting an institutional culture to sustain adherence, and contending with opposition to the IC mandate.
Acioli, M D; de Carvalho, E F
1998-01-01
This study analyzes and compares several social participation concepts in health education processes to practical experiences with schistosomiasis prevention measures under the Northeast Endemic Disease Control Program (Brazilian Ministry of Health/World Bank, 1987). Using qualitative methods, institutional documents and discourses were interpreted (Sucam, FNS, and Ministry of Health). A field study was also performed (using interviews with community-based health agents and the general population) in the Zona da Mata region of Pernambuco (a historically endemic area for schistosomiasis), focused in the county of Amaraji. Comparing discourses and educational practices, we found factors that explain respective points of convergence and divergence, as well as elements linked to the social and historical process of the target population which systematically limit the efficacy of such educational measures.
Dual-Use Space Technology Transfer Conference and Exhibition. Volume 1
NASA Technical Reports Server (NTRS)
Krishen, Kumar (Compiler)
1994-01-01
This document contains papers presented at the Dual-Use Space Technology Transfer Conference and Exhibition held at the Johnson Space Center February 1-3, 1994. Possible technology transfers covered during the conference were in the areas of information access; innovative microwave and optical applications; materials and structures; marketing and barriers; intelligent systems; human factors and habitation; communications and data systems; business process and technology transfer; software engineering; biotechnology and advanced bioinstrumentation; communications signal processing and analysis; new ways of doing business; medical care; applications derived from control center data systems; human performance evaluation; technology transfer methods; mathematics, modeling, and simulation; propulsion; software analysis and decision tools systems/processes in human support technology; networks, control centers, and distributed systems; power; rapid development perception and vision technologies; integrated vehicle health management; automation technologies; advanced avionics; ans robotics technologies. More than 77 papers, 20 presentations, and 20 exhibits covering various disciplines were presented b experts from NASA, universities, and industry.
[Nursing service certification. Norm UNE-EN-ISO 9001-2008].
Salazar de la Guerra, R; Ferrer Arnedo, C; Labrador Domínguez, M J; Sangregorio Matesanz, A
2014-01-01
To certify the nursing services using a quality management system, taking an international standard as a reference, and based on a continuous improvement process. The standard was revised, and the Quality Management System documentation was updated, consisting of a Quality Manual and 7 control procedures. All the existing procedures were coded in accordance with the documentation control process. Each operational procedure was associated with a set of indicators which permitted to know the results obtained, analyze the deviations and to implement further improvements. The system was implemented successfully. Twenty-eight care procedures and eleven procedures concerning techniques were incorporated into the management system. Thirty indicators were established that allowed the whole process to be monitored. All patients were assigned to a nurse in their clinical notes and all of them had a personalized Care Plan according to planning methodology using North American Nursing Diagnosis Association (NANDA), Nursing Interventions Classification (NIC) and Nursing Outcomes Classification (NOC) international rankings. The incidence of falls, as well as the incidence of chronic skin wounds, was low, taking into account the characteristics of the patient and the duration of the stay (mean=35.87 days). The safety indicators had a high level of compliance, with 90% of patients clearly identified and 100% with hygiene protocol. The confidence rating given to the nurses was 91%. The certification enabled the quality of the service to be improved using a structured process, analyzing the results, dealing with non-conformities and introducing improvements. Copyright © 2014 SECA. Published by Elsevier Espana. All rights reserved.
[Local approval procedures act as a brake on RCTs].
van der Stok, E P; Huiskens, J; Hemmes, B; Grünhagen, D J; van Gulik, T M; Verhoef, C; Punt, C J A
2016-01-01
Large multicentre randomised controlled trials (RCTs) in the Netherlands are increasingly being impeded by major differences between local approval procedures. However, no national agenda exists as yet to improve this situation. The existence of major local differences in processing time and documentation required has been reported previously but little is known about the costs incurred and whether or not specific certifications and research contracts are mandatory. The current study evaluated these aspects of local procedures for obtaining approval of two oncological multicentre RCTs. Retrospective, descriptive. All local procedures for obtaining approval of two randomised clinical trials were evaluated: the CAIRO5 and CHARISMA trials initiated by the Dutch Colorectal Cancer Group (DCCG). We objectified time between approval by the Medical Ethics Review Committee (METC) and final approval by the Board of Directors (RvB), the type and number of documents needed, and costs charged. The median time interval between the approval by the Medical Ethics Review Committee and the approval by the Board of Directors was 90 days (range 4-312). The number of documents required per centre ranged from 6-20. The costs charged ranged from € 0-€ 1750, and amounted to € 8575 for all procedures combined. No costs were charged by the majority of the centres. The approval procedures for multicentre clinical trials in the Netherlands demonstrate major differences. Processing times, documentation required and costs are unpredictable; greater uniformity is highly desirable in this context.
Sefton, Gerri; Lane, Steven; Killen, Roger; Black, Stuart; Lyon, Max; Ampah, Pearl; Sproule, Cathryn; Loren-Gosling, Dominic; Richards, Caitlin; Spinty, Jean; Holloway, Colette; Davies, Coral; Wilson, April; Chean, Chung Shen; Carter, Bernie; Carrol, E.D.
2017-01-01
Pediatric Early Warning Scores are advocated to assist health professionals to identify early signs of serious illness or deterioration in hospitalized children. Scores are derived from the weighting applied to recorded vital signs and clinical observations reflecting deviation from a predetermined “norm.” Higher aggregate scores trigger an escalation in care aimed at preventing critical deterioration. Process errors made while recording these data, including plotting or calculation errors, have the potential to impede the reliability of the score. To test this hypothesis, we conducted a controlled study of documentation using five clinical vignettes. We measured the accuracy of vital sign recording, score calculation, and time taken to complete documentation using a handheld electronic physiological surveillance system, VitalPAC Pediatric, compared with traditional paper-based charts. We explored the user acceptability of both methods using a Web-based survey. Twenty-three staff participated in the controlled study. The electronic physiological surveillance system improved the accuracy of vital sign recording, 98.5% versus 85.6%, P < .02, Pediatric Early Warning Score calculation, 94.6% versus 55.7%, P < .02, and saved time, 68 versus 98 seconds, compared with paper-based documentation, P < .002. Twenty-nine staff completed the Web-based survey. They perceived that the electronic physiological surveillance system offered safety benefits by reducing human error while providing instant visibility of recorded data to the entire clinical team. PMID:27832032
[Recommendations for the control of documents and the establishment of a documentary system].
Vinner, E
2013-06-01
The quality management system that must be implemented in a MBL to meet the requirements of the standard NF EN ISO 15189 is based, among other things, on the creation and use by staff of a documentary system approved and updated. This documentary system is constituted by external documents (standards, suppliers' documents...) and internal documents (quality manual, procedures, instructions, technical and quality recordings...). A procedure of the documentary system control must be formalized. The documentary system should be modeled in order to identify the various procedures to be drafted and the incurred risks in the case a document would be missing in this system. Each document must be indexed in a unique way and document management must be carried out rigorously. The use of document management software is a great help to manage the life cycle of documents.
Documenting the Engineering Design Process
ERIC Educational Resources Information Center
Hollers, Brent
2017-01-01
Documentation of ideas and the engineering design process is a critical, daily component of a professional engineer's job. While patent protection is often cited as the primary rationale for documentation, it can also benefit the engineer, the team, company, and stakeholders through creating a more rigorously designed and purposeful solution.…
1 CFR 18.1 - Original and copies required.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 1 General Provisions 1 2014-01-01 2012-01-01 true Original and copies required. 18.1 Section 18.1... PROCESSING OF DOCUMENTS PREPARATION AND TRANSMITTAL OF DOCUMENTS GENERALLY § 18.1 Original and copies... two duplicate originals or certified copies. 1 However, if the document is printed or processed on...
1 CFR 18.1 - Original and copies required.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 1 General Provisions 1 2013-01-01 2012-01-01 true Original and copies required. 18.1 Section 18.1... PROCESSING OF DOCUMENTS PREPARATION AND TRANSMITTAL OF DOCUMENTS GENERALLY § 18.1 Original and copies... two duplicate originals or certified copies. 1 However, if the document is printed or processed on...
1 CFR 18.1 - Original and copies required.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 1 General Provisions 1 2012-01-01 2012-01-01 false Original and copies required. 18.1 Section 18.1... PROCESSING OF DOCUMENTS PREPARATION AND TRANSMITTAL OF DOCUMENTS GENERALLY § 18.1 Original and copies... two duplicate originals or certified copies. 1 However, if the document is printed or processed on...
Guidance and Control Software Project Data - Volume 1: Planning Documents
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly J. (Editor)
2008-01-01
The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the planning documents from the GCS project. Volume 1 contains five appendices: A. Plan for Software Aspects of Certification for the Guidance and Control Software Project; B. Software Development Standards for the Guidance and Control Software Project; C. Software Verification Plan for the Guidance and Control Software Project; D. Software Configuration Management Plan for the Guidance and Control Software Project; and E. Software Quality Assurance Activities.
Guidance and Control Software Project Data - Volume 3: Verification Documents
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly J. (Editor)
2008-01-01
The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the verification documents from the GCS project. Volume 3 contains four appendices: A. Software Verification Cases and Procedures for the Guidance and Control Software Project; B. Software Verification Results for the Pluto Implementation of the Guidance and Control Software; C. Review Records for the Pluto Implementation of the Guidance and Control Software; and D. Test Results Logs for the Pluto Implementation of the Guidance and Control Software.
Friedman, Carol; Hripcsak, George; Shagina, Lyuda; Liu, Hongfang
1999-01-01
Objective: To design a document model that provides reliable and efficient access to clinical information in patient reports for a broad range of clinical applications, and to implement an automated method using natural language processing that maps textual reports to a form consistent with the model. Methods: A document model that encodes structured clinical information in patient reports while retaining the original contents was designed using the extensible markup language (XML), and a document type definition (DTD) was created. An existing natural language processor (NLP) was modified to generate output consistent with the model. Two hundred reports were processed using the modified NLP system, and the XML output that was generated was validated using an XML validating parser. Results: The modified NLP system successfully processed all 200 reports. The output of one report was invalid, and 199 reports were valid XML forms consistent with the DTD. Conclusions: Natural language processing can be used to automatically create an enriched document that contains a structured component whose elements are linked to portions of the original textual report. This integrated document model provides a representation where documents containing specific information can be accurately and efficiently retrieved by querying the structured components. If manual review of the documents is desired, the salient information in the original reports can also be identified and highlighted. Using an XML model of tagging provides an additional benefit in that software tools that manipulate XML documents are readily available. PMID:9925230
Documentation of new mission control center White Flight Control Room (FLCR)
1995-06-06
Documentation of the new mission control center White Flight Control Room (FLCR). Excellent overall view of White FLCR with personnel manning console workstations (11221). Fisheye lens perspective from Flight Director station with Brian Austin (11222). Environmental (EECOM) workstation and personnel (11223).
Citty, Sandra W.; Kamel, Amir; Garvan, Cynthia; Marlowe, Lee; Westhoff, Lynn
2017-01-01
Malnutrition in hospitalized patients is a major cause for hospital re-admission, pressure ulcers and increased hospital costs. Methods to improve the administration and documentation of nutritional supplements for hospitalized patients are needed to improve patient care, outcomes and resource utilization. Staff at a medium-sized academic health science center hospital in the southeastern United States noted that nutritional supplements ordered for patients at high risk for malnutrition were not offered or administered to patients in a standardized manner and/or not documented clearly in the electronic health record as per prescription. This paper reports on a process improvement project that redesigned the ordering, administration and documentation process of oral nutritional supplements in the electronic health record. By adding nutritional products to the medication order sets and adding an electronic nutrition administration record (ENAR) tab, the multidisciplinary team sought to standardize nutritional supplement ordering, documentation and administration at prescribed intervals. This process improvement project used a triangulated approach to evaluating pre- and post-process change including: medical record reviews, patient interviews, and nutrition formula room log reports. Staff education and training was carried out prior to initiation of the system changes. This process change resulted in an average decrease in the return of unused nutritional formula from 76% returned at baseline to 54% post-process change. The process change resulted in 100% of nutritional supplement orders having documentation about nutritional medication administration and/or reason for non-administration. Documentation in the ENAR showed that 41% of ONS orders were given and 59% were not given. Significantly more patients reported being offered the ONS product (p=0.0001) after process redesign and more patients (5% before ENAR and 86% after ENAR reported being offered the correct type, amount and frequency of nutritional products (p=0.0001). ENAR represented an effective strategy to improve administration and documentation of nutritional supplements for hospitalized patients. PMID:28243439
Citty, Sandra W; Kamel, Amir; Garvan, Cynthia; Marlowe, Lee; Westhoff, Lynn
2017-01-01
Malnutrition in hospitalized patients is a major cause for hospital re-admission, pressure ulcers and increased hospital costs. Methods to improve the administration and documentation of nutritional supplements for hospitalized patients are needed to improve patient care, outcomes and resource utilization. Staff at a medium-sized academic health science center hospital in the southeastern United States noted that nutritional supplements ordered for patients at high risk for malnutrition were not offered or administered to patients in a standardized manner and/or not documented clearly in the electronic health record as per prescription. This paper reports on a process improvement project that redesigned the ordering, administration and documentation process of oral nutritional supplements in the electronic health record. By adding nutritional products to the medication order sets and adding an electronic nutrition administration record (ENAR) tab, the multidisciplinary team sought to standardize nutritional supplement ordering, documentation and administration at prescribed intervals. This process improvement project used a triangulated approach to evaluating pre- and post-process change including: medical record reviews, patient interviews, and nutrition formula room log reports. Staff education and training was carried out prior to initiation of the system changes. This process change resulted in an average decrease in the return of unused nutritional formula from 76% returned at baseline to 54% post-process change. The process change resulted in 100% of nutritional supplement orders having documentation about nutritional medication administration and/or reason for non-administration. Documentation in the ENAR showed that 41% of ONS orders were given and 59% were not given. Significantly more patients reported being offered the ONS product (p=0.0001) after process redesign and more patients (5% before ENAR and 86% after ENAR reported being offered the correct type, amount and frequency of nutritional products (p=0.0001). ENAR represented an effective strategy to improve administration and documentation of nutritional supplements for hospitalized patients.
NASA Astrophysics Data System (ADS)
Rahman, Fuad; Tarnikova, Yuliya; Hartono, Rachmat; Alam, Hassan
2006-01-01
This paper presents a novel automatic web publishing solution, Pageview (R). PageView (R) is a complete working solution for document processing and management. The principal aim of this tool is to allow workgroups to share, access and publish documents on-line on a regular basis. For example, assuming that a person is working on some documents. The user will, in some fashion, organize his work either in his own local directory or in a shared network drive. Now extend that concept to a workgroup. Within a workgroup, some users are working together on some documents, and they are saving them in a directory structure somewhere on a document repository. The next stage of this reasoning is that a workgroup is working on some documents, and they want to publish them routinely on-line. Now it may happen that they are using different editing tools, different software, and different graphics tools. The resultant documents may be in PDF, Microsoft Office (R), HTML, or Word Perfect format, just to name a few. In general, this process needs the documents to be processed in a fashion so that they are in the HTML format, and then a web designer needs to work on that collection to make them available on-line. PageView (R) takes care of this whole process automatically, making the document workflow clean and easy to follow. PageView (R) Server publishes documents, complete with the directory structure, for online use. The documents are automatically converted to HTML and PDF so that users can view the content without downloading the original files, or having to download browser plug-ins. Once published, other users can access the documents as if they are accessing them from their local folders. The paper will describe the complete working system and will discuss possible applications within the document management research.
NASA Technical Reports Server (NTRS)
Callender, E. David; Steinbacher, Jody
1989-01-01
This is the third of five volumes on Information System Life-Cycle and Documentation Standards which present a well organized, easily used standard for providing technical information needed for developing information systems, components, and related processes. This volume states the Software Management and Assurance Program documentation standard for a product specification document and for data item descriptions. The framework can be applied to any NASA information system, software, hardware, operational procedures components, and related processes.
An automated system for generating program documentation
NASA Technical Reports Server (NTRS)
Hanney, R. J.
1970-01-01
A documentation program was developed in which the emphasis is placed on text content rather than flowcharting. It is keyword oriented, with 26 keywords that control the program. Seventeen of those keywords are recognized by the flowchart generator, three are related to text generation, and three have to do with control card and deck displays. The strongest advantage offered by the documentation program is that it produces the entire document. The document is prepared on 35mm microfilm, which is easy to store, and letter-size reproductions can be made inexpensively on bond paper.
NASA Astrophysics Data System (ADS)
Contos, Adam R.; Acton, D. Scott; Atcheson, Paul D.; Barto, Allison A.; Lightsey, Paul A.; Shields, Duncan M.
2006-06-01
The opto-mechanical design of the 6.6 meter James Webb Space Telescope (JWST), with its actively-controlled secondary and 18-segment primary mirror, presents unique challenges from a system engineering perspective. To maintain the optical alignment of the telescope on-orbit, a process called wavefront sensing and control (WFS&C) is employed to determine the current state of the mirrors and calculate the optimal mirror move updates. The needed imagery is downloaded to the ground, where the WFS&C algorithms to process the images reside, and the appropriate commands are uploaded to the observatory. Rather than use a dedicated wavefront sensor for the imagery as is done in most other applications, a science camera is used instead. For the success of the mission, WFS&C needs to perform flawlessly using the assets available among the combination of separate elements (ground operations, spacecraft, science instruments, optical telescope, etc.) that cross institutional as well as geographic borders. Rather than be yet another distinct element with its own set of requirements to flow to the other elements as was originally planned, a novel approach was selected. This approach entails reviewing and auditing other documents for the requirements needed to satisfy the needs of WFS&C. Three actions are taken: (1) when appropriate requirements exist, they are tracked by WFS&C ; (2) when an existing requirement is insufficient to meet the need, a requirement change is initiated; and finally (3) when a needed requirement is missing, a new requirement is established in the corresponding document. This approach, deemed a "best practice" at the customer's independent audit, allows for program confidence that the necessary requirements are complete, while still maintaining the responsibility for the requirement with the most appropriate entity. This paper describes the details and execution of the approach; the associated WFS&C requirements and verification documentation; and the implementation of the primary database tool for the project, DOORS (Dynamic Object-Oriented Requirements System).
Guidance and Control Software Project Data - Volume 2: Development Documents
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly J. (Editor)
2008-01-01
The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the development documents from the GCS project. Volume 2 contains three appendices: A. Guidance and Control Software Development Specification; B. Design Description for the Pluto Implementation of the Guidance and Control Software; and C. Source Code for the Pluto Implementation of the Guidance and Control Software
Securing Provenance of Distributed Processes in an Untrusted Environment
NASA Astrophysics Data System (ADS)
Syalim, Amril; Nishide, Takashi; Sakurai, Kouichi
Recently, there is much concern about the provenance of distributed processes, that is about the documentation of the origin and the processes to produce an object in a distributed system. The provenance has many applications in the forms of medical records, documentation of processes in the computer systems, recording the origin of data in the cloud, and also documentation of human-executed processes. The provenance of distributed processes can be modeled by a directed acyclic graph (DAG) where each node represents an entity, and an edge represents the origin and causal relationship between entities. Without sufficient security mechanisms, the provenance graph suffers from integrity and confidentiality problems, for example changes or deletions of the correct nodes, additions of fake nodes and edges, and unauthorized accesses to the sensitive nodes and edges. In this paper, we propose an integrity mechanism for provenance graph using the digital signature involving three parties: the process executors who are responsible in the nodes' creation, a provenance owner that records the nodes to the provenance store, and a trusted party that we call the Trusted Counter Server (TCS) that records the number of nodes stored by the provenance owner. We show that the mechanism can detect the integrity problem in the provenance graph, namely unauthorized and malicious “authorized” updates even if all the parties, except the TCS, collude to update the provenance. In this scheme, the TCS only needs a very minimal storage (linear with the number of the provenance owners). To protect the confidentiality and for an efficient access control administration, we propose a method to encrypt the provenance graph that allows access by paths and compartments in the provenance graph. We argue that encryption is important as a mechanism to protect the provenance data stored in an untrusted environment. We analyze the security of the integrity mechanism, and perform experiments to measure the performance of both mechanisms.
jTraML: an open source Java API for TraML, the PSI standard for sharing SRM transitions.
Helsens, Kenny; Brusniak, Mi-Youn; Deutsch, Eric; Moritz, Robert L; Martens, Lennart
2011-11-04
We here present jTraML, a Java API for the Proteomics Standards Initiative TraML data standard. The library provides fully functional classes for all elements specified in the TraML XSD document, as well as convenient methods to construct controlled vocabulary-based instances required to define SRM transitions. The use of jTraML is demonstrated via a two-way conversion tool between TraML documents and vendor specific files, facilitating the adoption process of this new community standard. The library is released as open source under the permissive Apache2 license and can be downloaded from http://jtraml.googlecode.com . TraML files can also be converted online at http://iomics.ugent.be/jtraml .
Software engineering for ESO's VLT project
NASA Astrophysics Data System (ADS)
Filippi, G.
1994-12-01
This paper reports on the experience at the European Southern Observatory on the application of software engineering techniques to a 200 man-year control software project for the Very Large Telescope (VLT). This shall provide astronomers, before the end of the century, with one of the most powerful telescopes in the world. From the definition of the general model, described in the software management plan, specific activities have been and will be defined: standards for documents and for code development, design approach using a CASE tool, the process of reviewing both documentation and code, quality assurance, test strategy, etc. The initial choices, the current implementation and the future planned activities are presented and, where feedback is already available, pros and cons are discussed.
NASA Technical Reports Server (NTRS)
McCubbin, Francis M.; Zeigler, Ryan A.
2017-01-01
The Astromaterials Acquisition and Curation Office (henceforth referred to herein as NASA Curation Office) at NASA Johnson Space Center (JSC) is responsible for curating all of NASA's extraterrestrial samples. Under the governing document, NASA Policy Directive (NPD) 7100.10F JSC is charged with curation of all extraterrestrial material under NASA control, including future NASA missions. The Directive goes on to define Curation as including documentation, preservation, preparation, and distribution of samples for research, education, and public outreach. Here we briefly describe NASA's astromaterials collections and our ongoing efforts related to enhancing the utility of our current collections as well as our efforts to prepare for future sample return missions. We collectively refer to these efforts as advanced curation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tang, Q.; Xie, S.
This report describes the Atmospheric Radiation Measurement (ARM) Best Estimate (ARMBE) station-based surface data (ARMBESTNS) value-added product. It is a twin data product of the ARMBE 2-Dimensional gridded (ARMBE2DGRID) data set. Unlike the ARMBE2DGRID data set, ARMBESTNS data are reported at the original site locations and show the original information (except for the interpolation over time). Therefore, the users have the flexibility to process the data with the approach more suitable for their applications. This document provides information about the input data, quality control (QC) method, and output format of this data set. As much of the information is identicalmore » to that of the ARMBE2DGRID data, this document will emphasize more on the different aspects of these two data sets.« less
Document segmentation for high-quality printing
NASA Astrophysics Data System (ADS)
Ancin, Hakan
1997-04-01
A technique to segment dark texts on light background of mixed mode color documents is presented. This process does not perceptually change graphics and photo regions. Color documents are scanned and printed from various media which usually do not have clean background. This is especially the case for the printouts generated from thin magazine samples, these printouts usually include text and figures form the back of the page, which is called bleeding. Removal of bleeding artifacts improves the perceptual quality of the printed document and reduces the color ink usage. By detecting the light background of the document, these artifacts are removed from background regions. Also detection of dark text regions enables the halftoning algorithms to use true black ink for the black text pixels instead of composite black. The processed document contains sharp black text on white background, resulting improved perceptual quality and better ink utilization. The described method is memory efficient and requires a small number of scan lines of high resolution color documents during processing.
Proceedings of the Second Joint Technology Workshop on Neural Networks and Fuzzy Logic, volume 2
NASA Technical Reports Server (NTRS)
Lea, Robert N. (Editor); Villarreal, James A. (Editor)
1991-01-01
Documented here are papers presented at the Neural Networks and Fuzzy Logic Workshop sponsored by NASA and the University of Texas, Houston. Topics addressed included adaptive systems, learning algorithms, network architectures, vision, robotics, neurobiological connections, speech recognition and synthesis, fuzzy set theory and application, control and dynamics processing, space applications, fuzzy logic and neural network computers, approximate reasoning, and multiobject decision making.
Army General Fund Adjustments Not Adequately Documented or Supported
2016-07-26
compilation process. Finding The Office of the Assistant Secretary of the Army (Financial Management & Comptroller) (OASA[FM&C]) and the Defense Finance and...statements were unreliable and lacked an adequate audit trail. Furthermore, DoD and Army managers could not rely on the data in their accounting...systems when making management and resource decisions. Until the Army and DFAS Indianapolis correct these control deficiencies, there is considerable
Asiki, Gershim; Shao, Shuai; Wainana, Carol; Khayeka-Wandabwa, Christopher; Haregu, Tilahun N; Juma, Pamela A; Mohammed, Shukri; Wambui, David; Gong, Enying; Yan, Lijing L; Kyobutungi, Catherine
2018-05-09
In Kenya, cardiovascular diseases (CVDs) accounted for more than 10% of total deaths and 4% of total Disability-Adjusted Life Years (DALYs) in 2015 with a steady increase over the past decade. The main objective of this paper was to review the existing policies and their content in relation to prevention, control and management of CVDs at primary health care (PHC) level in Kenya. A targeted document search in Google engine using keywords "Kenya national policy on cardiovascular diseases" and "Kenya national policy on non-communicable diseases (NCDs)" was conducted in addition to key informant interviews with Kenyan policy makers. Relevant regional and international policy documents were also included. The contents of documents identified were reviewed to assess how well they aligned with global health policies on CVD prevention, control and management. Thematic content analysis of the key informant interviews was also conducted to supplement the document reviews. A total of 17 documents were reviewed and three key informants interviewed. Besides the Tobacco Control Act (2007), all policy documents for CVD prevention, control and management were developed after 2013. The national policies were preceded by global initiatives and guidelines and were similar in content with the global policies. The Kenya health policy (2014-2030), The Kenya Health Sector Strategic and Investment Plan (2014-2018) and the Kenya National Strategy for the Prevention and Control of Non-communicable diseases (2015-2020) had strategies on NCDs including CVDs. Other policy documents for behavioral risk factors (The Tobacco Control Act 2007, Alcoholic Drinks Control (Licensing) Regulations (2010)) were available. The National Nutrition Action Plan (2012-2017) was available as a draft. Although Kenya has a tiered health care system comprising primary healthcare, integration of CVD prevention and control at PHC level was not explicitly mentioned in the policy documents. This review revealed important gaps in the policy environment for prevention, control and management of CVDs in PHC settings in Kenya. There is need to continuously engage the ministry of health and other sectors to prioritize inclusion of CVD services in PHC.
Engineering Documentation and Data Control
NASA Technical Reports Server (NTRS)
Matteson, Michael J.; Bramley, Craig; Ciaruffoli, Veronica
2001-01-01
Mississippi Space Services (MSS) the facility services contractor for NASA's John C. Stennis Space Center (SSC), is utilizing technology to improve engineering documentation and data control. Two identified improvement areas, labor intensive documentation research and outdated drafting standards, were targeted as top priority. MSS selected AutoManager(R) WorkFlow from Cyco software to manage engineering documentation. The software is currently installed on over 150 desctops. The outdated SSC drafting standard was written for pre-CADD drafting methods, in other words, board drafting. Implementation of COTS software solutions to manage engineering documentation and update the drafting standard resulted in significant increases in productivity by reducing the time spent searching for documents.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dingell, J.D.
1991-02-01
The Department of Energy's (DOE) Lawrence Livermore National Laboratory, located in Livermore, California, generates and controls large numbers of classified documents associated with the research and testing of nuclear weapons. Concern has been raised about the potential for espionage at the laboratory and the national security implications of classified documents being stolen. This paper determines the extent of missing classified documents at the laboratory and assesses the adequacy of accountability over classified documents in the laboratory's custody. Audit coverage was limited to the approximately 600,000 secret documents in the laboratory's custody. The adequacy of DOE's oversight of the laboratory's secretmore » document control program was also assessed.« less
Software for Better Documentation of Other Software
NASA Technical Reports Server (NTRS)
Pinedo, John
2003-01-01
The Literate Programming Extraction Engine is a Practical Extraction and Reporting Language- (PERL-)based computer program that facilitates and simplifies the implementation of a concept of self-documented literate programming in a fashion tailored to the typical needs of scientists. The advantage for the programmer is that documentation and source code are written side-by-side in the same file, reducing the likelihood that the documentation will be inconsistent with the code and improving the verification that the code performs its intended functions. The advantage for the user is the knowledge that the documentation matches the software because they come from the same file. This program unifies the documentation process for a variety of programming languages, including C, C++, and several versions of FORTRAN. This program can process the documentation in any markup language, and incorporates the LaTeX typesetting software. The program includes sample Makefile scripts for automating both the code-compilation (when appropriate) and documentation-generation processes into a single command-line statement. Also included are macro instructions for the Emacs display-editor software, making it easy for a programmer to toggle between editing in a code or a documentation mode.
Moving research to practice through partnership: a case study in Asphalt Paving.
Chang, Charlotte; Nixon, Laura; Baker, Robin
2015-08-01
Multi-stakeholder partnerships play a critical role in dissemination and implementation in health and safety. To better document and understand construction partnerships that have successfully scaled up effective interventions to protect workers, this case study focused on the collaborative processes of the Asphalt Paving Partnership. In the 1990s, this partnership developed, evaluated, disseminated, and achieved near universal, voluntary adoption of paver engineering controls to reduce exposure to asphalt fumes. We used in-depth interviews (n = 15) and document review in the case study. We describe contextual factors that both facilitated and challenged the formation of the collaboration, central themes and group processes, and research to practice (r2p) outcomes. The Asphalt Paving Partnership offers insight into how multi-stakeholder partnerships in construction can draw upon the strengths of diverse members to improve the dissemination and adoption of health and safety innovations and build a collaborative infrastructure to sustain momentum over time. © 2015 Wiley Periodicals, Inc.
Graham, Denise H
2004-11-01
The quality improvement plan relies on controlling quality of care through improving the process or system as a whole. Your ongoing data collection is paramount to the process of system-wide improvement and performance, enhancement of financial performance, operational performance and overall service performance and satisfaction. The threat of litigation and having to defend yourself from a claim of wrongdoing still looms every time your wheels turn. Your runsheet must serve and protect you. Look at the NFPA 1710 standard, which was enacted to serve and protect firefighters. This standard was enacted with their personal safety and well-being as the principle behind staffing requirements. At what stage of draft do you suppose the NFPA 1710 standard would be today if the relative data were collected sporadically or were not tracked for each service-related death? It may have taken many more service-related deaths to effect change for a system-wide improvement in operational performance. Every call merits documentation and data collection. Your data are catalysts for change.
LANL Safeguards and Security Assurance Program. Revision 6
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-04-03
The Safeguards and Security (S and S) Assurance Program provides a continuous quality improvement approach to ensure effective, compliant S and S program implementation throughout the Los Alamos National Laboratory. Any issues identified through the various internal and external assessments are documented, tracked and closed using the Safeguards and Security Issue Management Program. The Laboratory utilizes an integrated S and S systems approach to protect US Department of Energy (DOE) interests from theft or diversion of special nuclear material (SNM), sabotage, espionage, loss or theft of classified/controlled matter or government property, and other hostile acts that may cause unacceptable impactsmore » on national security, health and safety of employees and the public, and the environment. This document explains the basis, scope, and conduct of the S and S process to include: self-assessments, issue management, risk assessment, and root cause analysis. It also provides a discussion of S and S topical areas, roles and responsibilities, process flow charts, minimum requirements, methodology, terms, and forms.« less
An adaptable product for material processing and life science missions
NASA Technical Reports Server (NTRS)
Wassick, Gregory; Dobbs, Michael
1995-01-01
The Experiment Control System II (ECS-II) is designed to make available to the microgravity research community the same tools and mode of automated experimentation that their ground-based counterparts have enjoyed for the last two decades. The design goal was accomplished by combining commercial automation tools familiar to the experimenter community with system control components that interface with the on-orbit platform in a distributed architecture. The architecture insulates the tools necessary for managing a payload. By using commercial software and hardware components whenever possible, development costs were greatly reduced when compared to traditional space development projects. Using commercial-off-the-shelf (COTS) components also improved the usability documentation, and reducing the need for training of the system by providing familiar user interfaces, providing a wealth of readily available documentation, and reducing the need for training on system-specific details. The modularity of the distributed architecture makes it very amenable for modification to different on-orbit experiments requiring robotics-based automation.
Measuring the quality of therapeutic apheresis care in the pediatric intensive care unit.
Sussmane, Jeffrey B; Torbati, Dan; Gitlow, Howard S
2012-01-01
Our goal was to measure the quality of care provided in the Pediatric Intensive Care Unit (PICU) during Therapeutic Apheresis (TA). We described the care as a step by step process. We designed a flow chart to carefully document each step of the process. We then defined each step with a unique clinical indictor (CI) that represented the exact task we felt provided quality care. These CIs were studied and modified for 1 year. We measured our performance in this process by the number of times we accomplished the CI vs. the total number of CIs that were to be performed. The degree of compliance, with these clinical indicators, was analyzed and used as a metric for quality by calculating how close the process is running exactly as planned or "in control." The Apheresis Process was in control (compliance) for 47% of the indicators, as measured in the aggregate for the first observational year. We then applied the theory of Total Quality Management (TQM) through our Design, Measure, Analyze, Improve, and Control (DMAIC) model. We were able to improve the process and bring it into control by increasing the compliance to > 99.74%, in the aggregate, for the third and fourth quarter of the second year. We have implemented TQM to increase compliance, thus control, of a highly complex and multidisciplinary Pediatric Intensive Care therapy. We have shown a reproducible and scalable measure of quality for a complex clinical process in the PICU, without additional capital expenditure. Copyright © 2011 Wiley-Liss, Inc.
Holderried, Martin; Bökel, Ann-Catrin; Ochsmann, Elke
2018-05-01
In order to save and control the processes and quality of medical services, a suitable steering system of all relevant documents is essential from the point of view of clinical quality management. Systems supporting an automated steering system of documents are called document management systems (DMS), and they also enter the healthcare sector. The use of DMS in the German healthcare sector has hardly been investigated so far. To close this knowledge gap, interviews were carried out with German university hospitals over a six-month period and subjected to a qualitative content analysis according to Mayring. In total, 25 university hospitals agreed to participate in this study, 19 of which have been working with a digital DMS for about six years on average. There was a great variety among the IT systems used. Document management and usability of the DMS as well as its integration into existing IT structures were key decision-making criteria for the selection of a digital DMS. In general, the long-term usability of the DMS is supported by regular evaluation of one's own requirements for the system, administration and training programs. In addition, DMS have a positive effect on patient safety and the quality of medical care. Copyright © 2018. Published by Elsevier GmbH.
Intelligent Document Gateway: A Service System Case Study and Analysis
NASA Astrophysics Data System (ADS)
Krishna, Vikas; Lelescu, Ana
In today's fast paced world, it is necessary to process business documents expediently, accurately, and diligently. In other words, processing has to be fast, errors must be prevented (or caught and corrected quickly), and documents cannot be lost or misplaced. The failure to meet these criteria, depending on the type and purpose of the documents, can have serious business, legal, or safety consequences. In this paper, we evaluated a B2B order placement service system that allows clients to place orders for products and services over a network. We describe the order placement service before and after deploying the Intelligent Document Gateway (IDG), a document-centric business process automation technology from IBM Research. Using service science perspective and service systems frameworks, we provide an analysis of how IDG improved the value proposition for both the service providers and service clients.
Multiple benefits of personal FM system use by children with auditory processing disorder (APD).
Johnston, Kristin N; John, Andrew B; Kreisman, Nicole V; Hall, James W; Crandell, Carl C
2009-01-01
Children with auditory processing disorders (APD) were fitted with Phonak EduLink FM devices for home and classroom use. Baseline measures of the children with APD, prior to FM use, documented significantly lower speech-perception scores, evidence of decreased academic performance, and psychosocial problems in comparison to an age- and gender-matched control group. Repeated measures during the school year demonstrated speech-perception improvement in noisy classroom environments as well as significant academic and psychosocial benefits. Compared with the control group, the children with APD showed greater speech-perception advantage with FM technology. Notably, after prolonged FM use, even unaided (no FM device) speech-perception performance was improved in the children with APD, suggesting the possibility of fundamentally enhanced auditory system function.
The role of aluminum in slow sand filtration.
Weber-Shirk, Monroe L; Chan, Kwok Loon
2007-03-01
Engineering enhancement of slow sand filtration has been an enigma in large part because the mechanisms responsible for particle removal have not been well characterized. The presumed role of biological processes in the filter ripening process nearly precluded the possibility of enhancing filter performance since interventions to enhance biological activity would have required decreasing the quality of the influent water. In previous work, we documented that an acid soluble polymer controls filter performance. The new understanding that particle removal is controlled in large part by physical chemical mechanisms has expanded the possibilities of engineering slow sand filter performance. Herein, we explore the role of naturally occurring aluminum as a ripening agent for slow sand filters and the possibility of using a low dose of alum to improve filter performance or to ripen slow sand filters.
Management of infection control in dental practice.
Smith, A; Creanor, S; Hurrell, D; Bagg, J; McCowan, M
2009-04-01
This was an observational study in which the management policies and procedures associated with infection control and instrument decontamination were examined in 179 dental surgeries by a team of trained surveyors. Information relating to the management of a wide range of infection control procedures, in particular the decontamination of dental instruments, was collected by interview and by examination of practice documentation. This study found that although the majority of surgeries (70%) claimed to have a management policy on infection control, only 50% of these were documented. For infection control policies, 79% of surgeries had access to the British Dental Association Advice Sheet A12. Infection control policies were claimed to be present in 89% of surgeries, of which 62% were documented. Seventy-seven per cent of staff claimed to have received specific infection control training, but for instrument decontamination this was provided mainly by demonstration (97%) or observed practice (88%). Many dental nurses (74%) and dental practitioners (57%) did not recognise the symbol used to designate a single-use device. Audit of infection control or decontamination activities was undertaken in 11% of surgeries. The majority of surgeries have policies and procedures for the management of infection control in dental practice, but in many instances these are not documented. The training of staff in infection control and its documentation is poorly managed and consideration should be given to development of quality management systems for use in dental practice.
Interim Basis for PCB Sampling and Analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
BANNING, D.L.
2001-01-18
This document was developed as an interim basis for sampling and analysis of polychlorinated biphenyls (PCBs) and will be used until a formal data quality objective (DQO) document is prepared and approved. On August 31, 2000, the Framework Agreement for Management of Polychlorinated Biphenyls (PCBs) in Hanford Tank Waste was signed by the US. Department of Energy (DOE), the Environmental Protection Agency (EPA), and the Washington State Department of Ecology (Ecology) (Ecology et al. 2000). This agreement outlines the management of double shell tank (DST) waste as Toxic Substance Control Act (TSCA) PCB remediation waste based on a risk-based disposalmore » approval option per Title 40 of the Code of Federal Regulations 761.61 (c). The agreement calls for ''Quantification of PCBs in DSTs, single shell tanks (SSTs), and incoming waste to ensure that the vitrification plant and other ancillary facilities PCB waste acceptance limits and the requirements of the anticipated risk-based disposal approval are met.'' Waste samples will be analyzed for PCBs to satisfy this requirement. This document describes the DQO process undertaken to assure appropriate data will be collected to support management of PCBs and is presented in a DQO format. The DQO process was implemented in accordance with the U.S. Environmental Protection Agency EPA QAlG4, Guidance for the Data Quality Objectives Process (EPA 1994) and the Data Quality Objectives for Sampling and Analyses, HNF-IP-0842, Rev. 1 A, Vol. IV, Section 4.16 (Banning 1999).« less
Interim Basis for PCB Sampling and Analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
BANNING, D.L.
2001-03-20
This document was developed as an interim basis for sampling and analysis of polychlorinated biphenyls (PCBs) and will be used until a formal data quality objective (DQO) document is prepared and approved. On August 31, 2000, the Framework Agreement for Management of Polychlorinated Biphenyls (PCBs) in Hanford Tank Waste was signed by the U.S. Department of Energy (DOE), the Environmental Protection Agency (EPA), and the Washington State Department of Ecology (Ecology) (Ecology et al. 2000). This agreement outlines the management of double shell tank (DST) waste as Toxic Substance Control Act (TSCA) PCB remediation waste based on a risk-based disposalmore » approval option per Title 40 of the Code of Federal Regulations 761.61 (c). The agreement calls for ''Quantification of PCBs in DSTs, single shell tanks (SSTs), and incoming waste to ensure that the vitrification plant and other ancillary facilities PCB waste acceptance limits and the requirements of the anticipated risk-based disposal approval are met.'' Waste samples will be analyzed for PCBs to satisfy this requirement. This document describes the DQO process undertaken to assure appropriate data will be collected to support management of PCBs and is presented in a DQO format. The DQO process was implemented in accordance with the U.S. Environmental Protection Agency EPA QA/G4, Guidance for the Data Quality Objectives Process (EPA 1994) and the Data Quality Objectives for Sampling and Analyses, HNF-IP-0842, Rev. 1A, Vol. IV, Section 4.16 (Banning 1999).« less
The purpose of this presentation is to present an overview of the quality control (QC) sections of a draft EPA document entitled, "Quality Assurance/Quality Control Guidance for Laboratories Performing PCR Analyses on Environmental Samples." This document has been prepared by th...
SGML and HTML: The Merging of Document Management and Electronic Document Publishing.
ERIC Educational Resources Information Center
Dixon, Ross
1996-01-01
Document control is an issue for organizations that use SGML/HTML. The prevalent approach is to apply the same techniques to document elements that are applied to full documents, a practice that has led to an overlap of electronic publishing and document management. Lists requirements for the management of SGML/HTML documents. (PEN)
18 CFR 5.25 - Applications requiring a draft NEPA document.
Code of Federal Regulations, 2012 CFR
2012-04-01
... a draft NEPA document. 5.25 Section 5.25 Conservation of Power and Water Resources FEDERAL ENERGY... APPLICATION PROCESS § 5.25 Applications requiring a draft NEPA document. (a) If the Commission determines that a license application will be processed with an environmental impact statement, or a draft and final...
18 CFR 5.25 - Applications requiring a draft NEPA document.
Code of Federal Regulations, 2013 CFR
2013-04-01
... a draft NEPA document. 5.25 Section 5.25 Conservation of Power and Water Resources FEDERAL ENERGY... APPLICATION PROCESS § 5.25 Applications requiring a draft NEPA document. (a) If the Commission determines that a license application will be processed with an environmental impact statement, or a draft and final...
18 CFR 5.25 - Applications requiring a draft NEPA document.
Code of Federal Regulations, 2014 CFR
2014-04-01
... a draft NEPA document. 5.25 Section 5.25 Conservation of Power and Water Resources FEDERAL ENERGY... APPLICATION PROCESS § 5.25 Applications requiring a draft NEPA document. (a) If the Commission determines that a license application will be processed with an environmental impact statement, or a draft and final...
Cognitive Process as a Basis for Intelligent Retrieval Systems Design.
ERIC Educational Resources Information Center
Chen, Hsinchun; Dhar, Vasant
1991-01-01
Two studies of the cognitive processes involved in online document-based information retrieval were conducted. These studies led to the development of five computational models of online document retrieval which were incorporated into the design of an "intelligent" document-based retrieval system. Both the system and the broader implications of…