Müller-Staub, Maria; de Graaf-Waar, Helen; Paans, Wolter
2016-11-01
Nurses are accountable to apply the nursing process, which is key for patient care: It is a problem-solving process providing the structure for care plans and documentation. The state-of-the art nursing process is based on classifications that contain standardized concepts, and therefore, it is named Advanced Nursing Process. It contains valid assessments, nursing diagnoses, interventions, and nursing-sensitive patient outcomes. Electronic decision support systems can assist nurses to apply the Advanced Nursing Process. However, nursing decision support systems are missing, and no "gold standard" is available. The study aim is to develop a valid Nursing Process-Clinical Decision Support System Standard to guide future developments of clinical decision support systems. In a multistep approach, a Nursing Process-Clinical Decision Support System Standard with 28 criteria was developed. After pilot testing (N = 29 nurses), the criteria were reduced to 25. The Nursing Process-Clinical Decision Support System Standard was then presented to eight internationally known experts, who performed qualitative interviews according to Mayring. Fourteen categories demonstrate expert consensus on the Nursing Process-Clinical Decision Support System Standard and its content validity. All experts agreed the Advanced Nursing Process should be the centerpiece for the Nursing Process-Clinical Decision Support System and should suggest research-based, predefined nursing diagnoses and correct linkages between diagnoses, evidence-based interventions, and patient outcomes.
40 CFR 63.444 - Standards for the pulping system at sulfite processes.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Standards for the pulping system at sulfite processes. (a) The owner or operator of each sulfite process... 40 Protection of Environment 9 2010-07-01 2010-07-01 false Standards for the pulping system at sulfite processes. 63.444 Section 63.444 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY...
Dong, Ling; Sun, Yu; Pei, Wen-Xuan; Dai, Jun-Dong; Wang, Zi-Yu; Pan, Meng; Chen, Jiang-Peng; Wang, Yun
2017-12-01
The concept of "Quality by design" indicates that good design for the whole life cycle of pharmaceutical production enables the drug to meet the expected quality requirements. Aiming at the existing problems of the traditional Chinese medicine (TCM) industry, the TCM standardization system was put forward in this paper from the national strategic level, under the guidance by the idea of quality control in international manufacturing industry and with considerations of TCM industry's own characteristics and development status. The connotation of this strategy was to establish five interrelated systems: multi-indicators system based on tri-indicators system, quality standard and specification system of TCM herbal materials and decoction pieces, quality traceability system, data monitoring system based on whole-process quality control, and whole-process quality management system of TCM, and achieve the whole process systematic and scientific study in TCM industry through "top-level design-implement in steps-system integration" workflow. This article analyzed the correlation between the quality standards of all links, established standard operating procedures of each link and whole process, and constructed a high standard overall quality management system for TCM industry chains, in order to provide a demonstration for the establishment of TCM whole-process quality control system and provide systematic reference and basis for standardization strategy in TCM industry. Copyright© by the Chinese Pharmaceutical Association.
The standards process: X3 information processing systems
NASA Technical Reports Server (NTRS)
Emard, Jean-Paul
1993-01-01
The topics are presented in viewgraph form and include the following: International Organization for Standards (ISO); International Electrotechnical Committee (IEC); ISO/IEC Joint Technical Committee 1 (JTC-1); U.S. interface to JTC-1; ANSI; national organizations; U.S. standards development processes; national and international standards developing organizations; regional organizations; and X3 information processing systems.
46 CFR 154.500 - Cargo and process piping standards.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Equipment Cargo and Process Piping Systems § 154.500 Cargo and process piping standards. The cargo liquid and vapor piping and process piping systems must meet the requirements in §§ 154.503 through 154.562... 46 Shipping 5 2010-10-01 2010-10-01 false Cargo and process piping standards. 154.500 Section 154...
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 6 2010-07-01 2010-07-01 false Standards for coal processing and conveying equipment, coal storage systems, transfer and loading systems, and open storage piles. 60.254... (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Standards of Performance for Coal Preparation...
NASA's Earth Science Data Systems Standards Process Experiences
NASA Technical Reports Server (NTRS)
Ullman, Richard E.; Enloe, Yonsook
2007-01-01
NASA has impaneled several internal working groups to provide recommendations to NASA management on ways to evolve and improve Earth Science Data Systems. One of these working groups is the Standards Process Group (SPC). The SPG is drawn from NASA-funded Earth Science Data Systems stakeholders, and it directs a process of community review and evaluation of proposed NASA standards. The working group's goal is to promote interoperability and interuse of NASA Earth Science data through broader use of standards that have proven implementation and operational benefit to NASA Earth science by facilitating the NASA management endorsement of proposed standards. The SPC now has two years of experience with this approach to identification of standards. We will discuss real examples of the different types of candidate standards that have been proposed to NASA's Standards Process Group such as OPeNDAP's Data Access Protocol, the Hierarchical Data Format, and Open Geospatial Consortium's Web Map Server. Each of the three types of proposals requires a different sort of criteria for understanding the broad concepts of "proven implementation" and "operational benefit" in the context of NASA Earth Science data systems. We will discuss how our Standards Process has evolved with our experiences with the three candidate standards.
Software Safety Risk in Legacy Safety-Critical Computer Systems
NASA Technical Reports Server (NTRS)
Hill, Janice L.; Baggs, Rhoda
2007-01-01
Safety Standards contain technical and process-oriented safety requirements. Technical requirements are those such as "must work" and "must not work" functions in the system. Process-Oriented requirements are software engineering and safety management process requirements. Address the system perspective and some cover just software in the system > NASA-STD-8719.13B Software Safety Standard is the current standard of interest. NASA programs/projects will have their own set of safety requirements derived from the standard. Safety Cases: a) Documented demonstration that a system complies with the specified safety requirements. b) Evidence is gathered on the integrity of the system and put forward as an argued case. [Gardener (ed.)] c) Problems occur when trying to meet safety standards, and thus make retrospective safety cases, in legacy safety-critical computer systems.
Standard development at the Human Variome Project.
Smith, Timothy D; Vihinen, Mauno
2015-01-01
The Human Variome Project (HVP) is a world organization working towards facilitating the collection, curation, interpretation and free and open sharing of genetic variation information. A key component of HVP activities is the development of standards and guidelines. HVP Standards are systems, procedures and technologies that the HVP Consortium has determined must be used by HVP-affiliated data sharing infrastructure and should be used by the broader community. HVP guidelines are considered to be beneficial for HVP affiliated data sharing infrastructure and the broader community to adopt. The HVP also maintains a process for assessing systems, processes and tools that implement HVP Standards and Guidelines. Recommended System Status is an accreditation process designed to encourage the adoption of HVP Standards and Guidelines. Here, we describe the HVP standards development process and discuss the accepted standards, guidelines and recommended systems as well as those under acceptance. Certain HVP Standards and Guidelines are already widely adopted by the community and there are committed users for the others. © The Author(s) 2015. Published by Oxford University Press.
Standard development at the Human Variome Project
Smith, Timothy D.; Vihinen, Mauno
2015-01-01
The Human Variome Project (HVP) is a world organization working towards facilitating the collection, curation, interpretation and free and open sharing of genetic variation information. A key component of HVP activities is the development of standards and guidelines. HVP Standards are systems, procedures and technologies that the HVP Consortium has determined must be used by HVP-affiliated data sharing infrastructure and should be used by the broader community. HVP guidelines are considered to be beneficial for HVP affiliated data sharing infrastructure and the broader community to adopt. The HVP also maintains a process for assessing systems, processes and tools that implement HVP Standards and Guidelines. Recommended System Status is an accreditation process designed to encourage the adoption of HVP Standards and Guidelines. Here, we describe the HVP standards development process and discuss the accepted standards, guidelines and recommended systems as well as those under acceptance. Certain HVP Standards and Guidelines are already widely adopted by the community and there are committed users for the others. PMID:25818894
76 FR 16277 - System Restoration Reliability Standards
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-23
... system restoration process. The Commission also approves the NERC's proposal to retire four existing EOP... prepare personnel to enable effective coordination of the system restoration process. The Commission also..., through the Reliability Standard development process, a modification to EOP-005-1 that identifies time...
NASA Technical Reports Server (NTRS)
Callender, E. David; Steinbacher, Jody
1989-01-01
This is the third of five volumes on Information System Life-Cycle and Documentation Standards which present a well organized, easily used standard for providing technical information needed for developing information systems, components, and related processes. This volume states the Software Management and Assurance Program documentation standard for a product specification document and for data item descriptions. The framework can be applied to any NASA information system, software, hardware, operational procedures components, and related processes.
Khan, Wajahat Ali; Khattak, Asad Masood; Hussain, Maqbool; Amin, Muhammad Bilal; Afzal, Muhammad; Nugent, Christopher; Lee, Sungyoung
2014-08-01
Heterogeneity in the management of the complex medical data, obstructs the attainment of data level interoperability among Health Information Systems (HIS). This diversity is dependent on the compliance of HISs with different healthcare standards. Its solution demands a mediation system for the accurate interpretation of data in different heterogeneous formats for achieving data interoperability. We propose an adaptive AdapteR Interoperability ENgine mediation system called ARIEN, that arbitrates between HISs compliant to different healthcare standards for accurate and seamless information exchange to achieve data interoperability. ARIEN stores the semantic mapping information between different standards in the Mediation Bridge Ontology (MBO) using ontology matching techniques. These mappings are provided by our System for Parallel Heterogeneity (SPHeRe) matching system and Personalized-Detailed Clinical Model (P-DCM) approach to guarantee accuracy of mappings. The realization of the effectiveness of the mappings stored in the MBO is evaluation of the accuracy in transformation process among different standard formats. We evaluated our proposed system with the transformation process of medical records between Clinical Document Architecture (CDA) and Virtual Medical Record (vMR) standards. The transformation process achieved over 90 % of accuracy level in conversion process between CDA and vMR standards using pattern oriented approach from the MBO. The proposed mediation system improves the overall communication process between HISs. It provides an accurate and seamless medical information exchange to ensure data interoperability and timely healthcare services to patients.
76 FR 16263 - Revision to Electric Reliability Organization Definition of Bulk Electric System
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-23
...'s Reliability Standards Development Process, to revise its definition of the term ``bulk electric... definition of ``bulk electric system'' through the NERC Standards Development Process to address the... undertake the process of revising the bulk electric system definition to address the Commission's concerns...
Information system life-cycle and documentation standards, volume 1
NASA Technical Reports Server (NTRS)
Callender, E. David; Steinbacher, Jody
1989-01-01
The Software Management and Assurance Program (SMAP) Information System Life-Cycle and Documentation Standards Document describes the Version 4 standard information system life-cycle in terms of processes, products, and reviews. The description of the products includes detailed documentation standards. The standards in this document set can be applied to the life-cycle, i.e., to each phase in the system's development, and to the documentation of all NASA information systems. This provides consistency across the agency as well as visibility into the completeness of the information recorded. An information system is software-intensive, but consists of any combination of software, hardware, and operational procedures required to process, store, or transmit data. This document defines a standard life-cycle model and content for associated documentation.
[Status and suggestions for adjuvant standard for Chinese materia medica processing in China].
Yang, Chun-Yu; Cao, Hui; Wang, Xiao-Tao; Tu, Jia-Sheng; Qian, Zhong-Zhi; Yu, Zhi-Ling; Shang, Yue; Zhang, Bao-Xian
2017-04-01
In this paper, the status of adjuvant standard for Chinese materia medica processing in the Chinese Pharmacopoeia 2015 edition, the National Specification of Chinese Materia Medica Processing, and the 29 provincial specification of Chinese materia medica was summarized, and the the status including general requirements, specific requirements, and quality standard in the three grade official specifications was collected and analyzed according to the "medicine-adjuvant homology" and "food-adjuvant homology" features of adjuvants. This paper also introduced the research situation of adjuvant standard for Chinese materia medica processing in China; In addition, analyzed and discussed the problems existing in the standard system of adjuvant for Chinese materia medica processing, such as lack of general requirements, low level of standard, inconsistent standard references, and lack of research on the standard, and provided suggestions for the further establishment of the national standards system of adjuvant for Chinese materia medica processing. Copyright© by the Chinese Pharmaceutical Association.
Space Generic Open Avionics Architecture (SGOAA) standard specification
NASA Technical Reports Server (NTRS)
Wray, Richard B.; Stovall, John R.
1994-01-01
This standard establishes the Space Generic Open Avionics Architecture (SGOAA). The SGOAA includes a generic functional model, processing structural model, and an architecture interface model. This standard defines the requirements for applying these models to the development of spacecraft core avionics systems. The purpose of this standard is to provide an umbrella set of requirements for applying the generic architecture models to the design of a specific avionics hardware/software processing system. This standard defines a generic set of system interface points to facilitate identification of critical services and interfaces. It establishes the requirement for applying appropriate low level detailed implementation standards to those interfaces points. The generic core avionics functions and processing structural models provided herein are robustly tailorable to specific system applications and provide a platform upon which the interface model is to be applied.
Project evaluation process manual
DOT National Transportation Integrated Search
1997-07-01
Describes the process for evaluating airport environments, safety standards, airport infrastructure, licensing standards and multitransportational systems. The project rating system is intended to be used for determining state and federal funding.
NASA Astrophysics Data System (ADS)
Kuehl, C. Stephen
1996-06-01
Video signal system performance can be compromised in a military aircraft cockpit management system (CMS) with the tailoring of vintage Electronics Industries Association (EIA) RS170 and RS343A video interface standards. Video analog interfaces degrade when induced system noise is present. Further signal degradation has been traditionally associated with signal data conversions between avionics sensor outputs and the cockpit display system. If the CMS engineering process is not carefully applied during the avionics video and computing architecture development, extensive and costly redesign will occur when visual sensor technology upgrades are incorporated. Close monitoring and technical involvement in video standards groups provides the knowledge-base necessary for avionic systems engineering organizations to architect adaptable and extendible cockpit management systems. With the Federal Communications Commission (FCC) in the process of adopting the Digital HDTV Grand Alliance System standard proposed by the Advanced Television Systems Committee (ATSC), the entertainment and telecommunications industries are adopting and supporting the emergence of new serial/parallel digital video interfaces and data compression standards that will drastically alter present NTSC-M video processing architectures. The re-engineering of the U.S. Broadcasting system must initially preserve the electronic equipment wiring networks within broadcast facilities to make the transition to HDTV affordable. International committee activities in technical forums like ITU-R (former CCIR), ANSI/SMPTE, IEEE, and ISO/IEC are establishing global consensus on video signal parameterizations that support a smooth transition from existing analog based broadcasting facilities to fully digital computerized systems. An opportunity exists for implementing these new video interface standards over existing video coax/triax cabling in military aircraft cockpit management systems. Reductions in signal conversion processing steps, major improvement in video noise reduction, and an added capability to pass audio/embedded digital data within the digital video signal stream are the significant performance increases associated with the incorporation of digital video interface standards. By analyzing the historical progression of military CMS developments, establishing a systems engineering process for CMS design, tracing the commercial evolution of video signal standardization, adopting commercial video signal terminology/definitions, and comparing/contrasting CMS architecture modifications using digital video interfaces; this paper provides a technical explanation on how a systems engineering process approach to video interface standardization can result in extendible and affordable cockpit management systems.
IEC 61511 and the capital project process--a protective management system approach.
Summers, Angela E
2006-03-17
This year, the process industry has reached an important milestone in process safety-the acceptance of an internationally recognized standard for safety instrumented systems (SIS). This standard, IEC 61511, documents good engineering practice for the assessment, design, operation, maintenance, and management of SISs. The foundation of the standard is established by several requirements in Part 1, Clauses 5-7, which cover the development of a management system aimed at ensuring that functional safety is achieved. The management system includes a quality assurance process for the entire SIS lifecycle, requiring the development of procedures, identification of resources and acquisition of tools. For maximum benefit, the deliverables and quality control checks required by the standard should be integrated into the capital project process, addressing safety, environmental, plant productivity, and asset protection. Industry has become inundated with a multitude of programs focusing on safety, quality, and cost performance. This paper introduces a protective management system, which builds upon the work process identified in IEC 61511. Typical capital project phases are integrated with the management system to yield one comprehensive program to efficiently manage process risk. Finally, the paper highlights areas where internal practices or guidelines should be developed to improve program performance and cost effectiveness.
NASA Astrophysics Data System (ADS)
A'diat, Arkan Addien Al; Liquiddanu, Eko; Laksono, Pringgo Widyo; Sutopo, Wahyudi; Suletra, I. Wayan
2018-02-01
Along with the increasing number of the modern retail business in Indonesia, give an opportunity to small and medium enterprise (SME) to sell its products through the modern retailer. There are some obstacles faced by the SMEs, one of them is about product standard. Product standard that must be owned by SMEs are GMP standard and halal standard. This research was conducted to know the fulfillment by the beef floss enterprise in jagalan in fulfilling the GMP standard and halal. In addition, Hazard Analysis and Critical Control Points (HACCP) system was applied to analyze the process. HACCP which used in this research was based on the seven principles in SNI (Indonesian National Standard) 01-4852-1998. The seven principles included hazard analysis, critical control point (CCP) determination, critical limit establishment, CCP monitor system establishment, corrective action establishment, verification, and also documentation establishment that must be applied in preparing HACCP plan. Based on this case study, it is concluded that there were 5 CCPs : the boiling process, roasting process, frying process, the beef floss draining process, and the packaging process.
Architecture for Survivable System Processing (ASSP)
NASA Astrophysics Data System (ADS)
Wood, Richard J.
1991-11-01
The Architecture for Survivable System Processing (ASSP) Program is a multi-phase effort to implement Department of Defense (DOD) and commercially developed high-tech hardware, software, and architectures for reliable space avionics and ground based systems. System configuration options provide processing capabilities to address Time Dependent Processing (TDP), Object Dependent Processing (ODP), and Mission Dependent Processing (MDP) requirements through Open System Architecture (OSA) alternatives that allow for the enhancement, incorporation, and capitalization of a broad range of development assets. High technology developments in hardware, software, and networking models, address technology challenges of long processor life times, fault tolerance, reliability, throughput, memories, radiation hardening, size, weight, power (SWAP) and security. Hardware and software design, development, and implementation focus on the interconnectivity/interoperability of an open system architecture and is being developed to apply new technology into practical OSA components. To insure for widely acceptable architecture capable of interfacing with various commercial and military components, this program provides for regular interactions with standardization working groups (e.g.) the International Standards Organization (ISO), American National Standards Institute (ANSI), Society of Automotive Engineers (SAE), and Institute of Electrical and Electronic Engineers (IEEE). Selection of a viable open architecture is based on the widely accepted standards that implement the ISO/OSI Reference Model.
Architecture for Survivable System Processing (ASSP)
NASA Technical Reports Server (NTRS)
Wood, Richard J.
1991-01-01
The Architecture for Survivable System Processing (ASSP) Program is a multi-phase effort to implement Department of Defense (DOD) and commercially developed high-tech hardware, software, and architectures for reliable space avionics and ground based systems. System configuration options provide processing capabilities to address Time Dependent Processing (TDP), Object Dependent Processing (ODP), and Mission Dependent Processing (MDP) requirements through Open System Architecture (OSA) alternatives that allow for the enhancement, incorporation, and capitalization of a broad range of development assets. High technology developments in hardware, software, and networking models, address technology challenges of long processor life times, fault tolerance, reliability, throughput, memories, radiation hardening, size, weight, power (SWAP) and security. Hardware and software design, development, and implementation focus on the interconnectivity/interoperability of an open system architecture and is being developed to apply new technology into practical OSA components. To insure for widely acceptable architecture capable of interfacing with various commercial and military components, this program provides for regular interactions with standardization working groups (e.g.) the International Standards Organization (ISO), American National Standards Institute (ANSI), Society of Automotive Engineers (SAE), and Institute of Electrical and Electronic Engineers (IEEE). Selection of a viable open architecture is based on the widely accepted standards that implement the ISO/OSI Reference Model.
Solar industrial process heat systems: An assessment of standards for materials and components
NASA Astrophysics Data System (ADS)
Rossiter, W. J.; Shipp, W. E.
1981-09-01
A study was conducted to obtain information on the performance of materials and components in operational solar industrial process heat (PH) systems, and to provide recommendations for the development of standards including evaluative test procedures for materials and components. An assessment of the needs for standards for evaluating the long-term performance of materials and components of IPH systems was made. The assessment was based on the availability of existing standards, and information obtained from a field survey of operational systems, the literature, and discussions with individuals in the industry. Field inspections of 10 operational IPH systems were performed.
Government Open Systems Interconnection Profile (GOSIP) Transition Strategy
1993-09-01
it explores some of the GOSIP protocol details and discusses the process by which standards organizations have developed their...version 1 and 2. Additionally, it explores some of the GOSIP protocol details and discusses the process by which standards organizations have developed ...ORGANIZATION OF STUDY 1. The Standards Process Chapter II describes the process whereby standards are developed and adopted by the ISO and how the
Karch, Debra L; Chen, Mi; Tang, Tian
2014-01-01
In 2009, the Centers for Disease Control and Prevention completed migration of all 59 surveillance project areas (PAs) from the case-based HIV/AIDS Reporting System to the document-based Enhanced HIV/AIDS Reporting System. We conducted a PA-level assessment of Enhanced HIV/AIDS Reporting System process and outcome standards for HIV infection cases. Process standards were reported by PAs and outcome standards were calculated using standardized Centers for Disease Control and Prevention SAS code. A total of 59 PAs including 50 US states, the District of Columbia, 6 separately funded cities (Chicago, Houston, Los Angeles County, New York City, Philadelphia, and San Francisco), and 2 territories (Puerto Rico and the Virgin Islands). Cases diagnosed or reported to the PA surveillance system between January 1, 2011, and December 31, 2011, using data collected through December 2012. Process standards for death ascertainment and intra- and interstate case de-duplication; outcome standards for completeness and timeliness of case reporting, data quality, intrastate duplication rate, risk factor ascertainment, and completeness of initial CD4 and viral load reporting. Fifty-five of 59 PAs (93%) reported linking cases to state vital records death certificates during 2012, 76% to the Social Security Death Master File, and 59% to the National Death Index. Seventy percent completed monthly intrastate, and 63% completed semiannual interstate de-duplication. Eighty-three percent met the 85% or more case ascertainment standard, and 92% met the 66% or more timeliness standard; 75% met the 97% or more data quality standard; all PAs met the 5% or less intrastate duplication rate; 41% met the 85% or more risk factor ascertainment standard; 90% met the 50% or more standard for initial CD4; and 93% met the same standard for viral load reporting. Overall, 7% of PAs met all 11 process and outcome standards. Findings support the need for continued improvement in HIV surveillance activities and monitoring of system outcomes.
Advancements in internationally accepted standards for radiation processing
NASA Astrophysics Data System (ADS)
Farrar, Harry; Derr, Donald D.; Vehar, David W.
1993-10-01
Three subcommittees of the American Society for Testing and Materials (ASTM) are developing standards on various aspects of radiation processing. Subcommittee E10.01 "Dosimetry for Radiation Processing" has published 9 standards on how to select and calibrate dosimeters, where to put them, how many to use, and how to use individual types of dosimeter systems. The group is also developing standards on how to use gamma, electron beam, and x-ray facilities for radiation processing, and a standard on how to treat dose uncertainties. Efforts are underway to promote inclusion of these standards into procedures now being developed by government agencies and by international groups such as the United Nations' International Consultative Group on Food Irradiation (ICGFI) in order to harmonize regulations and help avoid trade barriers. Subcommittee F10.10 "Food Processing and Packaging" has completed standards on good irradiation practices for meat and poultry and for fresh fruits, and is developing similar standards for the irradiation of seafood and spices. These food-related standards are based on practices previously published by ICGFI. Subcommittee E10.07 on "Radiation Dosimetry for Radiation Effects on Materials and Devices" principally develops standards for determining doses for radiation hardness testing of electronics. Some, including their standards on the Fricke and TLD dosimetry systems are equally useful in other radiation processing applications.
Integrated flexible manufacturing program for manufacturing automation and rapid prototyping
NASA Technical Reports Server (NTRS)
Brooks, S. L.; Brown, C. W.; King, M. S.; Simons, W. R.; Zimmerman, J. J.
1993-01-01
The Kansas City Division of Allied Signal Inc., as part of the Integrated Flexible Manufacturing Program (IFMP), is developing an integrated manufacturing environment. Several systems are being developed to produce standards and automation tools for specific activities within the manufacturing environment. The Advanced Manufacturing Development System (AMDS) is concentrating on information standards (STEP) and product data transfer; the Expert Cut Planner system (XCUT) is concentrating on machining operation process planning standards and automation capabilities; the Advanced Numerical Control system (ANC) is concentrating on NC data preparation standards and NC data generation tools; the Inspection Planning and Programming Expert system (IPPEX) is concentrating on inspection process planning, coordinate measuring machine (CMM) inspection standards and CMM part program generation tools; and the Intelligent Scheduling and Planning System (ISAPS) is concentrating on planning and scheduling tools for a flexible manufacturing system environment. All of these projects are working together to address information exchange, standardization, and information sharing to support rapid prototyping in a Flexible Manufacturing System (FMS) environment.
NASA Astrophysics Data System (ADS)
Barber, Jeffrey; Greca, Joseph; Yam, Kevin; Weatherall, James C.; Smith, Peter R.; Smith, Barry T.
2017-05-01
In 2016, the millimeter wave (MMW) imaging community initiated the formation of a standard for millimeter wave image quality metrics. This new standard, American National Standards Institute (ANSI) N42.59, will apply to active MMW systems for security screening of humans. The Electromagnetic Signatures of Explosives Laboratory at the Transportation Security Laboratory is supporting the ANSI standards process via the creation of initial prototypes for round-robin testing with MMW imaging system manufacturers and experts. Results obtained for these prototypes will be used to inform the community and lead to consensus objective standards amongst stakeholders. Images collected with laboratory systems are presented along with results of preliminary image analysis. Future directions for object design, data collection and image processing are discussed.
Sauer, Vernon B.
2002-01-01
Surface-water computation methods and procedures are described in this report to provide standards from which a completely automated electronic processing system can be developed. To the greatest extent possible, the traditional U. S. Geological Survey (USGS) methodology and standards for streamflow data collection and analysis have been incorporated into these standards. Although USGS methodology and standards are the basis for this report, the report is applicable to other organizations doing similar work. The proposed electronic processing system allows field measurement data, including data stored on automatic field recording devices and data recorded by the field hydrographer (a person who collects streamflow and other surface-water data) in electronic field notebooks, to be input easily and automatically. A user of the electronic processing system easily can monitor the incoming data and verify and edit the data, if necessary. Input of the computational procedures, rating curves, shift requirements, and other special methods are interactive processes between the user and the electronic processing system, with much of this processing being automatic. Special computation procedures are provided for complex stations such as velocity-index, slope, control structures, and unsteady-flow models, such as the Branch-Network Dynamic Flow Model (BRANCH). Navigation paths are designed to lead the user through the computational steps for each type of gaging station (stage-only, stagedischarge, velocity-index, slope, rate-of-change in stage, reservoir, tide, structure, and hydraulic model stations). The proposed electronic processing system emphasizes the use of interactive graphics to provide good visual tools for unit values editing, rating curve and shift analysis, hydrograph comparisons, data-estimation procedures, data review, and other needs. Documentation, review, finalization, and publication of records are provided for with the electronic processing system, as well as archiving, quality assurance, and quality control.
2015-12-04
This final rule will extend enhanced funding for Medicaid eligibility systems as part of a state's mechanized claims processing system, and will update conditions and standards for such systems, including adding to and updating current Medicaid Management Information Systems (MMIS) conditions and standards. These changes will allow states to improve customer service and support the dynamic nature of Medicaid eligibility, enrollment, and delivery systems.
Code of Federal Regulations, 2010 CFR
2010-07-01
... being used will be based on information available to the Administrator, which may include, but is not... techniques, or the control system and process monitoring equipment during a malfunction in a manner... the process and control system monitoring equipment, and shall include a standardized checklist to...
Code of Federal Regulations, 2011 CFR
2011-07-01
... being used will be based on information available to the Administrator, which may include, but is not... techniques, or the control system and process monitoring equipment during a malfunction in a manner... the process and control system monitoring equipment, and shall include a standardized checklist to...
Template for success: using a resident-designed sign-out template in the handover of patient care.
Clark, Clancy J; Sindell, Sarah L; Koehler, Richard P
2011-01-01
Report our implementation of a standardized handover process in a general surgery residency program. The standardized handover process, sign-out template, method of implementation, and continuous quality improvement process were designed by general surgery residents with support of faculty and senior hospital administration using standard work principles and business models of the Virginia Mason Production System and the Toyota Production System. Nonprofit, tertiary referral teaching hospital. General surgery residents, residency faculty, patient care providers, and hospital administration. After instruction in quality improvement initiatives, a team of general surgery residents designed a sign-out process using an electronic template and standard procedures. The initial implementation phase resulted in 73% compliance. Using resident-driven continuous quality improvement processes, real-time feedback enabled residents to modify and improve this process, eventually attaining 100% compliance and acceptance by residents. The creation of a standardized template and protocol for patient handovers might eliminate communication failures. Encouraging residents to participate in this process can establish the groundwork for successful implementation of a standardized handover process. Integrating a continuous quality-improvement process into such an initiative can promote active participation of busy general surgery residents and lead to successful implementation of standard procedures. Copyright © 2011 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 10 2010-07-01 2010-07-01 false Closed vent systems and control devices; or emissions routed to a fuel gas system or process standards. 63.1034 Section 63.1034 Protection... stringent. The 20 parts per million by volume standard is not applicable to the provisions of § 63.1016. (ii...
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 10 2011-07-01 2011-07-01 false Closed vent systems and control devices; or emissions routed to a fuel gas system or process standards. 63.1034 Section 63.1034 Protection... stringent. The 20 parts per million by volume standard is not applicable to the provisions of § 63.1016. (ii...
NASA Technical Reports Server (NTRS)
Callender, E. David; Steinbacher, Jody
1989-01-01
This is the second of five volumes of the Information System Life-Cycle and Documentation Standards. This volume provides a well-organized, easily used standard for management plans used in acquiring, assuring, and developing information systems and software, hardware, and operational procedures components, and related processes.
Perskvist, Nasrin; Norlin, Loreana; Dillner, Joakim
2015-04-01
This article addresses the important issue of the standardization of the biobank process. It reports on i) the implementation of standard operating procedures for the processing of liquid-based cervical cells, ii) the standardization of storage conditions, and iii) the ultimate establishment of nationwide standardized biorepositories for cervical specimens. Given the differences in the infrastructure and healthcare systems of various county councils in Sweden, these efforts were designed to develop standardized methods of biobanking across the nation. The standardization of cervical sample processing and biobanking is an important and widely acknowledged issue. Efforts to address these concerns will facilitate better patient care and improve research based on retrospective and prospective collections of patient samples and cohorts. The successful nationalization of the Cervical Cytology Biobank in Sweden is based on three vital issues: i) the flexibility of the system to adapt to other regional systems, ii) the development of the system based on national collaboration between the university and the county councils, and iii) stable governmental financing by the provider, the Biobanking and Molecular Resource Infrastructure of Sweden (BBMRI.se). We will share our experiences with biorepository communities to promote understanding of and advances in opportunities to establish a nationalized biobank which covers the healthcare of the entire nation.
A Strategy for Improved System Assurance
2007-06-20
Quality (Measurements Life Cycle Safety, Security & Others) ISO /IEC 12207 * Software Life Cycle Processes ISO 9001 Quality Management System...14598 Software Product Evaluation Related ISO /IEC 90003 Guidelines for the Application of ISO 9001:2000 to Computer Software IEEE 12207 Industry...Implementation of International Standard ISO /IEC 12207 IEEE 1220 Standard for Application and Management of the System Engineering Process Use in
Adopting Industry Standards for Control Systems Within Advanced Life Support
NASA Technical Reports Server (NTRS)
Young, James Scott; Boulanger, Richard
2002-01-01
This paper gives a description of OPC (Object Linking and Embedding for Process Control) standards for process control and outlines the experiences at JSC with using these standards to interface with I/O hardware from three independent vendors. The I/O hardware was integrated with a commercially available SCADA/HMI software package to make up the control and monitoring system for the Environmental Systems Test Stand (ESTS). OPC standards were utilized for communicating with I/O hardware and the software was used for implementing monitoring, PC-based distributed control, and redundant data storage over an Ethernet physical layer using an embedded din-rail mounted PC.
Putting the Power of Configuration in the Hands of the Users
NASA Technical Reports Server (NTRS)
Al-Shihabi, Mary-Jo; Brown, Mark; Rigolini, Marianne
2011-01-01
Goal was to reduce the overall cost of human space flight while maintaining the most demanding standards for safety and mission success. In support of this goal, a project team was chartered to replace 18 legacy Space Shuttle nonconformance processes and systems with one fully integrated system Problem Reporting and Corrective Action (PRACA) processes provide a closed-loop system for the identification, disposition, resolution, closure, and reporting of all Space Shuttle hardware/software problems PRACA processes are integrated throughout the Space Shuttle organizational processes and are critical to assuring a safe and successful program Primary Project Objectives Develop a fully integrated system that provides an automated workflow with electronic signatures Support multiple NASA programs and contracts with a single "system" architecture Define standard processes, implement best practices, and minimize process variations
NASA Technical Reports Server (NTRS)
Wood, Richard J.
1992-01-01
The Architecture for Survivable Systems Processing (ASSP) program is a two phase program whose objective is the derivation, specification, development and validation of an open system architecture capable of supporting advanced processing needs of space, ground, and launch vehicle operations. The output of the first phase is a set of hardware and software standards and specifications defining this architecture at three levels. The second phase will validate these standards and develop the technology necessary to achieve strategic hardness, packaging density, throughput requirements, and interoperability/interchangeability.
Strategic, Organizational and Standardization Aspects of Integrated Information Systems. Volume 6.
1987-12-01
TEST CHART NATIONAL BUREAU OF STANDARDS- 1963-A Masaustt Strategic, Organizational, and Intueoyffomto TechnlogyStandardization Aspects of UJ Kowledge ...reasons (such as the desired level of processing power and the amount of storage space), organizational reasons (such as each department obtaining its...of processing power falls, Abbott can afford to subordinate efficient processing for organizational effectiveness. 4. Steps in an Analytical Process
Maximizing your Process Improvement ROI through Harmonization
2008-03-01
ISO 12207 ) provide comprehensive guidance on what system and software engineering processes are needed. The frameworks of Six Sigma provide specific...reductions. Their veloci-Q Enterprise integrated system, includes ISO 9001, CMM, P-CMM, TL9000, British Standard 7799, and Six Sigma. They estimate a 30...at their discretion. And, they chose to blend process maturity models and ISO standards to support their objective regarding the establishment of
40 CFR 60.692-2 - Standards: Individual drain systems.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Emissions From Petroleum Refinery Wastewater Systems § 60.692-2 Standards: Individual drain systems. (a)(1... section. (e) Refinery wastewater routed through new process drains and a new first common downstream...
40 CFR 60.692-2 - Standards: Individual drain systems.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Emissions From Petroleum Refinery Wastewater Systems § 60.692-2 Standards: Individual drain systems. (a)(1... section. (e) Refinery wastewater routed through new process drains and a new first common downstream...
40 CFR 60.692-2 - Standards: Individual drain systems.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Emissions From Petroleum Refinery Wastewater Systems § 60.692-2 Standards: Individual drain systems. (a)(1... section. (e) Refinery wastewater routed through new process drains and a new first common downstream...
40 CFR 60.692-2 - Standards: Individual drain systems.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Emissions From Petroleum Refinery Wastewater Systems § 60.692-2 Standards: Individual drain systems. (a)(1... section. (e) Refinery wastewater routed through new process drains and a new first common downstream...
40 CFR 60.692-2 - Standards: Individual drain systems.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Emissions From Petroleum Refinery Wastewater Systems § 60.692-2 Standards: Individual drain systems. (a)(1... section. (e) Refinery wastewater routed through new process drains and a new first common downstream...
ASSIP Study of Real-Time Safety-Critical Embedded Software-Intensive System Engineering Practices
2008-02-01
and assessment 2. product engineering processes 3. tooling processes 6 | CMU/SEI-2008-SR-001 Slide 1 Process Standards IEC/ ISO 12207 Software...and technical effort to align with 12207 IEC/ ISO 15026 System & Software Integrity Levels Generic Safety SAE ARP 4754 Certification Considerations...Process Frameworks in revision – ISO 9001, ISO 9004 – ISO 15288/ ISO 12207 harmonization – RTCA DO-178B, MOD Standard UK 00-56/3, … • Methods & Tools
A portable real-time data processing system for standard meteorological radiosondes
NASA Technical Reports Server (NTRS)
Staffanson, F. L.
1983-01-01
The UMET-1 is a microprocessor-based portable system for automatic real-time processing of flight data transmitted from the standard RAWINSONDE upper atmosphere meteorological balloonsonde. The first 'target system' is described which was designed to receive data from a mobile tracking and telemetry receiving station (TRADAT), as the balloonsonde ascends to apogee. After balloon-burst, the UMET-1 produces user-ready hardcopy.
40 CFR 60.562-1 - Standards: Process emissions.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Compound (VOC) Emissions from the Polymer Manufacturing Industry § 60.562-1 Standards: Process emissions... vent stream from a control device shall have car-sealed opened all valves in the vent system from the emission source to the control device and car-sealed closed all valves in vent system that would lead the...
Twenty new ISO standards on dosimetry for radiation processing
NASA Astrophysics Data System (ADS)
Farrar, H., IV
2000-03-01
Twenty standards on essentially all aspects of dosimetry for radiation processing were published as new ISO standards in December 1998. The standards are based on 20 standard practices and guides developed over the past 14 years by Subcommittee E10.01 of the American Society for Testing and Materials (ASTM). The transformation to ISO standards using the 'fast track' process under ISO Technical Committee 85 (ISO/TC85) commenced in 1995 and resulted in some overlap of technical information between three of the new standards and the existing ISO Standard 11137 Sterilization of health care products — Requirements for validation and routine control — Radiation sterilization. Although the technical information in these four standards was consistent, compromise wording in the scopes of the three new ISO standards to establish precedence for use were adopted. Two of the new ISO standards are specifically for food irradiation applications, but the majority apply to all forms of gamma, X-ray, and electron beam radiation processing, including dosimetry for sterilization of health care products and the radiation processing of fruit, vegetables, meats, spices, processed foods, plastics, inks, medical wastes, and paper. Most of the standards provide exact procedures for using individual dosimetry systems or for characterizing various types of irradiation facilities, but one covers the selection and calibration of dosimetry systems, and another covers the treatment of uncertainties using the new ISO Type A and Type B evaluations. Unfortunately, nine of the 20 standards just adopted by the ISO are not the most recent versions of these standards and are therefore already out of date. To help solve this problem, efforts are being made to develop procedures to coordinate the ASTM and ISO development and revision processes for these and future ASTM-originating dosimetry standards. In the meantime, an additional four dosimetry standards have recently been published by the ASTM but have not yet been submitted to the ISO, and six more dosimetry standards are under development.
Making the connection: the VA-Regenstrief project.
Martin, D K
1992-01-01
The Regenstrief Automated Medical Record System is a well-established clinical information system with powerful facilities for querying and decision support. My colleagues and I introduced this system into the Indianapolis Veterans Affairs (VA) Medical Center by interfacing it to the institution's automated data-processing system, the Decentralized Hospital Computer Program (DHCP), using a recently standardized method for clinical data interchange. This article discusses some of the challenges encountered in that process, including the translation of vocabulary terms and maintenance of the software interface. Efforts such as these demonstrate the importance of standardization in medical informatics and the need for data standards at all levels of information exchange.
The Standardization of Time: A Sociohistorical Perspective.
ERIC Educational Resources Information Center
Zerubavel, Eviatar
1982-01-01
Explores the social process of establishing a standard time-reckoning framework. The paper examines the introduction of Greenwich Mean Time in Britain, the establishment of the American railway time-zone system, and the almost universal enforcement of the international standard time-zone system. (AM)
Government Open Systems Interconnection Profile (GOSIP) transition strategy
NASA Astrophysics Data System (ADS)
Laxen, Mark R.
1993-09-01
This thesis analyzes the Government Open Systems Interconnection Profile (GOSIP) and the requirements of the Federal Information Processing Standard (FIPS) Publication 146-1. It begins by examining the International Organization for Standardization (ISO) Open Systems Interconnection (OSI) architecture and protocol suites and the distinctions between GOSIP version one and two. Additionally, it explores some of the GOSIP protocol details and discusses the process by which standards organizations have developed their recommendations. Implementation considerations from both government and vendor perspectives illustrate the barriers and requirements faced by information systems managers, as well as basic transition strategies. The result of this thesis is to show a transition strategy through an extended and coordinated period of coexistence due to extensive legacy systems and GOSIP product unavailability. Recommendations for GOSIP protocol standards to include capabilities outside the OSI model are also presented.
2011-05-27
frameworks 4 CMMI-DEV IEEE / ISO / IEC 15288 / 12207 Quality Assurance ©2011 Walz IEEE Life Cycle Processes & Artifacts • Systems Life Cycle Processes...TAG to ISO TC 176 Quality Management • Quality: ASQ, work experience • Software: three books, consulting, work experience • Systems: Telecom & DoD...and IEEE 730 SQA need to align. The P730 IEEE standards working group has expanded the scope of the SQA process standard to align with IS 12207
The Role of Metadata Standards in EOSDIS Search and Retrieval Applications
NASA Technical Reports Server (NTRS)
Pfister, Robin
1999-01-01
Metadata standards play a critical role in data search and retrieval systems. Metadata tie software to data so the data can be processed, stored, searched, retrieved and distributed. Without metadata these actions are not possible. The process of populating metadata to describe science data is an important service to the end user community so that a user who is unfamiliar with the data, can easily find and learn about a particular dataset before an order decision is made. Once a good set of standards are in place, the accuracy with which data search can be performed depends on the degree to which metadata standards are adhered during product definition. NASA's Earth Observing System Data and Information System (EOSDIS) provides examples of how metadata standards are used in data search and retrieval.
Multi-mission space science data processing systems - Past, present, and future
NASA Technical Reports Server (NTRS)
Stallings, William H.
1990-01-01
Packetized telemetry that is consistent with the international Consultative Committee for Space Data Systems (CCSDS) has been baselined for future NASA missions such as Space Station Freedom. Some experiences from past and present multimission systems are examined, including current experiences in implementing a CCSDS standard packetized data processing system, relative to the effectiveness of the multimission approach in lowering life cycle cost and the complexity of meeting new mission needs. It is shown that the continued effort toward standardization of telemetry and processing support will permit the development of multimission systems needed to meet the increased requirements of future NASA missions.
A Proven Method for Meeting Export Control Objectives in Postal and Shipping Sectors
2015-02-01
months, the USPIS team developed and implemented an export screening standard operating procedure, implemented new and updated processes and systems ...support and protect the U.S. Postal Service and its employees, infrastructure, and customers; enforce the laws that defend the nation’s mail system ...the incidence of mail shipments violating export control laws, regulations, and standards . • Evaluate current processes and systems and identify
40 CFR 63.443 - Standards for the pulping system at kraft, soda, and semi-chemical processes.
Code of Federal Regulations, 2014 CFR
2014-07-01
... operated at a minimum temperature of 871 °C (1600 °F) and a minimum residence time of 0.75 seconds; or (4... Paper Industry § 63.443 Standards for the pulping system at kraft, soda, and semi-chemical processes. (a...)(ii)(C) of this section. (A) Each knotter system with emissions of 0.05 kilograms or more of total HAP...
40 CFR 63.443 - Standards for the pulping system at kraft, soda, and semi-chemical processes.
Code of Federal Regulations, 2013 CFR
2013-07-01
... operated at a minimum temperature of 871 °C (1600 °F) and a minimum residence time of 0.75 seconds; or (4... Paper Industry § 63.443 Standards for the pulping system at kraft, soda, and semi-chemical processes. (a...)(ii)(C) of this section. (A) Each knotter system with emissions of 0.05 kilograms or more of total HAP...
40 CFR 63.443 - Standards for the pulping system at kraft, soda, and semi-chemical processes.
Code of Federal Regulations, 2012 CFR
2012-07-01
... operated at a minimum temperature of 871 °C (1600 °F) and a minimum residence time of 0.75 seconds; or (4... Paper Industry § 63.443 Standards for the pulping system at kraft, soda, and semi-chemical processes. (a...)(ii)(C) of this section. (A) Each knotter system with emissions of 0.05 kilograms or more of total HAP...
40 CFR 63.443 - Standards for the pulping system at kraft, soda, and semi-chemical processes.
Code of Federal Regulations, 2010 CFR
2010-07-01
... operated at a minimum temperature of 871 °C (1600 °F) and a minimum residence time of 0.75 seconds; or (4... Paper Industry § 63.443 Standards for the pulping system at kraft, soda, and semi-chemical processes. (a...)(ii)(C) of this section. (A) Each knotter system with emissions of 0.05 kilograms or more of total HAP...
NASA Technical Reports Server (NTRS)
Callender, E. David; Steinbacher, Jody
1989-01-01
This is the fourth of five volumes on Information System Life-Cycle and Documentation Standards. This volume provides a well organized, easily used standard for assurance documentation for information systems and software, hardware, and operational procedures components, and related processes. The specifications are developed in conjunction with the corresponding management plans specifying the assurance activities to be performed.
NASA Technical Reports Server (NTRS)
Callender, E. David; Steinbacher, Jody
1989-01-01
This is the fifth of five volumes on Information System Life-Cycle and Documentation Standards. This volume provides a well organized, easily used standard for management control and status reports used in monitoring and controlling the management, development, and assurance of informations systems and software, hardware, and operational procedures components, and related processes.
Human Integration Design Processes (HIDP)
NASA Technical Reports Server (NTRS)
Boyer, Jennifer
2014-01-01
The purpose of the Human Integration Design Processes (HIDP) document is to provide human-systems integration design processes, including methodologies and best practices that NASA has used to meet human systems and human rating requirements for developing crewed spacecraft. HIDP content is framed around human-centered design methodologies and processes in support of human-system integration requirements and human rating. NASA-STD-3001, Space Flight Human-System Standard, is a two-volume set of National Aeronautics and Space Administration (NASA) Agency-level standards established by the Office of the Chief Health and Medical Officer, directed at minimizing health and performance risks for flight crews in human space flight programs. Volume 1 of NASA-STD-3001, Crew Health, sets standards for fitness for duty, space flight permissible exposure limits, permissible outcome limits, levels of medical care, medical diagnosis, intervention, treatment and care, and countermeasures. Volume 2 of NASASTD- 3001, Human Factors, Habitability, and Environmental Health, focuses on human physical and cognitive capabilities and limitations and defines standards for spacecraft (including orbiters, habitats, and suits), internal environments, facilities, payloads, and related equipment, hardware, and software with which the crew interfaces during space operations. The NASA Procedural Requirements (NPR) 8705.2B, Human-Rating Requirements for Space Systems, specifies the Agency's human-rating processes, procedures, and requirements. The HIDP was written to share NASA's knowledge of processes directed toward achieving human certification of a spacecraft through implementation of human-systems integration requirements. Although the HIDP speaks directly to implementation of NASA-STD-3001 and NPR 8705.2B requirements, the human-centered design, evaluation, and design processes described in this document can be applied to any set of human-systems requirements and are independent of reference missions. The HIDP is a reference document that is intended to be used during the development of crewed space systems and operations to guide human-systems development process activities.
48 CFR 9904.414-50 - Techniques for application.
Code of Federal Regulations, 2010 CFR
2010-10-01
... the case of process cost accounting systems, the contracting parties may agree to substitute an.... 9904.414-50 Section 9904.414-50 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.414-50 Techniques for application. (a) The investment...
CMMI(Registered) for Development, Version 1.3
2010-11-01
ISO /IEC 15288:2008 Systems and Software Engineering – System Life Cycle Processes [ ISO 2008b] ISO /IEC 27001 :2005 Information technology – Security...IEC 2005 International Organization for Standardization and International Electrotechnical Commission. ISO /IEC 27001 Information Technology...International Electrotechnical Commission ( ISO /IEC) body of standards. CMMs focus on improving processes in an organization. They contain the
ERIC Educational Resources Information Center
Hack, David
This report on telephone networks and computer networks in a global context focuses on the processes and organizations through which the standards that make this possible are set. The first of five major sections presents descriptions of the standardization process, including discussions of the various kinds of standards, advantages and…
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 15 2010-07-01 2010-07-01 false Standards: Closed vent systems and control devices; or emissions routed to a fuel gas system or process. 65.115 Section 65.115 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONSOLIDATED FEDERAL AIR RULE Equipment Leaks § 65.115 Standards:...
Aircraft Alerting Systems Standardization Study. Phase IV. Accident Implications on Systems Design.
1982-06-01
computing and processing to assimilate and process status informa- 5 tion using...provided with capabilities in computing and processing , sensing, interfacing, and controlling and displaying. 17 o Computing and Processing - Algorithms...alerting system to perform a flight status monitor function would require additional sensinq, computing and processing , interfacing, and controlling
40 CFR 63.1031 - Compressors standards.
Code of Federal Regulations, 2012 CFR
2012-07-01
... specified in the referencing subpart. (b) Seal system standard. Each compressor shall be equipped with a seal system that includes a barrier fluid system and that prevents leakage of process fluid to the.... Each compressor seal system shall meet the applicable requirements specified in paragraph (b)(1), (b)(2...
40 CFR 63.1031 - Compressors standards.
Code of Federal Regulations, 2014 CFR
2014-07-01
... specified in the referencing subpart. (b) Seal system standard. Each compressor shall be equipped with a seal system that includes a barrier fluid system and that prevents leakage of process fluid to the.... Each compressor seal system shall meet the applicable requirements specified in paragraph (b)(1), (b)(2...
An overview of the National Space Science data Center Standard Information Retrieval System (SIRS)
NASA Technical Reports Server (NTRS)
Shapiro, A.; Blecher, S.; Verson, E. E.; King, M. L. (Editor)
1974-01-01
A general overview is given of the National Space Science Data Center (NSSDC) Standard Information Retrieval System. A description, in general terms, the information system that contains the data files and the software system that processes and manipulates the files maintained at the Data Center. Emphasis is placed on providing users with an overview of the capabilities and uses of the NSSDC Standard Information Retrieval System (SIRS). Examples given are taken from the files at the Data Center. Detailed information about NSSDC data files is documented in a set of File Users Guides, with one user's guide prepared for each file processed by SIRS. Detailed information about SIRS is presented in the SIRS Users Guide.
40 CFR 63.1032 - Sampling connection systems standards.
Code of Federal Regulations, 2013 CFR
2013-07-01
...) [Reserved] (3) Be designed and operated to capture and transport all the purged process fluid to a control... (CONTINUED) National Emission Standards for Equipment Leaks-Control Level 2 Standards § 63.1032 Sampling... design and operation. Each closed-purge, closed-loop, or closed vent system as required in paragraph (b...
40 CFR 63.1032 - Sampling connection systems standards.
Code of Federal Regulations, 2011 CFR
2011-07-01
...) [Reserved] (3) Be designed and operated to capture and transport all the purged process fluid to a control... (CONTINUED) National Emission Standards for Equipment Leaks-Control Level 2 Standards § 63.1032 Sampling... design and operation. Each closed-purge, closed-loop, or closed vent system as required in paragraph (b...
Retinal Information Processing for Minimum Laser Lesion Detection and Cumulative Damage
1992-09-17
TAL3Unaqr~orJ:ccd [] J ,;--Wicic tion --------------... MYRON....... . ................... ... ....... ...........................MYRON L. WOLBARSHT B D ist...possible beneficial visual function of the small retinal image movements. B . Visual System Models Prior models of visual system information processing have...against standard secondary sources whose calibrations can be traced to the National Bureau of Standards. B . Electrophysiological Techniques Extracellular
Quality Space and Launch Requirements Addendum to AS9100C
2015-03-05
45 8.9.1 Statistical Process Control (SPC) .......................................................................... 45 8.9.1.1 Out of Control...Systems Center SME Subject Matter Expert SOW Statement of Work SPC Statistical Process Control SPO System Program Office SRP Standard Repair...individual data exceeding the control limits. Control limits are developed using standard statistical methods or other approved techniques and are based on
12 CFR 234.3 - Standards for payment systems.
Code of Federal Regulations, 2014 CFR
2014-01-01
... SYSTEM (CONTINUED) DESIGNATED FINANCIAL MARKET UTILITIES (REGULATION HH) § 234.3 Standards for payment systems. (a) A designated financial market utility that is designated on the basis of its role as the... arrangements for timely completion of daily processing. (8) The payment system provides a means of making...
12 CFR 234.3 - Standards for payment systems.
Code of Federal Regulations, 2013 CFR
2013-01-01
... SYSTEM (CONTINUED) DESIGNATED FINANCIAL MARKET UTILITIES (REGULATION HH) § 234.3 Standards for payment systems. (a) A designated financial market utility that is designated on the basis of its role as the... arrangements for timely completion of daily processing. (8) The payment system provides a means of making...
Future of Software Engineering Standards
NASA Technical Reports Server (NTRS)
Poon, Peter T.
1997-01-01
In the new millennium, software engineering standards are expected to continue to influence the process of producing software-intensive systems which are cost-effetive and of high quality. These sytems may range from ground and flight systems used for planetary exploration to educational support systems used in schools as well as consumer-oriented systems.
A standard-driven approach for electronic submission to pharmaceutical regulatory authorities.
Lin, Ching-Heng; Chou, Hsin-I; Yang, Ueng-Cheng
2018-03-01
Using standards is not only useful for data interchange during the process of a clinical trial, but also useful for analyzing data in a review process. Any step, which speeds up approval of new drugs, may benefit patients. As a result, adopting standards for regulatory submission becomes mandatory in some countries. However, preparing standard-compliant documents, such as annotated case report form (aCRF), needs a great deal of knowledge and experience. The process is complex and labor-intensive. Therefore, there is a need to use information technology to facilitate this process. Instead of standardizing data after the completion of a clinical trial, this study proposed a standard-driven approach. This approach was achieved by implementing a computer-assisted "standard-driven pipeline (SDP)" in an existing clinical data management system. SDP used CDISC standards to drive all processes of a clinical trial, such as the design, data acquisition, tabulation, etc. RESULTS: A completed phase I/II trial was used to prove the concept and to evaluate the effects of this approach. By using the CDISC-compliant question library, aCRFs were generated automatically when the eCRFs were completed. For comparison purpose, the data collection process was simulated and the collected data was transformed by the SDP. This new approach reduced the missing data fields from sixty-two to eight and the controlled term mismatch field reduced from eight to zero during data tabulation. This standard-driven approach accelerated CRF annotation and assured data tabulation integrity. The benefits of this approach include an improvement in the use of standards during the clinical trial and a reduction in missing and unexpected data during tabulation. The standard-driven approach is an advanced design idea that can be used for future clinical information system development. Copyright © 2018 Elsevier Inc. All rights reserved.
Specifications for a Federal Information Processing Standard Data Dictionary System
NASA Technical Reports Server (NTRS)
Goldfine, A.
1984-01-01
The development of a software specification that Federal agencies may use in evaluating and selecting data dictionary systems (DDS) is discussed. To supply the flexibility needed by widely different applications and environments in the Federal Government, the Federal Information Processing Standard (FIPS) specifies a core DDS together with an optimal set of modules. The focus and status of the development project are described. Functional specifications for the FIPS DDS are examined for the dictionary, the dictionary schema, and the dictionary processing system. The DDS user interfaces and DDS software interfaces are discussed as well as dictionary administration.
National Pipeline Mapping System (NPMS) : repository standards
DOT National Transportation Integrated Search
1997-07-01
This draft document contains 7 sections. They are as follows: 1. General Topics, 2. Data Formats, 3. Metadata, 4. Attribute Data, 5. Data Flow, 6. Descriptive Process, and 7. Validation and Processing of Submitted Data. These standards were created w...
Space Generic Open Avionics Architecture (SGOAA) standard specification
NASA Technical Reports Server (NTRS)
Wray, Richard B.; Stovall, John R.
1993-01-01
The purpose of this standard is to provide an umbrella set of requirements for applying the generic architecture interface model to the design of a specific avionics hardware/software system. This standard defines a generic set of system interface points to facilitate identification of critical interfaces and establishes the requirements for applying appropriate low level detailed implementation standards to those interface points. The generic core avionics system and processing architecture models provided herein are robustly tailorable to specific system applications and provide a platform upon which the interface model is to be applied.
CMMI(Registered) for Acquisition, Version 1.3. CMMI-ACQ, V1.3
2010-11-01
and Software Engineering – System Life Cycle Processes [ ISO 2008b] ISO /IEC 27001 :2005 Information technology – Security techniques – Information...International Organization for Standardization and International Electrotechnical Commission. ISO /IEC 27001 Information Technology – Security Techniques...International Organization for Standardization/International Electrotechnical Commission ( ISO /IEC) body of standards. CMMs focus on improving processes
Advanced Map For Real-Time Process Control
NASA Astrophysics Data System (ADS)
Shiobara, Yasuhisa; Matsudaira, Takayuki; Sashida, Yoshio; Chikuma, Makoto
1987-10-01
MAP, a communications protocol for factory automation proposed by General Motors [1], has been accepted by users throughout the world and is rapidly becoming a user standard. In fact, it is now a LAN standard for factory automation. MAP is intended to interconnect different devices, such as computers and programmable devices, made by different manufacturers, enabling them to exchange information. It is based on the OSI intercomputer com-munications protocol standard under development by the ISO. With progress and standardization, MAP is being investigated for application to process control fields other than factory automation [2]. The transmission response time of the network system and centralized management of data exchanged with various devices for distributed control are import-ant in the case of a real-time process control with programmable controllers, computers, and instruments connected to a LAN system. MAP/EPA and MINI MAP aim at reduced overhead in protocol processing and enhanced transmission response. If applied to real-time process control, a protocol based on point-to-point and request-response transactions limits throughput and transmission response. This paper describes an advanced MAP LAN system applied to real-time process control by adding a new data transmission control that performs multicasting communication voluntarily and periodically in the priority order of data to be exchanged.
40 CFR 420.15 - Pretreatment standards for existing sources (PSES).
Code of Federal Regulations, 2014 CFR
2014-07-01
..., shall be provided for process wastewaters from wet coke oven gas desulfurization systems, but only to... process wastewaters from other wet air pollution control systems (except those from coal charging and coke pushing emission controls), coal tar processing operations and coke plant groundwater remediation systems...
40 CFR 420.15 - Pretreatment standards for existing sources (PSES).
Code of Federal Regulations, 2012 CFR
2012-07-01
..., shall be provided for process wastewaters from wet coke oven gas desulfurization systems, but only to... process wastewaters from other wet air pollution control systems (except those from coal charging and coke pushing emission controls), coal tar processing operations and coke plant groundwater remediation systems...
40 CFR 420.15 - Pretreatment standards for existing sources (PSES).
Code of Federal Regulations, 2011 CFR
2011-07-01
..., shall be provided for process wastewaters from wet coke oven gas desulfurization systems, but only to... process wastewaters from other wet air pollution control systems (except those from coal charging and coke pushing emission controls), coal tar processing operations and coke plant groundwater remediation systems...
40 CFR 420.15 - Pretreatment standards for existing sources (PSES).
Code of Federal Regulations, 2013 CFR
2013-07-01
..., shall be provided for process wastewaters from wet coke oven gas desulfurization systems, but only to... process wastewaters from other wet air pollution control systems (except those from coal charging and coke pushing emission controls), coal tar processing operations and coke plant groundwater remediation systems...
NASA Technical Reports Server (NTRS)
Navarro, Robert J.; Grimm, Barry
1996-01-01
The agency has developed this reference publication to aid NASA organizations and their suppliers in the transition to IS0 9000. This guide focuses on the standard s intent, clarifies its requirements, offers implementation examples and highlights interrelated areas. It can assist anyone developing or evaluating NASA or supplier quality management systems. The IS0 9000 standards contain the basic elements for managing those processes that affect an organization's ability to consistently meet customer requirements. IS0 9000 was developed through the International Organization for Standardization and has been adopted as the US. national standard. These standards define a flexible foundation for customer focused process measurement, management and improvement that is the hallmark of world class enterprises.
Common Approach to Geoprocessing of Uav Data across Application Domains
NASA Astrophysics Data System (ADS)
Percivall, G. S.; Reichardt, M.; Taylor, T.
2015-08-01
UAVs are a disruptive technology bringing new geographic data and information to many application domains. UASs are similar to other geographic imagery systems so existing frameworks are applicable. But the diversity of UAVs as platforms along with the diversity of available sensors are presenting challenges in the processing and creation of geospatial products. Efficient processing and dissemination of the data is achieved using software and systems that implement open standards. The challenges identified point to the need for use of existing standards and extending standards. Results from the use of the OGC Sensor Web Enablement set of standards are presented. Next steps in the progress of UAVs and UASs may follow the path of open data, open source and open standards.
Reference dosimeter system of the iaea
NASA Astrophysics Data System (ADS)
Mehta, Kishor; Girzikowsky, Reinhard
1995-09-01
Quality assurance programmes must be in operation at radiation processing facilities to satisfy national and international Standards. Since dosimetry has a vital function in these QA programmes, it is imperative that the dosimetry systems in use at these facilities are well calibrated with a traceability to a Primary Standard Dosimetry Laboratory. As a service to the Member States, the International Atomic Energy Agency operates the International Dose Assurance Service (IDAS) to assist in this process. The transfer standard dosimetry system that is used for this service is based on ESR spectrometry. The paper describes the activities undertaken at the IAEA Dosimetry Laboratory to establish the QA programme for its reference dosimetry system. There are four key elements of such a programme: quality assurance manual; calibration that is traceable to a Primary Standard Dosimetry Laboratory; a clear and detailed statement of uncertainty in the dose measurement; and, periodic quality audit.
NASA Astrophysics Data System (ADS)
Kiekebusch, Mario J.; Lucuix, Christian; Erm, Toomas M.; Chiozzi, Gianluca; Zamparelli, Michele; Kern, Lothar; Brast, Roland; Pirani, Werther; Reiss, Roland; Popovic, Dan; Knudstrup, Jens; Duchateau, Michel; Sandrock, Stefan; Di Lieto, Nicola
2014-07-01
ESO is currently in the final phase of the standardization process for PC-based Programmable Logical Controllers (PLCs) as the new platform for the development of control systems for future VLT/VLTI instruments. The standard solution used until now consists of a Local Control Unit (LCU), a VME-based system having a CPU and commercial and proprietary boards. This system includes several layers of software and many thousands of lines of code developed and maintained in house. LCUs have been used for several years as the interface to control instrument functions but now are being replaced by commercial off-the-shelf (COTS) systems based on BECKHOFF Embedded PCs and the EtherCAT fieldbus. ESO is working on the completion of the software framework that enables a seamless integration into the VLT control system in order to be ready to support upcoming instruments like ESPRESSO and ERIS, that will be the first fully VLT compliant instruments using the new standard. The technology evaluation and standardization process has been a long and combined effort of various engineering disciplines like electronics, control and software, working together to define a solution that meets the requirements and minimizes the impact on the observatory operations and maintenance. This paper presents the challenges of the standardization process and the steps involved in such a change. It provides a technical overview of how industrial standards like EtherCAT, OPC-UA, PLCOpen MC and TwinCAT can be used to replace LCU features in various areas like software engineering and programming languages, motion control, time synchronization and astronomical tracking.
Exploring the Use of Enterprise Content Management Systems in Unification Types of Organizations
NASA Astrophysics Data System (ADS)
Izza Arshad, Noreen; Mehat, Mazlina; Ariff, Mohamed Imran Mohamed
2014-03-01
The aim of this paper is to better understand how highly standardized and integrated businesses known as unification types of organizations use Enterprise Content Management Systems (ECMS) to support their business processes. Multiple case study approach was used to study the ways two unification organizations use their ECMS in their daily work practices. Arising from these case studies are insights into the differing ways in which ECMS is used to support businesses. Based on the comparisons of the two cases, this study proposed that unification organizations may use ECMS in four ways, for: (1) collaboration, (2) information sharing that supports a standardized process structure, (3) building custom workflows that support integrated and standardized processes, and (4) providing links and access to information systems. These findings may guide organizations that are highly standardized and integrated in fashion, to achieve their intended ECMS-use, to understand reasons for ECMS failures and underutilization and to exploit technologies investments.
NASA Technical Reports Server (NTRS)
Dehghani, Navid; Tankenson, Michael
2006-01-01
This paper details an architectural description of the Mission Data Processing and Control System (MPCS), an event-driven, multi-mission ground data processing components providing uplink, downlink, and data management capabilities which will support the Mars Science Laboratory (MSL) project as its first target mission. MPCS is developed based on a set of small reusable components, implemented in Java, each designed with a specific function and well-defined interfaces. An industry standard messaging bus is used to transfer information among system components. Components generate standard messages which are used to capture system information, as well as triggers to support the event-driven architecture of the system. Event-driven systems are highly desirable for processing high-rate telemetry (science and engineering) data, and for supporting automation for many mission operations processes.
Park, Yu Rang; Yoon, Young Jo; Koo, HaYeong; Yoo, Soyoung; Choi, Chang-Min; Beck, Sung-Ho
2018-01-01
Background Clinical trials pose potential risks in both communications and management due to the various stakeholders involved when performing clinical trials. The academic medical center has a responsibility and obligation to conduct and manage clinical trials while maintaining a sufficiently high level of quality, therefore it is necessary to build an information technology system to support standardized clinical trial processes and comply with relevant regulations. Objective The objective of the study was to address the challenges identified while performing clinical trials at an academic medical center, Asan Medical Center (AMC) in Korea, by developing and utilizing a clinical trial management system (CTMS) that complies with standardized processes from multiple departments or units, controlled vocabularies, security, and privacy regulations. Methods This study describes the methods, considerations, and recommendations for the development and utilization of the CTMS as a consolidated research database in an academic medical center. A task force was formed to define and standardize the clinical trial performance process at the site level. On the basis of the agreed standardized process, the CTMS was designed and developed as an all-in-one system complying with privacy and security regulations. Results In this study, the processes and standard mapped vocabularies of a clinical trial were established at the academic medical center. On the basis of these processes and vocabularies, a CTMS was built which interfaces with the existing trial systems such as the electronic institutional review board health information system, enterprise resource planning, and the barcode system. To protect patient data, the CTMS implements data governance and access rules, and excludes 21 personal health identifiers according to the Health Insurance Portability and Accountability Act (HIPAA) privacy rule and Korean privacy laws. Since December 2014, the CTMS has been successfully implemented and used by 881 internal and external users for managing 11,645 studies and 146,943 subjects. Conclusions The CTMS was introduced in the Asan Medical Center to manage the large amounts of data involved with clinical trial operations. Inter- and intraunit control of data and resources can be easily conducted through the CTMS system. To our knowledge, this is the first CTMS developed in-house at an academic medical center side which can enhance the efficiency of clinical trial management in compliance with privacy and security laws. PMID:29691212
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-15
... Classification System. \\2\\ Maximum Achievable Control Technology. Table 2 is not intended to be exhaustive, but..., methods, systems, or techniques that reduce the volume of or eliminate HAP emissions through process changes, substitution of materials, or other modifications; enclose systems or processes to eliminate...
Processing of meteorological data with ultrasonic thermoanemometers
NASA Astrophysics Data System (ADS)
Telminov, A. E.; Bogushevich, A. Ya.; Korolkov, V. A.; Botygin, I. A.
2017-11-01
The article describes a software system intended for supporting scientific researches of the atmosphere during the processing of data gathered by multi-level ultrasonic complexes for automated monitoring of meteorological and turbulent parameters in the ground layer of the atmosphere. The system allows to process files containing data sets of temperature instantaneous values, three orthogonal components of wind speed, humidity and pressure. The processing task execution is done in multiple stages. During the first stage, the system executes researcher's query for meteorological parameters. At the second stage, the system computes series of standard statistical meteorological field properties, such as averages, dispersion, standard deviation, asymmetry coefficients, excess, correlation etc. The third stage is necessary to prepare for computing the parameters of atmospheric turbulence. The computation results are displayed to user and stored at hard drive.
Specifications of Standards in Systems and Synthetic Biology.
Schreiber, Falk; Bader, Gary D; Golebiewski, Martin; Hucka, Michael; Kormeier, Benjamin; Le Novère, Nicolas; Myers, Chris; Nickerson, David; Sommer, Björn; Waltemath, Dagmar; Weise, Stephan
2015-09-04
Standards shape our everyday life. From nuts and bolts to electronic devices and technological processes, standardised products and processes are all around us. Standards have technological and economic benefits, such as making information exchange, production, and services more efficient. However, novel, innovative areas often either lack proper standards, or documents about standards in these areas are not available from a centralised platform or formal body (such as the International Standardisation Organisation). Systems and synthetic biology is a relatively novel area, and it is only in the last decade that the standardisation of data, information, and models related to systems and synthetic biology has become a community-wide effort. Several open standards have been established and are under continuous development as a community initiative. COMBINE, the ‘COmputational Modeling in BIology’ NEtwork has been established as an umbrella initiative to coordinate and promote the development of the various community standards and formats for computational models. There are yearly two meeting, HARMONY (Hackathons on Resources for Modeling in Biology), Hackathon-type meetings with a focus on development of the support for standards, and COMBINE forums, workshop-style events with oral presentations, discussion, poster, and breakout sessions for further developing the standards. For more information see http://co.mbine.org/. So far the different standards were published and made accessible through the standards’ web- pages or preprint services. The aim of this special issue is to provide a single, easily accessible and citable platform for the publication of standards in systems and synthetic biology. This special issue is intended to serve as a central access point to standards and related initiatives in systems and synthetic biology, it will be published annually to provide an opportunity for standard development groups to communicate updated specifications.
Intermountain Health Care, Inc.: Standard Costing System Methodology and Implementation
Rosqvist, W.V.
1984-01-01
Intermountain Health Care, Inc. (IHC) a notfor-profit hospital chain with 22 hospitals in the intermountain area and corporate offices located in Salt Lake City, Utah, has developed a Standard Costing System to provide hospital management with a tool for confronting increased cost pressures in the health care environment. This document serves as a description of methodology used in developing the standard costing system and outlines the implementation process.
Information Retrieval System for Japanese Standard Disease-Code Master Using XML Web Service
Hatano, Kenji; Ohe, Kazuhiko
2003-01-01
Information retrieval system of Japanese Standard Disease-Code Master Using XML Web Service is developed. XML Web Service is a new distributed processing system by standard internet technologies. With seamless remote method invocation of XML Web Service, users are able to get the latest disease code master information from their rich desktop applications or internet web sites, which refer to this service. PMID:14728364
40 CFR 63.1012 - Compressor standards.
Code of Federal Regulations, 2012 CFR
2012-07-01
... in the referencing subpart. (b) Seal system standard. Each compressor shall be equipped with a seal..., except as provided in § 63.1002(b) and paragraphs (e) and (f) of this section. Each compressor seal...-loop system that purges the barrier fluid directly into a process stream. (c) Barrier fluid system. The...
40 CFR 63.1012 - Compressor standards.
Code of Federal Regulations, 2014 CFR
2014-07-01
... in the referencing subpart. (b) Seal system standard. Each compressor shall be equipped with a seal..., except as provided in § 63.1002(b) and paragraphs (e) and (f) of this section. Each compressor seal...-loop system that purges the barrier fluid directly into a process stream. (c) Barrier fluid system. The...
40 CFR 63.1012 - Compressor standards.
Code of Federal Regulations, 2013 CFR
2013-07-01
... in the referencing subpart. (b) Seal system standard. Each compressor shall be equipped with a seal..., except as provided in § 63.1002(b) and paragraphs (e) and (f) of this section. Each compressor seal...-loop system that purges the barrier fluid directly into a process stream. (c) Barrier fluid system. The...
[The design and implementation of DICOM self-help film printing system].
Wang, Xiaodong; Jiang, Mowen
2013-09-01
This article focuses on the design and implementation of self-help film printing system which based on DICOM standard. According to DICOM standard and the working process of the radiology department, the system realizes self-help printing film as well as monitoring and managing the film printing business.
Manufacturing Bms/Iso System Review
NASA Technical Reports Server (NTRS)
Gomez, Yazmin
2004-01-01
The Quality Management System (QMS) is one that recognizes the need to continuously change and improve an organization s products and services as determined by system feedback, and corresponding management decisions. The purpose of a Quality Management System is to minimize quality variability of an organization's products and services. The optimal Quality Management System balances the need for an organization to maintain flexibility in the products and services it provides with the need for providing the appropriate level of discipline and control over the processes used to provide them. The goal of a Quality Management System is to ensure the quality of the products and services while consistently (through minimizing quality variability) meeting or exceeding customer expectations. The GRC Business Management System (BMS) is the foundation of the Center's ISO 9001:2000 registered quality system. ISO 9001 is a quality system model developed by the International Organization for Standardization. BMS supports and promote the Glenn Research Center Quality Policy and wants to ensure the customer satisfaction while also meeting quality standards. My assignment during this summer is to examine the manufacturing processes used to develop research hardware, which in most cases are one of a kind hardware, made with non conventional equipment and materials. During this process of observation I will make a determination, based on my observations of the hardware development processes the best way to meet customer requirements and at the same time achieve the GRC quality standards. The purpose of my task is to review the manufacturing processes identifying opportunities in which to optimize the efficiency of the processes and establish a plan for implementation and continuous improvement.
Magnetic Field Experiment Data Analysis System
NASA Technical Reports Server (NTRS)
Holland, D. B.; Zanetti, L. J.; Suther, L. L.; Potemra, T. A.; Anderson, B. J.
1995-01-01
The Johns Hopkins University Applied Physics Laboratory (JHU/APL) Magnetic Field Experiment Data Analysis System (MFEDAS) has been developed to process and analyze satellite magnetic field experiment data from the TRIAD, MAGSAT, AMPTE/CCE, Viking, Polar BEAR, DMSP, HILAT, UARS, and Freja satellites. The MFEDAS provides extensive data management and analysis capabilities. The system is based on standard data structures and a standard user interface. The MFEDAS has two major elements: (1) a set of satellite unique telemetry processing programs for uniform and rapid conversion of the raw data to a standard format and (2) the program Magplot which has file handling, data analysis, and data display sections. This system is an example of software reuse, allowing new data sets and software extensions to be added in a cost effective and timely manner. Future additions to the system will include the addition of standard format file import routines, modification of the display routines to use a commercial graphics package based on X-Window protocols, and a generic utility for telemetry data access and conversion.
Standard services for the capture, processing, and distribution of packetized telemetry data
NASA Technical Reports Server (NTRS)
Stallings, William H.
1989-01-01
Standard functional services for the capture, processing, and distribution of packetized data are discussed with particular reference to the future implementation of packet processing systems, such as those for the Space Station Freedom. The major functions are listed under the following major categories: input processing, packet processing, and output processing. A functional block diagram of a packet data processing facility is presented, showing the distribution of the various processing functions as well as the primary data flow through the facility.
Ackerman, Sara L; Gourley, Gato; Le, Gem; Williams, Pamela; Yazdany, Jinoos; Sarkar, Urmimala
2018-03-14
The aim of the study was to develop standards for tracking patient safety gaps in ambulatory care in safety net health systems. Leaders from five California safety net health systems were invited to participate in a modified Delphi process sponsored by the Safety Promotion Action Research and Knowledge Network (SPARKNet) and the California Safety Net Institute in 2016. During each of the three Delphi rounds, the feasibility and validity of 13 proposed patient safety measures were discussed and prioritized. Surveys and transcripts from the meetings were analyzed to understand the decision-making process. The Delphi process included eight panelists. Consensus was reached to adopt 9 of 13 proposed measures. All 9 measures were unanimously considered valid, but concern was expressed about the feasibility of implementing several of the measures. Although safety net health systems face high barriers to standardized measurement, our study demonstrates that consensus can be reached on acceptable and feasible methods for tracking patient safety gaps in safety net health systems. If accompanied by the active participation key stakeholder groups, including patients, clinicians, staff, data system professionals, and health system leaders, the consensus measures reported here represent one step toward improving ambulatory patient safety in safety net health systems.
NASA Astrophysics Data System (ADS)
Gromakov, E. I.; Gazizov, A. T.; Lukin, V. P.; Chimrov, A. V.
2017-01-01
The paper analyses efficiency (interference resistance) of standard TT, TN, IT networks in control links of automatic control systems (ACS) of technical processes (TP) of oil and gas production. Electromagnetic compatibility (EMC) is a standard term used to describe the interference in grounding circuits. Improved EMC of ACS TP can significantly reduce risks and costs of malfunction of equipment that could have serious consequences. It has been proved that an IT network is the best type of grounds for protection of ACS TP in real life conditions. It allows reducing the interference down to the level that is stated in standards of oil and gas companies.
Problems of Automation and Management Principles Information Flow in Manufacturing
NASA Astrophysics Data System (ADS)
Grigoryuk, E. N.; Bulkin, V. V.
2017-07-01
Automated control systems of technological processes are complex systems that are characterized by the presence of elements of the overall focus, the systemic nature of the implemented algorithms for the exchange and processing of information, as well as a large number of functional subsystems. The article gives examples of automatic control systems and automated control systems of technological processes held parallel between them by identifying strengths and weaknesses. Other proposed non-standard control system of technological process.
Terminal Information Processing System (TIPS) Consolidated CAB Display (CCD) Comparative Analysis.
1982-04-01
Barometric pressure 3. Center field wind speed, direction and gusts 4. Runway visual range 5. Low-level wind shear 6. Vortex advisory 7. Runway equipment...PASSWORD Command (standard user) u. PAUSE Command (standard user) v. PMSG Command (standard user) w. PPD Command (standard user) x. PURGE Command (standard
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automated the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system hasmore » been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.3 of the system.« less
A management, leadership, and board road map to transforming care for patients.
Toussaint, John
2013-01-01
Over the last decade I have studied 115 healthcare organizations in II countries, examining them from the boardroom to the patient bedside. In that time, I have observed one critical element missing from just about every facility: a set of standards that could reliably produce zero-defect care for patients. This lack of standards is largely rooted in the Sloan management approach, a top-down management and leadership structure that is void of standardized accountability. This article offers an alternative approach: management by process--an operating system that engages frontline staff in decisions and imposes standards and processes on the act of managing. Organizations that have adopted management by process have seen quality improve and costs decrease because the people closest to the work are expected to identify problems and solve them. Also detailed are the leadership behaviors required for an organization to successfully implement the management-by-process operating system and the board of trustees' role in supporting the transformation.
40 CFR 63.11498 - What are the standards and compliance requirements for wastewater systems?
Code of Federal Regulations, 2010 CFR
2010-07-01
... Chemical Manufacturing Area Sources Standards and Compliance Requirements § 63.11498 What are the standards... each wastewater stream using process knowledge, engineering assessment, or test data. Also, you must...
40 CFR 63.11498 - What are the standards and compliance requirements for wastewater systems?
Code of Federal Regulations, 2012 CFR
2012-07-01
... Chemical Manufacturing Area Sources Standards and Compliance Requirements § 63.11498 What are the standards... each wastewater stream using process knowledge, engineering assessment, or test data. Also, you must...
Tang, Qi-Yi; Zhang, Chuan-Xi
2013-04-01
A comprehensive but simple-to-use software package called DPS (Data Processing System) has been developed to execute a range of standard numerical analyses and operations used in experimental design, statistics and data mining. This program runs on standard Windows computers. Many of the functions are specific to entomological and other biological research and are not found in standard statistical software. This paper presents applications of DPS to experimental design, statistical analysis and data mining in entomology. © 2012 The Authors Insect Science © 2012 Institute of Zoology, Chinese Academy of Sciences.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zitney, S.E.
This paper highlights the use of the CAPE-OPEN (CO) standard interfaces in the Advanced Process Engineering Co-Simulator (APECS) developed at the National Energy Technology Laboratory (NETL). The APECS system uses the CO unit operation, thermodynamic, and reaction interfaces to provide its plug-and-play co-simulation capabilities, including the integration of process simulation with computational fluid dynamics (CFD) simulation. APECS also relies heavily on the use of a CO COM/CORBA bridge for running process/CFD co-simulations on multiple operating systems. For process optimization in the face of multiple and some time conflicting objectives, APECS offers stochastic modeling and multi-objective optimization capabilities developed to complymore » with the CO software standard. At NETL, system analysts are applying APECS to a wide variety of advanced power generation systems, ranging from small fuel cell systems to commercial-scale power plants including the coal-fired, gasification-based FutureGen power and hydrogen production plant.« less
Automated process planning system
NASA Technical Reports Server (NTRS)
Mann, W.
1978-01-01
Program helps process engineers set up manufacturing plans for machined parts. System allows one to develop and store library of similar parts characteristics, as related to particular facility. Information is then used in interactive system to help develop manufacturing plans that meet required standards.
Re-engineering Nascom's network management architecture
NASA Technical Reports Server (NTRS)
Drake, Brian C.; Messent, David
1994-01-01
The development of Nascom systems for ground communications began in 1958 with Project Vanguard. The low-speed systems (rates less than 9.6 Kbs) were developed following existing standards; but, there were no comparable standards for high-speed systems. As a result, these systems were developed using custom protocols and custom hardware. Technology has made enormous strides since the ground support systems were implemented. Standards for computer equipment, software, and high-speed communications exist and the performance of current workstations exceeds that of the mainframes used in the development of the ground systems. Nascom is in the process of upgrading its ground support systems and providing additional services. The Message Switching System (MSS), Communications Address Processor (CAP), and Multiplexer/Demultiplexer (MDM) Automated Control System (MACS) are all examples of Nascom systems developed using standards such as, X-windows, Motif, and Simple Network Management Protocol (SNMP). Also, the Earth Observing System (EOS) Communications (Ecom) project is stressing standards as an integral part of its network. The move towards standards has produced a reduction in development, maintenance, and interoperability costs, while providing operational quality improvement. The Facility and Resource Manager (FARM) project has been established to integrate the Nascom networks and systems into a common network management architecture. The maximization of standards and implementation of computer automation in the architecture will lead to continued cost reductions and increased operational efficiency. The first step has been to derive overall Nascom requirements and identify the functionality common to all the current management systems. The identification of these common functions will enable the reuse of processes in the management architecture and promote increased use of automation throughout the Nascom network. The MSS, CAP, MACS, and Ecom projects have indicated the potential value of commercial-off-the-shelf (COTS) and standards through reduced cost and high quality. The FARM will allow the application of the lessons learned from these projects to all future Nascom systems.
Gómez-Álvarez, Sandra; Porta-Oltra, Begoña; Hernandez-Griso, Marta; Pérez-Labaña, Francisca; Climente-Martí, Mónica
2016-01-01
to assess the impact of two closed-system drug transfer device on the local and environmental contamination and preparation times in the process of preparation of parenteral chemotherapy compared to the standard system. prospective observational study. Two different closed- systems providers, Care Fusion® and Icu Medical®, were compared to standard preparation. 15 nurses of Pharmacy Department prepared 5 preparations each one, one with the standard procedure and four using closed-systems. To evaluate the contamination, a fluorescein solution 0.5% was prepared. Two kind of contamination were evaluated, local (three points connection: closed-system connect vial, syringe and final infusion bags) and environmental (gloves and countertop). Percentage of contaminated preparations was obtained in each one. Time taken by each nurse in each preparation was recorded. 75 preparations were prepared. Local contamination was reduced 21% and 75% in closed-system Icu Medical® and Care Fusion® respectively. Care Fusion® closed system, local contamination was significantly lower than the standard system to the vial, syringe and final package, while Icu Medical® closed-systems only was significantly lower in the connection to the vial. Time of preparation was increased significantly with the use of closed-system between 23.4 and 30.5 seconds. both closed-systems drug transfer device have shown an improvement in contamination than the use of the standard system. However, preparation time has been significantly increased with the use of both systems. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.
A High Efficiency System for Science Instrument Commanding for the Mars Global Surveyor Mission
NASA Technical Reports Server (NTRS)
Jr., R. N. Brooks
1995-01-01
The Mars Global Surveyor (MGS) mission will return to Mars to re- cover most of the science lost when the ill fated Mars Observer space- craft suffered a catastrophic anomaly in its propulsion system and did not go into orbit. Described in detail are the methods employed by the MGS Sequence Team to accelerate science command processing by using standard command generation process and standard UNIX control scripts.
Contextualizing Learning Scenarios According to Different Learning Management Systems
ERIC Educational Resources Information Center
Drira, R.; Laroussi, M.; Le Pallec, X.; Warin, B.
2012-01-01
In this paper, we first demonstrate that an instructional design process of Technology Enhanced Learning (TEL) systems based on a Model Driven Approach (MDA) addresses the limits of Learning Technology Standards (LTS), such as SCORM and IMS-LD. Although these standards ensure the interoperability of TEL systems across different Learning Management…
The Standard Autonomous File Server, A Customized, Off-the-Shelf Success Story
NASA Technical Reports Server (NTRS)
Semancik, Susan K.; Conger, Annette M.; Obenschain, Arthur F. (Technical Monitor)
2001-01-01
The Standard Autonomous File Server (SAFS), which includes both off-the-shelf hardware and software, uses an improved automated file transfer process to provide a quicker, more reliable, prioritized file distribution for customers of near real-time data without interfering with the assets involved in the acquisition and processing of the data. It operates as a stand-alone solution, monitoring itself, and providing an automated fail-over process to enhance reliability. This paper describes the unique problems and lessons learned both during the COTS selection and integration into SAFS, and the system's first year of operation in support of NASA's satellite ground network. COTS was the key factor in allowing the two-person development team to deploy systems in less than a year, meeting the required launch schedule. The SAFS system has been so successful; it is becoming a NASA standard resource, leading to its nomination for NASA's Software of the Year Award in 1999.
NASA-STD-6016 Standard Materials and Processes Requirements for Spacecraft
NASA Technical Reports Server (NTRS)
Hirsch, David B.
2009-01-01
The standards for materials and processes surrounding spacecraft are discussed. Presentation focused on minimum requirements for Materials and Processes (M&P) used in design, fabrication, and testing of flight components for NASA manned, unmanned, robotic, launch vehicle, lander, in-space and surface systems, and spacecraft program/project hardware elements.Included is information on flammability, offgassing, compatibility requirements, and processes; both metallic and non-metallic materials are mentioned.
Security Implications of OPC, OLE, DCOM, and RPC in Control Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2006-01-01
OPC is a collection of software programming standards and interfaces used in the process control industry. It is intended to provide open connectivity and vendor equipment interoperability. The use of OPC technology simplifies the development of control systems that integrate components from multiple vendors and support multiple control protocols. OPC-compliant products are available from most control system vendors, and are widely used in the process control industry. OPC was originally known as OLE for Process Control; the first standards for OPC were based on underlying services in the Microsoft Windows computing environment. These underlying services (OLE [Object Linking and Embedding],more » DCOM [Distributed Component Object Model], and RPC [Remote Procedure Call]) have been the source of many severe security vulnerabilities. It is not feasible to automatically apply vendor patches and service packs to mitigate these vulnerabilities in a control systems environment. Control systems using the original OPC data access technology can thus inherit the vulnerabilities associated with these services. Current OPC standardization efforts are moving away from the original focus on Microsoft protocols, with a distinct trend toward web-based protocols that are independent of any particular operating system. However, the installed base of OPC equipment consists mainly of legacy implementations of the OLE for Process Control protocols.« less
NASA Astrophysics Data System (ADS)
Skersys, Tomas; Butleris, Rimantas; Kapocius, Kestutis
2013-10-01
Approaches for the analysis and specification of business vocabularies and rules are very relevant topics in both Business Process Management and Information Systems Development disciplines. However, in common practice of Information Systems Development, the Business modeling activities still are of mostly empiric nature. In this paper, basic aspects of the approach for business vocabularies' semi-automated extraction from business process models are presented. The approach is based on novel business modeling-level OMG standards "Business Process Model and Notation" (BPMN) and "Semantics for Business Vocabularies and Business Rules" (SBVR), thus contributing to OMG's vision about Model-Driven Architecture (MDA) and to model-driven development in general.
VIEW OF THE INTERIOR OF BUILDING 125, THE STANDARDS LABORATORY. ...
VIEW OF THE INTERIOR OF BUILDING 125, THE STANDARDS LABORATORY. THE PRIMARY FUNCTION OF THE STANDARDS LABORATORY WAS TO ENSURE AND IMPLEMENT A SYSTEM OF QUALITY CONTROL FOR INCOMING MATERIALS USED IN MANUFACTURING PROCESSES. SEVERAL ENGINEERING CONTROLS WERE USED TO ASSURE ACCURACY OF THE CALIBRATION PROCESSES INCLUDING: FLEX-FREE GRANITE TABLES, AIR LOCKED DOORS, TEMPERATURE CONTROLS, AND A SUPER-CLEAN ENVIRONMENT - Rocky Flats Plant, Standards Laboratory, Immediately north of 215A water tower & adjacent to Third Street, Golden, Jefferson County, CO
Paperless Procurement: The Impact of Advanced Automation
1992-09-01
System. POPS = Paperless Order Processing System; RADMIS = Research and Development Management Information System; SAACONS=Standard Army Automated... order processing system, which then updates the contractor’s production (or delivery) scheduling and contract accounting applications. In return, the...used by the DLA’s POPS. 3-5 into an EDI delivery order and pass it directly to the distributor’s or manufacturer’s order processing system. That
NASA Technical Reports Server (NTRS)
Nashman, Marilyn; Chaconas, Karen J.
1988-01-01
The sensory processing system for the NASA/NBS Standard Reference Model (NASREM) for telerobotic control is described. This control system architecture was adopted by NASA of the Flight Telerobotic Servicer. The control system is hierarchically designed and consists of three parallel systems: task decomposition, world modeling, and sensory processing. The Sensory Processing System is examined, and in particular the image processing hardware and software used to extract features at low levels of sensory processing for tasks representative of those envisioned for the Space Station such as assembly and maintenance are described.
Proposed Computer System for Library Catalog Maintenance. Part II: System Design.
ERIC Educational Resources Information Center
Stein (Theodore) Co., New York, NY.
The logic of the system presented in this report is divided into six parts for computer processing and manipulation. They are: (1) processing of Library of Congress copy, (2) editing of input into standard format, (3) processing of information into and out from the authority files, (4) creation of the catalog records, (5) production of the…
40 CFR 63.1013 - Sampling connection systems standards.
Code of Federal Regulations, 2011 CFR
2011-07-01
... (CONTINUED) National Emission Standards for Equipment Leaks-Control Level 1 § 63.1013 Sampling connection... container are not required to be collected or captured. (c) Equipment design and operation. Each closed... process fluid to a process; or (3) Be designed and operated to capture and transport all the purged...
Systems Engineering and Management Applications of ISO 9001:2015 for Government
NASA Technical Reports Server (NTRS)
Shepherd, Christena C.
2016-01-01
The manufacturing segment of the business world is busy assessing the impact of ISO 9001:2015, and updating their management systems to meet the required compliance date. What does the new revision mean for government agencies that deliver large engineering projects rather than mass production? In fact, the standard, especially the new revision, can be used quite readily for government agencies, or applied to specific projects, once it is understood in terms of the similarities with systems engineering and project management. From there it can be extrapolated to "mission realization" systems, and a Quality Management System (QMS) is a logical result that can bring order to processes and systems that likely already exist in some fashion. ISO 9001:2015 is less product-oriented than previous versions. It can be more broadly applied to public organizations as well as private; and to services (missions) as well as products. The emphasis on risk management in the revised standard provides the needed balance for weighing decisions with respect to cost, schedule, technical, safety, and regulatory compliance; so if this is not part of agency governance already, this is a good place to start, especially for large engineering projects. The Systems Engineering standard used for this analysis is from NASA's NPR 7123.1 NASA Systems Engineering Processes and Requirements; however, those who are more familiar with ISO/IEC 26702 Systems Engineering-application and management of the systems engineering process, or SAE/EIA 632 Processes for Engineering a System will also recognize the similarities. In reality, the QMS outlined by ISO 9001 reinforces the systems engineering processes, and serves to ensure that they are adequately implemented, although most of the ISO 9001 literature emphasizes the production and process aspects of the standard. Rather than beginning with ISO 9001and getting lost in the vocabulary, it is useful to begin with the systems engineering lifecycle. Identification of stakeholder expectations, identifying solutions, creating specific product or service designs, production of the product or service, delivery to the public, and the associated management, planning, and control processes, are a familiar place to begin thinking of the overall system of identifying, designing, and competing a project or mission. Lining up this lifecycle with the ISO requirements (see Figure 1) illustrates how a quality management system is concerned with the same processes, and provides a governance and assurance function. If implemented properly, there are cost savings resulting from less rework, repair, reprocessing, failures, misplaced documents, and similar types of deficiencies1. Starting with an organization's systems engineering processes allows the organization to use their own terminology for a QMS plan, and tailor the plan to their own project or organization, so that it is more easily developed, understood, and implemented.
Park, Yu Rang; Yoon, Young Jo; Koo, HaYeong; Yoo, Soyoung; Choi, Chang-Min; Beck, Sung-Ho; Kim, Tae Won
2018-04-24
Clinical trials pose potential risks in both communications and management due to the various stakeholders involved when performing clinical trials. The academic medical center has a responsibility and obligation to conduct and manage clinical trials while maintaining a sufficiently high level of quality, therefore it is necessary to build an information technology system to support standardized clinical trial processes and comply with relevant regulations. The objective of the study was to address the challenges identified while performing clinical trials at an academic medical center, Asan Medical Center (AMC) in Korea, by developing and utilizing a clinical trial management system (CTMS) that complies with standardized processes from multiple departments or units, controlled vocabularies, security, and privacy regulations. This study describes the methods, considerations, and recommendations for the development and utilization of the CTMS as a consolidated research database in an academic medical center. A task force was formed to define and standardize the clinical trial performance process at the site level. On the basis of the agreed standardized process, the CTMS was designed and developed as an all-in-one system complying with privacy and security regulations. In this study, the processes and standard mapped vocabularies of a clinical trial were established at the academic medical center. On the basis of these processes and vocabularies, a CTMS was built which interfaces with the existing trial systems such as the electronic institutional review board health information system, enterprise resource planning, and the barcode system. To protect patient data, the CTMS implements data governance and access rules, and excludes 21 personal health identifiers according to the Health Insurance Portability and Accountability Act (HIPAA) privacy rule and Korean privacy laws. Since December 2014, the CTMS has been successfully implemented and used by 881 internal and external users for managing 11,645 studies and 146,943 subjects. The CTMS was introduced in the Asan Medical Center to manage the large amounts of data involved with clinical trial operations. Inter- and intraunit control of data and resources can be easily conducted through the CTMS system. To our knowledge, this is the first CTMS developed in-house at an academic medical center side which can enhance the efficiency of clinical trial management in compliance with privacy and security laws. ©Yu Rang Park, Young Jo Yoon, HaYeong Koo, Soyoung Yoo, Chang-Min Choi, Sung-Ho Beck, Tae Won Kim. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 24.04.2018.
A System Evaluation Theory Analyzing Value and Results Chain for Institutional Accreditation in Oman
ERIC Educational Resources Information Center
Paquibut, Rene Ymbong
2017-01-01
Purpose: This paper aims to apply the system evaluation theory (SET) to analyze the institutional quality standards of Oman Academic Accreditation Authority using the results chain and value chain tools. Design/methodology/approach: In systems thinking, the institutional standards are connected as input, process, output and feedback and leads to…
ERIC Educational Resources Information Center
Radack, Shirley M.
1994-01-01
Examines the role of the National Institute of Standards and Technology (NIST) in the development of the National Information Infrastructure (NII). Highlights include the standards process; voluntary standards; Open Systems Interconnection problems; Internet Protocol Suite; consortia; government's role; and network security. (16 references) (LRW)
Reusable Models of Pedagogical Concepts--A Framework for Pedagogical and Content Design.
ERIC Educational Resources Information Center
Pawlowski, Jan M.
Standardization initiatives in the field of learning technologies have produced standards for the interoperability of learning environments and learning management systems. Learning resources based on these standards can be reused, recombined, and adapted to the user. However, these standards follow a content-oriented approach; the process of…
NASA Technical Reports Server (NTRS)
Robinson, Harriss
1992-01-01
The move to visualization and image processing in data systems is increasing the demand for larger and faster mass storage systems. The technology of choice is magnetic tape. This paper briefly reviews the technology past, present, and projected. A case is made for standards and the value of the standards to users.
[Development and clinical evaluation of an anesthesia information management system].
Feng, Jing-yi; Chen, Hua; Zhu, Sheng-mei
2010-09-21
To study the design, implementation and clinical evaluation of an anesthesia information management system. To record, process and store peri-operative patient data automatically, all kinds of bedside monitoring equipments are connected into the system based on information integrating technology; after a statistical analysis of those patient data by data mining technology, patient status can be evaluated automatically based on risk prediction standard and decision support system, and then anesthetist could perform reasonable and safe clinical processes; with clinical processes electronically recorded, standard record tables could be generated, and clinical workflow is optimized, as well. With the system, kinds of patient data could be collected, stored, analyzed and archived, kinds of anesthesia documents could be generated, and patient status could be evaluated to support clinic decision. The anesthesia information management system is useful for improving anesthesia quality, decreasing risk of patient and clinician, and aiding to provide clinical proof.
40 CFR 420.14 - New source performance standards (NSPS).
Code of Federal Regulations, 2010 CFR
2010-07-01
... provided for process wastewaters from coke oven gas wet desulfurization systems, but only to the extent... wastewaters from other wet air pollution control systems (except those from coal charging and coke pushing emission controls), coal tar processing operations and coke plant groundwater remediation systems, but only...
40 CFR 420.14 - New source performance standards (NSPS).
Code of Federal Regulations, 2011 CFR
2011-07-01
... provided for process wastewaters from coke oven gas wet desulfurization systems, but only to the extent... wastewaters from other wet air pollution control systems (except those from coal charging and coke pushing emission controls), coal tar processing operations and coke plant groundwater remediation systems, but only...
40 CFR 420.14 - New source performance standards (NSPS).
Code of Federal Regulations, 2012 CFR
2012-07-01
... provided for process wastewaters from coke oven gas wet desulfurization systems, but only to the extent... wastewaters from other wet air pollution control systems (except those from coal charging and coke pushing emission controls), coal tar processing operations and coke plant groundwater remediation systems, but only...
40 CFR 420.14 - New source performance standards (NSPS).
Code of Federal Regulations, 2013 CFR
2013-07-01
... provided for process wastewaters from coke oven gas wet desulfurization systems, but only to the extent... wastewaters from other wet air pollution control systems (except those from coal charging and coke pushing emission controls), coal tar processing operations and coke plant groundwater remediation systems, but only...
40 CFR 420.14 - New source performance standards (NSPS).
Code of Federal Regulations, 2014 CFR
2014-07-01
... provided for process wastewaters from coke oven gas wet desulfurization systems, but only to the extent... wastewaters from other wet air pollution control systems (except those from coal charging and coke pushing emission controls), coal tar processing operations and coke plant groundwater remediation systems, but only...
[Research strategies in standard decoction of medicinal slices].
Chen, Shi-Lin; Liu, An; Li, Qi; Toru, Sugita; Zhu, Guang-Wei; Sun, Yi; Dai, Yun-Tao; Zhang, Jun; Zhang, Tie-Jun; Takehisa, Tomoda; Liu, Chang-Xiao
2016-04-01
This paper discusses the research situation of the standard decoction of medicinal slices at home and abroad. Combined with the experimental data, the author proposes that the standard decoction of medicinal slices is made of single herb using standard process which should be guided by the theory of traditional Chinese medicine, based on clinical practice and referred to modern extraction method with a standard process. And the author also proposes the principles of establishing the specification of process parameters and quality standards and established the basis of drug efficacy material and biological reference. As a standard material and standard system, the standard decoction of medicinal slices can provide standards for clinical medication, standardize the use of the new type of medicinal slices especially for dispensing granules, which were widely used in clinical. It can ensure the accuracy of drugs and consistency of dose, and to solve current supervision difficulties. Moreover the study of standard decoction of medicinal slices will provide the research on dispensing granules, traditional Chinese medicine prescription standard decoction and couplet medicines standard decoction a useful reference. Copyright© by the Chinese Pharmaceutical Association.
[The standardization of medical care and the training of medical personnel].
Korbut, V B; Tyts, V V; Boĭshenko, V A
1997-09-01
The medical specialist training at all levels (medical orderly, doctor's assistant, general practitioner, doctors) should be based on the medical care standards. Preliminary studies in the field of military medicine standards have demonstrated that the medical service of the Armed Forces of Russia needs medical resources' standards, structure and organization standards, technology standards. Military medical service resources' standards should reflect the requisitions for: all medical specialists' qualification, equipment and material for medical set-ups, field medical systems, drugs, etc. Standards for structures and organization should include requisitions for: command and control systems in military formations' and task forces' medical services and their information support; health-care and evacuation functions, sanitary control and anti-epidemic measures and personnel health protection. Technology standards development could improve and regulate the health care procedures in the process of evacuation. Standards' development will help to solve the problem of the data-base for the military medicine education system and medical research.
40 CFR 63.11502 - What definitions apply to this subpart?
Code of Federal Regulations, 2010 CFR
2010-07-01
...: process knowledge, an engineering assessment, or test data. Byproduct means a chemical (liquid, gas, or... (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Chemical Manufacturing Area Sources... system(s); (5) A gas stream routed to other processes for reaction or other use in another process (i.e...
Comparative Analysis of the Measurement of Total Instructional Alignment
ERIC Educational Resources Information Center
Kick, Laura C.
2013-01-01
In 2007, Lisa Carter created the Total Instructional Alignment system--a process that aligns standards, curriculum, assessment, and instruction. Employed in several hundred school systems, the TIA process is a successful professional development program. The researcher developed an instrument to measure the success of the TIA process with the…
29 CFR 1910.119 - Process safety management of highly hazardous chemicals.
Code of Federal Regulations, 2011 CFR
2011-07-01
... complexity of the process will influence the decision as to the appropriate PHA methodology to use. All PHA... process hazard analysis in sufficient detail to support the analysis. (3) Information pertaining to the...) Relief system design and design basis; (E) Ventilation system design; (F) Design codes and standards...
29 CFR 1910.119 - Process safety management of highly hazardous chemicals.
Code of Federal Regulations, 2010 CFR
2010-07-01
... complexity of the process will influence the decision as to the appropriate PHA methodology to use. All PHA... process hazard analysis in sufficient detail to support the analysis. (3) Information pertaining to the...) Relief system design and design basis; (E) Ventilation system design; (F) Design codes and standards...
40 CFR 65.113 - Standards: Sampling connection systems.
Code of Federal Regulations, 2011 CFR
2011-07-01
... be collected or captured. (c) Equipment design and operation. Each closed-purge, closed-loop, or... system; or (2) Collect and recycle the purged process fluid to a process; or (3) Be designed and operated to capture and transport all the purged process fluid to a control device that meets the requirements...
40 CFR 65.113 - Standards: Sampling connection systems.
Code of Federal Regulations, 2014 CFR
2014-07-01
... be collected or captured. (c) Equipment design and operation. Each closed-purge, closed-loop, or... system; or (2) Collect and recycle the purged process fluid to a process; or (3) Be designed and operated to capture and transport all the purged process fluid to a control device that meets the requirements...
40 CFR 65.113 - Standards: Sampling connection systems.
Code of Federal Regulations, 2010 CFR
2010-07-01
... be collected or captured. (c) Equipment design and operation. Each closed-purge, closed-loop, or... system; or (2) Collect and recycle the purged process fluid to a process; or (3) Be designed and operated to capture and transport all the purged process fluid to a control device that meets the requirements...
Fornwall, M.; Gisiner, R.; Simmons, S. E.; Moustahfid, Hassan; Canonico, G.; Halpin, P.; Goldstein, P.; Fitch, R.; Weise, M.; Cyr, N.; Palka, D.; Price, J.; Collins, D.
2012-01-01
The US Integrated Ocean Observing System (IOOS) has recently adopted standards for biological core variables in collaboration with the US Geological Survey/Ocean Biogeographic Information System (USGS/OBIS-USA) and other federal and non-federal partners. In this Community White Paper (CWP) we provide a process to bring into IOOS a rich new source of biological observing data, visual line transect surveys, and to establish quality data standards for visual line transect observations, an important source of at-sea bird, turtle and marine mammal observation data. The processes developed through this exercise will be useful for other similar biogeographic observing efforts, such as passive acoustic point and line transect observations, tagged animal data, and mark-recapture (photo-identification) methods. Furthermore, we suggest that the processes developed through this exercise will serve as a catalyst for broadening involvement by the larger marine biological data community within the goals and processes of IOOS.
Acceleration of Linear Finite-Difference Poisson-Boltzmann Methods on Graphics Processing Units.
Qi, Ruxi; Botello-Smith, Wesley M; Luo, Ray
2017-07-11
Electrostatic interactions play crucial roles in biophysical processes such as protein folding and molecular recognition. Poisson-Boltzmann equation (PBE)-based models have emerged as widely used in modeling these important processes. Though great efforts have been put into developing efficient PBE numerical models, challenges still remain due to the high dimensionality of typical biomolecular systems. In this study, we implemented and analyzed commonly used linear PBE solvers for the ever-improving graphics processing units (GPU) for biomolecular simulations, including both standard and preconditioned conjugate gradient (CG) solvers with several alternative preconditioners. Our implementation utilizes the standard Nvidia CUDA libraries cuSPARSE, cuBLAS, and CUSP. Extensive tests show that good numerical accuracy can be achieved given that the single precision is often used for numerical applications on GPU platforms. The optimal GPU performance was observed with the Jacobi-preconditioned CG solver, with a significant speedup over standard CG solver on CPU in our diversified test cases. Our analysis further shows that different matrix storage formats also considerably affect the efficiency of different linear PBE solvers on GPU, with the diagonal format best suited for our standard finite-difference linear systems. Further efficiency may be possible with matrix-free operations and integrated grid stencil setup specifically tailored for the banded matrices in PBE-specific linear systems.
Quality Assurance By Laser Scanning And Imaging Techniques
NASA Astrophysics Data System (ADS)
SchmalfuB, Harald J.; Schinner, Karl Ludwig
1989-03-01
Laser scanning systems are well established in the world of fast industrial in-process quality inspection systems. The materials inspected by laser scanning systems are e.g. "endless" sheets of steel, paper, textile, film or foils. The web width varies from 50 mm up to 5000 mm or more. The web speed depends strongly on the production process and can reach several hundred meters per minute. The continuous data flow in one of different channels of the optical receiving system exceeds ten Megapixels/sec. Therefore it is clear that the electronic evaluation system has to process these data streams in real time and no image storage is possible. But sometimes (e.g. first installation of the system, change of the defect classification) it would be very helpful to have the possibility for a visual look on the original, i.e. not processed sensor data. At first we show the principle set up of a standard laser scanning system. Then we will introduce a large image memory especially designed for the needs of high-speed inspection sensors. This image memory co-operates with the standard on-line evaluation electronics and provides therefore an easy comparison between processed and non-processed data. We will discuss the basic system structure and we will show the first industrial results.
NASA Technical Reports Server (NTRS)
Bradford, Robert N.; Nichols, Kelvin F.
2006-01-01
To date very little effort has been made to provide interoperability between various space agency projects. To effectively get to the Moon and beyond systems must interoperate. To provide interoperability, standardization and registries of various technologies will be required. These registries will be created as they relate to space flight. With the new NASA Moon/Mars initiative a requirement to standardize and control the naming conventions of very disparate systems and technologies are emerging. The need to provide numbering to the many processes, schemas, vehicles, robots, space suits and technologies (e.g. versions), to name a few, in the highly complex Constellation Initiative is imperative. The number of corporations, developer personnel, system interfaces, people interfaces will require standardization and registries on a scale not currently envisioned. It would only take one exception (stove piped system development) to weaken, if not, destroy interoperability. To start, a standardized registry process must be defined that allows many differing engineers, organizations and operators the ability to easily access disparate registry information across numerous technological and scientific disciplines. Once registries are standardized the need to provide registry support in terms of setup and operations, resolution of conflicts between registries and other issues will need to be addressed. Registries should not be confused with repositories. No end user data is "stored" in a registry nor is it a configuration control system. Once a registry standard is created and approved, the technologies that should be registered must be identified and prioritized. In this paper, we will identify and define a registry process that is compatible with the Constellation Initiative and other non related space activities and organizations. We will then identify and define the various technologies that should use a registry to provide interoperability. The first set of technologies will be those that are currently in need of expansion namely the assignment of satellite designations and the process which controls assignments. Second, we will analyze the technologies currently standardized under the Consultative Committee for Space Data Systems (CCSDS) banner. Third, we will analyze the current CCSDS working group and birds of a feather activities to ascertain registry requirements. Lastly, we will identify technologies that are either currently under the auspices of another
Lobach, David F; Kawamoto, Kensaku; Anstrom, Kevin J; Russell, Michael L; Woods, Peter; Smith, Dwight
2007-01-01
Clinical decision support is recognized as one potential remedy for the growing crisis in healthcare quality in the United States and other industrialized nations. While decision support systems have been shown to improve care quality and reduce errors, these systems are not widely available. This lack of availability arises in part because most decision support systems are not portable or scalable. The Health Level 7 international standard development organization recently adopted a draft standard known as the Decision Support Service standard to facilitate the implementation of clinical decision support systems using software services. In this paper, we report the first implementation of a clinical decision support system using this new standard. This system provides point-of-care chronic disease management for diabetes and other conditions and is deployed throughout a large regional health system. We also report process measures and usability data concerning the system. Use of the Decision Support Service standard provides a portable and scalable approach to clinical decision support that could facilitate the more extensive use of decision support systems.
Final Report of the NASA Office of Safety and Mission Assurance Agile Benchmarking Team
NASA Technical Reports Server (NTRS)
Wetherholt, Martha
2016-01-01
To ensure that the NASA Safety and Mission Assurance (SMA) community remains in a position to perform reliable Software Assurance (SA) on NASAs critical software (SW) systems with the software industry rapidly transitioning from waterfall to Agile processes, Terry Wilcutt, Chief, Safety and Mission Assurance, Office of Safety and Mission Assurance (OSMA) established the Agile Benchmarking Team (ABT). The Team's tasks were: 1. Research background literature on current Agile processes, 2. Perform benchmark activities with other organizations that are involved in software Agile processes to determine best practices, 3. Collect information on Agile-developed systems to enable improvements to the current NASA standards and processes to enhance their ability to perform reliable software assurance on NASA Agile-developed systems, 4. Suggest additional guidance and recommendations for updates to those standards and processes, as needed. The ABT's findings and recommendations for software management, engineering and software assurance are addressed herein.
Handwriting in Lebanese Bigraphic Children: Standardization of the BHK Scale
ERIC Educational Resources Information Center
Matta Abizeid, Carla; Tabsh Nakib, Amira; Younès Harb, Céleste; Ghantous Faddoul, Shereen; Albaret, Jean-Michel
2017-01-01
Educational systems in Lebanon are bilingual. They simultaneously impose two handwriting systems in Arabic and Latin. This historically driven situation could constitute a significant impact on the process and development of handwriting skills. Using an accurate and valid handwriting evaluation tool standardized for the Lebanese population is a…
Building Dynamic Conceptual Physics Understanding
ERIC Educational Resources Information Center
Trout, Charlotte; Sinex, Scott A.; Ragan, Susan
2011-01-01
Models are essential to the learning and doing of science, and systems thinking is key to appreciating many environmental issues. The National Science Education Standards include models and systems in their unifying concepts and processes standard, while the AAAS Benchmarks include them in their common themes chapter. Hyerle and Marzano argue for…
Unified System Of Data On Materials And Processes
NASA Technical Reports Server (NTRS)
Key, Carlo F.
1989-01-01
Wide-ranging sets of data for aerospace industry described. Document describes Materials and Processes Technical Information System (MAPTIS), computerized set of integrated data bases for use by NASA and aerospace industry. Stores information in standard format for fast retrieval in searches and surveys of data. Helps engineers select materials and verify their properties. Promotes standardized nomenclature as well as standarized tests and presentation of data. Format of document of photographic projection slides used in lectures. Presents examples of reports from various data bases.
ERIC Educational Resources Information Center
Schutte, Marc; Spottl, Georg
2011-01-01
Developing countries such as Malaysia and Oman have recently established occupational standards based on core work processes (functional clusters of work objects, activities and performance requirements), to which competencies (performance determinants) can be linked. While the development of work-process-based occupational standards is supposed…
NASA Technical Reports Server (NTRS)
Bradford, Robert N.; Nichols, Kelvin F.; Witherspoon, Keith R.
2006-01-01
To date very little effort has been made to provide interoperability between various space agency projects. To effectively get to the Moon and beyond systems must interoperate. To provide interoperability, standardization and registries of various technologies will be required. These registries will be created as they relate to space flight. With the new NASA Moon/Mars initiative, a requirement to standardize and control the naming conventions of very disparate systems and technologies is emerging. The need to provide numbering to the many processes, schemas, vehicles, robots, space suits and technologies (e.g. versions), to name a few, in the highly complex Constellation initiative is imperative. The number of corporations, developer personnel, system interfaces, people interfaces will require standardization and registries on a scale not currently envisioned. It would only take one exception (stove piped system development) to weaken, if not, destroy interoperability. To start, a standardized registry process must be defined that allows many differing engineers, organizations and operators the ability to easily access disparate registry information across numerous technological and scientific disciplines. Once registries are standardized the need to provide registry support in terms of setup and operations, resolution of conflicts between registries and other issues will need to be addressed. Registries should not be confused with repositories. No end user data is "stored" in a registry nor is it a configuration control system. Once a registry standard is created and approved, the technologies that should be registered must be identified and prioritized. In this paper, we will identify and define a registry process that is compatible with the Constellation initiative and other non related space activities and organizations. We will then identify and define the various technologies that should use a registry to provide interoperability. The first set of technologies will be those that are currently in need of expansion namely the assignment of satellite designations and the process which controls assignments. Second, we will analyze the technologies currently standardized under the Consultative Committee for Space Data Systems (CCSDS) banner. Third, we will analyze the current CCSDS working group and Birds of a Feather (BoF) activities to ascertain registry requirements. Lastly, we will identify technologies that are either currently under the auspices of another standards body or technologies that are currently not standardized. For activities one through three, we will provide the analysis by either discipline or technology with rationale, identification and brief description of requirements and precedence. For activity four, we will provide a list of current standards bodies e.g. IETF and a list of potential candidates.
NASA Astrophysics Data System (ADS)
Downs, R. R.; Chen, R. S.
2011-12-01
Services that preserve and enable future access to scientific data are necessary to ensure that the data that are being collected today will be available for use by future generations of scientists. Many data centers, archives, and other digital repositories are working to improve their ability to serve as long-term stewards of scientific data. Trust in sustainable data management and preservation capabilities of digital repositories can influence decisions to use these services to deposit or obtain scientific data. Building on the Open Archival Information System (OAIS) Reference Model developed by the Consultative Committee for Space Data Systems (CCSDS) and adopted by the International Organization for Standardization as ISO 14721:2003, new standards are being developed to improve long-term data management processes and documentation. The Draft Information Standard ISO/DIS 16363, "Space data and information transfer systems - Audit and certification of trustworthy digital repositories" offers the potential to evaluate digital repositories objectively in terms of their trustworthiness as long-term stewards of digital resources. In conjunction with this, the CCSDS and ISO are developing another draft standard for the auditing and certification process, ISO/DIS 16919, "Space data and information transfer systems - Requirements for bodies providing audit and certification of candidate trustworthy digital repositories". Six test audits were conducted of scientific data centers and archives in Europe and the United States to test the use of these draft standards and identify potential improvements for the standards and for the participating digital repositories. We present a case study of the test audit conducted on the NASA Socioeconomic Data and Applications Center (SEDAC) and describe the preparation, the audit process, recommendations received, and next steps to obtain certification as a trustworthy digital repository, after approval of the ISO/DIS standards.
15 CFR 200.105 - Standard reference data.
Code of Federal Regulations, 2010 CFR
2010-01-01
... for application in energy, environment and health, industrial process design, materials durability... Institute of Physics, in the National Standard Reference Data System reports as the NSRDS-NIST series, and...
NASA Astrophysics Data System (ADS)
Bradbury-Bailey, Mary
With the implementation of No Child Left Behind came a wave of educational reform intended for those working with student populations whose academic performance seemed to indicate an alienation from the educational process. Central to these reforms was the implementation of standards-based instruction and their accompanying standardized assessments; however, in one area reform seemed nonexistent---the teacher's gradebook. (Erickson, 2010, Marzano, 2006; Scriffiny, 2008). Given the link between the grading process and achievement motivation, Ames (1992) suggested the use of practices that promote mastery goal orientation. The purpose of this study was to examine the impact of standards-based grading system as a factor contributing to mastery goal orientation on the academic performance of urban African American students. To determine the degree of impact, this study first compared the course content averages and End-of-Course-Test (EOCT) scores for science classes using a traditional grading system to those using a standards-based grading system by employing an Analysis of Covariance (ANCOVA). While there was an increase in all grading areas, two showed a significant difference---the Physical Science course content average (p = 0.024) and ix the Biology EOCT scores (p = 0.0876). These gains suggest that standards-based grading can have a positive impact on the academic performance of African American students. Secondly, this study examined the correlation between the course content averages and the EOCT scores for both the traditional and standards-based grading system; for both Physical Science and Biology, there was a stronger correlation between these two scores for the standards-based grading system.
Saghaeiannejad-Isfahani, Sakineh; Mirzaeian, Razieh; Jannesari, Hasan; Ehteshami, Asghar; Feizi, Awat; Raeisi, Ahmadreza
2014-01-01
Supporting a therapeutic approach and medication therapy management, the pharmacy information system (PIS) acts as one of the pillars of hospital information system. This ensures that medication therapy is being supported with an optimal level of safety and quality similar to other treatments and services. The present study is an applied, cross-sectional study conducted on the PIS in use in selected hospitals. The research population included all users of PIS. The research sample is the same as the research population. The data collection instrument was the self-designed checklist developed from the guidelines of the American Society of Health System Pharmacists, Australia pharmaceutical Society and Therapeutic guidelines of the Drug Commission of the German Medical Association. The checklist validity was assessed by research supervisors and PIS users and pharmacists. The findings of this study were revealed that regarding the degree of meeting the standards given in the guidelines issued by the Society of Pharmacists, the highest rank in observing input standards belonged to Social Services hospitals with a mean score of 32.75. Although teaching hospitals gained the highest score both in process standards with a mean score of 29.15 and output standards with a mean score of 43.95, the private hospitals had the lowest mean score of 23.32, 17.78, 24.25 in input, process and output standards, respectively. Based on the findings, it can be claimed that the studied hospitals had a minimal compliance with the input, output and processing standards related to the PIS.
2011-10-01
Systems engineer- ing knowledge has also been documented through the standards bodies, most notably : • ISO /IEC/IEEE 15288, Systems Engineer- ing...System Life Cycle Processes, 2008 (see [10]). • ANSI/EIA 632, Processes for Engineering a System, (1998) • IEEE 1220, ISO /IEC 26702 Application...tion • United States Defense Acquisition Guidebook, Chapter 4, June 27, 2011 • IEEE/EIA 12207 , Software Life Cycle Processes, 2008 • United
Multifractal Properties of Process Control Variables
NASA Astrophysics Data System (ADS)
Domański, Paweł D.
2017-06-01
Control system is an inevitable element of any industrial installation. Its quality affects overall process performance significantly. The assessment, whether control system needs any improvement or not, requires relevant and constructive measures. There are various methods, like time domain based, Minimum Variance, Gaussian and non-Gaussian statistical factors, fractal and entropy indexes. Majority of approaches use time series of control variables. They are able to cover many phenomena. But process complexities and human interventions cause effects that are hardly visible for standard measures. It is shown that the signals originating from industrial installations have multifractal properties and such an analysis may extend standard approach to further observations. The work is based on industrial and simulation data. The analysis delivers additional insight into the properties of control system and the process. It helps to discover internal dependencies and human factors, which are hardly detectable.
Safety of clinical and non-clinical decision makers in telephone triage: a narrative review.
Wheeler, Sheila Q; Greenberg, Mary E; Mahlmeister, Laura; Wolfe, Nicole
2015-09-01
Patient safety is a persistent problem in telephone triage research; however, studies have not differentiated between clinicians' and non-clinicians' respective safety. Currently, four groups of decision makers perform aspects of telephone triage: clinicians (physicians, nurses), and non-clinicians (emergency medical dispatchers (EMD) and clerical staff). Using studies published between 2002-2012, we applied Donabedian's structure-process-outcome model to examine groups' systems for evidence of system completeness (a minimum measure of structure and quality). We defined system completeness as the presence of a decision maker and four additional components: guidelines, documentation, training, and standards. Defining safety as appropriate referrals (AR) - (right time, right place with the right person), we measured each groups' corresponding AR rate percentages (outcomes). We analyzed each group's respective decision-making process as a safe match to the telephone triage task, based on each group's system structure completeness, process and AR rates (outcome). Studies uniformly noted system component presence: nurses (2-4), physicians (1), EMDs (2), clerical staff (1). Nurses had the highest average appropriate referral (AR) rates (91%), physicians' AR (82% average). Clerical staff had no system and did not perform telephone triage by standard definitions; EMDs may represent the use of the wrong system. Telephone triage appears least safe after hours when decision makers with the least complete systems (physicians, clerical staff) typically manage calls. At minimum, telephone triage decision makers should be clinicians; however, clinicians' safety calls for improvement. With improved training, standards and CDSS quality, the 24/7 clinical call center has potential to represent the national standard. © The Author(s) 2015.
The Standard Autonomous File Server, a Customized, Off-the-Shelf Success Story
NASA Technical Reports Server (NTRS)
Semancik, Susan K.; Conger, Annette M.; Obenschain, Arthur F. (Technical Monitor)
2001-01-01
The Standard Autonomous File Server (SAFS), which includes both off-the-shelf hardware and software, uses an improved automated file transfer process to provide a quicker, more reliable, prioritized file distribution for customers of near real-time data without interfering with the assets involved in the acquisition and processing of the data. It operates as a stand-alone solution, monitoring itself, and providing an automated fail-over process to enhance reliability. This paper will describe the unique problems and lessons learned both during the COTS selection and integration into SAFS, and the system's first year of operation in support of NASA's satellite ground network. COTS was the key factor in allowing the two-person development team to deploy systems in less than a year, meeting the required launch schedule. The SAFS system his been so successful, it is becoming a NASA standard resource, leading to its nomination for NASA's Software or the Year Award in 1999.
Minding Impacting Events in a Model of Stochastic Variance
Duarte Queirós, Sílvio M.; Curado, Evaldo M. F.; Nobre, Fernando D.
2011-01-01
We introduce a generalization of the well-known ARCH process, widely used for generating uncorrelated stochastic time series with long-term non-Gaussian distributions and long-lasting correlations in the (instantaneous) standard deviation exhibiting a clustering profile. Specifically, inspired by the fact that in a variety of systems impacting events are hardly forgot, we split the process into two different regimes: a first one for regular periods where the average volatility of the fluctuations within a certain period of time is below a certain threshold, , and another one when the local standard deviation outnumbers . In the former situation we use standard rules for heteroscedastic processes whereas in the latter case the system starts recalling past values that surpassed the threshold. Our results show that for appropriate parameter values the model is able to provide fat tailed probability density functions and strong persistence of the instantaneous variance characterized by large values of the Hurst exponent (), which are ubiquitous features in complex systems. PMID:21483864
Halabi, Sam F; Lin, Ching-Fu
An extensive global system of private food regulation is under construction, one that exceeds conventional regulation thought of as being driven by public authorities like FDA and USDA in the U.S. or the Food Standards Agency in the UK. Agrifood and grocer organizations, in concert with some farming groups, have been the primary designers of this new food regulatory regime. These groups have established alliances that compete with national regulators in complex ways. This article analyzes the relationship between public and private sources of food safety regulation by examining standards adopted by the Codex Alimentarius Commission, a food safety organization jointly run by the Food and Agricultural Organization and the World Health Organization and GlobalG.A.P., a farm assurance program created in the late 1990s by supermarket chains and their major suppliers which has now expanded into a global certifying coalition. While Codex standards are adopted, often as written, by national food safety regulators who are principal drivers of the standard setting process, customers for agricultural products in many countries now demand evidence of GlobalG.A.P. certification as a prerequisite for doing business This article tests not only the durability and strength of private sector standard setting in the food safety system, but also the desirability of that system as an alternative to formal, governmental processes embodied, for our purposes, in the standards adopted by Codex. In many cases, official standards and GlobalG.A.P. standards clash in ways that implicate not only food safety but the flow of agricultural products in the global trading system. The article analyzes current weaknesses in both regimes and possibilities for change that will better reconcile the two competing systems.
NASA Astrophysics Data System (ADS)
Nikolaev, V. N.; Titov, D. V.; Syryamkin, V. I.
2018-05-01
The comparative assessment of the level of channel capacity of different variants of the structural organization of the automated information processing systems is made. The information processing time assessment model depending on the type of standard elements and their structural organization is developed.
29 CFR 1956.10 - Specific criteria.
Code of Federal Regulations, 2010 CFR
2010-07-01
... develop or adopt such standards. Indices of the effectiveness of standards and procedures for the... agencies shall have the authority through appropriate legal process to compel such entry. (f) Prohibition... plan. Subject to the results of evaluations, conformity with the Standards for a Merit System of...
[Description of the ISO 9001/2000 certification process in the parenteral nutrition area].
Miana Mena, M T; Fontanals Martínez, S; López Púa, Y; López Suñé, E; Codina Jané, C; Ribas Sala, J
2007-01-01
In order to guarantee quality and safety and to increase user satisfaction, healthcare organisations have integrated quality management systems into their structures. This study describes the process for introducing the UNE-EN-ISO-9001/2000 standard in the parenteral nutrition area. A multidisciplinary group established the scope of the standard, focusing on transcription, preparation, dispensation and microbiological control. A detailed procedure describing the sequences of circuits and associated activities, the responsible staff and the action guidelines to be followed was established. Quality and activity markers were also established. This process has enabled a standard system to be implemented, with its operation perfectly described and documented, allowing its stages to be traceable and supervised. As there is no record of the data obtained beforehand, no direct comparison can be made; its evolution must therefore be analysed in the future.
The MSFC Collaborative Engineering Process for Preliminary Design and Concept Definition Studies
NASA Technical Reports Server (NTRS)
Mulqueen, Jack; Jones, David; Hopkins, Randy
2011-01-01
This paper describes a collaborative engineering process developed by the Marshall Space Flight Center's Advanced Concepts Office for performing rapid preliminary design and mission concept definition studies for potential future NASA missions. The process has been developed and demonstrated for a broad range of mission studies including human space exploration missions, space transportation system studies and in-space science missions. The paper will describe the design team structure and specialized analytical tools that have been developed to enable a unique rapid design process. The collaborative engineering process consists of integrated analysis approach for mission definition, vehicle definition and system engineering. The relevance of the collaborative process elements to the standard NASA NPR 7120.1 system engineering process will be demonstrated. The study definition process flow for each study discipline will be will be outlined beginning with the study planning process, followed by definition of ground rules and assumptions, definition of study trades, mission analysis and subsystem analyses leading to a standardized set of mission concept study products. The flexibility of the collaborative engineering design process to accommodate a wide range of study objectives from technology definition and requirements definition to preliminary design studies will be addressed. The paper will also describe the applicability of the collaborative engineering process to include an integrated systems analysis approach for evaluating the functional requirements of evolving system technologies and capabilities needed to meet the needs of future NASA programs.
Workflow management systems in radiology
NASA Astrophysics Data System (ADS)
Wendler, Thomas; Meetz, Kirsten; Schmidt, Joachim
1998-07-01
In a situation of shrinking health care budgets, increasing cost pressure and growing demands to increase the efficiency and the quality of medical services, health care enterprises are forced to optimize or complete re-design their processes. Although information technology is agreed to potentially contribute to cost reduction and efficiency improvement, the real success factors are the re-definition and automation of processes: Business Process Re-engineering and Workflow Management. In this paper we discuss architectures for the use of workflow management systems in radiology. We propose to move forward from information systems in radiology (RIS, PACS) to Radiology Management Systems, in which workflow functionality (process definitions and process automation) is implemented through autonomous workflow management systems (WfMS). In a workflow oriented architecture, an autonomous workflow enactment service communicates with workflow client applications via standardized interfaces. In this paper, we discuss the need for and the benefits of such an approach. The separation of workflow management system and application systems is emphasized, and the consequences that arise for the architecture of workflow oriented information systems. This includes an appropriate workflow terminology, and the definition of standard interfaces for workflow aware application systems. Workflow studies in various institutions have shown that most of the processes in radiology are well structured and suited for a workflow management approach. Numerous commercially available Workflow Management Systems (WfMS) were investigated, and some of them, which are process- oriented and application independent, appear suitable for use in radiology.
An Approach for Implementation of Project Management Information Systems
NASA Astrophysics Data System (ADS)
Běrziša, Solvita; Grabis, Jānis
Project management is governed by project management methodologies, standards, and other regulatory requirements. This chapter proposes an approach for implementing and configuring project management information systems according to requirements defined by these methodologies. The approach uses a project management specification framework to describe project management methodologies in a standardized manner. This specification is used to automatically configure the project management information system by applying appropriate transformation mechanisms. Development of the standardized framework is based on analysis of typical project management concepts and process and existing XML-based representations of project management. A demonstration example of project management information system's configuration is provided.
40 CFR 61.132 - Standard: Process vessels, storage tanks, and tar-intercepting sumps.
Code of Federal Regulations, 2010 CFR
2010-07-01
... POLLUTANTS National Emission Standard for Benzene Emissions from Coke By-Product Recovery Plants § 61.132... system, or other enclosed point in the by-product recovery process where the benzene in the gas will be... or operator of a furnace coke by-product recovery plant also shall comply with the requirements of...
76 FR 16728 - Announcement of the American Petroleum Institute's Standards Activities
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-25
... voluntary standards for equipment, materials, operations, and processes for the petroleum and natural gas... Techniques for Designing and/or Optimizing Gas-lift Wells and Systems, 1st Ed. RP 13K, Chemical Analysis of... Q2, Quality Management Systems for Service Supply Organizations for the Petroleum and Natural Gas...
Types of Standard Systems and Categories of Measurement.
ERIC Educational Resources Information Center
Aftanas, Marion S.
The term measurement has been used in a number of different sub-areas of psychology without an explicit recognition of the commonalities and potential differences in measurement characteristics. Analysis of these measurement situations reveals that the one common factor is that a mechanism or discriminative process, that is a standard system of…
E-Business Reporting: Towards a Global Standard for Financial Reporting Systems Using XBRL
ERIC Educational Resources Information Center
Long, Margaret J.
2013-01-01
Reporting systems can provide transparency into financial markets necessary for a sustainable, prosperous global economy. The most widely used global platform for exchanging electronic information about companies to regulatory bodies is XBRL. Standards for this platform are in the process of becoming legally harmonized, but not all countries are…
NASA Technical Reports Server (NTRS)
Wray, Richard B.; Stovall, John R.
1993-01-01
This paper presents an overview of the application of the Space Generic Open Avionics Architecture (SGOAA) to the Space Shuttle Data Processing System (DPS) architecture design. This application has been performed to validate the SGOAA, and its potential use in flight critical systems. The paper summarizes key elements of the Space Shuttle avionics architecture, data processing system requirements and software architecture as currently implemented. It then summarizes the SGOAA architecture and describes a tailoring of the SGOAA to the Space Shuttle. The SGOAA consists of a generic system architecture for the entities in spacecraft avionics, a generic processing external and internal hardware architecture, a six class model of interfaces and functional subsystem architectures for data services and operations control capabilities. It has been proposed as an avionics architecture standard with the National Aeronautics and Space Administration (NASA), through its Strategic Avionics Technology Working Group, and is being considered by the Society of Aeronautic Engineers (SAE) as an SAE Avionics Standard. This architecture was developed for the Flight Data Systems Division of JSC by the Lockheed Engineering and Sciences Company, Houston, Texas.
ERIC Educational Resources Information Center
Haritonov, R. P.
1971-01-01
An important feature of standardization work in the Soviet Union is the preparation and establishment of State standards enabling unified systems to be introduced for documentation, classification, coding and technical and economic information, as well as standards for all kinds of information storage media. (Author/MM)
40 CFR 267.204 - What air emission standards apply?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 26 2010-07-01 2010-07-01 false What air emission standards apply? 267... PERMIT Tank Systems § 267.204 What air emission standards apply? You must manage all hazardous waste... incinerator, flame, boiler, process heater, condenser, and carbon absorption unit. ...
Integration and Interoperability: An Analysis to Identify the Attributes for System of Systems
2008-09-01
divisions of the enterprise. Examples of the current I2 are: • a nightly feed of elearning information is captured through an automated and...standardized process throughout the enterprise and • the LMS has been integrated with SkillSoft, a third party elearning software system, (http...Command (JITC) is responsible to test all programs that utilize standard interfaces to specific global nets or systems. Many times programs that
Wang, Tieyu; Zhou, Yunqiao; Bi, Cencen; Lu, Yonglong; He, Guizhen; Giesy, John P
2017-07-01
There is a need to formulate water environment standards (WESs) from the current water quality criteria (WQC) in China. To this end, we briefly summarize typical mechanisms applied in several countries with longer histories of developing WESs, and three limitations to formulating WESs in China were identified. After analyzing the feasibility factors including economic development, scientific support capability and environmental policies, we realized that China is still not ready for a complete change from its current nation-wide unified WES system to a local-standard-based system. Thus, we proposed a framework for transformation from WQC to WESs in China. The framework consists of three parts, including responsibilities, processes and policies. The responsibilities include research authorization, development of guidelines, and collection of information, at both national and local levels; the processes include four steps and an impact factor system to establish water quality standards; and the policies include seven specific proposals. Copyright © 2016. Published by Elsevier B.V.
FPGA-based firmware model for extended measurement systems with data quality monitoring
NASA Astrophysics Data System (ADS)
Wojenski, A.; Pozniak, K. T.; Mazon, D.; Chernyshova, M.
2017-08-01
Modern physics experiments requires construction of advanced, modular measurement systems for data processing and registration purposes. Components are often designed in one of the common mechanical and electrical standards, e.g. VME or uTCA. The paper is focused on measurement systems using FPGAs as data processing blocks, especially for plasma diagnostics using GEM detectors with data quality monitoring aspects. In the article is proposed standardized model of HDL FPGA firmware implementation, for use in a wide range of different measurement system. The effort was made in term of flexible implementation of data quality monitoring along with source data dynamic selection. In the paper is discussed standard measurement system model followed by detailed model of FPGA firmware for modular measurement systems. Considered are both: functional blocks and data buses. In the summary, necessary blocks and signal lines are described. Implementation of firmware following the presented rules should provide modular design, with ease of change different parts of it. The key benefit is construction of universal, modular HDL design, that can be applied in different measurement system with simple adjustments.
106-17 Telemetry Standards Chapter 1
2017-07-01
Telemetry Standards , RCC Standard 106-17 Chapter 1, July 2017 1-1 CHAPTER 1 Introduction The Telemetry Standards address the here-to-date...generally devoted to a different element of the telemetry system or process . Chapters 21 through 28 address the topic of network telemetry. These...Commonly used terms are defined in standard reference glossaries and dictionaries. Definitions of terms with special applications are included when
A Preliminary Anthropometry Standard for Australian Army Equipment Evaluation
2014-08-01
UNCLASSIFIED Authors Mark Edwards Land Division Mark Edwards holds an undergraduate degree in Industrial Design , a Masters in Ergonomics ...equipment. Given that a built system is not a requirement of the processes described, this standard can also be used early in the design process to de...risk the design process. It must be noted that the data provided in this report are representative of the 2012 ADF Army population. The impacts
Achieving mask order processing automation, interoperability and standardization based on P10
NASA Astrophysics Data System (ADS)
Rodriguez, B.; Filies, O.; Sadran, D.; Tissier, Michel; Albin, D.; Stavroulakis, S.; Voyiatzis, E.
2007-02-01
Last year the MUSCLE (Masks through User's Supply Chain: Leadership by Excellence) project was presented. Here is the project advancement. A key process in mask supply chain management is the exchange of technical information for ordering masks. This process is large, complex, company specific and error prone, and leads to longer cycle times and higher costs due to missing or wrong inputs. Its automation and standardization could produce significant benefits. We need to agree on the standard for mandatory and optional parameters, and also a common way to describe parameters when ordering. A system was created to improve the performance in terms of Key Performance Indicators (KPIs) such as cycle time and cost of production. This tool allows us to evaluate and measure the effect of factors, as well as the effect of implementing the improvements of the complete project. Next, a benchmark study and a gap analysis were performed. These studies show the feasibility of standardization, as there is a large overlap in requirements. We see that the SEMI P10 standard needs enhancements. A format supporting the standard is required, and XML offers the ability to describe P10 in a flexible way. Beyond using XML for P10, the semantics of the mask order should also be addressed. A system design and requirements for a reference implementation for a P10 based management system are presented, covering a mechanism for the evolution and for version management and a design for P10 editing and data validation.
[HL7 standard--features, principles, and methodology].
Koncar, Miroslav
2005-01-01
The mission of HL7 Inc. non-profit organization is to provide standards for the exchange, management and integration of data that support clinical patient care, and the management, delivery and evaluation of healthcare services. As the standards developed by HL7 Inc. represent the world's most influential standardization efforts in the field of medical informatics, the HL7 family of standards has been recognized by the technical and scientific community as the foundation for the next generation healthcare information systems. Versions 1 and 2 of HL7 standard have solved many issues, but also demonstrated the size and complexity of health information sharing problem. As the solution complete new methodology has been adopted that is encompassed in the HL7 Version 3 recommendations. This approach standardizes Reference Information Model (RIM), which is the source of all derived domain models and message structures. Message design is now defined in detail, enabling interoperability between loosely coupled systems that are.designed by different vendors and deployed in various environments. At the start of the Primary Healthcare Information System project in the Republic of Croatia in 2002, the decision was to go directly to Version 3. The target scope of work includes clinical, financial and administrative data management in the domain of healthcare processes. By using HL7v3 standardized methodology we were able to completely map the Croatian primary healthcare domain to HL7v3 artefacts. Further refinement processes that are planned for the future will provide semantic interoperability and detailed description of all elements in HL7 messages. Our HL7 Business Component is in constant process of studying different legacy applications, making solid foundation for their integration to HL7-enabled communication environment.
Software And Systems Engineering Risk Management
2010-04-01
RSKM 2004 COSO Enterprise RSKM Framework 2006 ISO/IEC 16085 Risk Management Process 2008 ISO/IEC 12207 Software Lifecycle Processes 2009 ISO/IEC...1 Software And Systems Engineering Risk Management John Walz VP Technical and Conferences Activities, IEEE Computer Society Vice-Chair Planning...Software & Systems Engineering Standards Committee, IEEE Computer Society US TAG to ISO TMB Risk Management Working Group Systems and Software
NASA Technical Reports Server (NTRS)
West, R. S.
1975-01-01
The system is described as a computer-based system designed to track the status of problems and corrective actions pertinent to space shuttle hardware. The input, processing, output, and performance requirements of the system are presented along with standard display formats and examples. Operational requirements, hardware, requirements, and test requirements are also included.
Mapping Perinatal Nursing Process Measurement Concepts to Standard Terminologies.
Ivory, Catherine H
2016-07-01
The use of standard terminologies is an essential component for using data to inform practice and conduct research; perinatal nursing data standardization is needed. This study explored whether 76 distinct process elements important for perinatal nursing were present in four American Nurses Association-recognized standard terminologies. The 76 process elements were taken from a valid paper-based perinatal nursing process measurement tool. Using terminology-supported browsers, the elements were manually mapped to the selected terminologies by the researcher. A five-member expert panel validated 100% of the mapping findings. The majority of the process elements (n = 63, 83%) were present in SNOMED-CT, 28% (n = 21) in LOINC, 34% (n = 26) in ICNP, and 15% (n = 11) in CCC. SNOMED-CT and LOINC are terminologies currently recommended for use to facilitate interoperability in the capture of assessment and problem data in certified electronic medical records. Study results suggest that SNOMED-CT and LOINC contain perinatal nursing process elements and are useful standard terminologies to support perinatal nursing practice in electronic health records. Terminology mapping is the first step toward incorporating traditional paper-based tools into electronic systems.
The effect of image processing on the detection of cancers in digital mammography.
Warren, Lucy M; Given-Wilson, Rosalind M; Wallis, Matthew G; Cooke, Julie; Halling-Brown, Mark D; Mackenzie, Alistair; Chakraborty, Dev P; Bosmans, Hilde; Dance, David R; Young, Kenneth C
2014-08-01
OBJECTIVE. The objective of our study was to investigate the effect of image processing on the detection of cancers in digital mammography images. MATERIALS AND METHODS. Two hundred seventy pairs of breast images (both breasts, one view) were collected from eight systems using Hologic amorphous selenium detectors: 80 image pairs showed breasts containing subtle malignant masses; 30 image pairs, biopsy-proven benign lesions; 80 image pairs, simulated calcification clusters; and 80 image pairs, no cancer (normal). The 270 image pairs were processed with three types of image processing: standard (full enhancement), low contrast (intermediate enhancement), and pseudo-film-screen (no enhancement). Seven experienced observers inspected the images, locating and rating regions they suspected to be cancer for likelihood of malignancy. The results were analyzed using a jackknife-alternative free-response receiver operating characteristic (JAFROC) analysis. RESULTS. The detection of calcification clusters was significantly affected by the type of image processing: The JAFROC figure of merit (FOM) decreased from 0.65 with standard image processing to 0.63 with low-contrast image processing (p = 0.04) and from 0.65 with standard image processing to 0.61 with film-screen image processing (p = 0.0005). The detection of noncalcification cancers was not significantly different among the image-processing types investigated (p > 0.40). CONCLUSION. These results suggest that image processing has a significant impact on the detection of calcification clusters in digital mammography. For the three image-processing versions and the system investigated, standard image processing was optimal for the detection of calcification clusters. The effect on cancer detection should be considered when selecting the type of image processing in the future.
Trickling Filters. Student Manual. Biological Treatment Process Control.
ERIC Educational Resources Information Center
Richwine, Reynold D.
The textual material for a unit on trickling filters is presented in this student manual. Topic areas discussed include: (1) trickling filter process components (preliminary treatment, media, underdrain system, distribution system, ventilation, and secondary clarifier); (2) operational modes (standard rate filters, high rate filters, roughing…
CrowdMapping: A Crowdsourcing-Based Terminology Mapping Method for Medical Data Standardization.
Mao, Huajian; Chi, Chenyang; Huang, Boyu; Meng, Haibin; Yu, Jinghui; Zhao, Dongsheng
2017-01-01
Standardized terminology is the prerequisite of data exchange in analysis of clinical processes. However, data from different electronic health record systems are based on idiosyncratic terminology systems, especially when the data is from different hospitals and healthcare organizations. Terminology standardization is necessary for the medical data analysis. We propose a crowdsourcing-based terminology mapping method, CrowdMapping, to standardize the terminology in medical data. CrowdMapping uses a confidential model to determine how terminologies are mapped to a standard system, like ICD-10. The model uses mappings from different health care organizations and evaluates the diversity of the mapping to determine a more sophisticated mapping rule. Further, the CrowdMapping model enables users to rate the mapping result and interact with the model evaluation. CrowdMapping is a work-in-progress system, we present initial results mapping terminologies.
Saqaeian Nejad Isfahani, Sakineh; Mirzaeian, Razieh; Habibi, Mahbobe
2013-01-01
In supporting a therapeutic approach and medication therapy management, pharmacy information system acts as one of the central pillars of information system. This ensures that medication therapy is being supported and evaluated with an optimal level of safety and quality similar to other treatments and services. This research aims to evaluate the performance of pharmacy information system in three types of teaching, private and social affiliated hospitals. The present study is an applied, descriptive and analytical study which was conducted on the pharmacy information system in use in the selected hospitals. The research population included all the users of pharmacy information systems in the selected hospitals. The research sample is the same as the research population. Researchers collected data using a self-designed checklist developed following the guidelines of the American Society of Health-System Pharmacists, Australia pharmaceutical Society and Therapeutic guidelines of the Drug Commission of the German Medical Association. The checklist validity was assessed by research supervisors and pharmacy information system pharmacists and users. To collect data besides observation, the questionnaires were distributed among pharmacy information system pharmacists and users. Finally, the analysis of the data was performed using the SPSS software. Pharmacy information system was found to be semi-automated in 16 hospitals and automated in 3 ones. Regarding the standards in the guidelines issued by the Society of Pharmacists, the highest rank in observing the input standards belonged to the Social Services associated hospitals with a mean score of 32.75. While teaching hospitals gained the highest score both in processing standards with a mean score of 29.15 and output standards with a mean score of 43.95, and the private hospitals had the lowest mean scores of 23.32, 17.78, 24.25 in input, process and output standards respectively. Based on the findings, the studied hospitals had minimal compliance with the input, output and processing standards related to the pharmacy information system. It is suggested that the establishment of a team composed of operational managers, computer fields experts, health information managers, pharmacists as well as physicians may contribute to the promotion of the capabilities of pharmacy information system to be able to focus on health care practitioners' and users' requirements.
Development of a standardized, citywide process for managing smart-pump drug libraries.
Walroth, Todd A; Smallwood, Shannon; Arthur, Karen; Vance, Betsy; Washington, Alana; Staublin, Therese; Haslar, Tammy; Reddan, Jennifer G; Fuller, James
2018-06-15
Development and implementation of an interprofessional consensus-driven process for review and optimization of smart-pump drug libraries and dosing limits are described. The Indianapolis Coalition for Patient Safety (ICPS), which represents 6 Indianapolis-area health systems, identified an opportunity to reduce clinically insignificant alerts that smart infusion pumps present to end users. Through a consensus-driven process, ICPS aimed to identify best practices to implement at individual hospitals in order to establish specific action items for smart-pump drug library optimization. A work group of pharmacists, nurses, and industrial engineers met to evaluate variability within and lack of scrutiny of smart-pump drug libraries. The work group used Lean Six Sigma methodologies to generate a list of key needs and barriers to be addressed in process standardization. The group reviewed targets for smart-pump drug library optimization, including dosing limits, types of alerts reviewed, policies, and safety best practices. The work group also analyzed existing processes at each site to develop a final consensus statement outlining a model process for reviewing alerts and managing smart-pump data. Analysis of the total number of alerts per device across ICPS-affiliated health systems over a 4-year period indicated a 50% decrease (from 7.2 to 3.6 alerts per device per month) after implementation of the model by ICPS member organizations. Through implementation of a standardized, consensus-driven process for smart-pump drug library optimization, ICPS member health systems reduced clinically insignificant smart-pump alerts. Copyright © 2018 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
Systems Architecture for a Nationwide Healthcare System.
Abin, Jorge; Nemeth, Horacio; Friedmann, Ignacio
2015-01-01
From a national level to give Internet technology support, the Nationwide Integrated Healthcare System in Uruguay requires a model of Information Systems Architecture. This system has multiple healthcare providers (public and private), and a strong component of supplementary services. Thus, the data processing system should have an architecture that considers this fact, while integrating the central services provided by the Ministry of Public Health. The national electronic health record, as well as other related data processing systems, should be based on this architecture. The architecture model described here conceptualizes a federated framework of electronic health record systems, according to the IHE affinity model, HL7 standards, local standards on interoperability and security, as well as technical advice provided by AGESIC. It is the outcome of the research done by AGESIC and Systems Integration Laboratory (LINS) on the development and use of the e-Government Platform since 2008, as well as the research done by the team Salud.uy since 2013.
Health IT for Patient Safety and Improving the Safety of Health IT.
Magrabi, Farah; Ong, Mei-Sing; Coiera, Enrico
2016-01-01
Alongside their benefits health IT applications can pose new risks to patient safety. Problems with IT have been linked to many different types of clinical errors including prescribing and administration of medications; as well as wrong-patient, wrong-site errors, and delays in procedures. There is also growing concern about the risks of data breach and cyber-security. IT-related clinical errors have their origins in processes undertaken to design, build, implement and use software systems in a broader sociotechnical context. Safety can be improved with greater standardization of clinical software and by improving the quality of processes at different points in the technology life cycle, spanning design, build, implementation and use in clinical settings. Oversight processes can be set up at a regional or national level to ensure that clinical software systems meet specific standards. Certification and regulation are two mechanisms to improve oversight. In the absence of clear standards, guidelines are useful to promote safe design and implementation practices. Processes to identify and mitigate hazards can be formalised via a safety management system. Minimizing new patient safety risks is critical to realizing the benefits of IT.
Technical Standards for Command and Control Information Systems (CCISs) and Information Technology
1994-02-01
formatting, transmitting, receiving, and processing imagery and imagery-related information. The N1TFS is in essence the suite of individual standards...also known as Limited Operational Capability-Europe) and the German Joint Analysis System Military Intelligence ( JASMIN ). Among the approaches being... essence , the other systems utilize a one-level address space where addressing consists of identifying the fire support unit. However, AFATDS utilizes a two
Defense ADP Acquisition Study.
1981-11-30
Logistics ALS - Advanced Logistics System AMP - ADPS Master Plan ANSI - American National Standards Institute APR - Agency Procurement Request ASD(C...Computers IRM - Information Resources Management ISO - International Standards Organization L LCC - Life Cycle Costs LCM - Life Cycle Management LE...man- agement in the process * Lack of a mission orientation . Lack of systems management and life cycle perspectives * Lack of effective leadership
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greene, N.M.; Petrie, L.M.; Westfall, R.M.
SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system hasmore » been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. The manual is divided into three volumes: Volume 1--for the control module documentation; Volume 2--for functional module documentation; and Volume 3--for documentation of the data libraries and subroutine libraries.« less
PATRAM '80. Proceedings. Volume 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huebner, H.W.
Volume 1 contains papers from the following sessions: Plenary Session; Regulations, Licensing and Standards; LMFBR Systems Concepts; Risk/Safety Assessment I; Systems and Package Design; US Institutional Issues; Risk/Safety Assessment II; Leakage, Leak Rate and Seals; Poster Session A; Operations and Systems Experience I; Manufacturing Processes and Materials; and Quality Assurance and Maintenance. Individual papers were processed. (LM)
Ride quality criteria and the design process. [standards for ride comfort
NASA Technical Reports Server (NTRS)
Ravera, R. J.
1975-01-01
Conceptual designs for advanced ground transportation systems often hinge on obtaining acceptable vehicle ride quality while attempting to keep the total guideway cost (initial and subsequent maintenance) as low as possible. Two ride quality standards used extensively in work sponsored by the U.S. Department of Transportation (DOT) are the DOT-Urban Tracked Air Cushion Vehicle (UTACV) standard and the International Standards Organization (ISO) reduced ride comfort criteria. These standards are reviewed and some of the deficiencies, which become apparent when trying to apply them in practice, are noted. Through the use of a digital simulation, the impact of each of these standards on an example design process is examined. It is shown that meeting the ISO specification for the particular vehicle/guideway case investigated is easier than meeting the UTACV standard.
Traceability of Software Safety Requirements in Legacy Safety Critical Systems
NASA Technical Reports Server (NTRS)
Hill, Janice L.
2007-01-01
How can traceability of software safety requirements be created for legacy safety critical systems? Requirements in safety standards are imposed most times during contract negotiations. On the other hand, there are instances where safety standards are levied on legacy safety critical systems, some of which may be considered for reuse for new applications. Safety standards often specify that software development documentation include process-oriented and technical safety requirements, and also require that system and software safety analyses are performed supporting technical safety requirements implementation. So what can be done if the requisite documents for establishing and maintaining safety requirements traceability are not available?
A Converter from the Systems Biology Markup Language to the Synthetic Biology Open Language.
Nguyen, Tramy; Roehner, Nicholas; Zundel, Zach; Myers, Chris J
2016-06-17
Standards are important to synthetic biology because they enable exchange and reproducibility of genetic designs. This paper describes a procedure for converting between two standards: the Systems Biology Markup Language (SBML) and the Synthetic Biology Open Language (SBOL). SBML is a standard for behavioral models of biological systems at the molecular level. SBOL describes structural and basic qualitative behavioral aspects of a biological design. Converting SBML to SBOL enables a consistent connection between behavioral and structural information for a biological design. The conversion process described in this paper leverages Systems Biology Ontology (SBO) annotations to enable inference of a designs qualitative function.
Size reduction techniques for vital compliant VHDL simulation models
Rich, Marvin J.; Misra, Ashutosh
2006-08-01
A method and system select delay values from a VHDL standard delay file that correspond to an instance of a logic gate in a logic model. Then the system collects all the delay values of the selected instance and builds super generics for the rise-time and the fall-time of the selected instance. Then, the system repeats this process for every delay value in the standard delay file (310) that correspond to every instance of every logic gate in the logic model. The system then outputs a reduced size standard delay file (314) containing the super generics for every instance of every logic gate in the logic model.
Code of Federal Regulations, 2012 CFR
2012-04-01
... HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION INTELLIGENT TRANSPORTATION SYSTEMS INTELLIGENT TRANSPORTATION SYSTEM ARCHITECTURE AND STANDARDS § 940.3 Definitions. Intelligent Transportation System (ITS... projects or groups of projects. Systems engineering is a structured process for arriving at a final design...
Code of Federal Regulations, 2013 CFR
2013-04-01
... HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION INTELLIGENT TRANSPORTATION SYSTEMS INTELLIGENT TRANSPORTATION SYSTEM ARCHITECTURE AND STANDARDS § 940.3 Definitions. Intelligent Transportation System (ITS... projects or groups of projects. Systems engineering is a structured process for arriving at a final design...
Code of Federal Regulations, 2014 CFR
2014-04-01
... HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION INTELLIGENT TRANSPORTATION SYSTEMS INTELLIGENT TRANSPORTATION SYSTEM ARCHITECTURE AND STANDARDS § 940.3 Definitions. Intelligent Transportation System (ITS... projects or groups of projects. Systems engineering is a structured process for arriving at a final design...
Ishii, Lisa; Pronovost, Peter J; Demski, Renee; Wylie, Gill; Zenilman, Michael
2016-06-01
An increasing volume of ambulatory surgeries has led to an increase in the number of ambulatory surgery centers (ASCs). Some academic health systems have aligned with ASCs to create a more integrated care delivery system. Yet, these centers are diverse in many areas, including specialty types, ownership models, management, physician employment, and regulatory oversight. Academic health systems then face challenges in integrating these ASCs into their organizations. Johns Hopkins Medicine created the Ambulatory Surgery Coordinating Council in 2014 to manage, standardize, and promote peer learning among its eight ASCs. The Armstrong Institute for Patient Safety and Quality provided support and a model for this organization through its quality management infrastructure. The physician-led council defined a mission and created goals to identify best practices, uniformly provide the highest-quality patient-centered care, and continuously improve patient outcomes and experience across ASCs. Council members built trust and agreed on a standardized patient safety and quality dashboard to report measures that include regulatory, care process, patient experience, and outcomes data. The council addressed unintentional outcomes and process variation across the system and agreed to standard approaches to optimize quality. Council members also developed a process for identifying future goals, standardizing care practices and electronic medical record documentation, and creating quality and safety policies. The early success of the council supports the continuation of the Armstrong Institute model for physician-led quality management. Other academic health systems can learn from this model as they integrate ASCs into their complex organizations.
NASA Technical Reports Server (NTRS)
Moe, Karen L.; Perkins, Dorothy C.; Szczur, Martha R.
1987-01-01
The user support environment (USE) which is a set of software tools for a flexible standard interactive user interface to the Space Station systems, platforms, and payloads is described in detail. Included in the USE concept are a user interface language, a run time environment and user interface management system, support tools, and standards for human interaction methods. The goals and challenges of the USE are discussed as well as a methodology based on prototype demonstrations for involving users in the process of validating the USE concepts. By prototyping the key concepts and salient features of the proposed user interface standards, the user's ability to respond is greatly enhanced.
Hoelzer, Simon; Schweiger, Ralf K; Liu, Raymond; Rudolf, Dirk; Rieger, Joerg; Dudeck, Joachim
2005-01-01
With the introduction of the ICD-10 as the standard for diagnosis, the development of an electronic representation of its complete content, inherent semantics and coding rules is necessary. Our concept refers to current efforts of the CEN/TC 251 to establish a European standard for hierarchical classification systems in healthcare. We have developed an electronic representation of the ICD-10 with the extensible Markup Language (XML) that facilitates the integration in current information systems or coding software taking into account different languages and versions. In this context, XML offers a complete framework of related technologies and standard tools for processing that helps to develop interoperable applications.
[Standardization and modeling of surgical processes].
Strauss, G; Schmitz, P
2016-12-01
Due to the technological developments around the operating room, surgery in the twenty-first century is undergoing a paradigm shift. Which technologies have already been integrated into the surgical routine? How can a favorable cost-benefit balance be achieved by the implementation of new software-based assistance systems? This article presents the state of the art technology as exemplified by a semi-automated operation system for otorhinolaryngology surgery. The main focus is on systems for implementation of digital handbooks and navigational functions in situ. On the basis of continuous development in digital imaging, decisions may by facilitated by individual patient models thus allowing procedures to be optimized. The ongoing digitization and linking of all relevant information enable a high level of standardization in terms of operating procedures. This may be used by assistance systems as a basis for complete documentation and high process reliability. Automation of processes in the operating room results in an increase in quality, precision and standardization so that the effectiveness and efficiency of treatment can be improved; however, care must be taken that detrimental consequences, such as loss of skills and placing too much faith in technology must be avoided by adapted training concepts.
Preliminary design review package for the solar heating and cooling central data processing system
NASA Technical Reports Server (NTRS)
1976-01-01
The Central Data Processing System (CDPS) is designed to transform the raw data collected at remote sites into performance evaluation information for assessing the performance of solar heating and cooling systems. Software requirements for the CDPS are described. The programming standards to be used in development, documentation, and maintenance of the software are discussed along with the CDPS operations approach in support of daily data collection and processing.
A Lifecycle Approach to Brokered Data Management for Hydrologic Modeling Data Using Open Standards.
NASA Astrophysics Data System (ADS)
Blodgett, D. L.; Booth, N.; Kunicki, T.; Walker, J.
2012-12-01
The U.S. Geological Survey Center for Integrated Data Analytics has formalized an information management-architecture to facilitate hydrologic modeling and subsequent decision support throughout a project's lifecycle. The architecture is based on open standards and open source software to decrease the adoption barrier and to build on existing, community supported software. The components of this system have been developed and evaluated to support data management activities of the interagency Great Lakes Restoration Initiative, Department of Interior's Climate Science Centers and WaterSmart National Water Census. Much of the research and development of this system has been in cooperation with international interoperability experiments conducted within the Open Geospatial Consortium. Community-developed standards and software, implemented to meet the unique requirements of specific disciplines, are used as a system of interoperable, discipline specific, data types and interfaces. This approach has allowed adoption of existing software that satisfies the majority of system requirements. Four major features of the system include: 1) assistance in model parameter and forcing creation from large enterprise data sources; 2) conversion of model results and calibrated parameters to standard formats, making them available via standard web services; 3) tracking a model's processes, inputs, and outputs as a cohesive metadata record, allowing provenance tracking via reference to web services; and 4) generalized decision support tools which rely on a suite of standard data types and interfaces, rather than particular manually curated model-derived datasets. Recent progress made in data and web service standards related to sensor and/or model derived station time series, dynamic web processing, and metadata management are central to this system's function and will be presented briefly along with a functional overview of the applications that make up the system. As the separate pieces of this system progress, they will be combined and generalized to form a sort of social network for nationally consistent hydrologic modeling.
Virtual file system on NoSQL for processing high volumes of HL7 messages.
Kimura, Eizen; Ishihara, Ken
2015-01-01
The Standardized Structured Medical Information Exchange (SS-MIX) is intended to be the standard repository for HL7 messages that depend on a local file system. However, its scalability is limited. We implemented a virtual file system using NoSQL to incorporate modern computing technology into SS-MIX and allow the system to integrate local patient IDs from different healthcare systems into a universal system. We discuss its implementation using the database MongoDB and describe its performance in a case study.
Performance measures for rural transportation systems : guidebook.
DOT National Transportation Integrated Search
2006-06-01
This Performance Measures for Rural Transportation Systems Guidebook provides a : standardized and supportable performance measurement process that can be applied to : transportation systems in rural areas. The guidance included in this guidebook was...
A Study on the Development of Service Quality Index for Incheon International Airport
NASA Technical Reports Server (NTRS)
Lee, Kang Seok; Lee, Seung Chang; Hong, Soon Kil
2003-01-01
The main purpose of this study is located at developing Ominibus Monitors System(OMS) for internal management, which will enable to establish standards, finding out matters to be improved, and appreciation for its treatment in a systematic way. It is through developing subjective or objective estimation tool with use importance, perceived level, and complex index at international airport by each principal service items. The direction of this study came towards for the purpose of developing a metric analysis tool, utilizing the Quantitative Second Data, Analysing Perceived Data through airport user surveys, systemizing the data collection-input-analysis process, making data image according to graph of results, planning Service Encounter and endowing control attribution, and ensuring competitiveness at the minimal international standards. It is much important to set up a pre-investigation plan on the base of existent foreign literature and actual inspection to international airport. Two tasks have been executed together on the base of this pre-investigation; one is developing subjective estimation standards for departing party, entering party, and airport residence and the other is developing objective standards as complementary methods. The study has processed for the purpose of monitoring services at airports regularly and irregularly through developing software system for operating standards after ensuring credibility and feasibility of estimation standards with substantial and statistical way.
Bauer, Daniel R; Otter, Michael; Chafin, David R
2018-01-01
Studying and developing preanalytical tools and technologies for the purpose of obtaining high-quality samples for histological assays is a growing field. Currently, there does not exist a standard practice for collecting, fixing, and monitoring these precious samples. There has been some advancement in standardizing collection for the highest profile tumor types, such as breast, where HER2 testing drives therapeutic decisions. This review examines the area of tissue collection, transport, and monitoring of formalin diffusion and details a prototype system that could be used to help standardize tissue collection efforts. We have surveyed recent primary literature sources and conducted several site visits to understand the most error-prone processes in histology laboratories. This effort identified errors that resulted from sample collection techniques and subsequent transport delays from the operating room (OR) to the histology laboratories. We have therefore devised a prototype sample collection and transport concept. The system consists of a custom data logger and cold transport box and takes advantage of a novel cold + warm (named 2 + 2) fixation method. This review highlights the beneficial aspects of standardizing tissue collection, fixation, and monitoring. In addition, a prototype system is introduced that could help standardize these processes and is compatible with use directly in the OR and from remote sites.
Electrophoresis gel image processing and analysis using the KODAK 1D software.
Pizzonia, J
2001-06-01
The present article reports on the performance of the KODAK 1D Image Analysis Software for the acquisition of information from electrophoresis experiments and highlights the utility of several mathematical functions for subsequent image processing, analysis, and presentation. Digital images of Coomassie-stained polyacrylamide protein gels containing molecular weight standards and ethidium bromide stained agarose gels containing DNA mass standards are acquired using the KODAK Electrophoresis Documentation and Analysis System 290 (EDAS 290). The KODAK 1D software is used to optimize lane and band identification using features such as isomolecular weight lines. Mathematical functions for mass standard representation are presented, and two methods for estimation of unknown band mass are compared. Given the progressive transition of electrophoresis data acquisition and daily reporting in peer-reviewed journals to digital formats ranging from 8-bit systems such as EDAS 290 to more expensive 16-bit systems, the utility of algorithms such as Gaussian modeling, which can correct geometric aberrations such as clipping due to signal saturation common at lower bit depth levels, is discussed. Finally, image-processing tools that can facilitate image preparation for presentation are demonstrated.
Ionization-Assisted Getter Pumping for Ultra-Stable Trapped Ion Frequency Standards
NASA Technical Reports Server (NTRS)
Tjoelker, Robert L.; Burt, Eric A.
2010-01-01
A method eliminates (or recovers from) residual methane buildup in getter-pumped atomic frequency standard systems by applying ionizing assistance. Ultra-high stability trapped ion frequency standards for applications requiring very high reliability, and/or low power and mass (both for ground-based and space-based platforms) benefit from using sealed vacuum systems. These systems require careful material selection and system processing (cleaning and high-temperature bake-out). Even under the most careful preparation, residual hydrogen outgassing from vacuum chamber walls typically limits the base pressure. Non-evaporable getter pumps (NEGs) provide a convenient pumping option for sealed systems because of low mass and volume, and no power once activated. An ion gauge in conjunction with a NEG can be used to provide a low mass, low-power method for avoiding the deleterious effects of methane buildup in high-performance frequency standard vacuum systems.
Multi-core processing and scheduling performance in CMS
NASA Astrophysics Data System (ADS)
Hernández, J. M.; Evans, D.; Foulkes, S.
2012-12-01
Commodity hardware is going many-core. We might soon not be able to satisfy the job memory needs per core in the current single-core processing model in High Energy Physics. In addition, an ever increasing number of independent and incoherent jobs running on the same physical hardware not sharing resources might significantly affect processing performance. It will be essential to effectively utilize the multi-core architecture. CMS has incorporated support for multi-core processing in the event processing framework and the workload management system. Multi-core processing jobs share common data in memory, such us the code libraries, detector geometry and conditions data, resulting in a much lower memory usage than standard single-core independent jobs. Exploiting this new processing model requires a new model in computing resource allocation, departing from the standard single-core allocation for a job. The experiment job management system needs to have control over a larger quantum of resource since multi-core aware jobs require the scheduling of multiples cores simultaneously. CMS is exploring the approach of using whole nodes as unit in the workload management system where all cores of a node are allocated to a multi-core job. Whole-node scheduling allows for optimization of the data/workflow management (e.g. I/O caching, local merging) but efficient utilization of all scheduled cores is challenging. Dedicated whole-node queues have been setup at all Tier-1 centers for exploring multi-core processing workflows in CMS. We present the evaluation of the performance scheduling and executing multi-core workflows in whole-node queues compared to the standard single-core processing workflows.
Usability of HL7 and SNOMED CT standards in Java Persistence API environment.
Antal, Gábor; Végh, Ádám Zoltán; Bilicki, Vilmos
2014-01-01
Due to the need for an efficient way of communication between the different stakeholders of healthcare (e.g. doctors, pharmacists, hospitals, patients etc.), the possibility of integrating different healthcare systems occurs. However, during the integration process several problems of heterogeneity might come up, which can turn integration into a difficult task. These problems motivated the development of healthcare information standards. The main goal of the HL7 family of standards is the standardization of communication between clinical systems and the unification of clinical document formats on the structural level. The SNOMED CT standard aims the unification of the healthcare terminology, thus the development of a standard on lexical level. The goal of this article is to introduce the usability of these two standards in Java Persistence API (JPA) environment, and to examine how standard-based system components can be efficiently generated. First, we shortly introduce the structure of the standards, their advantages and disadvantages. Then, we present an architecture design method, which can help to eliminate the possible structural drawbacks of the standards, and makes code generating tools applicable for the automatic production of certain system components.
40 CFR 63.104 - Heat exchange system requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Standards for Organic Hazardous Air Pollutants From the Synthetic Organic Chemical Manufacturing Industry... subpart shall monitor each heat exchange system used to cool process equipment in a chemical manufacturing process unit meeting the conditions of § 63.100 (b)(1) through (b)(3) of this subpart, except for chemical...
40 CFR 63.104 - Heat exchange system requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Standards for Organic Hazardous Air Pollutants From the Synthetic Organic Chemical Manufacturing Industry... subpart shall monitor each heat exchange system used to cool process equipment in a chemical manufacturing process unit meeting the conditions of § 63.100 (b)(1) through (b)(3) of this subpart, except for chemical...
40 CFR 63.104 - Heat exchange system requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Standards for Organic Hazardous Air Pollutants From the Synthetic Organic Chemical Manufacturing Industry... subpart shall monitor each heat exchange system used to cool process equipment in a chemical manufacturing process unit meeting the conditions of § 63.100 (b)(1) through (b)(3) of this subpart, except for chemical...
The Impact of Flagging on the Admission Process.
ERIC Educational Resources Information Center
Cahalan-Laitusis, Cara; Mandinach, Ellen B.; Camara, Wayne J.
2003-01-01
Study explored issues surrounding flagging test scores taken under non-standard conditions and how the admission process could better serve students with disabilities. Respondents to survey felt current system was not adequately serving subgroups of students, believing some non-disabled students were manipulating the system to gain an advantage on…
Trigo, Jesús Daniel; Martínez, Ignacio; Alesanco, Alvaro; Kollmann, Alexander; Escayola, Javier; Hayn, Dieter; Schreier, Günter; García, José
2012-07-01
This paper investigates the application of the enterprise information system (EIS) paradigm to standardized cardiovascular condition monitoring. There are many specifications in cardiology, particularly in the ECG standardization arena. The existence of ECG formats, however, does not guarantee the implementation of homogeneous, standardized solutions for ECG management. In fact, hospital management services need to cope with various ECG formats and, moreover, several different visualization applications. This heterogeneity hampers the normalization of integrated, standardized healthcare information systems, hence the need for finding an appropriate combination of ECG formats and a suitable EIS-based software architecture that enables standardized exchange and homogeneous management of ECG formats. Determining such a combination is one objective of this paper. The second aim is to design and develop the integrated healthcare information system that satisfies the requirements posed by the previous determination. The ECG formats selected include ISO/IEEE11073, Standard Communications Protocol for Computer-Assisted Electrocardiography, and an ECG ontology. The EIS-enabling techniques and technologies selected include web services, simple object access protocol, extensible markup language, or business process execution language. Such a selection ensures the standardized exchange of ECGs within, or across, healthcare information systems while providing modularity and accessibility.
Brown, Jesslyn; Howard, Daniel M.; Wylie, Bruce K.; Friesz, Aaron M.; Ji, Lei; Gacke, Carolyn
2015-01-01
Monitoring systems benefit from high temporal frequency image data collected from the Moderate Resolution Imaging Spectroradiometer (MODIS) system. Because of near-daily global coverage, MODIS data are beneficial to applications that require timely information about vegetation condition related to drought, flooding, or fire danger. Rapid satellite data streams in operational applications have clear benefits for monitoring vegetation, especially when information can be delivered as fast as changing surface conditions. An “expedited” processing system called “eMODIS” operated by the U.S. Geological Survey provides rapid MODIS surface reflectance data to operational applications in less than 24 h offering tailored, consistently-processed information products that complement standard MODIS products. We assessed eMODIS quality and consistency by comparing to standard MODIS data. Only land data with known high quality were analyzed in a central U.S. study area. When compared to standard MODIS (MOD/MYD09Q1), the eMODIS Normalized Difference Vegetation Index (NDVI) maintained a strong, significant relationship to standard MODIS NDVI, whether from morning (Terra) or afternoon (Aqua) orbits. The Aqua eMODIS data were more prone to noise than the Terra data, likely due to differences in the internal cloud mask used in MOD/MYD09Q1 or compositing rules. Post-processing temporal smoothing decreased noise in eMODIS data.
Pollock, James; Bolton, Glen; Coffman, Jon; Ho, Sa V; Bracewell, Daniel G; Farid, Suzanne S
2013-04-05
This paper presents an integrated experimental and modelling approach to evaluate the potential of semi-continuous chromatography for the capture of monoclonal antibodies (mAb) in clinical and commercial manufacture. Small-scale single-column experimental breakthrough studies were used to derive design equations for the semi-continuous affinity chromatography system. Verification runs with the semi-continuous 3-column and 4-column periodic counter current (PCC) chromatography system indicated the robustness of the design approach. The product quality profiles and step yields (after wash step optimisation) achieved were comparable to the standard batch process. The experimentally-derived design equations were incorporated into a decisional tool comprising dynamic simulation, process economics and sizing optimisation. The decisional tool was used to evaluate the economic and operational feasibility of whole mAb bioprocesses employing PCC affinity capture chromatography versus standard batch chromatography across a product's lifecycle from clinical to commercial manufacture. The tool predicted that PCC capture chromatography would offer more significant savings in direct costs for early-stage clinical manufacture (proof-of-concept) (∼30%) than for late-stage clinical (∼10-15%) or commercial (∼5%) manufacture. The evaluation also highlighted the potential facility fit issues that could arise with a capture resin (MabSelect) that experiences losses in binding capacity when operated in continuous mode over lengthy commercial campaigns. Consequently, the analysis explored the scenario of adopting the PCC system for clinical manufacture and switching to the standard batch process following product launch. The tool determined the PCC system design required to operate at commercial scale without facility fit issues and with similar costs to the standard batch process whilst pursuing a process change application. A retrofitting analysis established that the direct cost savings obtained by 8 proof-of-concept batches would be sufficient to pay back the investment cost of the pilot-scale semi-continuous chromatography system. Copyright © 2013 Elsevier B.V. All rights reserved.
A multiarchitecture parallel-processing development environment
NASA Technical Reports Server (NTRS)
Townsend, Scott; Blech, Richard; Cole, Gary
1993-01-01
A description is given of the hardware and software of a multiprocessor test bed - the second generation Hypercluster system. The Hypercluster architecture consists of a standard hypercube distributed-memory topology, with multiprocessor shared-memory nodes. By using standard, off-the-shelf hardware, the system can be upgraded to use rapidly improving computer technology. The Hypercluster's multiarchitecture nature makes it suitable for researching parallel algorithms in computational field simulation applications (e.g., computational fluid dynamics). The dedicated test-bed environment of the Hypercluster and its custom-built software allows experiments with various parallel-processing concepts such as message passing algorithms, debugging tools, and computational 'steering'. Such research would be difficult, if not impossible, to achieve on shared, commercial systems.
Emergency healthcare process automation using mobile computing and cloud services.
Poulymenopoulou, M; Malamateniou, F; Vassilacopoulos, G
2012-10-01
Emergency care is basically concerned with the provision of pre-hospital and in-hospital medical and/or paramedical services and it typically involves a wide variety of interdependent and distributed activities that can be interconnected to form emergency care processes within and between Emergency Medical Service (EMS) agencies and hospitals. Hence, in developing an information system for emergency care processes, it is essential to support individual process activities and to satisfy collaboration and coordination needs by providing readily access to patient and operational information regardless of location and time. Filling this information gap by enabling the provision of the right information, to the right people, at the right time fosters new challenges, including the specification of a common information format, the interoperability among heterogeneous institutional information systems or the development of new, ubiquitous trans-institutional systems. This paper is concerned with the development of an integrated computer support to emergency care processes by evolving and cross-linking institutional healthcare systems. To this end, an integrated EMS cloud-based architecture has been developed that allows authorized users to access emergency case information in standardized document form, as proposed by the Integrating the Healthcare Enterprise (IHE) profile, uses the Organization for the Advancement of Structured Information Standards (OASIS) standard Emergency Data Exchange Language (EDXL) Hospital Availability Exchange (HAVE) for exchanging operational data with hospitals and incorporates an intelligent module that supports triaging and selecting the most appropriate ambulances and hospitals for each case.
ERIC Educational Resources Information Center
Lang, Laura B.; Schoen, Robert R.; LaVenia, Mark; Oberlin, Maureen
2014-01-01
The Florida Center for Research in Science, Technology, Engineering and Mathematics (FCR-STEM) was awarded a grant by the Florida Department of Education to develop a Mathematics Formative Assessment System (MFAS) aligned with the Common Core State Standards (CCSS). Intended for both teachers and students, formative assessment is a process that…
40 CFR 265.1055 - Standards: Sampling connection systems.
Code of Federal Regulations, 2013 CFR
2013-07-01
... with the requirements of § 265.1060 of this subpart. (c) In-situ sampling systems and sampling systems... required in paragraph (a) of this section shall: (1) Return the purged process fluid directly to the...
40 CFR 265.1055 - Standards: Sampling connection systems.
Code of Federal Regulations, 2014 CFR
2014-07-01
... with the requirements of § 265.1060 of this subpart. (c) In-situ sampling systems and sampling systems... required in paragraph (a) of this section shall: (1) Return the purged process fluid directly to the...
40 CFR 265.1055 - Standards: Sampling connection systems.
Code of Federal Regulations, 2012 CFR
2012-07-01
... with the requirements of § 265.1060 of this subpart. (c) In-situ sampling systems and sampling systems... required in paragraph (a) of this section shall: (1) Return the purged process fluid directly to the...
Wiemuth, M; Junger, D; Leitritz, M A; Neumann, J; Neumuth, T; Burgert, O
2017-08-01
Medical processes can be modeled using different methods and notations. Currently used modeling systems like Business Process Model and Notation (BPMN) are not capable of describing the highly flexible and variable medical processes in sufficient detail. We combined two modeling systems, Business Process Management (BPM) and Adaptive Case Management (ACM), to be able to model non-deterministic medical processes. We used the new Standards Case Management Model and Notation (CMMN) and Decision Management Notation (DMN). First, we explain how CMMN, DMN and BPMN could be used to model non-deterministic medical processes. We applied this methodology to model 79 cataract operations provided by University Hospital Leipzig, Germany, and four cataract operations provided by University Eye Hospital Tuebingen, Germany. Our model consists of 85 tasks and about 20 decisions in BPMN. We were able to expand the system with more complex situations that might appear during an intervention. An effective modeling of the cataract intervention is possible using the combination of BPM and ACM. The combination gives the possibility to depict complex processes with complex decisions. This combination allows a significant advantage for modeling perioperative processes.
Kropf, Stefan; Chalopin, Claire; Lindner, Dirk; Denecke, Kerstin
2017-06-28
Access to patient data within the hospital or between hospitals is still problematic since a variety of information systems is in use applying different vendor specific terminologies and underlying knowledge models. Beyond, the development of electronic health record systems (EHRSs) is time and resource consuming. Thus, there is a substantial need for a development strategy of standardized EHRSs. We are applying a reuse-oriented process model and demonstrate its feasibility and realization on a practical medical use case, which is an EHRS holding all relevant data arising in the context of treatment of tumors of the sella region. In this paper, we describe the development process and our practical experiences. Requirements towards the development of the EHRS were collected by interviews with a neurosurgeon and patient data analysis. For modelling of patient data, we selected openEHR as standard and exploited the software tools provided by the openEHR foundation. The patient information model forms the core of the development process, which comprises the EHR generation and the implementation of an EHRS architecture. Moreover, a reuse-oriented process model from the business domain was adapted to the development of the EHRS. The reuse-oriented process model is a model for a suitable abstraction of both, modeling and development of an EHR centralized EHRS. The information modeling process resulted in 18 archetypes that were aggregated in a template and built the boilerplate of the model driven development. The EHRs and the EHRS were developed by openEHR and W3C standards, tightly supported by well-established XML techniques. The GUI of the final EHRS integrates and visualizes information from various examinations, medical reports, findings and laboratory test results. We conclude that the development of a standardized overarching EHR and an EHRS is feasible using openEHR and W3C standards, enabling a high degree of semantic interoperability. The standardized representation visualizes data and can in this way support the decision process of clinicians.
Evaluation of an attributive measurement system in the automotive industry
NASA Astrophysics Data System (ADS)
Simion, C.
2016-08-01
Measurement System Analysis (MSA) is a critical component for any quality improvement process. MSA is defined as an experimental and mathematical method of determining how much the variation within the measurement process contributes to overall process variability and it falls into two categories: attribute and variable. Most problematic measurement system issues come from measuring attribute data, which are usually the result of human judgment (visual inspection). Because attributive measurement systems are often used in some manufacturing processes, their assessment is important to obtain the confidence in the inspection process, to see where are the problems in order to eliminate them and to guide the process improvement. It was the aim of this paper to address such a issue presenting a case study made in a local company from the Sibiu region supplying products for the automotive industry, specifically the bag (a technical textile component, i.e. the fabric) for the airbag module. Because defects are inherent in every manufacturing process and in the field of airbag systems a minor defect can influence their performance and lives depend on the safety feature, there is a stringent visual inspection required on the defects of the bag material. The purpose of this attribute MSA was: to determine if all inspectors use the same criteria to determine “pass” from “fail” product (i.e. the fabric); to assess company inspection standards against customer's requirements; to determine how well inspectors are conforming to themselves; to identify how inspectors are conforming to a “known master,” which includes: how often operators ship defective product, how often operators dispose of acceptable product; to discover areas where training is required, procedures must be developed and standards are not available. The results were analyzed using MINITAB software with its module called Attribute Agreement Analysis. The conclusion was that the inspection process must be improved by operator training, developing visual aids/boundary samples, establishing standards and set-up procedures.
Kreimeyer, Kory; Foster, Matthew; Pandey, Abhishek; Arya, Nina; Halford, Gwendolyn; Jones, Sandra F; Forshee, Richard; Walderhaug, Mark; Botsis, Taxiarchis
2017-09-01
We followed a systematic approach based on the Preferred Reporting Items for Systematic Reviews and Meta-Analyses to identify existing clinical natural language processing (NLP) systems that generate structured information from unstructured free text. Seven literature databases were searched with a query combining the concepts of natural language processing and structured data capture. Two reviewers screened all records for relevance during two screening phases, and information about clinical NLP systems was collected from the final set of papers. A total of 7149 records (after removing duplicates) were retrieved and screened, and 86 were determined to fit the review criteria. These papers contained information about 71 different clinical NLP systems, which were then analyzed. The NLP systems address a wide variety of important clinical and research tasks. Certain tasks are well addressed by the existing systems, while others remain as open challenges that only a small number of systems attempt, such as extraction of temporal information or normalization of concepts to standard terminologies. This review has identified many NLP systems capable of processing clinical free text and generating structured output, and the information collected and evaluated here will be important for prioritizing development of new approaches for clinical NLP. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
1972-01-01
The IDAPS (Image Data Processing System) is a user-oriented, computer-based, language and control system, which provides a framework or standard for implementing image data processing applications, simplifies set-up of image processing runs so that the system may be used without a working knowledge of computer programming or operation, streamlines operation of the image processing facility, and allows multiple applications to be run in sequence without operator interaction. The control system loads the operators, interprets the input, constructs the necessary parameters for each application, and cells the application. The overlay feature of the IBSYS loader (IBLDR) provides the means of running multiple operators which would otherwise overflow core storage.
New Directions in Space Operations Services in Support of Interplanetary Exploration
NASA Technical Reports Server (NTRS)
Bradford, Robert N.
2005-01-01
To gain access to the necessary operational processes and data in support of NASA's Lunar/Mars Exploration Initiative, new services, adequate levels of computing cycles and access to myriad forms of data must be provided to onboard spacecraft and ground based personnel/systems (earth, lunar and Martian) to enable interplanetary exploration by humans. These systems, cycles and access to vast amounts of development, test and operational data will be required to provide a new level of services not currently available to existing spacecraft, on board crews and other operational personnel. Although current voice, video and data systems in support of current space based operations has been adequate, new highly reliable and autonomous processes and services will be necessary for future space exploration activities. These services will range from the more mundane voice in LEO to voice in interplanetary travel which because of the high latencies will require new voice processes and standards. New services, like component failure predictions based on data mining of significant quantities of data, located at disparate locations, will be required. 3D or holographic representation of onboard components, systems or family members will greatly improve maintenance, operations and service restoration not to mention crew morale. Current operational systems and standards, like the Internet Protocol, will not able to provide the level of service required end to end from an end point on the Martian surface like a scientific instrument to a researcher at a university. Ground operations whether earth, lunar or Martian and in flight operations to the moon and especially to Mars will require significant autonomy that will require access to highly reliable processing capabilities, data storage based on network storage technologies. Significant processing cycles will be needed onboard but could be borrowed from other locations either ground based or onboard other spacecraft. Reliability will be a key factor with onboard and distributed backup processing an absolutely necessary requirement. Current cluster processing/Grid technologies may provide the basis for providing these services. An overview of existing services, future services that will be required and the technologies and standards required to be developed will be presented. The purpose of this paper will be to initiate a technological roadmap, albeit at a high level, of current voice, video, data and network technologies and standards (which show promise for adaptation or evolution) to what technologies and standards need to be redefined, adjusted or areas where new ones require development. The roadmap should begin the differentiation between non manned and manned processes/services where applicable. The paper will be based in part on the activities of the CCSDS Monitor and Control working group which is beginning the process of standardization of the these processes. Another element of the paper will be based on an analysis of current technologies supporting space flight processes and services at JSC, MSFC, GSFC and to a lesser extent at KSC. Work being accomplished in areas such as Grid computing, data mining and network storage at ARC, IBM and the University of Alabama at Huntsville will be researched and analyzed.
On the Risk Management and Auditing of SOA Based Business Processes
NASA Astrophysics Data System (ADS)
Orriens, Bart; Heuvel, Willem-Jan V./D.; Papazoglou, Mike
SOA-enabled business processes stretch across many cooperating and coordinated systems, possibly crossing organizational boundaries, and technologies like XML and Web services are used for making system-to-system interactions commonplace. Business processes form the foundation for all organizations, and as such, are impacted by industry regulations. This requires organizations to review their business processes and ensure that they meet the compliance standards set forth in legislation. In this paper we sketch a SOA-based service risk management and auditing methodology including a compliance enforcement and verification system that assures verifiable business process compliance. This is done on the basis of a knowledge-based system that allows integration of internal control systems into business processes conform pre-defined compliance rules, monitor both the normal process behavior and those of the control systems during process execution, and log these behaviors to facilitate retrospective auditing.
EOS MLS Science Data Processing System: A Description of Architecture and Capabilities
NASA Technical Reports Server (NTRS)
Cuddy, David T.; Echeverri, Mark D.; Wagner, Paul A.; Hanzel, Audrey T.; Fuller, Ryan A.
2006-01-01
This paper describes the architecture and capabilities of the Science Data Processing System (SDPS) for the EOS MLS. The SDPS consists of two major components--the Science Computing Facility and the Science Investigator-led Processing System. The Science Computing Facility provides the facilities for the EOS MLS Science Team to perform the functions of scientific algorithm development, processing software development, quality control of data products, and scientific analyses. The Science Investigator-led Processing System processes and reprocesses the science data for the entire mission and delivers the data products to the Science Computing Facility and to the Goddard Space Flight Center Earth Science Distributed Active Archive Center, which archives and distributes the standard science products.
40 CFR 430.35 - New source performance standards (NSPS).
Code of Federal Regulations, 2011 CFR
2011-07-01
...-chemical (cross recovery) process and/or a combined unbleached kraft and semi-chemical process, wherein the spent semi-chemical cooking liquor is burned within the unbleached kraft chemical recovery system...
Library Information-Processing System
NASA Technical Reports Server (NTRS)
1985-01-01
System works with Library of Congress MARC II format. System composed of subsystems that provide wide range of library informationprocessing capabilities. Format is American National Standards Institute (ANSI) format for machine-readable bibliographic data. Adaptable to any medium-to-large library.
Chen, Elizabeth S.; Maloney, Francine L.; Shilmayster, Eugene; Goldberg, Howard S.
2009-01-01
A systematic and standard process for capturing information within free-text clinical documents could facilitate opportunities for improving quality and safety of patient care, enhancing decision support, and advancing data warehousing across an enterprise setting. At Partners HealthCare System, the Medical Language Processing (MLP) services project was initiated to establish a component-based architectural model and processes to facilitate putting MLP functionality into production for enterprise consumption, promote sharing of components, and encourage reuse. Key objectives included exploring the use of an open-source framework called the Unstructured Information Management Architecture (UIMA) and leveraging existing MLP-related efforts, terminology, and document standards. This paper describes early experiences in defining the infrastructure and standards for extracting, encoding, and structuring clinical observations from a variety of clinical documents to serve enterprise-wide needs. PMID:20351830
Chen, Elizabeth S; Maloney, Francine L; Shilmayster, Eugene; Goldberg, Howard S
2009-11-14
A systematic and standard process for capturing information within free-text clinical documents could facilitate opportunities for improving quality and safety of patient care, enhancing decision support, and advancing data warehousing across an enterprise setting. At Partners HealthCare System, the Medical Language Processing (MLP) services project was initiated to establish a component-based architectural model and processes to facilitate putting MLP functionality into production for enterprise consumption, promote sharing of components, and encourage reuse. Key objectives included exploring the use of an open-source framework called the Unstructured Information Management Architecture (UIMA) and leveraging existing MLP-related efforts, terminology, and document standards. This paper describes early experiences in defining the infrastructure and standards for extracting, encoding, and structuring clinical observations from a variety of clinical documents to serve enterprise-wide needs.
Analysis of the Integration of Skill Standards into Community College Curriculum
ERIC Educational Resources Information Center
Aragon, Steven R.; Woo, Hui-Jeong; Marvel, Matthew R.
2004-01-01
The utilization of skill standards in the curriculum development process has become an increasingly prominent aspect of the reform movement in career and technical education over the past 10 years. Standards are seen as a way to achieve better accountability within Career and Technical Education (CTE) systems, and improve their quality as well as…
A Distributed Processing Approach to Payroll Time Reporting for a Large School District.
ERIC Educational Resources Information Center
Freeman, Raoul J.
1983-01-01
Describes a system for payroll reporting from geographically disparate locations in which data is entered, edited, and verified locally on minicomputers and then uploaded to a central computer for the standard payroll process. Communications and hardware, time-reporting software, data input techniques, system implementation, and its advantages are…
ERIC Educational Resources Information Center
Landolfi, Adrienne M.
2016-01-01
As accountability measures continue to increase within education, public school systems have integrated standards-based evaluation systems to formally assess professional practices among educators. The purpose of this study was to explore the extent in which the communication process between evaluators and teachers impacts teacher performance…
Quality Space and Launch Requirements, Addendum to AS9100C
2015-05-08
45 8.9.1 Statistical Process Control (SPC) .......................................................................... 45...SMC Space and Missile Systems Center SME Subject Matter Expert SOW Statement of Work SPC Statistical Process Control SPO System Program Office SRP...occur without any individual data exceeding the control limits. Control limits are developed using standard statistical methods or other approved
NASA Astrophysics Data System (ADS)
Peckham, Scott
2016-04-01
Over the last decade, model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that make it much easier for modelers to connect heterogeneous sets of process models in a plug-and-play manner to create composite "system models". These mechanisms greatly simplify code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing with standardized metadata. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can use the self description functions to learn about each process model in a collection to be coupled and then automatically call framework service components (e.g. regridders, time interpolators and unit converters) as necessary to mediate the differences between them so they can work together. This talk will first review two key products of the CSDMS project, namely a standardized model interface called the Basic Model Interface (BMI) and the CSDMS Standard Names. The standard names are used in conjunction with BMI to provide a semantic matching mechanism that allows output variables from one process model or data set to be reliably used as input variables to other process models in a collection. They include not just a standardized naming scheme for model variables, but also a standardized set of terms for describing the attributes and assumptions of a given model. Recent efforts to bring powerful uncertainty analysis and inverse modeling toolkits such as DAKOTA into modeling frameworks will also be described. This talk will conclude with an overview of several related modeling projects that have been funded by NSF's EarthCube initiative, namely the Earth System Bridge, OntoSoft and GeoSemantics projects.
Human Spaceflight Safety for the Next Generation on Orbital Space Systems
NASA Technical Reports Server (NTRS)
Mango, Edward J.
2011-01-01
The National Aeronautics and Space Administration (NASA) Commercial Crew Program (CCP) has been chartered to facilitate the development of a United States (U.S.) commercial crew space transportation capability with the goal of achieving safe, reliable, and cost effective access to and from low Earth orbit (LEO) and the International Space Station (ISS) as soon as possible. Once the capability is matured and is available to the Government and other customers, NASA expects to purchase commercial services to meet its ISS crew rotation and emergency return objectives. The primary role of the CCP is to enable and ensure safe human spaceflight and processes for the next generation of earth orbital space systems. The architecture of the Program delineates the process for investment performance in safe orbital systems, Crew Transportation System (CTS) certification, and CTS Flight Readiness. A series of six technical documents build up the architecture to address the top-level CTS requirements and standards. They include Design Reference Missions, with the near term focus on ISS crew services, Certification and Service Requirements, Technical Management Processes, and Technical and Operations Standards Evaluation Processes.
Leonard, Mandy C; Thyagarajan, Rema; Wilson, Amy J; Sekeres, Mikkael A
2018-04-01
Lessons learned from the creation of a multihospital health-system formulary management and pharmacy and therapeutics (P&T) committee are described. A health system can create and implement a multihospital system formulary and P&T committee to provide evidence-based medications for ideal healthcare. The formulary and P&T process should be multidisciplinary and include adequate representation from system hospitals. The aim of a system formulary and P&T committee is standardization; however, the system should allow flexibility for differences. Key points for a successful multihospital system formulary and P&T committee are patience, collaboration, resilience, and communication. When establishing a multihospital health-system formulary and P&T committee, the needs of individual hospitals are crucial. A designated member of the pharmacy department needs to centrally coordinate and manage formulary requests, medication reviews and monographs, meeting agendas and minutes, and a summary of decisions for implementation. It is imperative to create a timeline for formulary reviews to set expectations, as well as a process for formulary appeals. Collaboration across the various hospitals is critical for successful formulary standardization. When implementing a health-system P&T committee or standardizing a formulary system, it is important to be patient and give local sites time to make practice changes. Evidence-based data and rationale must be provided to all sites to support formulary changes. Finally, there must be multidisciplinary collaboration. There are several options for formulary structures and P&T committees in a health system. Potential strengths and barriers should be evaluated before selecting a formulary management process. Copyright © 2018 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
An Interoperable Architecture for Air Pollution Early Warning System Based on Sensor Web
NASA Astrophysics Data System (ADS)
Samadzadegan, F.; Zahmatkesh, H.; Saber, M.; Ghazi khanlou, H. J.
2013-09-01
Environmental monitoring systems deal with time-sensitive issues which require quick responses in emergency situations. Handling the sensor observations in near real-time and obtaining valuable information is challenging issues in these systems from a technical and scientific point of view. The ever-increasing population growth in urban areas has caused certain problems in developing countries, which has direct or indirect impact on human life. One of applicable solution for controlling and managing air quality by considering real time and update air quality information gathered by spatially distributed sensors in mega cities, using sensor web technology for developing monitoring and early warning systems. Urban air quality monitoring systems using functionalities of geospatial information system as a platform for analysing, processing, and visualization of data in combination with Sensor Web for supporting decision support systems in disaster management and emergency situations. This system uses Sensor Web Enablement (SWE) framework of the Open Geospatial Consortium (OGC), which offers a standard framework that allows the integration of sensors and sensor data into spatial data infrastructures. SWE framework introduces standards for services to access sensor data and discover events from sensor data streams as well as definition set of standards for the description of sensors and the encoding of measurements. The presented system provides capabilities to collect, transfer, share, process air quality sensor data and disseminate air quality status in real-time. It is possible to overcome interoperability challenges by using standard framework. In a routine scenario, air quality data measured by in-situ sensors are communicated to central station where data is analysed and processed. The extracted air quality status is processed for discovering emergency situations, and if necessary air quality reports are sent to the authorities. This research proposed an architecture to represent how integrate air quality sensor data stream into geospatial data infrastructure to present an interoperable air quality monitoring system for supporting disaster management systems by real time information. Developed system tested on Tehran air pollution sensors for calculating Air Quality Index (AQI) for CO pollutant and subsequently notifying registered users in emergency cases by sending warning E-mails. Air quality monitoring portal used to retrieving and visualize sensor observation through interoperable framework. This system provides capabilities to retrieve SOS observation using WPS in a cascaded service chaining pattern for monitoring trend of timely sensor observation.
Service Oriented Architecture for Wireless Sensor Networks in Agriculture
NASA Astrophysics Data System (ADS)
Sawant, S. A.; Adinarayana, J.; Durbha, S. S.; Tripathy, A. K.; Sudharsan, D.
2012-08-01
Rapid advances in Wireless Sensor Network (WSN) for agricultural applications has provided a platform for better decision making for crop planning and management, particularly in precision agriculture aspects. Due to the ever-increasing spread of WSNs there is a need for standards, i.e. a set of specifications and encodings to bring multiple sensor networks on common platform. Distributed sensor systems when brought together can facilitate better decision making in agricultural domain. The Open Geospatial Consortium (OGC) through Sensor Web Enablement (SWE) provides guidelines for semantic and syntactic standardization of sensor networks. In this work two distributed sensing systems (Agrisens and FieldServer) were selected to implement OGC SWE standards through a Service Oriented Architecture (SOA) approach. Online interoperable data processing was developed through SWE components such as Sensor Model Language (SensorML) and Sensor Observation Service (SOS). An integrated web client was developed to visualize the sensor observations and measurements that enables the retrieval of crop water resources availability and requirements in a systematic manner for both the sensing devices. Further, the client has also the ability to operate in an interoperable manner with any other OGC standardized WSN systems. The study of WSN systems has shown that there is need to augment the operations / processing capabilities of SOS in order to understand about collected sensor data and implement the modelling services. Also, the very low cost availability of WSN systems in future, it is possible to implement the OGC standardized SWE framework for agricultural applications with open source software tools.
Flexibility First, Then Standardize: A Strategy for Growing Inter-Departmental Systems.
á Torkilsheyggi, Arnvør
2015-01-01
Any attempt to use IT to standardize work practices faces the challenge of finding a balance between standardization and flexibility. In implementing electronic whiteboards with the goal of standardizing inter-departmental practices, a hospital in Denmark chose to follow the strategy of "flexibility first, then standardization." To improve the local grounding of the system, they first focused on flexibility by configuring the whiteboards to support intra-departmental practices. Subsequently, they focused on standardization by using the white-boards to negotiate standardization of inter-departmental practices. This paper investigates the chosen strategy and finds: that super users on many wards managed to configure the whiteboard to support intra-departmental practices; that initiatives to standardize inter-departmental practices improved coordination of certain processes; and that the chosen strategy posed a challenge for finding the right time and manner to shift the balance from flexibility to standardization.
Defense Logistics Standard Systems Functional Requirements.
1987-03-01
Artificial Intelligence - the development of a machine capability to perform functions normally concerned with human intelligence, such as learning , adapting...Basic Data Base Machine Configurations .... ......... D- 18 xx ~ ?f~~~vX PART I: MODELS - DEFENSE LOGISTICS STANDARD SYSTEMS FUNCTIONAL REQUIREMENTS...On-line, Interactive Access. Integrating user input and machine output in a dynamic, real-time, give-and- take process is considered the optimum mode
First Workshop on Convergence and Consolidation towards Standard AAL Platform Services
NASA Astrophysics Data System (ADS)
Lázaro, Juan-Pablo; Guillén, Sergio; Farshchian, Babak; Mikalsen, Marius
The following document describes the call for papers for a workshop based on identifying which are the potential commonalities that are important for an AAL system, so they can be discussed and proposed for opening an standardization process. Groups of components like context-management, user interaction management or semantic description of services are frequent components and technologies that are part of an AAL system.
Increasing the Automation and Autonomy for Spacecraft Operations with Criteria Action Table
NASA Technical Reports Server (NTRS)
Li, Zhen-Ping; Savki, Cetin
2005-01-01
The Criteria Action Table (CAT) is an automation tool developed for monitoring real time system messages for specific events and processes in order to take user defined actions based on a set of user-defined rules. CAT was developed by Lockheed Martin Space Operations as a part of a larger NASA effort at the Goddard Space Flight Center (GSFC) to create a component-based, middleware-based, and standard-based general purpose ground system architecture referred as GMSEC - the GSFC Mission Services Evolution Center. CAT has been integrated into the upgraded ground systems for Tropical Rainfall Measuring Mission (TRMM) and Small Explorer (SMEX) satellites and it plays the central role in their automation effort to reduce the cost and increase the reliability for spacecraft operations. The GMSEC architecture provides a standard communication interface and protocol for components to publish/describe messages to an information bus. It also provides a standard message definition so components can send and receive messages to the bus interface rather than each other, thus reducing the component-to-component coupling, interface, protocols, and link (socket) management. With the GMSEC architecture, components can publish standard event messages to the bus for all nominal, significant, and surprising events in regard to satellite, celestial, ground system, or any other activity. In addition to sending standard event messages, each GMSEC compliant component is required to accept and process GMSEC directive request messages.
Viirs Land Science Investigator-Led Processing System
NASA Astrophysics Data System (ADS)
Devadiga, S.; Mauoka, E.; Roman, M. O.; Wolfe, R. E.; Kalb, V.; Davidson, C. C.; Ye, G.
2015-12-01
The objective of the NASA's Suomi National Polar Orbiting Partnership (S-NPP) Land Science Investigator-led Processing System (Land SIPS), housed at the NASA Goddard Space Flight Center (GSFC), is to produce high quality land products from the Visible Infrared Imaging Radiometer Suite (VIIRS) to extend the Earth System Data Records (ESDRs) developed from NASA's heritage Earth Observing System (EOS) Moderate Resolution Imaging Spectroradiometer (MODIS) onboard the EOS Terra and Aqua satellites. In this paper we will present the functional description and capabilities of the S-NPP Land SIPS, including system development phases and production schedules, timeline for processing, and delivery of land science products based on coordination with the S-NPP Land science team members. The Land SIPS processing stream is expected to be operational by December 2016, generating land products either using the NASA science team delivered algorithms, or the "best-of" science algorithms currently in operation at NASA's Land Product Evaluation and Algorithm Testing Element (PEATE). In addition to generating the standard land science products through processing of the NASA's VIIRS Level 0 data record, the Land SIPS processing system is also used to produce a suite of near-real time products for NASA's application community. Land SIPS will also deliver the standard products, ancillary data sets, software and supporting documentation (ATBDs) to the assigned Distributed Active Archive Centers (DAACs) for archival and distribution. Quality assessment and validation will be an integral part of the Land SIPS processing system; the former being performed at Land Data Operational Product Evaluation (LDOPE) facility, while the latter under the auspices of the CEOS Working Group on Calibration & Validation (WGCV) Land Product Validation (LPV) Subgroup; adopting the best-practices and tools used to assess the quality of heritage EOS-MODIS products generated at the MODIS Adaptive Processing System (MODAPS).
Integration of image capture and processing: beyond single-chip digital camera
NASA Astrophysics Data System (ADS)
Lim, SukHwan; El Gamal, Abbas
2001-05-01
An important trend in the design of digital cameras is the integration of capture and processing onto a single CMOS chip. Although integrating the components of a digital camera system onto a single chip significantly reduces system size and power, it does not fully exploit the potential advantages of integration. We argue that a key advantage of integration is the ability to exploit the high speed imaging capability of CMOS image senor to enable new applications such as multiple capture for enhancing dynamic range and to improve the performance of existing applications such as optical flow estimation. Conventional digital cameras operate at low frame rates and it would be too costly, if not infeasible, to operate their chips at high frame rates. Integration solves this problem. The idea is to capture images at much higher frame rates than he standard frame rate, process the high frame rate data on chip, and output the video sequence and the application specific data at standard frame rate. This idea is applied to optical flow estimation, where significant performance improvements are demonstrate over methods using standard frame rate sequences. We then investigate the constraints on memory size and processing power that can be integrated with a CMOS image sensor in a 0.18 micrometers process and below. We show that enough memory and processing power can be integrated to be able to not only perform the functions of a conventional camera system but also to perform applications such as real time optical flow estimation.
Engineering Lessons Learned and Systems Engineering Applications
NASA Technical Reports Server (NTRS)
Gill, Paul S.; Garcia, Danny; Vaughan, William W.
2005-01-01
Systems Engineering is fundamental to good engineering, which in turn depends on the integration and application of engineering lessons learned. Thus, good Systems Engineering also depends on systems engineering lessons learned from within the aerospace industry being documented and applied. About ten percent of the engineering lessons learned documented in the NASA Lessons Learned Information System are directly related to Systems Engineering. A key issue associated with lessons learned datasets is the communication and incorporation of this information into engineering processes. As part of the NASA Technical Standards Program activities, engineering lessons learned datasets have been identified from a number of sources. These are being searched and screened for those having a relation to Technical Standards. This paper will address some of these Systems Engineering Lessons Learned and how they are being related to Technical Standards within the NASA Technical Standards Program, including linking to the Agency's Interactive Engineering Discipline Training Courses and the life cycle for a flight vehicle development program.
40 CFR 63.983 - Closed vent systems.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 10 2010-07-01 2010-07-01 false Closed vent systems. 63.983 Section 63... Emission Standards for Closed Vent Systems, Control Devices, Recovery Devices and Routing to a Fuel Gas System or a Process § 63.983 Closed vent systems. (a) Closed vent system equipment and operating...
Apply creative thinking of decision support in electrical nursing record.
Hao, Angelica Te-Hui; Hsu, Chien-Yeh; Li-Fang, Huang; Jian, Wen-Shan; Wu, Li-Bin; Kao, Ching-Chiu; Lu, Mei-Show; Chang, Her-Kung
2006-01-01
The nursing process consists of five interrelated steps: assessment, diagnosis, planning, intervention, and evaluation. In the nursing process, the nurse collects a great deal of data and information. The amount of data and information may exceed the amount the nurse can process efficiently and correctly. Thus, the nurse needs assistance to become proficient in the planning of nursing care, due to the difficulty of simultaneously processing a large set of information. Computer systems are viewed as tools to expand the capabilities of the nurse's mind. Using computer technology to support clinicians' decision making may provide high-quality, patient-centered, and efficient healthcare. Although some existing nursing information systems aid in the nursing process, they only provide the most fundamental decision support--i.e., standard care plans associated with common nursing diagnoses. Such a computerized decision support system helps the nurse develop a care plan step-by-step. But it does not assist the nurse in the decision-making process. The decision process about how to generate nursing diagnoses from data and how to individualize the care plans still reminds of the nurse. The purpose of this study is to develop a pilot structure in electronic nursing record system integrated with international nursing standard for improving the proficiency and accuracy of plan of care in clinical pathway process. The proposed pilot systems not only assist both student nurses and nurses who are novice in nursing practice, but also experts who need to work in a practice area which they are not familiar with.
Gathering Information from Transport Systems for Processing in Supply Chains
NASA Astrophysics Data System (ADS)
Kodym, Oldřich; Unucka, Jakub
2016-12-01
Paper deals with complex system for processing information from means of transport acting as parts of train (rail or road). It focuses on automated information gathering using AutoID technology, information transmission via Internet of Things networks and information usage in information systems of logistic firms for support of selected processes on MES and ERP levels. Different kinds of gathered information from whole transport chain are discussed. Compliance with existing standards is mentioned. Security of information in full life cycle is integral part of presented system. Design of fully equipped system based on synthesized functional nodes is presented.
Fox, W.E.; McCollum, D.W.; Mitchell, J.E.; Swanson, L.E.; Kreuter, U.P.; Tanaka, J.A.; Evans, G.R.; Theodore, Heintz H.; Breckenridge, R.P.; Geissler, P.H.
2009-01-01
Currently, there is no standard method to assess the complex systems in rangeland ecosystems. Decision makers need baselines to create a common language of current rangeland conditions and standards for continued rangeland assessment. The Sustainable Rangeland Roundtable (SRR), a group of private and public organizations and agencies, has created a forum to discuss rangeland sustainability and assessment. The SRR has worked to integrate social, economic, and ecological disciplines related to rangelands and has identified a standard set of indicators that can be used to assess rangeland sustainability. As part of this process, SRR has developed a two-tiered conceptual framework from a systems perspective to study the validity of indicators and the relationships among them. The first tier categorizes rangeland characteristics into four states. The second tier defines processes affecting these states through time and space. The framework clearly shows that the processes affect and are affected by each other. ?? 2009 Taylor & Francis Group, LLC.
Multi-core processing and scheduling performance in CMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hernandez, J. M.; Evans, D.; Foulkes, S.
2012-01-01
Commodity hardware is going many-core. We might soon not be able to satisfy the job memory needs per core in the current single-core processing model in High Energy Physics. In addition, an ever increasing number of independent and incoherent jobs running on the same physical hardware not sharing resources might significantly affect processing performance. It will be essential to effectively utilize the multi-core architecture. CMS has incorporated support for multi-core processing in the event processing framework and the workload management system. Multi-core processing jobs share common data in memory, such us the code libraries, detector geometry and conditions data, resultingmore » in a much lower memory usage than standard single-core independent jobs. Exploiting this new processing model requires a new model in computing resource allocation, departing from the standard single-core allocation for a job. The experiment job management system needs to have control over a larger quantum of resource since multi-core aware jobs require the scheduling of multiples cores simultaneously. CMS is exploring the approach of using whole nodes as unit in the workload management system where all cores of a node are allocated to a multi-core job. Whole-node scheduling allows for optimization of the data/workflow management (e.g. I/O caching, local merging) but efficient utilization of all scheduled cores is challenging. Dedicated whole-node queues have been setup at all Tier-1 centers for exploring multi-core processing workflows in CMS. We present the evaluation of the performance scheduling and executing multi-core workflows in whole-node queues compared to the standard single-core processing workflows.« less
40 CFR 63.1089 - What records must I keep?
Code of Federal Regulations, 2011 CFR
2011-07-01
...) National Emission Standards for Ethylene Manufacturing Process Units: Heat Exchange Systems and Waste Operations Recordkeeping and Reporting Requirements for Heat Exchange Systems § 63.1089 What records must I...
40 CFR 63.1089 - What records must I keep?
Code of Federal Regulations, 2010 CFR
2010-07-01
...) National Emission Standards for Ethylene Manufacturing Process Units: Heat Exchange Systems and Waste Operations Recordkeeping and Reporting Requirements for Heat Exchange Systems § 63.1089 What records must I...
40 CFR 63.1089 - What records must I keep?
Code of Federal Regulations, 2013 CFR
2013-07-01
...) National Emission Standards for Ethylene Manufacturing Process Units: Heat Exchange Systems and Waste Operations Recordkeeping and Reporting Requirements for Heat Exchange Systems § 63.1089 What records must I...
40 CFR 63.1089 - What records must I keep?
Code of Federal Regulations, 2014 CFR
2014-07-01
...) National Emission Standards for Ethylene Manufacturing Process Units: Heat Exchange Systems and Waste Operations Recordkeeping and Reporting Requirements for Heat Exchange Systems § 63.1089 What records must I...
40 CFR 63.1089 - What records must I keep?
Code of Federal Regulations, 2012 CFR
2012-07-01
...) National Emission Standards for Ethylene Manufacturing Process Units: Heat Exchange Systems and Waste Operations Recordkeeping and Reporting Requirements for Heat Exchange Systems § 63.1089 What records must I...
Intelligent Information Systems.
ERIC Educational Resources Information Center
Zabezhailo, M. I.; Finn, V. K.
1996-01-01
An Intelligent Information System (IIS) uses data warehouse technology to facilitate the cycle of data and knowledge processing, including input, standardization, storage, representation, retrieval, calculation, and delivery. This article provides an overview of IIS products and artificial intelligence systems, illustrates examples of IIS…
Modernized Techniques for Dealing with Quality Data and Derived Products
NASA Astrophysics Data System (ADS)
Neiswender, C.; Miller, S. P.; Clark, D.
2008-12-01
"I just want a picture of the ocean floor in this area" is expressed all too often by researchers, educators, and students in the marine geosciences. As more sophisticated systems are developed to handle data collection and processing, the demand for quality data, and standardized products continues to grow. Data management is an invisible bridge between science and researchers/educators. The SIOExplorer digital library presents more than 50 years of ocean-going research. Prior to publication, all data is checked for quality using standardized criterion developed for each data stream. Despite the evolution of data formats and processing systems, SIOExplorer continues to present derived products in well- established formats. Standardized products are published for each cruise, and include a cruise report, MGD77 merged data, multi-beam flipbook, and underway profiles. Creation of these products is made possible by processing scripts, which continue to change with ever-evolving data formats. We continue to explore the potential of database-enabled creation of standardized products, such as the metadata-rich MGD77 header file. Database-enabled, automated processing produces standards-compliant metadata for each data and derived product. Metadata facilitates discovery and interpretation of published products. This descriptive information is stored both in an ASCII file, and a searchable digital library database. SIOExplorer's underlying technology allows focused search and retrieval of data and products. For example, users can initiate a search of only multi-beam data, which includes data-specific parameters. This customization is made possible with a synthesis of database, XML, and PHP technology. The combination of standardized products and digital library technology puts quality data and derived products in the hands of scientists. Interoperable systems enable distribution these published resources using technology such as web services. By developing modernized strategies to deal with data, Scripps Institution of Oceanography is able to produce and distribute well-formed, and quality-tested derived products, which aid research, understanding, and education.
NASA Technical Reports Server (NTRS)
Hall, David G.; Bridges, James
1992-01-01
A sophisticated, multi-channel computerized data acquisition and processing system was developed at the NASA LeRC for use in noise experiments. This technology, which is available for transfer to industry, provides a convenient, cost-effective alternative to analog tape recording for high frequency acoustic measurements. This system provides 32-channel acquisition of microphone signals with an analysis bandwidth up to 100 kHz per channel. Cost was minimized through the use of off-the-shelf components. Requirements to allow for future expansion were met by choosing equipment which adheres to established industry standards for hardware and software. Data processing capabilities include narrow band and 1/3 octave spectral analysis, compensation for microphone frequency response/directivity, and correction of acoustic data to standard day conditions. The system was used successfully in a major wind tunnel test program at NASA LeRC to acquire and analyze jet noise data in support of the High Speed Civil Transport (HSCT) program.
NASA Astrophysics Data System (ADS)
Topolskiy, D.; Topolskiy, N.; Solomin, E.; Topolskaya, I.
2016-04-01
In the present paper the authors discuss some ways of solving energy saving problems in mechanical engineering. In authors' opinion one of the ways of solving this problem is integrated modernization of power engineering objects of mechanical engineering companies, which should be intended for the energy supply control efficiency increase and electric energy commercial accounting improvement. The author have proposed the usage of digital current and voltage transformers for these purposes. To check the compliance of this equipment with the IEC 61850 International Standard, we have built a mathematic model of the data exchange process between measuring transformers and a universal SCADA-system. The results of modeling show that the discussed equipment corresponds to the mentioned Standard requirements and the usage of the universal SCADA-system for these purposes is preferable and economically reasonable. In modeling the authors have used the following software: MasterScada, Master OPC_DI_61850, OPNET.
Leveraging Terminology Services for Extract-Transform-Load Processes: A User-Centered Approach
Peterson, Kevin J.; Jiang, Guoqian; Brue, Scott M.; Liu, Hongfang
2016-01-01
Terminology services serve an important role supporting clinical and research applications, and underpin a diverse set of processes and use cases. Through standardization efforts, terminology service-to-system interactions can leverage well-defined interfaces and predictable integration patterns. Often, however, users interact more directly with terminologies, and no such blueprints are available for describing terminology service-to-user interactions. In this work, we explore the main architecture principles necessary to build a user-centered terminology system, using an Extract-Transform-Load process as our primary usage scenario. To analyze our architecture, we present a prototype implementation based on the Common Terminology Services 2 (CTS2) standard using the Patient-Centered Network of Learning Health Systems (LHSNet) project as a concrete use case. We perform a preliminary evaluation of our prototype architecture using three architectural quality attributes: interoperability, adaptability and usability. We find that a design-time focus on user needs, cognitive models, and existing patterns is essential to maximize system utility. PMID:28269898
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-30
... process. Key components of the antitheft device will include a passive immobilizer, a warning message... feature as standard equipment. When the system is activated, the alarm will trigger if one of the doors...
48 CFR 9903.202-3 - Amendments and revisions.
Code of Federal Regulations, 2010 CFR
2010-10-01
.... 9903.202-3 Section 9903.202-3 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD... ACCOUNTING STANDARDS CONTRACT COVERAGE CAS Program Requirements 9903.202-3 Amendments and revisions..., Disclosure Statements is discouraged except when extensive changes require it to assist the review process. ...
40 CFR 63.605 - Monitoring requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants From Phosphoric Acid Manufacturing Plants § 63.605... the mass flow of phosphorus-bearing feed material to the process. The monitoring system shall have an...
de Oliveira, Neurilene Batista; Peres, Heloisa Helena Ciqueto
2015-01-01
To evaluate the functional performance and the technical quality of the Electronic Documentation System of the Nursing Process of the Teaching Hospital of the University of São Paulo. exploratory-descriptive study. The Quality Model of regulatory standard 25010 and the Evaluation Process defined under regulatory standard 25040, both of the International Organization for Standardization/International Electrotechnical Commission. The quality characteristics evaluated were: functional suitability, reliability, usability, performance efficiency, compatibility, security, maintainability and portability. The sample was made up of 37 evaluators. in the evaluation of the specialists in information technology, only the characteristic of usability obtained a rate of positive responses of less than 70%. For the nurse lecturers, all the quality characteristics obtained a rate of positive responses of over 70%. The staff nurses of the medical and surgical clinics with experience in using the system) and staff nurses from other units of the hospital and from other health institutions (without experience in using the system) obtained rates of positive responses of more than 70% referent to the functional suitability, usability, and security. However, performance efficiency, reliability and compatibility all obtained rates below the parameter established. the software achieved rates of positive responses of over 70% for the majority of the quality characteristics evaluated.
A Standard Kinematic Model for Flight Simulation at NASA Ames
NASA Technical Reports Server (NTRS)
Mcfarland, R. E.
1975-01-01
A standard kinematic model for aircraft simulation exists at NASA-Ames on a variety of computer systems, one of which is used to control the flight simulator for advanced aircraft (FSAA). The derivation of the kinematic model is given and various mathematical relationships are presented as a guide. These include descriptions of standardized simulation subsystems such as the atmospheric turbulence model and the generalized six-degrees-of-freedom trim routine, as well as an introduction to the emulative batch-processing system which enables this facility to optimize its real-time environment.
Comparative effectiveness of the SNaP™ Wound Care System.
Hutton, David W; Sheehan, Peter
2011-04-01
Diabetic lower extremity wounds cause substantial burden to healthcare systems, costing tens of thousands of dollars per episode. Negative pressure wound therapy (NPWT) devices have been shown to be cost-effective at treating these wounds, but the traditional devices use bulky electrical pumps that require a durable medical equipment rental-based procurement process. The Spiracur SNaP™ Wound Care System is an ultraportable NPWT system that does not use an electric pump and is fully disposable. It has superior healing compared to standard of care with modern dressings and comparable healing to traditional NPWT devices while giving patients greater mobility and giving clinicians a simpler procurement process. We used a mathematical model to analyse the costs of the SNaP™ system and compare them to standard of care and electrically powered NPWT devices. When compared to standard of care, the SNaP™ system saves over $9000 per wound treated and more than doubles the number of patients healed. The SNaP system has similar healing time to powered NPWT devices, but saves $2300 in Medicare payments or $2800 for private payers per wound treated. Our analysis shows that the SNaP™ system could save substantial treatment costs in addition to allowing patients greater freedom and mobility. © 2011 The Authors. © 2011 Blackwell Publishing Ltd and Medicalhelplines.com Inc.
Implementation of a low-cost, commercial orbit determination system
NASA Technical Reports Server (NTRS)
Corrigan, Jim
1994-01-01
This paper describes the implementation and potential applications of a workstation-based orbit determination system developed by Storm Integration, Inc. called the Precision Orbit Determination System (PODS). PODS is offered as a layered product to the commercially-available Satellite Tool Kit (STK) produced by Analytical Graphics, Inc. PODS also incorporates the Workstation/Precision Orbit Determination (WS/POD) product offered by Van Martin System, Inc. The STK graphical user interface is used to access and invoke the PODS capabilities and to display the results. WS/POD is used to compute a best-fit solution to user-supplied tracking data. PODS provides the capability to simultaneously estimate the orbits of up to 99 satellites based on a wide variety of observation types including angles, range, range rate, and Global Positioning System (GPS) data. PODS can also estimate ground facility locations, Earth geopotential model coefficients, solar pressure and atmospheric drag parameters, and observation data biases. All determined data is automatically incorporated into the STK data base, which allows storage, manipulation and export of the data to other applications. PODS is offered in three levels: Standard, Basic GPS and Extended GPS. Standard allows processing of non-GPS observation types for any number of vehicles and facilities. Basic GPS adds processing of GPS pseudo-ranging data to the Standard capabilities. Extended GPS adds the ability to process GPS carrier phase data.
Mengel, M; Sis, B; Halloran, P F
2007-10-01
The Banff process defined the diagnostic histologic lesions for renal allograft rejection and created a standardized classification system where none had existed. By correcting this deficit the process had universal impact on clinical practice and clinical and basic research. All trials of new drugs since the early 1990s benefited, because the Banff classification of lesions permitted the end point of biopsy-proven rejection. The Banff process has strengths, weaknesses, opportunities and threats (SWOT). The strength is its self-organizing group structure to create consensus. Consensus does not mean correctness: defining consensus is essential if a widely held view is to be proved wrong. The weaknesses of the Banff process are the absence of an independent external standard to test the classification; and its almost exclusive reliance on histopathology, which has inherent limitations in intra- and interobserver reproducibility, particularly at the interface between borderline and rejection, is exactly where clinicians demand precision. The opportunity lies in the new technology such as transcriptomics, which can form an external standard and can be incorporated into a new classification combining the elegance of histopathology and the objectivity of transcriptomics. The threat is the degree to which the renal transplant community will participate in and support this process.
Lu, Tu-Lin; Li, Jin-Ci; Yu, Jiang-Yong; Cai, Bao-Chang; Mao, Chun-Qin; Yin, Fang-Zhou
2014-01-01
Traditional Chinese medicine (TCM) reference standards plays an important role in the quality control of Chinese herbal pieces. This paper overviewed the development of TCM reference standards. By analyzing the 2010 edition of Chinese pharmacopoeia, the application of TCM reference standards in the quality control of Chinese herbal pieces was summarized, and the problems exiting in the system were put forward. In the process of improving the quality control level of Chinese herbal pieces, various kinds of advanced methods and technology should be used to research the characteristic reference standards of Chinese herbal pieces, more and more reasonable reference standards should be introduced in the quality control system of Chinese herbal pieces. This article discussed the solutions in the aspect of TCM reference standards, and future development of quality control on Chinese herbal pieces is prospected.
40 CFR 63.1012 - Compressor standards.
Code of Federal Regulations, 2011 CFR
2011-07-01
... fluid system degassing reservoir that is routed to a process or fuel gas system or connected by a closed... sensor that will detect failure of the seal system, barrier fluid system, or both. Each sensor shall be... the seal system, the barrier fluid system, or both. If the sensor indicates failure of the seal system...
40 CFR 63.1012 - Compressor standards.
Code of Federal Regulations, 2010 CFR
2010-07-01
... fluid system degassing reservoir that is routed to a process or fuel gas system or connected by a closed... sensor that will detect failure of the seal system, barrier fluid system, or both. Each sensor shall be... the seal system, the barrier fluid system, or both. If the sensor indicates failure of the seal system...
40 CFR 63.1080 - What is the purpose of this subpart?
Code of Federal Regulations, 2013 CFR
2013-07-01
... (CONTINUED) National Emission Standards for Ethylene Manufacturing Process Units: Heat Exchange Systems and... requirements for controlling emissions of hazardous air pollutants (HAP) from heat exchange systems and waste...
40 CFR 63.1080 - What is the purpose of this subpart?
Code of Federal Regulations, 2011 CFR
2011-07-01
... (CONTINUED) National Emission Standards for Ethylene Manufacturing Process Units: Heat Exchange Systems and... requirements for controlling emissions of hazardous air pollutants (HAP) from heat exchange systems and waste...
40 CFR 63.1080 - What is the purpose of this subpart?
Code of Federal Regulations, 2014 CFR
2014-07-01
... (CONTINUED) National Emission Standards for Ethylene Manufacturing Process Units: Heat Exchange Systems and... requirements for controlling emissions of hazardous air pollutants (HAP) from heat exchange systems and waste...
40 CFR 63.1080 - What is the purpose of this subpart?
Code of Federal Regulations, 2012 CFR
2012-07-01
... (CONTINUED) National Emission Standards for Ethylene Manufacturing Process Units: Heat Exchange Systems and... requirements for controlling emissions of hazardous air pollutants (HAP) from heat exchange systems and waste...
40 CFR 63.1080 - What is the purpose of this subpart?
Code of Federal Regulations, 2010 CFR
2010-07-01
... (CONTINUED) National Emission Standards for Ethylene Manufacturing Process Units: Heat Exchange Systems and... requirements for controlling emissions of hazardous air pollutants (HAP) from heat exchange systems and waste...
Code of Federal Regulations, 2012 CFR
2012-04-01
... STANDARDS FOR HUMAN BLOOD AND BLOOD PRODUCTS Source Plasma § 640.68 Processing. (a) Sterile system. All administration and transfer sets inserted into blood containers used for processing Source Plasma intended for manufacturing into injectable or noninjectable products and all interior surfaces of plasma containers used for...
Code of Federal Regulations, 2011 CFR
2011-04-01
... STANDARDS FOR HUMAN BLOOD AND BLOOD PRODUCTS Source Plasma § 640.68 Processing. (a) Sterile system. All administration and transfer sets inserted into blood containers used for processing Source Plasma intended for manufacturing into injectable or noninjectable products and all interior surfaces of plasma containers used for...
Code of Federal Regulations, 2013 CFR
2013-04-01
... STANDARDS FOR HUMAN BLOOD AND BLOOD PRODUCTS Source Plasma § 640.68 Processing. (a) Sterile system. All administration and transfer sets inserted into blood containers used for processing Source Plasma intended for manufacturing into injectable or noninjectable products and all interior surfaces of plasma containers used for...
Code of Federal Regulations, 2014 CFR
2014-04-01
... STANDARDS FOR HUMAN BLOOD AND BLOOD PRODUCTS Source Plasma § 640.68 Processing. (a) Sterile system. All administration and transfer sets inserted into blood containers used for processing Source Plasma intended for manufacturing into injectable or noninjectable products and all interior surfaces of plasma containers used for...
Technical Data Interoperability (TDI) Pathfinder Via Emerging Standards
NASA Technical Reports Server (NTRS)
Conroy, Mike; Gill, Paul; Hill, Bradley; Ibach, Brandon; Jones, Corey; Ungar, David; Barch, Jeffrey; Ingalls, John; Jacoby, Joseph; Manning, Josh;
2014-01-01
The TDI project (TDI) investigates trending technical data standards for applicability to NASA vehicles, space stations, payloads, facilities, and equipment. TDI tested COTS software compatible with a certain suite of related industry standards for capabilities of individual benefits and interoperability. These standards not only esnable Information Technology (IT) efficiencies, but also address efficient structures and standard content for business processes. We used source data from generic industry samples as well as NASA and European Space Agency (ESA) data from space systems.
An array processing system for lunar geochemical and geophysical data
NASA Technical Reports Server (NTRS)
Eliason, E. M.; Soderblom, L. A.
1977-01-01
A computerized array processing system has been developed to reduce, analyze, display, and correlate a large number of orbital and earth-based geochemical, geophysical, and geological measurements of the moon on a global scale. The system supports the activities of a consortium of about 30 lunar scientists involved in data synthesis studies. The system was modeled after standard digital image-processing techniques but differs in that processing is performed with floating point precision rather than integer precision. Because of flexibility in floating-point image processing, a series of techniques that are impossible or cumbersome in conventional integer processing were developed to perform optimum interpolation and smoothing of data. Recently color maps of about 25 lunar geophysical and geochemical variables have been generated.
40 CFR 63.445 - Standards for the bleaching system.
Code of Federal Regulations, 2010 CFR
2010-07-01
... process using secondary or non-wood fibers, that use chlorine dioxide. (b) The equipment at each bleaching... system. (a) Each bleaching system that does not use any chlorine or chlorinated compounds for bleaching... systems shall meet all the provisions of this section: (1) Bleaching systems that use chlorine; (2...
40 CFR 63.445 - Standards for the bleaching system.
Code of Federal Regulations, 2011 CFR
2011-07-01
... process using secondary or non-wood fibers, that use chlorine dioxide. (b) The equipment at each bleaching... system. (a) Each bleaching system that does not use any chlorine or chlorinated compounds for bleaching... systems shall meet all the provisions of this section: (1) Bleaching systems that use chlorine; (2...
Systems Management of Air Force Standard Communications-Computer systems: There is a Better Way
1988-04-01
upgrade or replacement of systems. AFR 700-6, Information Systems Operation Management , AFR 700-7, Information Processing Center Opera- tions Management...and AFR 700-8, Telephone Systems Operation Management provide USAF guidance, policy and procedures governing this phase. 4 2. 800-Series Regulations
NASA Astrophysics Data System (ADS)
Yang, C.; Zheng, W.; Zhang, M.; Yuan, T.; Zhuang, G.; Pan, Y.
2016-06-01
Measurement and control of the plasma in real-time are critical for advanced Tokamak operation. It requires high speed real-time data acquisition and processing. ITER has designed the Fast Plant System Controllers (FPSC) for these purposes. At J-TEXT Tokamak, a real-time data acquisition and processing framework has been designed and implemented using standard ITER FPSC technologies. The main hardware components of this framework are an Industrial Personal Computer (IPC) with a real-time system and FlexRIO devices based on FPGA. With FlexRIO devices, data can be processed by FPGA in real-time before they are passed to the CPU. The software elements are based on a real-time framework which runs under Red Hat Enterprise Linux MRG-R and uses Experimental Physics and Industrial Control System (EPICS) for monitoring and configuring. That makes the framework accord with ITER FPSC standard technology. With this framework, any kind of data acquisition and processing FlexRIO FPGA program can be configured with a FPSC. An application using the framework has been implemented for the polarimeter-interferometer diagnostic system on J-TEXT. The application is able to extract phase-shift information from the intermediate frequency signal produced by the polarimeter-interferometer diagnostic system and calculate plasma density profile in real-time. Different algorithms implementations on the FlexRIO FPGA are compared in the paper.
Integrated control system for electron beam processes
NASA Astrophysics Data System (ADS)
Koleva, L.; Koleva, E.; Batchkova, I.; Mladenov, G.
2018-03-01
The ISO/IEC 62264 standard is widely used for integration of the business systems of a manufacturer with the corresponding manufacturing control systems based on hierarchical equipment models, functional data and manufacturing operations activity models. In order to achieve the integration of control systems, formal object communication models must be developed, together with manufacturing operations activity models, which coordinate the integration between different levels of control. In this article, the development of integrated control system for electron beam welding process is presented as part of a fully integrated control system of an electron beam plant, including also other additional processes: surface modification, electron beam evaporation, selective melting and electron beam diagnostics.
NASA Technical Reports Server (NTRS)
Jester, Peggy L.; Hancock, David W., III
1999-01-01
This document provides the Data Management Plan for the GLAS Standard Data Software (SDS) supporting the GLAS instrument of the EOS ICESat Spacecraft. The SDS encompasses the ICESat Science Investigator-led Processing System (I-SIPS) Software and the Instrument Support Facility (ISF) Software. This Plan addresses the identification, authority, and description of the interface nodes associated with the GLAS Standard Data Products and the GLAS Ancillary Data.
NASA Astrophysics Data System (ADS)
Liang, J.; Sédillot, S.; Traverson, B.
1997-09-01
This paper addresses federation of a transactional object standard - Object Management Group (OMG) object transaction service (OTS) - with the X/Open distributed transaction processing (DTP) model and International Organization for Standardization (ISO) open systems interconnection (OSI) transaction processing (TP) communication protocol. The two-phase commit propagation rules within a distributed transaction tree are similar in the X/Open, ISO and OMG models. Building an OTS on an OSI TP protocol machine is possible because the two specifications are somewhat complementary. OTS defines a set of external interfaces without specific internal protocol machine, while OSI TP specifies an internal protocol machine without any application programming interface. Given these observations, and having already implemented an X/Open two-phase commit transaction toolkit based on an OSI TP protocol machine, we analyse the feasibility of using this implementation as a transaction service provider for OMG interfaces. Based on the favourable result of this feasibility study, we are implementing an OTS compliant system, which, by initiating the extensibility and openness strengths of OSI TP, is able to provide interoperability between X/Open DTP and OMG OTS models.
Graham, Denise H
2004-11-01
The quality improvement plan relies on controlling quality of care through improving the process or system as a whole. Your ongoing data collection is paramount to the process of system-wide improvement and performance, enhancement of financial performance, operational performance and overall service performance and satisfaction. The threat of litigation and having to defend yourself from a claim of wrongdoing still looms every time your wheels turn. Your runsheet must serve and protect you. Look at the NFPA 1710 standard, which was enacted to serve and protect firefighters. This standard was enacted with their personal safety and well-being as the principle behind staffing requirements. At what stage of draft do you suppose the NFPA 1710 standard would be today if the relative data were collected sporadically or were not tracked for each service-related death? It may have taken many more service-related deaths to effect change for a system-wide improvement in operational performance. Every call merits documentation and data collection. Your data are catalysts for change.
Assessment and certification of neonatal incubator sensors through an inferential neural network.
de Araújo, José Medeiros; de Menezes, José Maria Pires; Moura de Albuquerque, Alberto Alexandre; da Mota Almeida, Otacílio; Ugulino de Araújo, Fábio Meneghetti
2013-11-15
Measurement and diagnostic systems based on electronic sensors have been increasingly essential in the standardization of hospital equipment. The technical standard IEC (International Electrotechnical Commission) 60601-2-19 establishes requirements for neonatal incubators and specifies the calibration procedure and validation tests for such devices using sensors systems. This paper proposes a new procedure based on an inferential neural network to evaluate and calibrate a neonatal incubator. The proposal presents significant advantages over the standard calibration process, i.e., the number of sensors is drastically reduced, and it runs with the incubator under operation. Since the sensors used in the new calibration process are already installed in the commercial incubator, no additional hardware is necessary; and the calibration necessity can be diagnosed in real time without the presence of technical professionals in the neonatal intensive care unit (NICU). Experimental tests involving the aforementioned calibration system are carried out in a commercial incubator in order to validate the proposal.
Assessment and Certification of Neonatal Incubator Sensors through an Inferential Neural Network
de Araújo Júnior, José Medeiros; de Menezes Júnior, José Maria Pires; de Albuquerque, Alberto Alexandre Moura; Almeida, Otacílio da Mota; de Araújo, Fábio Meneghetti Ugulino
2013-01-01
Measurement and diagnostic systems based on electronic sensors have been increasingly essential in the standardization of hospital equipment. The technical standard IEC (International Electrotechnical Commission) 60601-2-19 establishes requirements for neonatal incubators and specifies the calibration procedure and validation tests for such devices using sensors systems. This paper proposes a new procedure based on an inferential neural network to evaluate and calibrate a neonatal incubator. The proposal presents significant advantages over the standard calibration process, i.e., the number of sensors is drastically reduced, and it runs with the incubator under operation. Since the sensors used in the new calibration process are already installed in the commercial incubator, no additional hardware is necessary; and the calibration necessity can be diagnosed in real time without the presence of technical professionals in the neonatal intensive care unit (NICU). Experimental tests involving the aforementioned calibration system are carried out in a commercial incubator in order to validate the proposal. PMID:24248278
48 CFR 9.203 - QPL's, QML's, and QBL's.
Code of Federal Regulations, 2010 CFR
2010-10-01
...) Qualification and listing in a QPL, QML, or QBL is the process by which products are obtained from manufacturers... and Standardization Information System (ASSIST) at (http://assist.daps.dla.mil). (c) Instructions concerning qualification procedures are included in the following publications: (1) Federal Standardization...
48 CFR 352.239-71 - Standard for encryption language.
Code of Federal Regulations, 2011 CFR
2011-10-01
... language. 352.239-71 Section 352.239-71 Federal Acquisition Regulations System HEALTH AND HUMAN SERVICES... Information Processing Standard (FIPS) 140-2-compliant encryption (Security Requirements for Cryptographic Module, as amended) to protect all instances of HHS sensitive information during storage and transmission...
Innovating the Standard Procurement System Utilizing Intelligent Agent Technologies
1999-12-01
36 C. STANDARD PROCUREMENT SYSTEM 36 1. OVERVIEW 36 2. SPS FUNCTIONS , 37 3. SPS ADVANTAGES 39 4. SPS DISADVANTAGES 40 5. SPS SUMMARY 41 D...PROCUREMENT PROCESS INNOVATION RESULTS ’. 52 E. INTELLIGENT AGENT (IA) TECHNOLOGY 53 1. OVERVIEW 54 viii 2. ADVANTAGES 58 3. DISADVANTAGES 58 F...Electronic Mall (EMALL), GSA Advantage , etc. • Web invoicing Electronic Funds Transfer (EFT) • • International Merchant Purchase Authorization Card (IMPAC
Risk Management Considerations for Interoperable Acquisition
2006-08-01
Electronics Engineers (IEEE) to harmonize the standards for software (IEEE 12207 ) and system (IEEE 15288) life-cycle processes. A goal of this harmonization...management ( ISO /IEC 16085) is being generalized to apply to the systems level. The revised, generalized standard will add require- ments and guidance for the...risk management. The documents include the following: • ISO /IEC Guide 73: Risk Management—Vocabulary—Guidelines for use in stan- dards [ ISO 02
Quality assurance in military medical research and medical radiation accident management.
Hotz, Mark E; Meineke, Viktor
2012-08-01
The provision of quality radiation-related medical diagnostic and therapeutic treatments cannot occur without the presence of robust quality assurance and standardization programs. Medical laboratory services are essential in patient treatment and must be able to meet the needs of all patients and the clinical personnel responsible for the medical care of these patients. Clinical personnel involved in patient care must embody the quality assurance process in daily work to ensure program sustainability. In conformance with the German Federal Government's concept for modern departmental research, the international standard ISO 9001, one of the relevant standards of the International Organization for Standardization (ISO), is applied in quality assurance in military medical research. By its holistic approach, this internationally accepted standard provides an excellent basis for establishing a modern quality management system in line with international standards. Furthermore, this standard can serve as a sound basis for the further development of an already established quality management system when additional standards shall apply, as for instance in reference laboratories or medical laboratories. Besides quality assurance, a military medical facility must manage additional risk events in the context of early recognition/detection of health risks of military personnel on deployment in order to be able to take appropriate preventive and protective measures; for instance, with medical radiation accident management. The international standard ISO 31000:2009 can serve as a guideline for establishing risk management. Clear organizational structures and defined work processes are required when individual laboratory units seek accreditation according to specific laboratory standards. Furthermore, international efforts to develop health laboratory standards must be reinforced that support sustainable quality assurance, as in the exchange and comparison of test results within the scope of external quality assurance, but also in the exchange of special diagnosis data among international research networks. In summary, the acknowledged standard for a quality management system to ensure quality assurance is the very generic standard ISO 9001.Health Phys. 103(2):221-225; 2012.
An ecological method to understand agricultural standardization in peach orchard ecosystems
Wan, Nian-Feng; Zhang, Ming-Yi; Jiang, Jie-Xian; Ji, Xiang-Yun; Hao-Zhang
2016-01-01
While the worldwide standardization of agricultural production has been advocated and recommended, relatively little research has focused on the ecological significance of such a shift. The ecological concerns stemming from the standardization of agricultural production may require new methodology. In this study, we concentrated on how ecological two-sidedness and ecological processes affect the standardization of agricultural production which was divided into three phrases (pre-, mid- and post-production), considering both the positive and negative effects of agricultural processes. We constructed evaluation indicator systems for the pre-, mid- and post-production phases and here we presented a Standardization of Green Production Index (SGPI) based on the Full Permutation Polygon Synthetic Indicator (FPPSI) method which we used to assess the superiority of three methods of standardized production for peaches. The values of SGPI for pre-, mid- and post-production were 0.121 (Level IV, “Excellent” standard), 0.379 (Level III, “Good” standard), and 0.769 × 10−2 (Level IV, “Excellent” standard), respectively. Here we aimed to explore the integrated application of ecological two-sidedness and ecological process in agricultural production. Our results are of use to decision-makers and ecologists focusing on eco-agriculture and those farmers who hope to implement standardized agricultural production practices. PMID:26899360
An ecological method to understand agricultural standardization in peach orchard ecosystems.
Wan, Nian-Feng; Zhang, Ming-Yi; Jiang, Jie-Xian; Ji, Xiang-Yun; Hao-Zhang
2016-02-22
While the worldwide standardization of agricultural production has been advocated and recommended, relatively little research has focused on the ecological significance of such a shift. The ecological concerns stemming from the standardization of agricultural production may require new methodology. In this study, we concentrated on how ecological two-sidedness and ecological processes affect the standardization of agricultural production which was divided into three phrases (pre-, mid- and post-production), considering both the positive and negative effects of agricultural processes. We constructed evaluation indicator systems for the pre-, mid- and post-production phases and here we presented a Standardization of Green Production Index (SGPI) based on the Full Permutation Polygon Synthetic Indicator (FPPSI) method which we used to assess the superiority of three methods of standardized production for peaches. The values of SGPI for pre-, mid- and post-production were 0.121 (Level IV, "Excellent" standard), 0.379 (Level III, "Good" standard), and 0.769 × 10(-2) (Level IV, "Excellent" standard), respectively. Here we aimed to explore the integrated application of ecological two-sidedness and ecological process in agricultural production. Our results are of use to decision-makers and ecologists focusing on eco-agriculture and those farmers who hope to implement standardized agricultural production practices.
EOforge: Generic Open Framework for Earth Observation Data Processing Systems
2006-09-01
Allow the use of existing interfaces, i.e. MUIS: ESA multimission catalogue for EO products. • Support last EO systems technologies, i.e. MASS ...5. Extensibility and configurability to allow customisation and the inclusion of new functionality. 6. Multi-instrument and multi-mission processing...such as: • MUIS: ESA multimission catalogue for EO products. • MASS (Multi-Application Support Service System): ESA web services technology standard
2008-12-01
A SYSTEMS ENGINEERING PROCESS SUPPORTING THE DEVELOPMENT OF OPERATIONAL REQUIREMENTS DRIVEN FEDERATIONS Andreas Tolk & Thomas G. Litwin ...c. THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 Tolk, Litwin and Kewley Executive Office (PEO...capabilities and their relative changes 1297 Tolk, Litwin and Kewley based on the system to be evaluated as well, in particular when it comes to
Control of Complex Dynamic Systems by Neural Networks
NASA Technical Reports Server (NTRS)
Spall, James C.; Cristion, John A.
1993-01-01
This paper considers the use of neural networks (NN's) in controlling a nonlinear, stochastic system with unknown process equations. The NN is used to model the resulting unknown control law. The approach here is based on using the output error of the system to train the NN controller without the need to construct a separate model (NN or other type) for the unknown process dynamics. To implement such a direct adaptive control approach, it is required that connection weights in the NN be estimated while the system is being controlled. As a result of the feedback of the unknown process dynamics, however, it is not possible to determine the gradient of the loss function for use in standard (back-propagation-type) weight estimation algorithms. Therefore, this paper considers the use of a new stochastic approximation algorithm for this weight estimation, which is based on a 'simultaneous perturbation' gradient approximation that only requires the system output error. It is shown that this algorithm can greatly enhance the efficiency over more standard stochastic approximation algorithms based on finite-difference gradient approximations.
Defining and reconstructing clinical processes based on IHE and BPMN 2.0.
Strasser, Melanie; Pfeifer, Franz; Helm, Emmanuel; Schuler, Andreas; Altmann, Josef
2011-01-01
This paper describes the current status and the results of our process management system for defining and reconstructing clinical care processes, which contributes to compare, analyze and evaluate clinical processes and further to identify high cost tasks or stays. The system is founded on IHE, which guarantees standardized interfaces and interoperability between clinical information systems. At the heart of the system there is BPMN, a modeling notation and specification language, which allows the definition and execution of clinical processes. The system provides functionality to define healthcare information system independent clinical core processes and to execute the processes in a workflow engine. Furthermore, the reconstruction of clinical processes is done by evaluating an IHE audit log database, which records patient movements within a health care facility. The main goal of the system is to assist hospital operators and clinical process managers to detect discrepancies between defined and actual clinical processes and as well to identify main causes of high medical costs. Beyond that, the system can potentially contribute to reconstruct and improve clinical processes and enhance cost control and patient care quality.
Butt, Muhammad Arif; Akram, Muhammad
2016-01-01
We present a new intuitionistic fuzzy rule-based decision-making system based on intuitionistic fuzzy sets for a process scheduler of a batch operating system. Our proposed intuitionistic fuzzy scheduling algorithm, inputs the nice value and burst time of all available processes in the ready queue, intuitionistically fuzzify the input values, triggers appropriate rules of our intuitionistic fuzzy inference engine and finally calculates the dynamic priority (dp) of all the processes in the ready queue. Once the dp of every process is calculated the ready queue is sorted in decreasing order of dp of every process. The process with maximum dp value is sent to the central processing unit for execution. Finally, we show complete working of our algorithm on two different data sets and give comparisons with some standard non-preemptive process schedulers.
Convergence Toward Common Standards in Machine-Readable Cataloging *
Gull, C. D.
1969-01-01
The adoption of the MARC II format for the communication of bibliographic information by the three National Libraries of the U.S.A. makes it possible for those libraries to converge on the remaining necessary common standards for machine-readable cataloging. Three levels of standards are identified: fundamental, the character set; intermediate, MARC II; and detailed, the codes for identifying data elements. The convergence on these standards implies that the National Libraries can create and operate a Joint Bibliographic Data Bank requiring standard book numbers and universal serial numbers for identifying monographs and serials and that the system will thoroughly process contributed catalog entries before adding them to the Data Bank. There is reason to hope that the use of the MARC II format will facilitate catalogers' decision processes. PMID:5782261
Fulga, Netta
2013-06-01
Quality management and accreditation in the analytical laboratory setting are developing rapidly and becoming the standard worldwide. Quality management refers to all the activities used by organizations to ensure product or service consistency. Accreditation is a formal recognition by an authoritative regulatory body that a laboratory is competent to perform examinations and report results. The Motherisk Drug Testing Laboratory is licensed to operate at the Hospital for Sick Children in Toronto, Ontario. The laboratory performs toxicology tests of hair and meconium samples for research and clinical purposes. Most of the samples are involved in a chain of custody cases. Establishing a quality management system and achieving accreditation became mandatory by legislation for all Ontario clinical laboratories since 2003. The Ontario Laboratory Accreditation program is based on International Organization for Standardization 15189-Medical laboratories-Particular requirements for quality and competence, an international standard that has been adopted as a national standard in Canada. The implementation of a quality management system involves management commitment, planning and staff education, documentation of the system, validation of processes, and assessment against the requirements. The maintenance of a quality management system requires control and monitoring of the entire laboratory path of workflow. The process of transformation of a research/clinical laboratory into an accredited laboratory, and the benefits of maintaining an effective quality management system, are presented in this article.
Deda, H; Yakupoglu, H
2002-01-01
Science must have a common language. For centuries, Latin language carried out this job, but the progress in computer technology and internet world through the last 20 years, began to produce a new language with the new century; the computer language. The information masses, which need data language standardization, are the followings; Digital libraries and medical education systems, Consumer health informatics, Medical education systems, World Wide Web Applications, Database systems, Medical language processing, Automatic indexing systems, Image processing units, Telemedicine, New Generation Internet (NGI).
Relevance of deterministic chaos theory to studies in functioning of dynamical systems
NASA Astrophysics Data System (ADS)
Glagolev, S. N.; Bukhonova, S. M.; Chikina, E. D.
2018-03-01
The paper considers chaotic behavior of dynamical systems typical for social and economic processes. Approaches to analysis and evaluation of system development processes are studies from the point of view of controllability and determinateness. Explanations are given for necessity to apply non-standard mathematical tools to explain states of dynamical social and economic systems on the basis of fractal theory. Features of fractal structures, such as non-regularity, self-similarity, dimensionality and fractionality are considered.
Lindoerfer, Doris; Mansmann, Ulrich
2017-07-01
Patient registries are instrumental for medical research. Often their structures are complex and their implementations use composite software systems to meet the wide spectrum of challenges. Commercial and open-source systems are available for registry implementation, but many research groups develop their own systems. Methodological approaches in the selection of software as well as the construction of proprietary systems are needed. We propose an evidence-based checklist, summarizing essential items for patient registry software systems (CIPROS), to accelerate the requirements engineering process. Requirements engineering activities for software systems follow traditional software requirements elicitation methods, general software requirements specification (SRS) templates, and standards. We performed a multistep procedure to develop a specific evidence-based CIPROS checklist: (1) A systematic literature review to build a comprehensive collection of technical concepts, (2) a qualitative content analysis to define a catalogue of relevant criteria, and (3) a checklist to construct a minimal appraisal standard. CIPROS is based on 64 publications and covers twelve sections with a total of 72 items. CIPROS also defines software requirements. Comparing CIPROS with traditional software requirements elicitation methods, SRS templates and standards show a broad consensus but differences in issues regarding registry-specific aspects. Using an evidence-based approach to requirements engineering for registry software adds aspects to the traditional methods and accelerates the software engineering process for registry software. The method we used to construct CIPROS serves as a potential template for creating evidence-based checklists in other fields. The CIPROS list supports developers in assessing requirements for existing systems and formulating requirements for their own systems, while strengthening the reporting of patient registry software system descriptions. It may be a first step to create standards for patient registry software system assessments. Copyright © 2017 Elsevier Inc. All rights reserved.
Ultrasonic imaging system for in-process fabric defect detection
Sheen, Shuh-Haw; Chien, Hual-Te; Lawrence, William P.; Raptis, Apostolos C.
1997-01-01
An ultrasonic method and system are provided for monitoring a fabric to identify a defect. A plurality of ultrasonic transmitters generate ultrasonic waves relative to the fabric. An ultrasonic receiver means responsive to the generated ultrasonic waves from the transmitters receives ultrasonic waves coupled through the fabric and generates a signal. An integrated peak value of the generated signal is applied to a digital signal processor and is digitized. The digitized signal is processed to identify a defect in the fabric. The digitized signal processing includes a median value filtering step to filter out high frequency noise. Then a mean value and standard deviation of the median value filtered signal is calculated. The calculated mean value and standard deviation are compared with predetermined threshold values to identify a defect in the fabric.
Wiltz, Jennifer L; Blanck, Heidi M; Lee, Brian; Kocot, S Lawrence; Seeff, Laura; McGuire, Lisa C; Collins, Janet
2017-10-26
Electronic information technology standards facilitate high-quality, uniform collection of data for improved delivery and measurement of health care services. Electronic information standards also aid information exchange between secure systems that link health care and public health for better coordination of patient care and better-informed population health improvement activities. We developed international data standards for healthy weight that provide common definitions for electronic information technology. The standards capture healthy weight data on the "ABCDs" of a visit to a health care provider that addresses initial obesity prevention and care: assessment, behaviors, continuity, identify resources, and set goals. The process of creating healthy weight standards consisted of identifying needs and priorities, developing and harmonizing standards, testing the exchange of data messages, and demonstrating use-cases. Healthy weight products include 2 message standards, 5 use-cases, 31 LOINC (Logical Observation Identifiers Names and Codes) question codes, 7 healthy weight value sets, 15 public-private engagements with health information technology implementers, and 2 technical guides. A logic model and action steps outline activities toward better data capture, interoperable systems, and information use. Sharing experiences and leveraging this work in the context of broader priorities can inform the development of electronic information standards for similar core conditions and guide strategic activities in electronic systems.
Blanck, Heidi M.; Lee, Brian; Kocot, S. Lawrence; Seeff, Laura; McGuire, Lisa C.; Collins, Janet
2017-01-01
Electronic information technology standards facilitate high-quality, uniform collection of data for improved delivery and measurement of health care services. Electronic information standards also aid information exchange between secure systems that link health care and public health for better coordination of patient care and better-informed population health improvement activities. We developed international data standards for healthy weight that provide common definitions for electronic information technology. The standards capture healthy weight data on the “ABCDs” of a visit to a health care provider that addresses initial obesity prevention and care: assessment, behaviors, continuity, identify resources, and set goals. The process of creating healthy weight standards consisted of identifying needs and priorities, developing and harmonizing standards, testing the exchange of data messages, and demonstrating use-cases. Healthy weight products include 2 message standards, 5 use-cases, 31 LOINC (Logical Observation Identifiers Names and Codes) question codes, 7 healthy weight value sets, 15 public–private engagements with health information technology implementers, and 2 technical guides. A logic model and action steps outline activities toward better data capture, interoperable systems, and information use. Sharing experiences and leveraging this work in the context of broader priorities can inform the development of electronic information standards for similar core conditions and guide strategic activities in electronic systems. PMID:29072985
Sachs, Peter B; Hunt, Kelly; Mansoubi, Fabien; Borgstede, James
2017-02-01
Building and maintaining a comprehensive yet simple set of standardized protocols for a cross-sectional image can be a daunting task. A single department may have difficulty preventing "protocol creep," which almost inevitably occurs when an organized "playbook" of protocols does not exist and individual radiologists and technologists alter protocols at will and on a case-by-case basis. When multiple departments or groups function in a large health system, the lack of uniformity of protocols can increase exponentially. In 2012, the University of Colorado Hospital formed a large health system (UCHealth) and became a 5-hospital provider network. CT and MR imaging studies are conducted at multiple locations by different radiology groups. To facilitate consistency in ordering, acquisition, and appearance of a given study, regardless of location, we minimized the number of protocols across all scanners and sites of practice with a clinical indication-driven protocol selection and standardization process. Here we review the steps utilized to perform this process improvement task and insure its stability over time. Actions included creation of a standardized protocol template, which allowed for changes in electronic storage and management of protocols, designing a change request form, and formation of a governance structure. We utilized rapid improvement events (1 day for CT, 2 days for MR) and reduced 248 CT protocols into 97 standardized protocols and 168 MR protocols to 66. Additional steps are underway to further standardize output and reporting of imaging interpretation. This will result in an improved, consistent radiologist, patient, and provider experience across the system.
Information Processing in Memory Tasks.
ERIC Educational Resources Information Center
Johnston, William A.
The intensity of information processing engendered in different phases of standard memory tasks was examined in six experiments. Processing intensity was conceptualized as system capacity consumed, and was measured via a divided-attention procedure in which subjects performed a memory task and a simple reaction-time (RT) task concurrently. The…
Code of Federal Regulations, 2010 CFR
2010-04-01
... STANDARDS FOR HUMAN BLOOD AND BLOOD PRODUCTS Plasma § 640.34 Processing. (a) Plasma. Plasma shall be... collecting, processing, and storage system unless the product is to be stored as Liquid Plasma. (b) Fresh Frozen Plasma. Fresh frozen plasma shall be prepared from blood collected by a single uninterrupted...
Code of Federal Regulations, 2012 CFR
2012-04-01
... STANDARDS FOR HUMAN BLOOD AND BLOOD PRODUCTS Plasma § 640.34 Processing. (a) Plasma. Plasma shall be... collecting, processing, and storage system unless the product is to be stored as Liquid Plasma. (b) Fresh Frozen Plasma. Fresh frozen plasma shall be prepared from blood collected by a single uninterrupted...
Code of Federal Regulations, 2011 CFR
2011-04-01
... STANDARDS FOR HUMAN BLOOD AND BLOOD PRODUCTS Plasma § 640.34 Processing. (a) Plasma. Plasma shall be... collecting, processing, and storage system unless the product is to be stored as Liquid Plasma. (b) Fresh Frozen Plasma. Fresh frozen plasma shall be prepared from blood collected by a single uninterrupted...
Code of Federal Regulations, 2013 CFR
2013-04-01
... STANDARDS FOR HUMAN BLOOD AND BLOOD PRODUCTS Plasma § 640.34 Processing. (a) Plasma. Plasma shall be... collecting, processing, and storage system unless the product is to be stored as Liquid Plasma. (b) Fresh Frozen Plasma. Fresh frozen plasma shall be prepared from blood collected by a single uninterrupted...
Code of Federal Regulations, 2014 CFR
2014-04-01
... STANDARDS FOR HUMAN BLOOD AND BLOOD PRODUCTS Plasma § 640.34 Processing. (a) Plasma. Plasma shall be... collecting, processing, and storage system unless the product is to be stored as Liquid Plasma. (b) Fresh Frozen Plasma. Fresh frozen plasma shall be prepared from blood collected by a single uninterrupted...
40 CFR 63.1082 - What definitions do I need to know?
Code of Federal Regulations, 2010 CFR
2010-07-01
...) National Emission Standards for Ethylene Manufacturing Process Units: Heat Exchange Systems and Waste... resulting from the quench and compression of cracked gas (the cracking furnace effluent) at an ethylene... within an ethylene production unit. Process wastewater is not organic wastes, process fluids, product...
NASA Astrophysics Data System (ADS)
Malyshev, Mikhail; Kreimer, Johannes
2013-09-01
Safety analyses for electrical, electronic and/or programmable electronic (E/E/EP) safety-related systems used in payload applications on-board the International Space Station (ISS) are often based on failure modes, effects and criticality analysis (FMECA). For industrial applications of E/E/EP safety-related systems, comparable strategies exist and are defined in the IEC-61508 standard. This standard defines some quantitative criteria based on potential failure modes (for example, Safe Failure Fraction). These criteria can be calculated for an E/E/EP system or components to assess their compliance to requirements of a particular Safety Integrity Level (SIL). The standard defines several SILs depending on how much risk has to be mitigated by a safety-critical system. When a FMECA is available for an ISS payload or its subsystem, it may be possible to calculate the same or similar parameters as defined in the 61508 standard. One example of a payload that has a dedicated functional safety subsystem is the Electromagnetic Levitator (EML). This payload for the ISS is planned to be operated on-board starting 2014. The EML is a high-temperature materials processing facility. The dedicated subsystem "Hazard Control Electronics" (HCE) is implemented to ensure compliance to failure tolerance in limiting samples processing parameters to maintain generation of the potentially toxic by-products to safe limits in line with the requirements applied to the payloads by the ISS Program. The objective of this paper is to assess the implementation of the HCE in the EML against criteria for functional safety systems in the IEC-61508 standard and to evaluate commonalities and differences with respect to safety requirements levied on ISS Payloads. An attempt is made to assess a possibility of using commercially available components and systems certified for compliance to industrial functional safety standards in ISS payloads.
Implementation of a formulary management process.
Karel, Lauren I; Delisle, Dennis R; Anagnostis, Ellena A; Wordell, Cindy J
2017-08-15
The application of lean methodology in an initiative to redesign the formulary maintenance process at an academic medical center is described. Maintaining a hospital formulary requires clear communication and coordination among multiple members of the pharmacy department. Using principles of lean methodology, pharmacy department personnel within a multihospital health system launched a multifaceted initiative to optimize formulary management systemwide. The ongoing initiative began with creation of a formulary maintenance redesign committee consisting of pharmacy department personnel with expertise in informatics, automation, purchasing, drug information, and clinical pharmacy services. The committee met regularly and used lean methodology to design a standardized process for management of formulary additions and deletions and changes to medications' formulary status. Through value stream analysis, opportunities for process and performance improvement were identified; staff suggestions on process streamlining were gathered during a series of departmental kaizen events. A standardized template for development and dissemination of monographs associated with formulary additions and status changes was created. In addition, a shared Web-based checklist was developed to facilitate information sharing and timely initiation and completion of tasks involved in formulary status changes, and a permanent formulary maintenance committee was established to monitor and refine the formulary management process. A clearly defined, standardized process within the pharmacy department was developed for tracking necessary steps in enacting formulary changes to encourage safe and efficient workflow. Copyright © 2017 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Kwon, Seyong; Cho, Chang Hyun; Kwon, Youngmee; Lee, Eun Sook; Park, Je-Kyun
2017-04-01
Immunohistochemistry (IHC) plays an important role in biomarker-driven cancer therapy. Although there has been a high demand for standardized and quality assured IHC, it has rarely been achieved due to the complexity of IHC testing and the subjective validation-based process flow of IHC quality control. We present here a microfluidic immunostaining system for the standardization of IHC by creating a microfluidic linearly graded antibody (Ab)-staining device and a reference cell microarray. Unlike conventional efforts, our system deals primarily with the screening of biomarker staining conditions for quantitative quality assurance testing in IHC. We characterized the microfluidic matching of Ab staining intensity using three HER2 Abs produced by different manufacturers. The quality of HER2 Ab was also validated using tissues of breast cancer patients, demonstrating that our system is an efficient and powerful tool for the standardization and quality assurance of IHC.
NASA Astrophysics Data System (ADS)
Flores, Jorge L.; García-Torales, G.; Ponce Ávila, Cristina
2006-08-01
This paper describes an in situ image recognition system designed to inspect the quality standards of the chocolate pops during their production. The essence of the recognition system is the localization of the events (i.e., defects) in the input images that affect the quality standards of pops. To this end, processing modules, based on correlation filter, and segmentation of images are employed with the objective of measuring the quality standards. Therefore, we designed the correlation filter and defined a set of features from the correlation plane. The desired values for these parameters are obtained by exploiting information about objects to be rejected in order to find the optimal discrimination capability of the system. Regarding this set of features, the pop can be correctly classified. The efficacy of the system has been tested thoroughly under laboratory conditions using at least 50 images, containing 3 different types of possible defects.
Solutions for acceleration measurement in vehicle crash tests
NASA Astrophysics Data System (ADS)
Dima, D. S.; Covaciu, D.
2017-10-01
Crash tests are useful for validating computer simulations of road traffic accidents. One of the most important parameters measured is the acceleration. The evolution of acceleration versus time, during a crash test, form a crash pulse. The correctness of the crash pulse determination depends on the data acquisition system used. Recommendations regarding the instrumentation for impact tests are given in standards, which are focused on the use of accelerometers as impact sensors. The goal of this paper is to present the device and software developed by authors for data acquisition and processing. The system includes two accelerometers with different input ranges, a processing unit based on a 32-bit microcontroller and a data logging unit with SD card. Data collected on card, as text files, is processed with a dedicated software running on personal computers. The processing is based on diagrams and includes the digital filters recommended in standards.
Managing Interoperability for GEOSS - A Report from the SIF
NASA Astrophysics Data System (ADS)
Khalsa, S. J.; Actur, D.; Nativi, S.; Browdy, S.; Eglitis, P.
2009-04-01
The Global Earth Observation System of Systems (GEOSS) is a coordinating and integrating framework for Earth observing and information systems, which are contributed on a voluntary basis by Members and Participating Organizations of the intergovernmental Group on Earth Observations (GEO). GEOSS exists to support informed decision making for the benefit of society, including the implementation of international environmental treaty obligations. GEO Members and Participating organizations use the GEOSS Common Infrastructure (GCI) to register their Earth observation resources, thereby making them discoverable and consumable by both humans and client applications. Essential to meeting GEO user needs is a process for supporting interoperability of observing, processing, modeling and dissemination capabilities. The GEO Standards and Interoperability Forum (SIF) was created to develop, implement and oversee this process. The SIF supports GEO organizations contributing resources to the GEOSS by helping them understand and work with the GEOSS interoperability guidelines and encouraging them to register their "interoperability arrangements" (standards or other ad hoc arrangements for interoperability) in the GEOSS standards registry, which is part of the GCI. These registered interoperability arrangements support the actual services used to achieve interoperability of systems. By making information about these interoperability arrangements available to users of the GEOSS the SIF enhances the understanding and utility of contributed resources. We describe the procedures that the SIF has enacted to carry out its work. To operate effectively the SIF uses a workflow system and is establishing a set of regional teams and domain experts. In the near term our work has focused on population and review of the GEOSS Standards Registry, but we are also developing approaches to achieving progressive convergence on, and uptake of, an optimal set of interoperability arrangements for all of GEOSS.
Hume, Samuel; Chow, Anthony; Evans, Julie; Malfait, Frederik; Chason, Julie; Wold, J. Darcy; Kubick, Wayne; Becnel, Lauren B.
2018-01-01
The Clinical Data Interchange Standards Consortium (CDISC) is a global non-profit standards development organization that creates consensus-based standards for clinical and translational research. Several of these standards are now required by regulators for electronic submissions of regulated clinical trials’ data and by government funding agencies. These standards are free and open, available for download on the CDISC Website as PDFs. While these documents are human readable, they are not amenable to ready use by electronic systems. CDISC launched the CDISC Shared Health And Research Electronic library (SHARE) to provide the standards metadata in machine-readable formats to facilitate the automated management and implementation of the standards. This paper describes how CDISC SHARE’S standards can facilitate collecting, aggregating and analyzing standardized data from early design to end analysis; and its role as a central resource providing information systems with metadata that drives process automation including study setup and data pipelining. PMID:29888049
A computational imaging target specific detectivity metric
NASA Astrophysics Data System (ADS)
Preece, Bradley L.; Nehmetallah, George
2017-05-01
Due to the large quantity of low-cost, high-speed computational processing available today, computational imaging (CI) systems are expected to have a major role for next generation multifunctional cameras. The purpose of this work is to quantify the performance of theses CI systems in a standardized manner. Due to the diversity of CI system designs that are available today or proposed in the near future, significant challenges in modeling and calculating a standardized detection signal-to-noise ratio (SNR) to measure the performance of these systems. In this paper, we developed a path forward for a standardized detectivity metric for CI systems. The detectivity metric is designed to evaluate the performance of a CI system searching for a specific known target or signal of interest, and is defined as the optimal linear matched filter SNR, similar to the Hotelling SNR, calculated in computational space with special considerations for standardization. Therefore, the detectivity metric is designed to be flexible, in order to handle various types of CI systems and specific targets, while keeping the complexity and assumptions of the systems to a minimum.
On a more rigorous gravity field processing for future LL-SST type gravity satellite missions
NASA Astrophysics Data System (ADS)
Daras, I.; Pail, R.; Murböck, M.
2013-12-01
In order to meet the augmenting demands of the user community concerning accuracies of temporal gravity field models, future gravity missions of low-low satellite-to-satellite tracking (LL-SST) type are planned to carry more precise sensors than their precedents. A breakthrough is planned with the improved LL-SST measurement link, where the traditional K-band microwave instrument of 1μm accuracy will be complemented by an inter-satellite ranging instrument of several nm accuracy. This study focuses on investigations concerning the potential performance of the new sensors and their impact in gravity field solutions. The processing methods for gravity field recovery have to meet the new sensor standards and be able to take full advantage of the new accuracies that they provide. We use full-scale simulations in a realistic environment to investigate whether the standard processing techniques suffice to fully exploit the new sensors standards. We achieve that by performing full numerical closed-loop simulations based on the Integral Equation approach. In our simulation scheme, we simulate dynamic orbits in a conventional tracking analysis to compute pseudo inter-satellite ranges or range-rates that serve as observables. Each part of the processing is validated separately with special emphasis on numerical errors and their impact in gravity field solutions. We demonstrate that processing with standard precision may be a limiting factor for taking full advantage of new generation sensors that future satellite missions will carry. Therefore we have created versions of our simulator with enhanced processing precision with primarily aim to minimize round-off system errors. Results using the enhanced precision show a big reduction of system errors that were present at the standard precision processing even for the error-free scenario, and reveal the improvements the new sensors will bring into the gravity field solutions. As a next step, we analyze the contribution of individual error sources to the system's error budget. More specifically we analyze sensor noise from the laser interferometer and the accelerometers, errors in the kinematic orbits and the background fields as well as temporal and spatial aliasing errors. We give special care on the assessment of error sources with stochastic behavior, such as the laser interferometer and the accelerometers, and their consistent stochastic modeling in frame of the adjustment process.
Cruz, Márcio Freire; Cavalcante, Carlos Arthur Mattos Teixeira; Sá Barretto, Sérgio Torres
2018-05-30
Health Level Seven (HL7) is one of the standards most used to centralize data from different vital sign monitoring systems. This solution significantly limits the data available for historical analysis, because it typically uses databases that are not effective in storing large volumes of data. In industry, a specific Big Data Historian, known as a Process Information Management System (PIMS), solves this problem. This work proposes the same solution to overcome the restriction on storing vital sign data. The PIMS needs a compatible communication standard to allow storing, and the one most commonly used is the OLE for Process Control (OPC). This paper presents a HL7-OPC Server that permits communication between vital sign monitoring systems with PIMS, thus allowing the storage of long historical series of vital signs. In addition, it carries out a review about local and cloud-based Big Medical Data researches, followed by an analysis of the PIMS in a Health IT Environment. Then it shows the architecture of HL7 and OPC Standards. Finally, it shows the HL7-OPC Server and a sequence of tests that proved its full operation and performance.
From IHE Audit Trails to XES Event Logs Facilitating Process Mining.
Paster, Ferdinand; Helm, Emmanuel
2015-01-01
Recently Business Intelligence approaches like process mining are applied to the healthcare domain. The goal of process mining is to gain process knowledge, compliance and room for improvement by investigating recorded event data. Previous approaches focused on process discovery by event data from various specific systems. IHE, as a globally recognized basis for healthcare information systems, defines in its ATNA profile how real-world events must be recorded in centralized event logs. The following approach presents how audit trails collected by the means of ATNA can be transformed to enable process mining. Using the standardized audit trails provides the ability to apply these methods to all IHE based information systems.
Trends in Planetary Data Analysis. Executive summary of the Planetary Data Workshop
NASA Technical Reports Server (NTRS)
Evans, N.
1984-01-01
Planetary data include non-imaging remote sensing data, which includes spectrometric, radiometric, and polarimetric remote sensing observations. Also included are in-situ, radio/radar data, and Earth based observation. Also discussed is development of a planetary data system. A catalog to identify observations will be the initial entry point for all levels of users into the data system. There are seven distinct data support services: encyclopedia, data index, data inventory, browse, search, sample, and acquire. Data systems for planetary science users must provide access to data, process, store, and display data. Two standards will be incorporated into the planetary data system: Standard communications protocol and Standard format data unit. The data system configuration must combine a distributed system with those of a centralized system. Fiscal constraints have made prioritization important. Activities include saving previous mission data, planning/cost analysis, and publishing of proceedings.
Decentralized or onsite wastewater treatment (OWT) systems have long been implicated in being a major source of N inputs to surface and ground waters and numerous regulatory bodies have promulgated strict total N (TN) effluent standards in N-sensitive areas. These standards, howe...
75 FR 35689 - System Personnel Training Reliability Standards
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-23
... using realistic simulations.\\14\\ \\13\\ Id. P 1331. \\14\\ Reliability Standard PER-002-0. 9. In Order No... development process to: (1) Include formal training requirements for reliability coordinators similar to those... simulation technology such as a simulator, virtual technology, or other technology in their emergency...
NASA Astrophysics Data System (ADS)
Peckham, S. D.
2013-12-01
Model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that allow heterogeneous sets of process models to be assembled in a plug-and-play manner to create composite "system models". These mechanisms facilitate code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers, e.g. by requiring them to provide their output in specific forms that meet the input requirements of other models. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can compare the answers to these queries with similar answers from other process models in a collection and then automatically call framework service components as necessary to mediate the differences between the coupled models. This talk will first review two key products of the CSDMS project, namely a standardized model interface called the Basic Model Interface (BMI) and the CSDMS Standard Names. The standard names are used in conjunction with BMI to provide a semantic matching mechanism that allows output variables from one process model to be reliably used as input variables to other process models in a collection. They include not just a standardized naming scheme for model variables, but also a standardized set of terms for describing the attributes and assumptions of a given model. To illustrate the power of standardized model interfaces and metadata, a smart, light-weight modeling framework written in Python will be introduced that can automatically (without user intervention) couple a set of BMI-enabled hydrologic process components together to create a spatial hydrologic model. The same mechanisms could also be used to provide seamless integration (import/export) of data and models.
Availability of the OGC geoprocessing standard: March 2011 reality check
NASA Astrophysics Data System (ADS)
Lopez-Pellicer, Francisco J.; Rentería-Agualimpia, Walter; Béjar, Rubén; Muro-Medrano, Pedro R.; Zarazaga-Soria, F. Javier
2012-10-01
This paper presents an investigation about the servers available in March 2011 conforming to the Web Processing Service interface specification published by the geospatial standards organization Open Geospatial Consortium (OGC) in 2007. This interface specification gives support to standard Web-based geoprocessing. The data used in this research were collected using a focused crawler configured for finding OGC Web services. The research goals are (i) to provide a reality check of the availability of Web Processing Service servers, (ii) to provide quantitative data about the use of different features defined in the standard that are relevant for a scalable Geoprocessing Web (e.g. long-running processes, Web-accessible data outputs), and (iii) to test if the advances in the use of search engines and focused crawlers for finding Web services can be applied for finding geoscience processing systems. Research results show the feasibility of the discovery approach and provide data about the implementation of the Web Processing Service specification. These results also show extensive use of features related to scalability, except for those related to technical and semantic interoperability.
RICIS Symposium 1992: Mission and Safety Critical Systems Research and Applications
NASA Technical Reports Server (NTRS)
1992-01-01
This conference deals with computer systems which control systems whose failure to operate correctly could produce the loss of life and or property, mission and safety critical systems. Topics covered are: the work of standards groups, computer systems design and architecture, software reliability, process control systems, knowledge based expert systems, and computer and telecommunication protocols.
A low-cost vector processor boosting compute-intensive image processing operations
NASA Technical Reports Server (NTRS)
Adorf, Hans-Martin
1992-01-01
Low-cost vector processing (VP) is within reach of everyone seriously engaged in scientific computing. The advent of affordable add-on VP-boards for standard workstations complemented by mathematical/statistical libraries is beginning to impact compute-intensive tasks such as image processing. A case in point in the restoration of distorted images from the Hubble Space Telescope. A low-cost implementation is presented of the standard Tarasko-Richardson-Lucy restoration algorithm on an Intel i860-based VP-board which is seamlessly interfaced to a commercial, interactive image processing system. First experience is reported (including some benchmarks for standalone FFT's) and some conclusions are drawn.
ERIC Educational Resources Information Center
Pawlowski, Jan M.
2007-01-01
In 2005, the new quality standard for learning, education, and training, ISO/IEC 19796-1, was published. Its purpose is to help educational organizations to develop quality systems and to improve the quality of their processes, products, and services. In this article, the standard is presented and compared to existing approaches, showing the…
Adding Bite to the Bark: Using LibGuides2 Migration as Impetus to Introduce Strong Content Standards
ERIC Educational Resources Information Center
Fritch, Melia; Pitts, Joelle E.
2016-01-01
The authors discuss the long-term accumulation of unstandardized and inaccessible content within the Libguides system and the decision-making process to create and implement a set of standards using the migration to the LibGuides2 platform as a vehicle for change. Included in the discussion are strategies for the creation of standards and…
[Establishment of database with standard 3D tooth crowns based on 3DS MAX].
Cheng, Xiaosheng; An, Tao; Liao, Wenhe; Dai, Ning; Yu, Qing; Lu, Peijun
2009-08-01
The database with standard 3D tooth crowns has laid the groundwork for dental CAD/CAM system. In this paper, we design the standard tooth crowns in 3DS MAX 9.0 and create a database with these models successfully. Firstly, some key lines are collected from standard tooth pictures. Then we use 3DS MAX 9.0 to design the digital tooth model based on these lines. During the design process, it is important to refer to the standard plaster tooth model. After some tests, the standard tooth models designed with this method are accurate and adaptable; furthermore, it is very easy to perform some operations on the models such as deforming and translating. This method provides a new idea to build the database with standard 3D tooth crowns and a basis for dental CAD/CAM system.
Standardization of XML Database Exchanges and the James Webb Space Telescope Experience
NASA Technical Reports Server (NTRS)
Gal-Edd, Jonathan; Detter, Ryan; Jones, Ron; Fatig, Curtis C.
2007-01-01
Personnel from the National Aeronautics and Space Administration (NASA) James Webb Space Telescope (JWST) Project have been working with various standard communities such the Object Management Group (OMG) and the Consultative Committee for Space Data Systems (CCSDS) to assist in the definition of a common extensible Markup Language (XML) for database exchange format. The CCSDS and OMG standards are intended for the exchange of core command and telemetry information, not for all database information needed to exercise a NASA space mission. The mission-specific database, containing all the information needed for a space mission, is translated from/to the standard using a translator. The standard is meant to provide a system that encompasses 90% of the information needed for command and telemetry processing. This paper will discuss standardization of the XML database exchange format, tools used, and the JWST experience, as well as future work with XML standard groups both commercial and government.
Aeronautical Mobile Airport Communications System (AeroMACS)
NASA Technical Reports Server (NTRS)
Budinger, James M.; Hall, Edward
2011-01-01
To help increase the capacity and efficiency of the nation s airports, a secure wideband wireless communications system is proposed for use on the airport surface. This paper provides an overview of the research and development process for the Aeronautical Mobile Airport Communications System (AeroMACS). AeroMACS is based on a specific commercial profile of the Institute of Electrical and Electronics Engineers (IEEE) 802.16 standard known as Wireless Worldwide Interoperability for Microwave Access or WiMAX (WiMax Forum). The paper includes background on the need for global interoperability in air/ground data communications, describes potential AeroMACS applications, addresses allocated frequency spectrum constraints, summarizes the international standardization process, and provides findings and recommendations from the world s first AeroMACS prototype implemented in Cleveland, Ohio, USA.
ERIC Educational Resources Information Center
National Bureau of Standards (DOC), Washington, DC.
These guidelines provide a handbook for use by federal organizations in structuring physical security and risk management programs for their automatic data processing facilities. This publication discusses security analysis, natural disasters, supporting utilities, system reliability, procedural measures and controls, off-site facilities,…
A Study of Alternative Computer Architectures for System Reliability and Software Simplification.
1981-04-22
compression. Several known applications of neighborhood processing, such as noise removal, and boundary smoothing, are shown to be special cases of...Processing [21] A small effort was undertaken to implement image array processing at a very low cost. To this end, a standard Qwip Facsimile
A methodology for Manufacturing Execution Systems (MES) implementation
NASA Astrophysics Data System (ADS)
Govindaraju, Rajesri; Putra, Krisna
2016-02-01
Manufacturing execution system is information systems (IS) application that bridges the gap between IS at the top level, namely enterprise resource planning (ERP), and IS at the lower levels, namely the automation systems. MES provides a media for optimizing the manufacturing process as a whole in a real time basis. By the use of MES in combination with the implementation of ERP and other automation systems, a manufacturing company is expected to have high competitiveness. In implementing MES, functional integration -making all the components of the manufacturing system able to work well together, is the most difficult challenge. For this, there has been an industry standard that specifies the sub-systems of a manufacturing execution systems and defines the boundaries between ERP systems, MES, and other automation systems. The standard is known as ISA-95. Although the advantages from the use of MES have been stated in some studies, not much research being done on how to implement MES effectively. The purpose of this study is to develop a methodology describing how MES implementation project should be managed, utilising the support of ISA- 95 reference model in the system development process. A proposed methodology was developed based on a general IS development methodology. The developed methodology were then revisited based on the understanding about the specific charateristics of MES implementation project found in an Indonesian steel manufacturing company implementation case. The case study highlighted the importance of applying an effective requirement elicitation method during innitial system assessment process, managing system interfaces and labor division in the design process, and performing a pilot deployment before putting the whole system into operation.
40 CFR 63.1083 - Does this subpart apply to my heat exchange system?
Code of Federal Regulations, 2011 CFR
2011-07-01
... CATEGORIES (CONTINUED) National Emission Standards for Ethylene Manufacturing Process Units: Heat Exchange Systems and Waste Operations Applicability for Heat Exchange Systems § 63.1083 Does this subpart apply to my heat exchange system? The provisions of this subpart apply to your heat exchange system if you own...
40 CFR 63.1083 - Does this subpart apply to my heat exchange system?
Code of Federal Regulations, 2012 CFR
2012-07-01
... CATEGORIES (CONTINUED) National Emission Standards for Ethylene Manufacturing Process Units: Heat Exchange Systems and Waste Operations Applicability for Heat Exchange Systems § 63.1083 Does this subpart apply to my heat exchange system? The provisions of this subpart apply to your heat exchange system if you own...
40 CFR 63.1083 - Does this subpart apply to my heat exchange system?
Code of Federal Regulations, 2014 CFR
2014-07-01
... CATEGORIES (CONTINUED) National Emission Standards for Ethylene Manufacturing Process Units: Heat Exchange Systems and Waste Operations Applicability for Heat Exchange Systems § 63.1083 Does this subpart apply to my heat exchange system? The provisions of this subpart apply to your heat exchange system if you own...
40 CFR 63.1083 - Does this subpart apply to my heat exchange system?
Code of Federal Regulations, 2013 CFR
2013-07-01
... CATEGORIES (CONTINUED) National Emission Standards for Ethylene Manufacturing Process Units: Heat Exchange Systems and Waste Operations Applicability for Heat Exchange Systems § 63.1083 Does this subpart apply to my heat exchange system? The provisions of this subpart apply to your heat exchange system if you own...
NASA Astrophysics Data System (ADS)
Friedman, Gary; Schwuttke, Ursula M.; Burliegh, Scott; Chow, Sanguan; Parlier, Randy; Lee, Lorrine; Castro, Henry; Gersbach, Jim
1993-03-01
In the early days of JPL's solar system exploration, each spacecraft mission required its own dedicated data system with all software applications written in the mainframe's native assembly language. Although these early telemetry processing systems were a triumph of engineering in their day, since that time the computer industry has advanced to the point where it is now advantageous to replace these systems with more modern technology. The Space Flight Operations Center (SFOC) Prototype group was established in 1985 as a workstation and software laboratory. The charter of the lab was to determine if it was possible to construct a multimission telemetry processing system using commercial, off-the-shelf computers that communicated via networks. The staff of the lab mirrored that of a typical skunk works operation -- a small, multi-disciplinary team with a great deal of autonomy that could get complex tasks done quickly. In an effort to determine which approaches would be useful, the prototype group experimented with all types of operating systems, inter-process communication mechanisms, network protocols, packet size parameters. Out of that pioneering work came the confidence that a multi-mission telemetry processing system could be built using high-level languages running in a heterogeneous, networked workstation environment. Experience revealed that the operating systems on all nodes should be similar (i.e., all VMS or all PC-DOS or all UNIX), and that a unique Data Transport Subsystem tool needed to be built to address the incompatibilities of network standards, byte ordering, and socket buffering. The advantages of building a telemetry processing system based on emerging industry standards were numerous: by employing these standards, we would no longer be locked into a single vendor. When new technology came to market which offered ten times the performance at one eighth the cost, it would be possible to attach the new machine to the network, re-compile the application code, and run. In addition, we would no longer be plagued with lack of manufacturer support when we encountered obscure bugs. And maybe, hopefully, the eternal elusive goal of software portability across different vendors' platforms would finally be available. Some highlights of our prototyping efforts are described.
NASA Technical Reports Server (NTRS)
Friedman, Gary; Schwuttke, Ursula M.; Burliegh, Scott; Chow, Sanguan; Parlier, Randy; Lee, Lorrine; Castro, Henry; Gersbach, Jim
1993-01-01
In the early days of JPL's solar system exploration, each spacecraft mission required its own dedicated data system with all software applications written in the mainframe's native assembly language. Although these early telemetry processing systems were a triumph of engineering in their day, since that time the computer industry has advanced to the point where it is now advantageous to replace these systems with more modern technology. The Space Flight Operations Center (SFOC) Prototype group was established in 1985 as a workstation and software laboratory. The charter of the lab was to determine if it was possible to construct a multimission telemetry processing system using commercial, off-the-shelf computers that communicated via networks. The staff of the lab mirrored that of a typical skunk works operation -- a small, multi-disciplinary team with a great deal of autonomy that could get complex tasks done quickly. In an effort to determine which approaches would be useful, the prototype group experimented with all types of operating systems, inter-process communication mechanisms, network protocols, packet size parameters. Out of that pioneering work came the confidence that a multi-mission telemetry processing system could be built using high-level languages running in a heterogeneous, networked workstation environment. Experience revealed that the operating systems on all nodes should be similar (i.e., all VMS or all PC-DOS or all UNIX), and that a unique Data Transport Subsystem tool needed to be built to address the incompatibilities of network standards, byte ordering, and socket buffering. The advantages of building a telemetry processing system based on emerging industry standards were numerous: by employing these standards, we would no longer be locked into a single vendor. When new technology came to market which offered ten times the performance at one eighth the cost, it would be possible to attach the new machine to the network, re-compile the application code, and run. In addition, we would no longer be plagued with lack of manufacturer support when we encountered obscure bugs. And maybe, hopefully, the eternal elusive goal of software portability across different vendors' platforms would finally be available. Some highlights of our prototyping efforts are described.
NASA Astrophysics Data System (ADS)
Kepner, J. V.; Janka, R. S.; Lebak, J.; Richards, M. A.
1999-12-01
The Vector/Signal/Image Processing Library (VSIPL) is a DARPA initiated effort made up of industry, government and academic representatives who have defined an industry standard API for vector, signal, and image processing primitives for real-time signal processing on high performance systems. VSIPL supports a wide range of data types (int, float, complex, ...) and layouts (vectors, matrices and tensors) and is ideal for astronomical data processing. The VSIPL API is intended to serve as an open, vendor-neutral, industry standard interface. The object-based VSIPL API abstracts the memory architecture of the underlying machine by using the concept of memory blocks and views. Early experiments with VSIPL code conversions have been carried out by the High Performance Computing Program team at the UCSD. Commercially, several major vendors of signal processors are actively developing implementations. VSIPL has also been explicitly required as part of a recent Rome Labs teraflop procurement. This poster presents the VSIPL API, its functionality and the status of various implementations.
Wang, Zhi; Liang, Jiabin; Rong, Xing; Zhou, Hao; Duan, Chuanwei; Du, Weijia; Liu, Yimin
2015-12-01
To investigate noise hazard and its influence on hearing loss in workers in the automotive component manufacturing industry. Noise level in the workplace of automotive component manufacturing enterprises was measured and hearing examination was performed for workers to analyze the features and exposure levels of noise in each process, as well as the influence on hearing loss in workers. In the manufacturing processes for different products in this industry, the manufacturing processes of automobile hub and suspension and steering systems had the highest degrees of noise hazard, with over-standard rates of 79.8% and 57.1%, respectively. In the different technical processes for automotive component manufacturing, punching and casting had the highest degrees of noise hazard, with over-standard rates of 65.0% and 50%, respectively. The workers engaged in the automotive air conditioning system had the highest rate of abnormal hearing ability (up to 3.1%). In the automotive component manufacturing industry, noise hazard exceeds the standard seriously. Although the rate of abnormal hearing is lower than the average value of the automobile manufacturing industry in China, this rate tends to increase gradually. Enough emphasis should be placed on the noise hazard in this industry.
40 CFR Table 1 to Subpart Uuuu of... - Emission Limits and Work Practice Standards
Code of Federal Regulations, 2014 CFR
2014-07-01
... least once per month as specified in § 63.148(f)(2)). 12. heat exchanger system that cools process equipment or materials in the process unit each existing or new affected source monitor and repair the heat exchanger system according to § 63.104(a) through (e), except that references to “chemical manufacturing...
40 CFR Table 1 to Subpart Uuuu of... - Emission Limits and Work Practice Standards
Code of Federal Regulations, 2012 CFR
2012-07-01
... least once per month as specified in § 63.148(f)(2)). 12. heat exchanger system that cools process equipment or materials in the process unit each existing or new affected source monitor and repair the heat exchanger system according to § 63.104(a) through (e), except that references to “chemical manufacturing...
40 CFR Table 1 to Subpart Uuuu of... - Emission Limits and Work Practice Standards
Code of Federal Regulations, 2013 CFR
2013-07-01
... least once per month as specified in § 63.148(f)(2)). 12. heat exchanger system that cools process equipment or materials in the process unit each existing or new affected source monitor and repair the heat exchanger system according to § 63.104(a) through (e), except that references to “chemical manufacturing...
40 CFR Table 1 to Subpart Uuuu of... - Emission Limits and Work Practice Standards
Code of Federal Regulations, 2011 CFR
2011-07-01
... least once per month as specified in § 63.148(f)(2)). 12. heat exchanger system that cools process equipment or materials in the process unit each existing or new affected source monitor and repair the heat exchanger system according to § 63.104(a) through (e), except that references to “chemical manufacturing...
The Bologna Club: What U.S. Higher Education Can Learn from a Decade of European Reconstruction
ERIC Educational Resources Information Center
Adelman, Clifford
2008-01-01
This report examines the efforts of 46 European nations to harmonize (not "standardize") their higher education systems and indicates that the United States higher education system needs to adopt some of the features of the Bologna Process. Based on what can be learned from the Bologna Process, this report makes concrete suggestions for…
NASA Astrophysics Data System (ADS)
Hu, Chen; Chen, Mian-zhou; Li, Hong-bin; Zhang, Zhu; Jiao, Yang; Shao, Haiming
2018-05-01
Ordinarily electronic voltage transformers (EVTs) are calibrated off-line and the calibration procedure requires complex switching operations, which will influence the reliability of the power grid and induce large economic losses. To overcome this problem, this paper investigates a 110 kV on-site calibration system for EVTs, including a standard channel, a calibrated channel and a PC equipped with the LabView environment. The standard channel employs a standard capacitor and an analogue integrating circuit to reconstruct the primary voltage signal. Moreover, an adaptive full-phase discrete Fourier transform (DFT) algorithm is proposed to extract electrical parameters. The algorithm involves the process of extracting the frequency of the grid, adjusting the operation points, and calculating the results using DFT. In addition, an insulated automatic lifting device is designed to realize the live connection of the standard capacitor, which is driven by a wireless remote controller. A performance test of the capacitor verifies the accurateness of the standard capacitor. A system calibration test shows that the system ratio error is less than 0.04% and the phase error is below 2‧, which meets the requirement of the 0.2 accuracy class. Finally, the developed calibration system was used in a substation, and the field test data validates the availability of the system.
Design and implementation of the standards-based personal intelligent self-management system (PICS).
von Bargen, Tobias; Gietzelt, Matthias; Britten, Matthias; Song, Bianying; Wolf, Klaus-Hendrik; Kohlmann, Martin; Marschollek, Michael; Haux, Reinhold
2013-01-01
Against the background of demographic change and a diminishing care workforce there is a growing need for personalized decision support. The aim of this paper is to describe the design and implementation of the standards-based personal intelligent care systems (PICS). PICS makes consistent use of internationally accepted standards such as the Health Level 7 (HL7) Arden syntax for the representation of the decision logic, HL7 Clinical Document Architecture for information representation and is based on a open-source service-oriented architecture framework and a business process management system. Its functionality is exemplified for the application scenario of a patient suffering from congestive heart failure. Several vital signs sensors provide data for the decision support system, and a number of flexible communication channels are available for interaction with patient or caregiver. PICS is a standards-based, open and flexible system enabling personalized decision support. Further development will include the implementation of components on small computers and sensor nodes.
1988-11-01
system, using graphic techniques which enable users, analysts, and designers to get a clear and common picture of the system and how its parts fit...boxes into hierarchies suitable for computer implementation. ŗ. Structured Design uses tools, especially graphic ones, to render systems readily...LSA, PROCESSES, DATA FLOWS, DATA STORES, EX"RNAL ENTITIES, OVERALL SYSTEMS DESIGN PROCESS, over 19, ABSTRACT (Continue on reverse if necessary and
The systems engineering overview and process (from the Systems Engineering Management Guide, 1990)
NASA Technical Reports Server (NTRS)
1993-01-01
The past several decades have seen the rise of large, highly interactive systems that are on the forward edge of technology. As a result of this growth and the increased usage of digital systems (computers and software), the concept of systems engineering has gained increasing attention. Some of this attention is no doubt due to large program failures which possibly could have been avoided, or at least mitigated, through the use of systems engineering principles. The complexity of modern day weapon systems requires conscious application of systems engineering concepts to ensure producible, operable and supportable systems that satisfy mission requirements. Although many authors have traced the roots of systems engineering to earlier dates, the initial formalization of the systems engineering process for military development began to surface in the mid-1950s on the ballistic missile programs. These early ballistic missile development programs marked the emergence of engineering discipline 'specialists' which has since continued to grow. Each of these specialties not only has a need to take data from the overall development process, but also to supply data, in the form of requirements and analysis results, to the process. A number of technical instructions, military standards and specifications, and manuals were developed as a result of these development programs. In particular, MILSTD-499 was issued in 1969 to assist both government and contractor personnel in defining the systems engineering effort in support of defense acquisition programs. This standard was updated to MIL-STD499A in 1974, and formed the foundation for current application of systems engineering principles to military development programs.
The systems engineering overview and process (from the Systems Engineering Management Guide, 1990)
NASA Astrophysics Data System (ADS)
The past several decades have seen the rise of large, highly interactive systems that are on the forward edge of technology. As a result of this growth and the increased usage of digital systems (computers and software), the concept of systems engineering has gained increasing attention. Some of this attention is no doubt due to large program failures which possibly could have been avoided, or at least mitigated, through the use of systems engineering principles. The complexity of modern day weapon systems requires conscious application of systems engineering concepts to ensure producible, operable and supportable systems that satisfy mission requirements. Although many authors have traced the roots of systems engineering to earlier dates, the initial formalization of the systems engineering process for military development began to surface in the mid-1950s on the ballistic missile programs. These early ballistic missile development programs marked the emergence of engineering discipline 'specialists' which has since continued to grow. Each of these specialties not only has a need to take data from the overall development process, but also to supply data, in the form of requirements and analysis results, to the process. A number of technical instructions, military standards and specifications, and manuals were developed as a result of these development programs. In particular, MILSTD-499 was issued in 1969 to assist both government and contractor personnel in defining the systems engineering effort in support of defense acquisition programs. This standard was updated to MIL-STD499A in 1974, and formed the foundation for current application of systems engineering principles to military development programs.
Integration of CBIR in radiological routine in accordance with IHE
NASA Astrophysics Data System (ADS)
Welter, Petra; Deserno, Thomas M.; Fischer, Benedikt; Wein, Berthold B.; Ott, Bastian; Günther, Rolf W.
2009-02-01
Increasing use of digital imaging processing leads to an enormous amount of imaging data. The access to picture archiving and communication systems (PACS), however, is solely textually, leading to sparse retrieval results because of ambiguous or missing image descriptions. Content-based image retrieval (CBIR) systems can improve the clinical diagnostic outcome significantly. However, current CBIR systems are not able to integrate their results with clinical workflow and PACS. Existing communication standards like DICOM and HL7 leave many options for implementation and do not ensure full interoperability. We present a concept of the standardized integration of a CBIR system for the radiology workflow in accordance with the Integrating the Healthcare Enterprise (IHE) framework. This is based on the IHE integration profile 'Post-Processing Workflow' (PPW) defining responsibilities as well as standardized communication and utilizing the DICOM Structured Report (DICOM SR). Because nowadays most of PACS and RIS systems are not yet fully IHE compliant to PPW, we also suggest an intermediate approach with the concepts of the CAD-PACS Toolkit. The integration is independent of the particular PACS and RIS system. Therefore, it supports the widespread application of CBIR in radiological routine. As a result, the approach is exemplarily applied to the Image Retrieval in Medical Applications (IRMA) framework.
40 CFR 63.867 - Reporting requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Emission Standards for Hazardous Air Pollutants for Chemical Recovery Combustion Sources at Kraft, Soda...) Additional reporting requirements for HAP metals standards. (1) Any owner or operator of a group of process units in a chemical recovery system at a mill complying with the PM emissions limits in § 63.862(a)(1...
40 CFR 63.867 - Reporting requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Emission Standards for Hazardous Air Pollutants for Chemical Recovery Combustion Sources at Kraft, Soda...) Additional reporting requirements for HAP metals standards. (1) Any owner or operator of a group of process units in a chemical recovery system at a mill complying with the PM emissions limits in § 63.862(a)(1...
40 CFR 63.867 - Reporting requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Emission Standards for Hazardous Air Pollutants for Chemical Recovery Combustion Sources at Kraft, Soda...) Additional reporting requirements for HAP metals standards. (1) Any owner or operator of a group of process units in a chemical recovery system at a mill complying with the PM emissions limits in § 63.862(a)(1...
ISO 9000 Quality Systems: Application to Higher Education.
ERIC Educational Resources Information Center
Clery, Roger G.
This paper describes and explains the 20 elements of the International Organization for Standards 9000 (ISO 9000) series, a model for quality assurance in the business processes of design/development, production, installation and servicing. The standards were designed in 1987 to provide a common denominator for business quality particularly to…
Applying Standard Interfaces to a Process-Control Language
NASA Technical Reports Server (NTRS)
Berthold, Richard T.
2005-01-01
A method of applying open-operating-system standard interfaces to the NASA User Interface Language (UIL) has been devised. UIL is a computing language that can be used in monitoring and controlling automated processes: for example, the Timeliner computer program, written in UIL, is a general-purpose software system for monitoring and controlling sequences of automated tasks in a target system. In providing the major elements of connectivity between UIL and the target system, the present method offers advantages over the prior method. Most notably, unlike in the prior method, the software description of the target system can be made independent of the applicable compiler software and need not be linked to the applicable executable compiler image. Also unlike in the prior method, it is not necessary to recompile the source code and relink the source code to a new executable compiler image. Abstraction of the description of the target system to a data file can be defined easily, with intuitive syntax, and knowledge of the source-code language is not needed for the definition.
Tailoring Systems Engineering Projects for Small Satellite Missions
NASA Technical Reports Server (NTRS)
Horan, Stephen; Belvin, Keith
2013-01-01
NASA maintains excellence in its spaceflight systems by utilizing rigorous engineering processes based on over 50 years of experience. The NASA systems engineering process for flight projects described in NPR 7120.5E was initially developed for major flight projects. The design and development of low-cost small satellite systems does not entail the financial and risk consequences traditionally associated with spaceflight projects. Consequently, an approach is offered to tailoring of the processes such that the small satellite missions will benefit from the engineering rigor without overly burdensome overhead. In this paper we will outline the approaches to tailoring the standard processes for these small missions and describe how it will be applied in a proposed small satellite mission.
Detwiller, Maureen; Petillion, Wendy
2014-06-01
Moving a large healthcare organization from an old, nonstandardized clinical information system to a new user-friendly, standards-based system was much more than an upgrade to technology. This project to standardize terminology, optimize key processes, and implement a new clinical information system was a large change initiative over 4 years that affected clinicians across the organization. Effective change management and engagement of clinical stakeholders were critical to the success of the initiative. The focus of this article was to outline the strategies and methodologies used and the lessons learned.
Standardization of quality control plans for highway bridges in Europe: COST Action TU 1406
NASA Astrophysics Data System (ADS)
Casas, Joan R.; Matos, Jose Campos e.
2017-09-01
In Europe, as all over the world, the need to manage roadway bridges in an efficient way led to the development of different management systems. Hence, nowadays, many European countries have their own system. Although they present a similar architectural framework, several differences can be appointed. These differences constitute a divergent mechanism that may conduct to different decisions on maintenance actions. Within the roadway bridge management process, the identification of maintenance needs is more effective when developed in a uniform and repeatable manner. This process can be accomplished by the identification of performance indicators and definition of performance goals and key performance indicators (KPI), improving the planning of maintenance strategies. Therefore, a discussion at a European level, seeking to achieve a standardized approach in this subject, will bring significant benefits. Accordingly, a COST Action is under way in Europe with the aim of standardizing the establishment of quality control plans for roadway bridges.
Automated installation methods for photovoltaic arrays
NASA Astrophysics Data System (ADS)
Briggs, R.; Daniels, A.; Greenaway, R.; Oster, J., Jr.; Racki, D.; Stoeltzing, R.
1982-11-01
Since installation expenses constitute a substantial portion of the cost of a large photovoltaic power system, methods for reduction of these costs were investigated. The installation of the photovoltaic arrays includes all areas, starting with site preparation (i.e., trenching, wiring, drainage, foundation installation, lightning protection, grounding and installation of the panel) and concluding with the termination of the bus at the power conditioner building. To identify the optimum combination of standard installation procedures and automated/mechanized techniques, the installation process was investigated including the equipment and hardware available, the photovoltaic array structure systems and interfaces, and the array field and site characteristics. Preliminary designs of hardware for both the standard installation method, the automated/mechanized method, and a mix of standard installation procedures and mechanized procedures were identified to determine which process effectively reduced installation costs. In addition, costs associated with each type of installation method and with the design, development and fabrication of new installation hardware were generated.
Expert system verification and validation study. Delivery 3A and 3B: Trip summaries
NASA Technical Reports Server (NTRS)
French, Scott
1991-01-01
Key results are documented from attending the 4th workshop on verification, validation, and testing. The most interesting part of the workshop was when representatives from the U.S., Japan, and Europe presented surveys of VV&T within their respective regions. Another interesting part focused on current efforts to define industry standards for artificial intelligence and how that might affect approaches to VV&T of expert systems. The next part of the workshop focused on VV&T methods of applying mathematical techniques to verification of rule bases and techniques for capturing information relating to the process of developing software. The final part focused on software tools. A summary is also presented of the EPRI conference on 'Methodologies, Tools, and Standards for Cost Effective Reliable Software Verification and Validation. The conference was divided into discussion sessions on the following issues: development process, automated tools, software reliability, methods, standards, and cost/benefit considerations.
Coherent Frequency Reference System for the NASA Deep Space Network
NASA Technical Reports Server (NTRS)
Tucker, Blake C.; Lauf, John E.; Hamell, Robert L.; Gonzaler, Jorge, Jr.; Diener, William A.; Tjoelker, Robert L.
2010-01-01
The NASA Deep Space Network (DSN) requires state-of-the-art frequency references that are derived and distributed from very stable atomic frequency standards. A new Frequency Reference System (FRS) and Frequency Reference Distribution System (FRD) have been developed, which together replace the previous Coherent Reference Generator System (CRG). The FRS and FRD each provide new capabilities that significantly improve operability and reliability. The FRS allows for selection and switching between frequency standards, a flywheel capability (to avoid interruptions when switching frequency standards), and a frequency synthesis system (to generate standardized 5-, 10-, and 100-MHz reference signals). The FRS is powered by redundant, specially filtered, and sustainable power systems and includes a monitor and control capability for station operations to interact and control the frequency-standard selection process. The FRD receives the standardized 5-, 10-, and 100-MHz reference signals and distributes signals to distribution amplifiers in a fan out fashion to dozens of DSN users that require the highly stable reference signals. The FRD is also powered by redundant, specially filtered, and sustainable power systems. The new DSN Frequency Distribution System, which consists of the FRS and FRD systems described here, is central to all operational activities of the NASA DSN. The frequency generation and distribution system provides ultra-stable, coherent, and very low phase-noise references at 5, l0, and 100 MHz to between 60 and 100 separate users at each Deep Space Communications Complex.
National Incident Management System (NIMS) Standards Review Panel Workshop Summary Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stenner, Robert D.; Kirk, Jennifer L.; Stanton, James R.
The importance and need for full compliant implementation of NIMS nationwide was clearly demonstrated during the Hurricane Katrina event, which was clearly expressed in Secretary Chertoff's October 4, 2005 letter addressed to the State's governors. It states, ''Hurricane Katrina was a stark reminder of how critical it is for our nation to approach incident management in a coordinated, consistent, and efficient manner. We must be able to come together, at all levels of government, to prevent, prepare for, respond to, and recover from any emergency or disaster. Our operations must be seamless and based on common incident management doctrine, becausemore » the challenges we face as a nation are far greater than capabilities of any one jurisdiction.'' The NIMS is a system/architecture for organizing response on a ''national'' level. It incorporations ICS as a main component of that structure (i.e., it institutionalizes ICS in NIMS). In a paper published on the NIMS Website, the following statements were made: ''NIMS represents a core set of doctrine, principles, terminology, and organizational processes to enable effective, efficient and collaborative incident management at all levels. To provide the framework for interoperability and compatibility, the NIMS is based on a balance between flexibility and standardization.'' Thus the NIC is challenged with the need to adopt quality SDO generated standards to support NIMS compliance, but in doing so maintain the flexibility necessary so that response operations can be tailored for the specific jurisdictional and geographical needs across the nation. In support of this large and complex challenge facing the NIC, the Pacific Northwest National Laboratory (PNNL) was asked to provide technical support to the NIC, through their DHS Science and Technology ? Standards Portfolio Contract, to help identify, review, and develop key standards for NIMS compliance. Upon examining the challenge, the following general process appears to be a reasonable approach for identifying and establishing existing standards that would be applicable to NIMS compliance. The suggested generalized steps to establishing existing SDO generated standards for NIMS compliance are: (1) establish search criteria from the NIMS and its support documents, (2) search SDO databases to identify key existing nationally and/or internationally recognized standards that have potential application to NIMS compliance needs, (3) review the identified standards against the specific component needs of the NIMS, (4) identify the pertinent aspects/components of those identified standards that clearly address specific NIMS compliance needs, (5) establish a process to adopt the pertinent standards, which includes the generation of formalized FEMA Guidance that identifies the specific NIMS component compliance needs addressed in the respective standard, (6) develop performance criteria for which to measure compliance with the identified NIMS components addressed by the respective adopted standard, and (7) adopt the standard, publish the guidance and performance criteria, and incorporate it into routine FEMA/NIC NIMS management operations. This review process will also help identify real gaps in standards for which new NIMS specific standards should be developed. To jump start this process and hopefully identify some key ''low hanging fruit'' standards the NIC could use to begin such a process, a panel of first-responder experts (familiar with the current standards of common use in the first-responder community) from various response disciplines was formed and a workshop held. The workshop included a pre-workshop information gathering process. This report discusses the workshop and its findings in detail.« less
Multimission image processing and science data visualization
NASA Technical Reports Server (NTRS)
Green, William B.
1993-01-01
The Operational Science Analysis (OSA) Functional area supports science instrument data display, analysis, visualization and photo processing in support of flight operations of planetary spacecraft managed by the Jet Propulsion Laboratory (JPL). This paper describes the data products generated by the OSA functional area, and the current computer system used to generate these data products. The objectives on a system upgrade now in process are described. The design approach to development of the new system are reviewed, including use of the Unix operating system and X-Window display standards to provide platform independence, portability, and modularity within the new system, is reviewed. The new system should provide a modular and scaleable capability supporting a variety of future missions at JPL.
Device and methods for "gold standard" registration of clinical 3D and 2D cerebral angiograms
NASA Astrophysics Data System (ADS)
Madan, Hennadii; Likar, Boštjan; Pernuš, Franjo; Å piclin, Žiga
2015-03-01
Translation of any novel and existing 3D-2D image registration methods into clinical image-guidance systems is limited due to lack of their objective validation on clinical image datasets. The main reason is that, besides the calibration of the 2D imaging system, a reference or "gold standard" registration is very difficult to obtain on clinical image datasets. In the context of cerebral endovascular image-guided interventions (EIGIs), we present a calibration device in the form of a headband with integrated fiducial markers and, secondly, propose an automated pipeline comprising 3D and 2D image processing, analysis and annotation steps, the result of which is a retrospective calibration of the 2D imaging system and an optimal, i.e., "gold standard" registration of 3D and 2D images. The device and methods were used to create the "gold standard" on 15 datasets of 3D and 2D cerebral angiograms, whereas each dataset was acquired on a patient undergoing EIGI for either aneurysm coiling or embolization of arteriovenous malformation. The use of the device integrated seamlessly in the clinical workflow of EIGI. While the automated pipeline eliminated all manual input or interactive image processing, analysis or annotation. In this way, the time to obtain the "gold standard" was reduced from 30 to less than one minute and the "gold standard" of 3D-2D registration on all 15 datasets of cerebral angiograms was obtained with a sub-0.1 mm accuracy.
NASA Astrophysics Data System (ADS)
Moreira, Francisco; Silva, Nuno
2016-08-01
Safety systems require accident avoidance. This is covered by application standards, processes, techniques and tools that support the identification, analysis, elimination or reduction to an acceptable level of system risks and hazards. Ideally, a safety system should be free of hazards. However, both industry and academia have been struggling to ensure appropriate risk and hazard analysis, especially in what concerns completeness of the hazards, formalization, and timely analysis in order to influence the specifications and the implementation. Such analysis is also important when considering a change to an existing system. The Common Safety Method for Risk Evaluation and Assessment (CSM- RA) is a mandatory procedure whenever any significant change is proposed to the railway system in a European Member State. This paper provides insights on the fundamentals of CSM-RA based and complemented with Hazard Analysis. When and how to apply them, and the relation and similarities of these processes with industry standards and the system life cycles is highlighted. Finally, the paper shows how CSM-RA can be the basis of a change management process, guiding the identification and management of the hazards helping ensuring the similar safety level as the initial system. This paper will show how the CSM-RA principles can be used in other domains particularly for space system evolution.
Approach for Configuring a Standardized Vessel for Processing Radioactive Waste Slurries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bamberger, Judith A.; Enderlin, Carl W.; Minette, Michael J.
2015-09-10
A standardized vessel design is being considered at the Waste Treatment and Immobilization Plant (WTP) that is under construction at Hanford, Washington. The standardized vessel design will be used for storing, blending, and chemical processing of slurries that exhibit a variable process feed including Newtonian to non-Newtonian rheologies over a range of solids loadings. Developing a standardized vessel is advantageous and reduces the testing required to evaluate the performance of the design. The objectives of this paper are to: 1) present a design strategy for developing a standard vessel mixing system design for the pretreatment portion of the waste treatmentmore » plant that must process rheologically and physically challenging process streams, 2) identify performance criteria that the design for the standard vessel must satisfy, 3) present parameters that are to be used for assessing the performance criteria, and 4) describe operation of the selected technology. Vessel design performance will be assessed for both Newtonian and non-Newtonian simulants which represent a range of waste types expected during operation. Desired conditions for the vessel operations are the ability to shear the slurry so that flammable gas does not accumulate within the vessel, that settled solids will be mobilized, that contents can be blended, and that contents can be transferred from the vessel. A strategy is presented for adjusting the vessel configuration to ensure that all these conditions are met.« less
Three-dimensional measurement system for crime scene documentation
NASA Astrophysics Data System (ADS)
Adamczyk, Marcin; Hołowko, Elwira; Lech, Krzysztof; Michoński, Jakub; MÄ czkowski, Grzegorz; Bolewicki, Paweł; Januszkiewicz, Kamil; Sitnik, Robert
2017-10-01
Three dimensional measurements (such as photogrammetry, Time of Flight, Structure from Motion or Structured Light techniques) are becoming a standard in the crime scene documentation process. The usage of 3D measurement techniques provide an opportunity to prepare more insightful investigation and helps to show every trace in the context of the entire crime scene. In this paper we would like to present a hierarchical, three-dimensional measurement system that is designed for crime scenes documentation process. Our system reflects the actual standards in crime scene documentation process - it is designed to perform measurement in two stages. First stage of documentation, the most general, is prepared with a scanner with relatively low spatial resolution but also big measuring volume - it is used for the whole scene documentation. Second stage is much more detailed: high resolution but smaller size of measuring volume for areas that required more detailed approach. The documentation process is supervised by a specialised application CrimeView3D, that is a software platform for measurements management (connecting with scanners and carrying out measurements, automatic or semi-automatic data registration in the real time) and data visualisation (3D visualisation of documented scenes). It also provides a series of useful tools for forensic technicians: virtual measuring tape, searching for sources of blood spatter, virtual walk on the crime scene and many others. In this paper we present our measuring system and the developed software. We also provide an outcome from research on metrological validation of scanners that was performed according to VDI/VDE standard. We present a CrimeView3D - a software-platform that was developed to manage the crime scene documentation process. We also present an outcome from measurement sessions that were conducted on real crime scenes with cooperation with Technicians from Central Forensic Laboratory of Police.
NASA Astrophysics Data System (ADS)
Jöckel, P.; Sander, R.; Kerkweg, A.; Tost, H.; Lelieveld, J.
2005-02-01
The development of a comprehensive Earth System Model (ESM) to study the interactions between chemical, physical, and biological processes, requires coupling of the different domains (land, ocean, atmosphere, ...). One strategy is to link existing domain-specific models with a universal coupler, i.e. an independent standalone program organizing the communication between other programs. In many cases, however, a much simpler approach is more feasible. We have developed the Modular Earth Submodel System (MESSy). It comprises (1) a modular interface structure to connect to a , (2) an extendable set of such for miscellaneous processes, and (3) a coding standard. MESSy is therefore not a coupler in the classical sense, but exchanges data between a and several within one comprehensive executable. The internal complexity of the is controllable in a transparent and user friendly way. This provides remarkable new possibilities to study feedback mechanisms (by two-way coupling). Note that the MESSy and the coupler approach can be combined. For instance, an atmospheric model implemented according to the MESSy standard could easily be coupled to an ocean model by means of an external coupler. The vision is to ultimately form a comprehensive ESM which includes a large set of submodels, and a base model which contains only a central clock and runtime control. This can be reached stepwise, since each process can be included independently. Starting from an existing model, process submodels can be reimplemented according to the MESSy standard. This procedure guarantees the availability of a state-of-the-art model for scientific applications at any time of the development. In principle, MESSy can be implemented into any kind of model, either global or regional. So far, the MESSy concept has been applied to the general circulation model ECHAM5 and a number of process boxmodels.
Prue-Owens, Kathy; Watkins, Miko; Wolgast, Kelly A
2011-01-01
The Patient CaringTouch System emerged from a comprehensive assessment and gap analysis of clinical nursing capabilities in the Army. The Patient CaringTouch System now provides the framework and set of standards by which we drive excellence in quality nursing care for our patients and excellence in quality of life for our nurses in Army Medicine. As part of this enterprise transformation, we placed particular emphasis on the delivery of nursing care at the bedside as well as the integration of a formal professional peer feedback process in support of individual nurse practice enhancement. The Warrior Care Imperative Action Team was chartered to define and establish the standards for care teams in the clinical settings and the process by which we established formal peer feedback for our professional nurses. This back-to-basics approach is a cornerstone of the Patient CaringTouch System implementation and sustainment.
A reference model for space data system interconnection services
NASA Astrophysics Data System (ADS)
Pietras, John; Theis, Gerhard
1993-03-01
The widespread adoption of standard packet-based data communication protocols and services for spaceflight missions provides the foundation for other standard space data handling services. These space data handling services can be defined as increasingly sophisticated processing of data or information received from lower-level services, using a layering approach made famous in the International Organization for Standardization (ISO) Open System Interconnection Reference Model (OSI-RM). The Space Data System Interconnection Reference Model (SDSI-RM) incorporates the conventions of the OSIRM to provide a framework within which a complete set of space data handling services can be defined. The use of the SDSI-RM is illustrated through its application to data handling services and protocols that have been defined by, or are under consideration by, the Consultative Committee for Space Data Systems (CCSDS).
A reference model for space data system interconnection services
NASA Technical Reports Server (NTRS)
Pietras, John; Theis, Gerhard
1993-01-01
The widespread adoption of standard packet-based data communication protocols and services for spaceflight missions provides the foundation for other standard space data handling services. These space data handling services can be defined as increasingly sophisticated processing of data or information received from lower-level services, using a layering approach made famous in the International Organization for Standardization (ISO) Open System Interconnection Reference Model (OSI-RM). The Space Data System Interconnection Reference Model (SDSI-RM) incorporates the conventions of the OSIRM to provide a framework within which a complete set of space data handling services can be defined. The use of the SDSI-RM is illustrated through its application to data handling services and protocols that have been defined by, or are under consideration by, the Consultative Committee for Space Data Systems (CCSDS).
Portable Wireless LAN Device and Two-way Radio Threat Assessment for Aircraft Navigation Radios
NASA Technical Reports Server (NTRS)
Nguyen, Truong X.; Koppen, Sandra V.; Ely, Jay J.; Williams, Reuben A.; Smith, Laura J.; Salud, Maria Theresa P.
2003-01-01
Measurement processes, data and analysis are provided to address the concern for Wireless Local Area Network devices and two-way radios to cause electromagnetic interference to aircraft navigation radio systems. A radiated emission measurement process is developed and spurious radiated emissions from various devices are characterized using reverberation chambers. Spurious radiated emissions in aircraft radio frequency bands from several wireless network devices are compared with baseline emissions from standard computer laptops and personal digital assistants. In addition, spurious radiated emission data in aircraft radio frequency bands from seven pairs of two-way radios are provided, A description of the measurement process, device modes of operation and the measurement results are reported. Aircraft interference path loss measurements were conducted on four Boeing 747 and Boeing 737 aircraft for several aircraft radio systems. The measurement approach is described and the path loss results are compared with existing data from reference documents, standards, and NASA partnerships. In-band on-channel interference thresholds are compiled from an existing reference document. Using these data, a risk assessment is provided for interference from wireless network devices and two-way radios to aircraft systems, including Localizer, Glideslope, Very High Frequency Omnidirectional Range, Microwave Landing System and Global Positioning System. The report compares the interference risks associated with emissions from wireless network devices and two-way radios against standard laptops and personal digital assistants. Existing receiver interference threshold references are identified as to require more data for better interference risk assessments.
Standard model light-by-light scattering in SANC: Analytic and numeric evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bardin, D. Yu., E-mail: bardin@nu.jinr.ru; Kalinovskaya, L. V., E-mail: kalinov@nu.jinr.ru; Uglov, E. D., E-mail: corner@nu.jinr.r
2010-11-15
The implementation of the Standard Model process {gamma}{gamma} {yields} {gamma}{gamma} through a fermion and boson loop into the framework of SANC system and additional precomputation modules used for calculation of massive box diagrams are described. The computation of this process takes into account nonzero mass of loop particles. The covariant and helicity amplitudes for this process, some particular cases of D{sub 0} and C{sub 0} Passarino-Veltman functions, and also numerical results of corresponding SANC module evaluation are presented. Whenever possible, the results are compared with those existing in the literature.
Dewan, Shaveta; Sibal, Anupam; Uberoi, R S; Kaur, Ishneet; Nayak, Yogamaya; Kar, Sujoy; Loria, Gaurav; Yatheesh, G; Balaji, V
2014-01-01
Creating and implementing processes to deliver quality care in compliance with accreditation standards is a challenging task but even more daunting is sustaining these processes and systems. There is need for frequent monitoring of the gap between the expected level of care and the level of care actually delivered so as to achieve consistent level of care. The Apollo Accreditation Program (AAP) was implemented as a web-based single measurable dashboard to display, measure and compare compliance levels for established standards of care in JCI accredited hospitals every quarter and resulted in an overall 15.5% improvement in compliance levels over one year.
Implementation of the fugitive emissions system program: The OxyChem experience
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deshmukh, A.
An overview is provided for the Fugitive Emissions System (FES) that has been implemented at Occidental Chemical in conjunction with the computer-based maintenance system called PassPort{reg_sign} developed by Indus Corporation. The goal of PassPort{reg_sign} FES program has been to interface with facilities data, equipment information, work standards and work orders. Along the way, several implementation hurdles had to be overcome before a monitoring and regulatory system could be standardized for the appropriate maintenance, process and environmental groups. This presentation includes step-by-step account of several case studies that developed during the implementation of the FES system.
75 FR 6364 - Process for Requesting a Variance From Vegetation Standards for Levees and Floodwalls
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-09
..., channels, or shore- line or river-bank protection systems such as revetments, sand dunes, and barrier...) toe (subject to preexisting right-of-way). f. The vegetation variance process is not a mechanism to...
Generic System for Remote Testing and Calibration of Measuring Instruments: Security Architecture
NASA Astrophysics Data System (ADS)
Jurčević, M.; Hegeduš, H.; Golub, M.
2010-01-01
Testing and calibration of laboratory instruments and reference standards is a routine activity and is a resource and time consuming process. Since many of the modern instruments include some communication interfaces, it is possible to create a remote calibration system. This approach addresses a wide range of possible applications and permits to drive a number of different devices. On the other hand, remote calibration process involves a number of security issues due to recommendations specified in standard ISO/IEC 17025, since it is not under total control of the calibration laboratory personnel who will sign the calibration certificate. This approach implies that the traceability and integrity of the calibration process directly depends on the collected measurement data. The reliable and secure remote control and monitoring of instruments is a crucial aspect of internet-enabled calibration procedure.
Twomey, Michèle; Šijački, Ana; Krummrey, Gert; Welzel, Tyson; Exadaktylos, Aristomenis K; Ercegovac, Marko
2018-03-12
Emergency center visits are mostly unscheduled, undifferentiated, and unpredictable. A standardized triage process is an opportunity to obtain real-time data that paints a picture of the variation in acuity found in emergency centers. This is particularly pertinent as the influx of people seeking asylum or in transit mostly present with emergency care needs or first seek help at an emergency center. Triage not only reduces the risk of missing or losing a patient that may be deteriorating in the waiting room but also enables a time-critical response in the emergency care service provision. As part of a joint emergency care system strengthening and patient safety initiative, the Serbian Ministry of Health in collaboration with the Centre of Excellence in Emergency Medicine (CEEM) introduced a standardized triage process at the Clinical Centre of Serbia (CCS). This paper describes four crucial stages that were considered for the integration of a standardized triage process into acute care pathways.
On-board Attitude Determination System (OADS). [for advanced spacecraft missions
NASA Technical Reports Server (NTRS)
Carney, P.; Milillo, M.; Tate, V.; Wilson, J.; Yong, K.
1978-01-01
The requirements, capabilities and system design for an on-board attitude determination system (OADS) to be flown on advanced spacecraft missions were determined. Based upon the OADS requirements and system performance evaluation, a preliminary on-board attitude determination system is proposed. The proposed OADS system consists of one NASA Standard IRU (DRIRU-2) as the primary attitude determination sensor, two improved NASA Standard star tracker (SST) for periodic update of attitude information, a GPS receiver to provide on-board space vehicle position and velocity vector information, and a multiple microcomputer system for data processing and attitude determination functions. The functional block diagram of the proposed OADS system is shown. The computational requirements are evaluated based upon this proposed OADS system.
7 CFR 274.4 - Reconciliation and reporting.
Code of Federal Regulations, 2014 CFR
2014-01-01
... basis and consist of: (1) Information on how the system operates relative to its performance standards..., shall be submitted by each State agency operating an issuance system. The report shall be prepared at... reconciliation process. The EBT system shall provide reports and documentation pertaining to the following: (1...
Effectiveness of the Department of Defense Information Assurance Accreditation Process
2013-03-01
meeting the requirements of ISO 27001, Information Security Management System. ISO 27002 provides “security techniques” or best practices that can be...efforts to the next level and implement a recognized standard such as the International Organization for Standards ( ISO ) 27000 Series of standards...implemented by an organization as part of their certification effort.15 Most likely, the main motivation a company would have for achieving an ISO
NASA Astrophysics Data System (ADS)
Davies, D.; Murphy, K. J.; Michael, K.
2013-12-01
NASA's Land Atmosphere Near real-time Capability for EOS (Earth Observing System) (LANCE) provides data and imagery from Terra, Aqua and Aura satellites in less than 3 hours from satellite observation, to meet the needs of the near real-time (NRT) applications community. This article describes the architecture of the LANCE and outlines the modifications made to achieve the 3-hour latency requirement with a view to informing future NRT satellite distribution capabilities. It also describes how latency is determined. LANCE is a distributed system that builds on the existing EOS Data and Information System (EOSDIS) capabilities. To achieve the NRT latency requirement, many components of the EOS satellite operations, ground and science processing systems have been made more efficient without compromising the quality of science data processing. The EOS Data and Operations System (EDOS) processes the NRT stream with higher priority than the science data stream in order to minimize latency. In addition to expediting transfer times, the key difference between the NRT Level 0 products and those for standard science processing is the data used to determine the precise location and tilt of the satellite. Standard products use definitive geo-location (attitude and ephemeris) data provided daily, whereas NRT products use predicted geo-location provided by the instrument Global Positioning System (GPS) or approximation of navigational data (depending on platform). Level 0 data are processed in to higher-level products at designated Science Investigator-led Processing Systems (SIPS). The processes used by LANCE have been streamlined and adapted to work with datasets as soon as they are downlinked from satellites or transmitted from ground stations. Level 2 products that require ancillary data have modified production rules to relax the requirements for ancillary data so reducing processing times. Looking to the future, experience gained from LANCE can provide valuable lessons on satellite and ground system architectures and on how the delivery of NRT products from other NASA missions might be achieved.
40 CFR 61.242-3 - Standards: Compressors.
Code of Federal Regulations, 2010 CFR
2010-07-01
... barrier fluid system degassing reservoir that is routed to a process or fuel gas system or connected by a... paragraphs (a)-(c) of this section shall be equipped with a sensor that will detect failure of the seal system, barrier fluid system, or both. (e)(1) Each sensor as required in paragraph (d) of this section...
40 CFR 61.242-3 - Standards: Compressors.
Code of Federal Regulations, 2011 CFR
2011-07-01
... barrier fluid system degassing reservoir that is routed to a process or fuel gas system or connected by a... paragraphs (a)-(c) of this section shall be equipped with a sensor that will detect failure of the seal system, barrier fluid system, or both. (e)(1) Each sensor as required in paragraph (d) of this section...
Development of the Gross Motor Function Classification System (1997)
ERIC Educational Resources Information Center
Morris, Christopher
2008-01-01
To address the need for a standardized system to classify the gross motor function of children with cerebral palsy, the authors developed a five-level classification system analogous to the staging and grading systems used in medicine. Nominal group process and Delphi survey consensus methods were used to examine content validity and revise the…
International Federation of Nurse Anesthetists' anesthesia program approval process.
Horton, B J; Anang, S P; Riesen, M; Yang, H-J; Björkelund, K B
2014-06-01
The International Federation of Nurse Anesthetists is improving anaesthesia patient care through a voluntary Anesthesia Program Approval Process (APAP) for schools and programmes. It is the result of a coordinated effort by anaesthesia leaders from many nations to implement a voluntary quality improvement system for education. These leaders firmly believe that meeting international education standards is an important way to improve anaesthesia, pain management and resuscitative care to patients worldwide. By 2013, 14 anaesthesia programmes from France, Iceland, Indonesia, Philippines, Sweden, Switzerland, Netherlands, Tunisia and the USA had successfully completed the process. Additional programmes were scheduled for review in 2014. Faculty from these programmes, who have successfully completed APAP, show how anaesthesia educators throughout the world seek to continually improve education and patient care by pledging to meet common education standards. As national governments, education ministers and heads of education institutions work to decrease shortages of healthcare workers, they would benefit from considering the value offered by quality improvement systems supported by professional organizations. When education programmes are measured against standards developed by experts in a profession, policy makers can be assured that the programmes have met certain standards of quality. They can also be confident that graduates of approved programmes are appropriately trained healthcare workers for their citizens. © 2014 International Council of Nurses.
JPSS-1 Data and the EOSDIS System: It's seamless
NASA Astrophysics Data System (ADS)
Hall, A.; Behnke, J.; Ho, E.
2017-12-01
The continuity of climate and environmental data is the key to the NASA Earth science program to develop a scientific understanding of Earth's system and its response to changes. NASA has made a long-term investment in processing, archiving and distributing Earth science data through the Earth Observing System (EOS) Data and Information System (EOSDIS). The use of the EOSDIS infrastructure and services provides seamless integration of Suomi National Polar-Orbiting Partnership (SNPP) and future Joint Polar Satellite System (JPSS-1) products as it does for the entire NASA Earth Science data collection. This continuity of measurements from all the missions is supported by the use of common data structures and standards in the generation of products and the subsequent services, tools and access to those products. Similar to EOS missions, 5 Science Investigator-led Processing Systems (SIPS) were established for SNPP: Land, Ocean, Atmosphere, Ozone, and Sounder along with NASA's Clouds and the Earth's Radiant Energy System and Ozone Mapper/Profiler Suite Limb systems now produce the NASA SNPP standard Level 1, Level 2, and Level 3 products developed by the NASA science teams.
Modular integration of electronics and microfluidic systems using flexible printed circuit boards.
Wu, Amy; Wang, Lisen; Jensen, Erik; Mathies, Richard; Boser, Bernhard
2010-02-21
Microfluidic systems offer an attractive alternative to conventional wet chemical methods with benefits including reduced sample and reagent volumes, shorter reaction times, high-throughput, automation, and low cost. However, most present microfluidic systems rely on external means to analyze reaction products. This substantially adds to the size, complexity, and cost of the overall system. Electronic detection based on sub-millimetre size integrated circuits (ICs) has been demonstrated for a wide range of targets including nucleic and amino acids, but deployment of this technology to date has been limited due to the lack of a flexible process to integrate these chips within microfluidic devices. This paper presents a modular and inexpensive process to integrate ICs with microfluidic systems based on standard printed circuit board (PCB) technology to assemble the independently designed microfluidic and electronic components. The integrated system can accommodate multiple chips of different sizes bonded to glass or PDMS microfluidic systems. Since IC chips and flex PCB manufacturing and assembly are industry standards with low cost, the integrated system is economical for both laboratory and point-of-care settings.
40 CFR 63.1007 - Pumps in light liquid service standards.
Code of Federal Regulations, 2011 CFR
2011-07-01
... sensor that indicates failure of the seal system, the barrier fluid system, or both. The owner or... reservoir that is routed to a process or fuel gas system or connected by a closed vent system to a control... liquid service. (iv) Each barrier fluid system is equipped with a sensor that will detect failure of the...
40 CFR 63.1007 - Pumps in light liquid service standards.
Code of Federal Regulations, 2010 CFR
2010-07-01
... sensor that indicates failure of the seal system, the barrier fluid system, or both. The owner or... reservoir that is routed to a process or fuel gas system or connected by a closed vent system to a control... liquid service. (iv) Each barrier fluid system is equipped with a sensor that will detect failure of the...
Parallel-Processing Test Bed For Simulation Software
NASA Technical Reports Server (NTRS)
Blech, Richard; Cole, Gary; Townsend, Scott
1996-01-01
Second-generation Hypercluster computing system is multiprocessor test bed for research on parallel algorithms for simulation in fluid dynamics, electromagnetics, chemistry, and other fields with large computational requirements but relatively low input/output requirements. Built from standard, off-shelf hardware readily upgraded as improved technology becomes available. System used for experiments with such parallel-processing concepts as message-passing algorithms, debugging software tools, and computational steering. First-generation Hypercluster system described in "Hypercluster Parallel Processor" (LEW-15283).
Aerobic Digestion. Biological Treatment Process Control. Instructor's Guide.
ERIC Educational Resources Information Center
Klopping, Paul H.
This unit on aerobic sludge digestion covers the theory of the process, system components, factors that affect the process performance, standard operational concerns, indicators of steady-state operations, and operational problems. The instructor's guide includes: (1) an overview of the unit; (2) lesson plan; (3) lecture outline (keyed to a set of…
40 CFR 63.11395 - What are the standards and compliance requirements for existing sources?
Code of Federal Regulations, 2013 CFR
2013-07-01
... routine and long-term maintenance) and continuous monitoring system. (4) A list of operating parameters... polymerization process equipment and monomer recovery process equipment and convey the collected gas stream.... (2) 0.05 lb/hr of AN from the control device for monomer recovery process equipment. (3) If you do...
40 CFR 63.11395 - What are the standards and compliance requirements for existing sources?
Code of Federal Regulations, 2012 CFR
2012-07-01
... routine and long-term maintenance) and continuous monitoring system. (4) A list of operating parameters... polymerization process equipment and monomer recovery process equipment and convey the collected gas stream.... (2) 0.05 lb/hr of AN from the control device for monomer recovery process equipment. (3) If you do...
40 CFR 63.11395 - What are the standards and compliance requirements for existing sources?
Code of Federal Regulations, 2014 CFR
2014-07-01
... routine and long-term maintenance) and continuous monitoring system. (4) A list of operating parameters... polymerization process equipment and monomer recovery process equipment and convey the collected gas stream.... (2) 0.05 lb/hr of AN from the control device for monomer recovery process equipment. (3) If you do...
Optical radiation hazards of laser welding processes. Part 1: Neodymium-YAG laser.
Rockwell, R J; Moss, C E
1983-08-01
High power laser devices are being used for numerous metalworking processes such as welding, cutting and heat treating. Such laser devices are totally enclosed either by the manufacturer or the end-user. When this is done, the total laser system is usually certified by the manufacturer following the federal requirements of the Code of Federal Regulations (CFR) 1040.10 and 10.40.11 as a Class I laser system. Similarly, the end-user may also reclassify an enclosed high-power laser into the Class I category following the requirements of the American National Standards Institute (ANSI) Z-136.1 (1980) standard. There are, however, numerous industrial laser applications where Class IV systems are required to be used in an unenclosed manner. In such applications, there is concern for both ocular and skin hazards caused by direct and scattered laser radiation, as well as potential hazards caused by the optical radiation created by the laser beam's interaction with the metal (i.e. the plume radiation). Radiant energy measurements are reported for both the scattered laser radiation and the resultant plume radiations which were produced during typical unenclosed Class IV Neodymium-YAG laser welding processes. Evaluation of the plume radiation was done with both radiometric and spectroradiometric measurement equipment. The data obtained were compared to applicable safety standards.
40 CFR 63.166 - Standards: Sampling connection systems.
Code of Federal Regulations, 2012 CFR
2012-07-01
... fluid to a process; or (3) Be designed and operated to capture and transport the purged process fluid to a control device that complies with the requirements of § 63.172 of this subpart; or (4) Collect... of subpart G of this part applicable to group 1 wastewater streams. If the purged process fluid does...
40 CFR 63.166 - Standards: Sampling connection systems.
Code of Federal Regulations, 2014 CFR
2014-07-01
... fluid to a process; or (3) Be designed and operated to capture and transport the purged process fluid to a control device that complies with the requirements of § 63.172 of this subpart; or (4) Collect... of subpart G of this part applicable to group 1 wastewater streams. If the purged process fluid does...
ERIC Educational Resources Information Center
Critchosin, Heather
2014-01-01
Executing Quality describes the perceived process experienced by participants while engaging in Keystone Standards, Training, Assistance, Resources, and Support (Keystone STARS) quality rating improvement system (QRIS). The purpose of this qualitative inquiry was to understand the process of Keystone STARS engagement in order to generate a…
40 CFR Table 2 to Subpart Mmm of... - Standards for New and Existing PAI Sources
Code of Federal Regulations, 2012 CFR
2012-07-01
... a HAP Particulate matter concentration not to exceed 0.01 gr/dscf. Heat exchange systems Each heat exchange system used to cool process equipment in PAI manufacturing operations Monitoring and leak repair...
40 CFR Table 2 to Subpart Mmm of... - Standards for New and Existing PAI Sources
Code of Federal Regulations, 2011 CFR
2011-07-01
... a HAP Particulate matter concentration not to exceed 0.01 gr/dscf. Heat exchange systems Each heat exchange system used to cool process equipment in PAI manufacturing operations Monitoring and leak repair...
CERT Resilience Management Model, Version 1.0
2010-05-01
practice such as ISO 27000 , COBIT, or ITIL. If you are a member of an established process improvement community, particularly one centered on CMMI...Systems Audit and Control Association ISO International Organization for Standardization ISSA Information Systems Security Association IT
DOT National Transportation Integrated Search
2006-01-01
This pamphlet gives a brief introduction to the National Intelligent Transportation Systems (ITS) architecture and regional ITS architectures. It gives an overview of architecture, project, and standards requirements, and describes the availability o...
Center/TRACON Automation System: Development and Evaluation in the Field
DOT National Transportation Integrated Search
1993-10-01
Technological advances are changing the way that advanced air traffic control : automation should be developed and assessed. Current standards and practices of : system development place field testing at the end of the development process. : While su...
NASA Astrophysics Data System (ADS)
Yin, Bo; Liu, Li; Wang, Jiahan; Li, Xiran; Liu, Zhenbo; Li, Dewei; Wang, Jun; Liu, Lu; Wu, Jun; Xu, Tingting; Cui, He
2017-10-01
Electric energy measurement as a basic work, an accurate measurements play a vital role for the economic interests of both parties of power supply, the standardized management of the measurement laboratory at all levels is a direct factor that directly affects the fairness of measurement. Currently, the management of metering laboratories generally uses one-dimensional bar code as the recognition object, advances the testing process by manual management, most of the test data requires human input to generate reports. There are many problems and potential risks in this process: Data cannot be saved completely, cannot trace the status of inspection, the inspection process isn't completely controllable and so on. For the provincial metrology center's actual requirements of the whole process management for the performance test of the power measuring appliances, using of large-capacity RF tags as a process management information media, we developed a set of general measurement experiment management system, formulated a standardized full performance test process, improved the raw data recording mode of experimental process, developed a storehouse automatic inventory device, established a strict test sample transfer and storage system, ensured that all the raw data of the inspection can be traced back, achieved full life-cycle control of the sample, significantly improved the quality control level and the effectiveness of inspection work.
Using ISO 25040 standard for evaluating electronic health record systems.
Oliveira, Marília; Novaes, Magdala; Vasconcelos, Alexandre
2013-01-01
Quality of electronic health record systems (EHR-S) is one of the key points in the discussion about the safe use of this kind of system. It stimulates creation of technical standards and certifications in order to establish the minimum requirements expected for these systems. [1] In other side, EHR-S suppliers need to invest in evaluation of their products to provide systems according to these requirements. This work presents a proposal of use ISO 25040 standard, which focuses on the evaluation of software products, for define a model of evaluation of EHR-S in relation to Brazilian Certification for Electronic Health Record Systems - SBIS-CFM Certification. Proposal instantiates the process described in ISO 25040 standard using the set of requirements that is scope of the Brazilian certification. As first results, this research has produced an evaluation model and a scale for classify an EHR-S about its compliance level in relation to certification. This work in progress is part for the acquisition of the degree of master in Computer Science at the Federal University of Pernambuco.
NASA Technical Reports Server (NTRS)
Barber, Peter W.; Demerdash, Nabeel A. O.; Wang, R.; Hurysz, B.; Luo, Z.
1991-01-01
The goal is to analyze the potential effects of electromagnetic interference (EMI) originating from power system processing and transmission components for Space Station Freedom.The approach consists of four steps: (1) develop analytical tools (models and computer programs); (2) conduct parameterization studies; (3) predict the global space station EMI environment; and (4) provide a basis for modification of EMI standards.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2008-07-01
This case study describes how the Kaiser Aluminum plant in Sherman, Texas, achieved annual savings of $360,000 and 45,000 MMBtu, and improved furnace energy intensity by 11.1% after receiving a DOE Save Energy Now energy assessment and implementing recommendations to improve the efficiency of its process heating system.
Research on Secure Systems and Automatic Programming. Volume I
1977-10-14
for the enforcement of adherence to authorization; they include physical limitations, legal codes, social pressures, and the psychological makeup of...systems job statistics and possibly indications of an support instructions. The criteria for their abnormal termination. * inclusion were high execution...interrupt processes, for the output data page. Jobs may also terminate however, use the standard SWI TCH PROCESS instruc- abnormally by executing an
Standard Generalized Markup Language for self-defining structured reports.
Kahn, C E
1999-01-01
Structured reporting is the process of using standardized data elements and predetermined data-entry formats to record observations. The Standard Generalized Markup Language (SGML; International Standards Organization (ISO) 8879:1986)--an open, internationally accepted standard for document interchange was used to encode medical observations acquired in an Internet-based structured reporting system. The resulting report is self-documenting: it includes a definition of its allowable data fields and values encoded as a report-specific SGML document type definition (DTD). The data-entry forms, DTD, and report document instances are based on report specifications written in a simple, SGML-based language designed for that purpose. Reporting concepts can be linked with those of external vocabularies such as the Unified Medical Language System (UMLS) Metathesaurus. The use of open standards such as SGML is an important step in the creation of open, universally comprehensible structured reports.
Dynamical Evolution of Planetary Embryos
NASA Technical Reports Server (NTRS)
Wetherill, George W.
2002-01-01
During the past decade, progress has been made by relating the 'standard model' for the formation of planetary systems to computational and observational advances. A significant contribution to this has been provided by this grant. The consequence of this is that the rigor of the physical modeling has improved considerably. This has identified discrepancies between the predictions of the standard model and recent observations of extrasolar planets. In some cases, the discrepancies can be resolved by recognition of the stochastic nature of the planetary formation process, leading to variations in the final state of a planetary system. In other cases, it seems more likely that there are major deficiencies in the standard model, requiring our identifying variations to the model that are not so strongly constrained to our Solar System.
Innovating the Standard Procurement System Through Electronic Commerce Technologies
1999-12-01
commerce are emerging almost daily as businesses continue to realize the overwhelming ability of agent applications to reduce costs and improve ...processed using the SPS. The result may reduce cycle time, assist contracting professionals, improve the acquisition process, save money and aid...of innovation processes, and it offers enormous potential for helping organizations achieve major improvements in terms of process cost , time
48 CFR 30.604 - Processing changes to disclosed or established cost accounting practices.
Code of Federal Regulations, 2010 CFR
2010-10-01
... disclosed or established cost accounting practices. 30.604 Section 30.604 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL CONTRACTING REQUIREMENTS COST ACCOUNTING STANDARDS ADMINISTRATION CAS Administration 30.604 Processing changes to disclosed or established cost accounting practices...
48 CFR 30.604 - Processing changes to disclosed or established cost accounting practices.
Code of Federal Regulations, 2011 CFR
2011-10-01
... disclosed or established cost accounting practices. 30.604 Section 30.604 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL CONTRACTING REQUIREMENTS COST ACCOUNTING STANDARDS ADMINISTRATION CAS Administration 30.604 Processing changes to disclosed or established cost accounting practices...
Integrated Multi-process Microfluidic Systems for Automating Analysis
Yang, Weichun; Woolley, Adam T.
2010-01-01
Microfluidic technologies have been applied extensively in rapid sample analysis. Some current challenges for standard microfluidic systems are relatively high detection limits, and reduced resolving power and peak capacity compared to conventional approaches. The integration of multiple functions and components onto a single platform can overcome these separation and detection limitations of microfluidics. Multiplexed systems can greatly increase peak capacity in multidimensional separations and can increase sample throughput by analyzing many samples simultaneously. On-chip sample preparation, including labeling, preconcentration, cleanup and amplification, can all serve to speed up and automate processes in integrated microfluidic systems. This paper summarizes advances in integrated multi-process microfluidic systems for automated analysis, their benefits and areas for needed improvement. PMID:20514343
Open source cardiology electronic health record development for DIGICARDIAC implementation
NASA Astrophysics Data System (ADS)
Dugarte, Nelson; Medina, Rubén.; Huiracocha, Lourdes; Rojas, Rubén.
2015-12-01
This article presents the development of a Cardiology Electronic Health Record (CEHR) system. Software consists of a structured algorithm designed under Health Level-7 (HL7) international standards. Novelty of the system is the integration of high resolution ECG (HRECG) signal acquisition and processing tools, patient information management tools and telecardiology tools. Acquisition tools are for management and control of the DIGICARDIAC electrocardiograph functions. Processing tools allow management of HRECG signal analysis searching for indicative patterns of cardiovascular pathologies. Telecardiology tools incorporation allows system communication with other health care centers decreasing access time to the patient information. CEHR system was completely developed using open source software. Preliminary results of process validation showed the system efficiency.
Eftimov, Tome; Korošec, Peter; Koroušić Seljak, Barbara
2017-01-01
The European Food Safety Authority has developed a standardized food classification and description system called FoodEx2. It uses facets to describe food properties and aspects from various perspectives, making it easier to compare food consumption data from different sources and perform more detailed data analyses. However, both food composition data and food consumption data, which need to be linked, are lacking in FoodEx2 because the process of classification and description has to be manually performed—a process that is laborious and requires good knowledge of the system and also good knowledge of food (composition, processing, marketing, etc.). In this paper, we introduce a semi-automatic system for classifying and describing foods according to FoodEx2, which consists of three parts. The first involves a machine learning approach and classifies foods into four FoodEx2 categories, with two for single foods: raw (r) and derivatives (d), and two for composite foods: simple (s) and aggregated (c). The second uses a natural language processing approach and probability theory to describe foods. The third combines the result from the first and the second part by defining post-processing rules in order to improve the result for the classification part. We tested the system using a set of food items (from Slovenia) manually-coded according to FoodEx2. The new semi-automatic system obtained an accuracy of 89% for the classification part and 79% for the description part, or an overall result of 79% for the whole system. PMID:28587103
Eftimov, Tome; Korošec, Peter; Koroušić Seljak, Barbara
2017-05-26
The European Food Safety Authority has developed a standardized food classification and description system called FoodEx2. It uses facets to describe food properties and aspects from various perspectives, making it easier to compare food consumption data from different sources and perform more detailed data analyses. However, both food composition data and food consumption data, which need to be linked, are lacking in FoodEx2 because the process of classification and description has to be manually performed-a process that is laborious and requires good knowledge of the system and also good knowledge of food (composition, processing, marketing, etc.). In this paper, we introduce a semi-automatic system for classifying and describing foods according to FoodEx2, which consists of three parts. The first involves a machine learning approach and classifies foods into four FoodEx2 categories, with two for single foods: raw (r) and derivatives (d), and two for composite foods: simple (s) and aggregated (c). The second uses a natural language processing approach and probability theory to describe foods. The third combines the result from the first and the second part by defining post-processing rules in order to improve the result for the classification part. We tested the system using a set of food items (from Slovenia) manually-coded according to FoodEx2. The new semi-automatic system obtained an accuracy of 89% for the classification part and 79% for the description part, or an overall result of 79% for the whole system.
Procedures and Standards for Residential Ventilation System Commissioning: An Annotated Bibliography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stratton, J. Chris; Wray, Craig P.
2013-04-01
Beginning with the 2008 version of Title 24, new homes in California must comply with ANSI/ASHRAE Standard 62.2-2007 requirements for residential ventilation. Where installed, the limited data available indicate that mechanical ventilation systems do not always perform optimally or even as many codes and forecasts predict. Commissioning such systems when they are installed or during subsequent building retrofits is a step towards eliminating deficiencies and optimizing the tradeoff between energy use and acceptable IAQ. Work funded by the California Energy Commission about a decade ago at Berkeley Lab documented procedures for residential commissioning, but did not focus on ventilation systems.more » Since then, standards and approaches for commissioning ventilation systems have been an active area of work in Europe. This report describes our efforts to collect new literature on commissioning procedures and to identify information that can be used to support the future development of residential-ventilation-specific procedures and standards. We recommend that a standardized commissioning process and a commissioning guide for practitioners be developed, along with a combined energy and IAQ benefit assessment standard and tool, and a diagnostic guide for estimating continuous pollutant emission rates of concern in residences (including a database that lists emission test data for commercially-available labeled products).« less
ERIC Educational Resources Information Center
McNeal, Larry; Christy, W. Keith
This brief paper is a presentation that preceeded another case of considering the ongoing dialogue on the advantages and disadvantages of centralized and decentralized school-improvement processes. It attempts to raise a number of questions about the relationship between state-designed standards and accountability initiatives and change and…
75 FR 14386 - Interpretation of Transmission Planning Reliability Standard
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-25
... created electronically using word processing software should be filed in native applications or print-to.... FERC, 564 F.3d 1342 (DC Cir. 2009). \\6\\ Mandatory Reliability Standards for the Bulk-Power System... print-to-PDF format and not in a scanned format. Commenters filing electronically do not need to make a...
Quality assurance, training, and certification in ozone air pollution studies
Susan Schilling; Paul Miller; Brent Takemoto
1996-01-01
Uniform, or standard, measurement methods of data are critical to projects monitoring change to forest systems. Standardized methods, with known or estimable errors, contribute greatly to the confidence associated with decisions on the basis of field data collections (Zedaker and Nicholas 1990). Quality assurance (QA) for the measurement process includes operations and...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-09
... Standards (UPCS) inspection protocol was designed to be a uniform inspection process and standard for HUD's... frequency of inspections based on the results the UPCS inspection. UPCS was designed to assess the condition... physical assessment score. HUD Response: The UPCS inspection protocol as designed assesses the physical...
On Systems Thinking and Ways of Building It in Learning
ERIC Educational Resources Information Center
Abdyrova, Aitzhan; Galiyev, Temir; Yessekeshova, Maral; Aldabergenova, Saule; Alshynbayeva, Zhuldyz
2016-01-01
The article focuses on the issue of shaping learners' systems thinking skills in the context of traditional education using specially elaborated system methods that are implemented based on the standard textbook. Applying these methods naturally complements the existing learning process and contributes to an efficient development of learners'…
40 CFR 63.1090 - What reports must I submit?
Code of Federal Regulations, 2013 CFR
2013-07-01
...) National Emission Standards for Ethylene Manufacturing Process Units: Heat Exchange Systems and Waste Operations Recordkeeping and Reporting Requirements for Heat Exchange Systems § 63.1090 What reports must I submit? If you delay repair for your heat exchange system, you must report the delay of repair in the...
40 CFR 63.1090 - What reports must I submit?
Code of Federal Regulations, 2010 CFR
2010-07-01
...) National Emission Standards for Ethylene Manufacturing Process Units: Heat Exchange Systems and Waste Operations Recordkeeping and Reporting Requirements for Heat Exchange Systems § 63.1090 What reports must I submit? If you delay repair for your heat exchange system, you must report the delay of repair in the...
40 CFR 63.1090 - What reports must I submit?
Code of Federal Regulations, 2011 CFR
2011-07-01
...) National Emission Standards for Ethylene Manufacturing Process Units: Heat Exchange Systems and Waste Operations Recordkeeping and Reporting Requirements for Heat Exchange Systems § 63.1090 What reports must I submit? If you delay repair for your heat exchange system, you must report the delay of repair in the...
40 CFR 63.1090 - What reports must I submit?
Code of Federal Regulations, 2014 CFR
2014-07-01
...) National Emission Standards for Ethylene Manufacturing Process Units: Heat Exchange Systems and Waste Operations Recordkeeping and Reporting Requirements for Heat Exchange Systems § 63.1090 What reports must I submit? If you delay repair for your heat exchange system, you must report the delay of repair in the...
40 CFR 63.1090 - What reports must I submit?
Code of Federal Regulations, 2012 CFR
2012-07-01
...) National Emission Standards for Ethylene Manufacturing Process Units: Heat Exchange Systems and Waste Operations Recordkeeping and Reporting Requirements for Heat Exchange Systems § 63.1090 What reports must I submit? If you delay repair for your heat exchange system, you must report the delay of repair in the...
A Manual Transactional System for the IR Office.
ERIC Educational Resources Information Center
Black, Frank S.
The development and operation of a transactional system (defined as a routine method for carrying out certain processes) without computer support at Texas Southern University's Office of Institutional Research is described. The system was developed in 1975 primarily to improve the internal management of the office. It is a standard operating…
Data Processing for NASA's TDRSS DAMA Channel
NASA Technical Reports Server (NTRS)
Long, Christopher C.; Horan, Stephen
1996-01-01
A concept for the addition of a Demand Assignment Multiple Access (DAMA) service to NASA's current Space Network (SN) is developed. Specifically, the design of a receiver for the DAMA channel is outlined. Also, an outline of the procedures taken to process the received service request is presented. The modifications to the (SN) system are minimal. The post reception processing is accomplished using standard commercial off the shelf (COTS) packages. The result is a random access system capable of receiving requests for service.
Lunar Regolith Simulant Materials: Recommendations for Standardization, Production, and Usage
NASA Technical Reports Server (NTRS)
Sibille, L.; Carpenter, P.; Schlagheck, R.; French, R. A.
2006-01-01
Experience gained during the Apollo program demonstrated the need for extensive testing of surface systems in relevant environments, including regolith materials similar to those encountered on the lunar surface. As NASA embarks on a return to the Moon, it is clear that the current lunar sample inventory is not only insufficient to support lunar surface technology and system development, but its scientific value is too great to be consumed by destructive studies. Every effort must be made to utilize standard simulant materials, which will allow developers to reduce the cost, development, and operational risks to surface systems. The Lunar Regolith Simulant Materials Workshop held in Huntsville, AL, on January 24 26, 2005, identified the need for widely accepted standard reference lunar simulant materials to perform research and development of technologies required for lunar operations. The workshop also established a need for a common, traceable, and repeatable process regarding the standardization, characterization, and distribution of lunar simulants. This document presents recommendations for the standardization, production and usage of lunar regolith simulant materials.
High Available COTS Based Computer for Space
NASA Astrophysics Data System (ADS)
Hartmann, J.; Magistrati, Giorgio
2015-09-01
The availability and reliability factors of a system are central requirements of a target application. From a simple fuel injection system used in cars up to a flight control system of an autonomous navigating spacecraft, each application defines its specific availability factor under the target application boundary conditions. Increasing quality requirements on data processing systems used in space flight applications calling for new architectures to fulfill the availability, reliability as well as the increase of the required data processing power. Contrary to the increased quality request simplification and use of COTS components to decrease costs while keeping the interface compatibility to currently used system standards are clear customer needs. Data processing system design is mostly dominated by strict fulfillment of the customer requirements and reuse of available computer systems were not always possible caused by obsolescence of EEE-Parts, insufficient IO capabilities or the fact that available data processing systems did not provide the required scalability and performance.
Software Certification - Coding, Code, and Coders
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Holzmann, Gerard J.
2011-01-01
We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.
A data types profile suitable for use with ISO EN 13606.
Sun, Shanghua; Austin, Tony; Kalra, Dipak
2012-12-01
ISO EN 13606 is a five part International Standard specifying how Electronic Healthcare Record (EHR) information should be communicated between different EHR systems and repositories. Part 1 of the standard defines an information model for representing the EHR information itself, including the representation of types of data value. A later International Standard, ISO 21090:2010, defines a comprehensive set of models for data types needed by all health IT systems. This latter standard is vast, and duplicates some of the functions already handled by ISO EN 13606 part 1. A profile (sub-set) of ISO 21090 would therefore be expected to provide EHR system vendors with a more specially tailored set of data types to implement and avoid the risk of providing more than one modelling option for representing the information properties. This paper describes the process and design decisions made for developing a data types profile for EHR interoperability.
Applying Use Cases to Describe the Role of Standards in e-Health Information Systems
NASA Astrophysics Data System (ADS)
Chávez, Emma; Finnie, Gavin; Krishnan, Padmanabhan
Individual health records (IHRs) contain a person's lifetime records of their key health history and care within a health system (National E-Health Transition Authority, Retrieved Jan 12, 2009 from http://www.nehta.gov.au/coordinated-care/whats-in-iehr, 2004). This information can be processed and stored in different ways. The record should be available electronically to authorized health care providers and the individual anywhere, anytime, to support high-quality care. Many organizations provide a diversity of solutions for e-health and its services. Standards play an important role to enable these organizations to support information interchange and improve efficiency of health care delivery. However, there are numerous standards to choose from and not all of them are accessible to the software developer. This chapter proposes a framework to describe the e-health standards that can be used by software engineers to implement e-health information systems.
Healthcare software assurance.
Cooper, Jason G; Pauley, Keith A
2006-01-01
Software assurance is a rigorous, lifecycle phase-independent set of activities which ensure completeness, safety, and reliability of software processes and products. This is accomplished by guaranteeing conformance to all requirements, standards, procedures, and regulations. These assurance processes are even more important when coupled with healthcare software systems, embedded software in medical instrumentation, and other healthcare-oriented life-critical systems. The current Food and Drug Administration (FDA) regulatory requirements and guidance documentation do not address certain aspects of complete software assurance activities. In addition, the FDA's software oversight processes require enhancement to include increasingly complex healthcare systems such as Hospital Information Systems (HIS). The importance of complete software assurance is introduced, current regulatory requirements and guidance discussed, and the necessity for enhancements to the current processes shall be highlighted.
Cooper, Jason G.; Pauley, Keith A.
2006-01-01
Software assurance is a rigorous, lifecycle phase-independent set of activities which ensure completeness, safety, and reliability of software processes and products. This is accomplished by guaranteeing conformance to all requirements, standards, procedures, and regulations. These assurance processes are even more important when coupled with healthcare software systems, embedded software in medical instrumentation, and other healthcare-oriented life-critical systems. The current Food and Drug Administration (FDA) regulatory requirements and guidance documentation do not address certain aspects of complete software assurance activities. In addition, the FDA’s software oversight processes require enhancement to include increasingly complex healthcare systems such as Hospital Information Systems (HIS). The importance of complete software assurance is introduced, current regulatory requirements and guidance discussed, and the necessity for enhancements to the current processes shall be highlighted. PMID:17238324
The six critical attributes of the next generation of quality management software systems.
Clark, Kathleen
2011-07-01
Driven by both the need to meet regulatory requirements and a genuine desire to drive improved quality, quality management systems encompassing standard operating procedure, corrective and preventative actions and related processes have existed for many years, both in paper and electronic form. The impact of quality management systems on 'actual' quality, however, is often reported as far less than desired. A quality management software system that moves beyond formal forms-driven processes to include a true closed loop design, manage disparate processes across the enterprise, provide support for collaborative processes and deliver insight into the overall state of control has the potential to close the gap between simply accomplishing regulatory compliance and delivering measurable improvements in quality and efficiency.
48 CFR 239.7201 - Solicitation requirements.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Information Processing Standards are incorporated into solicitations. [71 FR 39011, July 11, 2006] ... SYSTEM, DEPARTMENT OF DEFENSE SPECIAL CATEGORIES OF CONTRACTING ACQUISITION OF INFORMATION TECHNOLOGY...
Implementation of standardization in clinical practice: not always an easy task.
Panteghini, Mauro
2012-02-29
As soon as a new reference measurement system is adopted, clinical validation of correctly calibrated commercial methods should take place. Tracing back the calibration of routine assays to a reference system can actually modify the relation of analyte results to existing reference intervals and decision limits and this may invalidate some of the clinical decision-making criteria currently used. To maintain the accumulated clinical experience, the quantitative relationship to the previous calibration system should be established and, if necessary, the clinical decision-making criteria should be adjusted accordingly. The implementation of standardization should take place in a concerted action of laboratorians, manufacturers, external quality assessment scheme organizers and clinicians. Dedicated meetings with manufacturers should be organized to discuss the process of assay recalibration and studies should be performed to obtain convincing evidence that the standardization works, improving result comparability. Another important issue relates to the surveillance of the performance of standardized assays through the organization of appropriate analytical internal and external quality controls. Last but not least, uncertainty of measurement that fits for this purpose must be defined across the entire traceability chain, starting with the available reference materials, extending through the manufacturers and their processes for assignment of calibrator values and ultimately to the final result reported to clinicians by laboratories.
Use of artificial intelligence in analytical systems for the clinical laboratory
Truchaud, Alain; Ozawa, Kyoichi; Pardue, Harry; Schnipelsky, Paul
1995-01-01
The incorporation of information-processing technology into analytical systems in the form of standard computing software has recently been advanced by the introduction of artificial intelligence (AI), both as expert systems and as neural networks. This paper considers the role of software in system operation, control and automation, and attempts to define intelligence. AI is characterized by its ability to deal with incomplete and imprecise information and to accumulate knowledge. Expert systems, building on standard computing techniques, depend heavily on the domain experts and knowledge engineers that have programmed them to represent the real world. Neural networks are intended to emulate the pattern-recognition and parallel processing capabilities of the human brain and are taught rather than programmed. The future may lie in a combination of the recognition ability of the neural network and the rationalization capability of the expert system. In the second part of the paper, examples are given of applications of AI in stand-alone systems for knowledge engineering and medical diagnosis and in embedded systems for failure detection, image analysis, user interfacing, natural language processing, robotics and machine learning, as related to clinical laboratories. It is concluded that AI constitutes a collective form of intellectual propery, and that there is a need for better documentation, evaluation and regulation of the systems already being used in clinical laboratories. PMID:18924784
Office Skills: Measuring Typewriting Output.
ERIC Educational Resources Information Center
Petersen, Lois E.; Kruk, Leonard B.
1978-01-01
The advent of word processing centers has provided typewriting teachers with an alternative measurement system that, instead of penalizing errors, grades students according to Usable Lines Produced (ULP). The ULP system is job-oriented and promotes realistic office standards in typewriting productivity. (MF)
Failure Analysis of Network Based Accessible Pedestrian Signals in Closed-Loop Operation
DOT National Transportation Integrated Search
2011-03-01
The potential failure modes of a network based accessible pedestrian system were analyzed to determine the limitations and benefits of closed-loop operation. The vulnerabilities of the system are accessed using the industry standard process known as ...
40 CFR Table 2 to Subpart Mmm of... - Standards for New and Existing PAI Sources
Code of Federal Regulations, 2014 CFR
2014-07-01
... feedstock that is a solid and a HAP Particulate matter concentration not to exceed 0.01 gr/dscf. Heat exchange systems Each heat exchange system used to cool process equipment in PAI manufacturing operations...
40 CFR Table 2 to Subpart Mmm of... - Standards for New and Existing PAI Sources
Code of Federal Regulations, 2013 CFR
2013-07-01
... feedstock that is a solid and a HAP Particulate matter concentration not to exceed 0.01 gr/dscf. Heat exchange systems Each heat exchange system used to cool process equipment in PAI manufacturing operations...
Proposed standards for peer-reviewed publication of computer code
USDA-ARS?s Scientific Manuscript database
Computer simulation models are mathematical abstractions of physical systems. In the area of natural resources and agriculture, these physical systems encompass selected interacting processes in plants, soils, animals, or watersheds. These models are scientific products and have become important i...
Blind guidance system based on laser triangulation
NASA Astrophysics Data System (ADS)
Wu, Jih-Huah; Wang, Jinner-Der; Fang, Wei; Lee, Yun-Parn; Shan, Yi-Chia; Kao, Hai-Ko; Ma, Shih-Hsin; Jiang, Joe-Air
2012-05-01
We propose a new guidance system for the blind. An optical triangulation method is used in the system. The main components of the proposed system comprise of a notebook computer, a camera, and two laser modules. The track image of the light beam on the ground or on the object is captured by the camera and then the image is sent to the notebook computer for further processing and analysis. Using a developed signal-processing algorithm, our system can determine the object width and the distance between the object and the blind person through the calculation of the light line positions on the image. A series of feasibility tests of the developed blind guidance system were conducted. The experimental results show that the distance between the test object and the blind can be measured with a standard deviation of less than 8.5% within the range of 40 and 130 cm, while the test object width can be measured with a standard deviation of less than 4.5% within the range of 40 and 130 cm. The application potential of the designed system to the blind guidance can be expected.
Shan, Bao-Qing; Li, Nan; Tang, Wen-Zhong
2012-11-01
Ecological drainage system (EDS) including ditches, ponds and wetland was constructed at the Paifangchen village on the north of Chaohu Lake, Anhui, and its retention effect on pollution was investigated. With the comprehensive function of sewage discharge, collecting and process, the system could intercept runoff pollutants effectively. The results acquired from 3 rainfall events showed that the retention rates of EDS to TSS, COD, TP and TN were 78.2%, 57.8%, 55.5% and 64.2% respectively, and the concentrations at outflow of the system to TSS, COD, TP and NH4(+) -N were 23.5, 66.3, 0.49 and 3.03 mg x L(-1) separately, met the first standard of "Integrated Wastewater Discharge Standards". Ponds were the important unit of EDS and the daily water quality concentrations of TSS, COD, TP and TN were 28.0, 31.2, 0.47 and 4.65 mg x L(-1) respectively, met the V standard of "Environment Quality Standards for Surface Water" basically.
Payload transportation system study
NASA Technical Reports Server (NTRS)
1976-01-01
A standard size set of shuttle payload transportation equipment was defined that will substantially reduce the cost of payload transportation and accommodate a wide range of payloads with minimum impact on payload design. The system was designed to accommodate payload shipments between the level 4 payload integration sites and the launch site during the calendar years 1979-1982. In addition to defining transportation multi-use mission support equipment (T-MMSE) the mode of travel, prime movers, and ancillary equipment required in the transportation process were also considered. Consistent with the STS goals of low cost and the use of standardized interfaces, the transportation system was designed to commercial grade standards and uses the payload flight mounting interfaces for transportation. The technical, cost, and programmatic data required to permit selection of a baseline system of MMSE for intersite movement of shuttle payloads were developed.
American Pharmacists Association; Bough, Marcie
2011-01-01
To develop an improved risk evaluation and mitigation strategies (REMS) system for maximizing effective and safe patient medication use while minimizing burden on the health care delivery system. 34 stakeholders gathered October 6-7, 2010, in Arlington, VA, for the REMS Stakeholder Meeting, convened by the American Pharmacists Association (APhA). Participants included national health care provider associations, including representatives for physicians, physician assistants, nurses, nurse practitioners, and pharmacists, as well as representatives for patient advocates, drug distributors, community pharmacists (chain and independent), drug manufacturer associations (brand, generic, and biologic organizations), and health information technology, standards, and safety organizations. Staff from the Food and Drug Administration (FDA) Center for Drug Evaluation and Research participated as observers. The meeting built on themes from the APhA's 2009 REMS white paper. The current REMS environment presents many challenges for health care providers due to the growing number of REMS programs and the lack of standardization or similarities among various REMS programs. A standardized REMS process that focuses on maximizing patient safety and minimizing impacts on patient access and provider implementation could offset these challenges. A new process that includes effective provider interventions and standardized tools and systems for implementing REMS programs may improve patient care and overcome some of the communication issues providers and patients currently face. Metrics could be put in place to evaluate the effectiveness of REMS elements. By incorporating REMS program components into existing technologies and data infrastructures, achieving REMS implementation that is workflow neutral and minimizes administrative burden may be possible. An appropriate compensation model could ensure providers have adequate resources for patient care and REMS implementation. Overall, stakeholders should continue to work collaboratively with FDA and manufacturers to improve REMS program design and implementation issues. A workable REMS system will require effective patient interventions, standardized elements that limit barriers to implementation for both patients and providers, standardized yet flexible implementation strategies, use of existing technologies in practice settings, increased opportunities for provider input early in REMS design processes, improved communication strategies and awareness of program requirements, and viable provider compensation models needed to offset costs to implement and comply with REMS program requirements.
Enterprise and system of systems capability development life-cycle processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beck, David Franklin
2014-08-01
This report and set of appendices are a collection of memoranda originally drafted circa 2007-2009 for the purpose of describing and detailing a models-based systems engineering approach for satisfying enterprise and system-of-systems life cycle process requirements. At the time there was interest and support to move from Capability Maturity Model Integration (CMMI) Level One (ad hoc processes) to Level Three. The main thrust of the material presents a rational exposâe of a structured enterprise development life cycle that uses the scientific method as a framework, with further rigor added from adapting relevant portions of standard systems engineering processes. While themore » approach described invokes application of the Department of Defense Architectural Framework (DoDAF), it is suitable for use with other architectural description frameworks.« less
Repetto, Robert; Levy, Richard
2015-01-01
The application of single-use systems, or disposables, has increased dramatically in the past 10 years. Although some elements of the pharmaceutical and biotech manufacturing process were single-use and therefore disposable and not reused, the majority of the process equipment and fluid path was cleaned and reused by end users. Today, much more of the manufacturing process is composed of single-use systems, and there are some biotech plants that use single-use systems exclusively. Because of this increasing reliance on suppliers, there is an urgent need for more formal standards specifically for single-use system technology. The objective of this PDA-sponsored workshop held on May 14, 2014 was twofold: (1) to promote a harmonized approach to supporting single-use system activities within the industry and in so doing to minimize duplication of efforts, and (2) to communicate ongoing single-use system initiatives among the group. Representatives of ASME, ASTM, BPOG, BPSA, ELSIE, PDA, PQRI, and USP, as well as representatives of CBER and CDER of FDA, attended. Today, the majority of pharmaceutical and biotech drug manufacturing equipment is cleaned and reused. However, in the past 10 years, the use of disposable manufacturing systems has increased dramatically; there are even some biotech-derived drugs that are manufactured entirely using single-use technology. This movement toward disposables has placed increased reliance on disposable equipment suppliers, each of which manufactures its products independently to meet customer needs. This fact has led to non-uniformity in design for connectors and similar sub-processes, and has made the need for more formal industry standards. The objective of this PDA-sponsored workshop held on May 14, 2014 was twofold: (1) to promote a harmonized approach to supporting single-use system projects within the industry and in so doing to minimize duplication of efforts, and (2) to communicate ongoing single-use system initiatives among the group. Representatives of industry associations and standard-setting organizations, as well as representatives of the U.S. Food and Drug Administration, attended. © PDA, Inc. 2015.
Nilsson, Lars B; Skansen, Patrik
2012-06-30
The investigations in this article were triggered by two observations in the laboratory; for some liquid chromatography/tandem mass spectrometry (LC/MS/MS) systems it was possible to obtain linear calibration curves for extreme concentration ranges and for some systems seemingly linear calibration curves gave good accuracy at low concentrations only when using a quadratic regression function. The absolute and relative responses were tested for three different LC/MS/MS systems by injecting solutions of a model compound and a stable isotope labeled internal standard. The analyte concentration range for the solutions was 0.00391 to 500 μM (128,000×), giving overload of the chromatographic column at the highest concentrations. The stable isotope labeled internal standard concentration was 0.667 μM in all samples. The absolute response per concentration unit decreased rapidly as higher concentrations were injected. The relative response, the ratio for the analyte peak area to the internal standard peak area, per concentration unit was calculated. For system 1, the ionization process was found to limit the response and the relative response per concentration unit was constant. For systems 2 and 3, the ion detection process was the limiting factor resulting in decreasing relative response at increasing concentrations. For systems behaving like system 1, simple linear regression can be used for any concentration range while, for systems behaving like systems 2 and 3, non-linear regression is recommended for all concentration ranges. Another consequence is that the ionization capacity limited systems will be insensitive to matrix ion suppression when an ideal internal standard is used while the detection capacity limited systems are at risk of giving erroneous results at high concentrations if the matrix ion suppression varies for different samples in a run. Copyright © 2012 John Wiley & Sons, Ltd.
ICESat Science Investigator led Processing System (I-SIPS)
NASA Astrophysics Data System (ADS)
Bhardwaj, S.; Bay, J.; Brenner, A.; Dimarzio, J.; Hancock, D.; Sherman, M.
2003-12-01
The ICESat Science Investigator-led Processing System (I-SIPS) generates the GLAS standard data products. It consists of two main parts the Scheduling and Data Management System (SDMS) and the Geoscience Laser Altimeter System (GLAS) Science Algorithm Software. The system has been operational since the successful launch of ICESat. It ingests data from the GLAS instrument, generates GLAS data products, and distributes them to the GLAS Science Computing Facility (SCF), the Instrument Support Facility (ISF) and the National Snow and Ice Data Center (NSIDC) ECS DAAC. The SDMS is the Planning, Scheduling and Data Management System that runs the GLAS Science Algorithm Software (GSAS). GSAS is based on the Algorithm Theoretical Basis Documents provided by the Science Team and is developed independently of SDMS. The SDMS provides the processing environment to plan jobs based on existing data, control job flow, data distribution, and archiving. The SDMS design is based on a mission-independent architecture that imposes few constraints on the science code thereby facilitating I-SIPS integration. I-SIPS currently works in an autonomous manner to ingest GLAS instrument data, distribute this data to the ISF, run the science processing algorithms to produce the GLAS standard products, reprocess data when new versions of science algorithms are released, and distributes the products to the SCF, ISF, and NSIDC. I-SIPS has a proven performance record, delivering the data to the SCF within hours after the initial instrument activation. The I-SIPS design philosophy gives this system a high potential for reuse in other science missions.
Smart laser hole drilling for gas turbine combustors
NASA Astrophysics Data System (ADS)
Laraque, Edy
1991-04-01
A smart laser drilling system, which incorporates air flow inspection-in-process of the holes and intelligent real-time process parameter corrections, is described. The system along with good laser parameter developments is proved to be efficient for producing cooling holes which meet the highest aeronautical standards. To date, the system is used for percussion drilling of combustion chamber cooling holes. The system is considered to be very economical due to the drilling-on-the-fly capability that is capable of drilling up to 3 holes of 0.025-in. dia. per second.
Hailey, P A; Doherty, P; Tapsell, P; Oliver, T; Aldridge, P K
1996-03-01
An automated system for the on-line monitoring of powder blending processes is described. The system employs near-infrared (NIR) spectroscopy using fibre-optics and a graphical user interface (GUI) developed in the LabVIEW environment. The complete supervisory control and data analysis (SCADA) software controls blender and spectrophotometer operation and performs statistical spectral data analysis in real time. A data analysis routine using standard deviation is described to demonstrate an approach to the real-time determination of blend homogeneity.