Sample records for standardization work process

  1. 40 CFR 63.7890 - What emissions limitations and work practice standards must I meet for process vents?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS... Pollutants: Site Remediation Process Vents § 63.7890 What emissions limitations and work practice standards...

  2. 40 CFR 63.7890 - What emissions limitations and work practice standards must I meet for process vents?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS... Pollutants: Site Remediation Process Vents § 63.7890 What emissions limitations and work practice standards...

  3. 40 CFR 63.7890 - What emissions limitations and work practice standards must I meet for process vents?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS... Pollutants: Site Remediation Process Vents § 63.7890 What emissions limitations and work practice standards...

  4. 40 CFR 63.7890 - What emissions limitations and work practice standards must I meet for process vents?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS... Pollutants: Site Remediation Process Vents § 63.7890 What emissions limitations and work practice standards...

  5. NASA's Earth Science Data Systems Standards Process Experiences

    NASA Technical Reports Server (NTRS)

    Ullman, Richard E.; Enloe, Yonsook

    2007-01-01

    NASA has impaneled several internal working groups to provide recommendations to NASA management on ways to evolve and improve Earth Science Data Systems. One of these working groups is the Standards Process Group (SPC). The SPG is drawn from NASA-funded Earth Science Data Systems stakeholders, and it directs a process of community review and evaluation of proposed NASA standards. The working group's goal is to promote interoperability and interuse of NASA Earth Science data through broader use of standards that have proven implementation and operational benefit to NASA Earth science by facilitating the NASA management endorsement of proposed standards. The SPC now has two years of experience with this approach to identification of standards. We will discuss real examples of the different types of candidate standards that have been proposed to NASA's Standards Process Group such as OPeNDAP's Data Access Protocol, the Hierarchical Data Format, and Open Geospatial Consortium's Web Map Server. Each of the three types of proposals requires a different sort of criteria for understanding the broad concepts of "proven implementation" and "operational benefit" in the context of NASA Earth Science data systems. We will discuss how our Standards Process has evolved with our experiences with the three candidate standards.

  6. 40 CFR 63.7500 - What emission limits, work practice standards, and operating limits must I meet?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Industrial, Commercial, and Institutional Boilers and Process Heaters Emission Limits and Work Practice... emission limit and work practice standard in Table 1 to this subpart that applies to your boiler or process... 40 Protection of Environment 13 2010-07-01 2010-07-01 false What emission limits, work practice...

  7. Development of Entry-Level Competence Tests: A Strategy for Evaluation of Vocational Education Training Systems

    ERIC Educational Resources Information Center

    Schutte, Marc; Spottl, Georg

    2011-01-01

    Developing countries such as Malaysia and Oman have recently established occupational standards based on core work processes (functional clusters of work objects, activities and performance requirements), to which competencies (performance determinants) can be linked. While the development of work-process-based occupational standards is supposed…

  8. IEEE Std 730 Software Quality Assurance: Supporting CMMI-DEV v1.3, Product and Process Quality Assurance

    DTIC Science & Technology

    2011-05-27

    frameworks 4 CMMI-DEV IEEE / ISO / IEC 15288 / 12207 Quality Assurance ©2011 Walz IEEE Life Cycle Processes & Artifacts • Systems Life Cycle Processes...TAG to ISO TC 176 Quality Management • Quality: ASQ, work experience • Software: three books, consulting, work experience • Systems: Telecom & DoD...and IEEE 730 SQA need to align. The P730 IEEE standards working group has expanded the scope of the SQA process standard to align with IS 12207

  9. Standards Handbook. Version 4.0. What Works Clearinghouse™

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2017

    2017-01-01

    The What Works Clearinghouse (WWC) systematic review process is the basis of many of its products, enabling the WWC to use consistent, objective, and transparent standards and procedures in its reviews, while also ensuring comprehensive coverage of the relevant literature. The WWC systematic review process consists of five steps: (1) Developing…

  10. From the Analysis of Work-Processes to Designing Competence-Based Occupational Standards and Vocational Curricula

    ERIC Educational Resources Information Center

    Tutlys, Vidmantas; Spöttl, Georg

    2017-01-01

    Purpose: This paper aims to explore methodological and institutional challenges on application of the work-process analysis approach in the design and development of competence-based occupational standards for Lithuania. Design/methodology/approach: The theoretical analysis is based on the review of scientific literature and the analysis of…

  11. Standardization efforts of digital pathology in Europe.

    PubMed

    Rojo, Marcial García; Daniel, Christel; Schrader, Thomas

    2012-01-01

    EURO-TELEPATH is a European COST Action IC0604. It started in 2007 and will end in November 2011. Its main objectives are evaluating and validating the common technological framework and communication standards required to access, transmit, and manage digital medical records by pathologists and other medical specialties in a networked environment. Working Group 1, "Business Modelling in Pathology," has designed main pathology processes - Frozen Study, Formalin Fixed Specimen Study, Telepathology, Cytology, and Autopsy - using Business Process Modelling Notation (BPMN). Working Group 2 has been dedicated to promoting the application of informatics standards in pathology, collaborating with Integrating Healthcare Enterprise (IHE), Digital Imaging and Communications in Medicine (DICOM), Health Level Seven (HL7), and other standardization bodies. Health terminology standardization research has become a topic of great interest. Future research work should focus on standardizing automatic image analysis and tissue microarrays imaging.

  12. The COST Action IC0604 "Telepathology Network in Europe" (EURO-TELEPATH).

    PubMed

    García-Rojo, Marcial; Gonçalves, Luís; Blobel, Bernd

    2012-01-01

    The COST Action IC0604 "Telepathology Network in Europe" (EURO-TELEPATH) is a European COST Action that has been running from 2007 to 2011. COST Actions are funded by the COST (European Cooperation in the field of Scientific and Technical Research) Agency, supported by the Seventh Framework Programme for Research and Technological Development (FP7), of the European Union. EURO-TELEPATH's main objectives were evaluating and validating the common technological framework and communication standards required to access, transmit and manage digital medical records by pathologists and other medical professionals in a networked environment. The project was organized in four working groups. orking Group 1 "Business modeling in pathology" has designed main pathology processes - Frozen Study, Formalin Fixed Specimen Study, Telepathology, Cytology, and Autopsy -using Business Process Modeling Notation (BPMN). orking Group 2 "Informatics standards in pathology" has been dedicated to promoting the development and application of informatics standards in pathology, collaborating with Integrating the Healthcare Enterprise (IHE), Digital Imaging and Communications in Medicine (DICOM), Health Level Seven (HL7), and other standardization bodies. Working Group 3 "Images: Analysis, Processing, Retrieval and Management" worked on the use of virtual or digital slides that are fostering the use of image processing and analysis in pathology not only for research purposes, but also in daily practice. Working Group 4 "Technology and Automation in Pathology" was focused on studying the adequacy of current existing technical solutions, including, e.g., the quality of images obtained by slide scanners, or the efficiency of image analysis applications. Major outcome of this action are the collaboration with international health informatics standardization bodies to foster the development of standards for digital pathology, offering a new approach for workflow analysis, based in business process modeling. Health terminology standardization research has become a topic of high interest. Future research work should focus on standardization of automatic image analysis and tissue microarrays imaging.

  13. Software Safety Risk in Legacy Safety-Critical Computer Systems

    NASA Technical Reports Server (NTRS)

    Hill, Janice L.; Baggs, Rhoda

    2007-01-01

    Safety Standards contain technical and process-oriented safety requirements. Technical requirements are those such as "must work" and "must not work" functions in the system. Process-Oriented requirements are software engineering and safety management process requirements. Address the system perspective and some cover just software in the system > NASA-STD-8719.13B Software Safety Standard is the current standard of interest. NASA programs/projects will have their own set of safety requirements derived from the standard. Safety Cases: a) Documented demonstration that a system complies with the specified safety requirements. b) Evidence is gathered on the integrity of the system and put forward as an argued case. [Gardener (ed.)] c) Problems occur when trying to meet safety standards, and thus make retrospective safety cases, in legacy safety-critical computer systems.

  14. 40 CFR 65.3 - Compliance with standards and operation and maintenance requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...)(4)(i) and (ii) do not apply to Group 2A or Group 2B process vents. Compliance with design, equipment, work practice, and operational standards, including those for equipment leaks, shall be determined... this part. (5) Design, equipment, work practice, or operational standards. Paragraphs (b)(5)(i) and (ii...

  15. International standards on working postures and movements ISO 11226 and EN 1005-4.

    PubMed

    Delleman, N J; Dul, J

    2007-11-01

    Standards organizations have given considerable attention to the problem of work-related musculoskeletal disorders. The publication of international standards for evaluating working postures and movements, ISO 11,226 in 2000 and EN 1,005-4 in 2005, may be considered as a support for those involved in preventing and controlling these disorders. The first one is a tool for evaluation of existing work situations, whereas the latter one is a tool for evaluation during a design/engineering process. Key publications and considerations that led to the content of the standards are presented, followed by examples of application.

  16. Journal article reporting standards for qualitative primary, qualitative meta-analytic, and mixed methods research in psychology: The APA Publications and Communications Board task force report.

    PubMed

    Levitt, Heidi M; Bamberg, Michael; Creswell, John W; Frost, David M; Josselson, Ruthellen; Suárez-Orozco, Carola

    2018-01-01

    The American Psychological Association Publications and Communications Board Working Group on Journal Article Reporting Standards for Qualitative Research (JARS-Qual Working Group) was charged with examining the state of journal article reporting standards as they applied to qualitative research and with generating recommendations for standards that would be appropriate for a wide range of methods within the discipline of psychology. These standards describe what should be included in a research report to enable and facilitate the review process. This publication marks a historical moment-the first inclusion of qualitative research in APA Style, which is the basis of both the Publication Manual of the American Psychological Association (APA, 2010) and APA Style CENTRAL, an online program to support APA Style. In addition to the general JARS-Qual guidelines, the Working Group has developed standards for both qualitative meta-analysis and mixed methods research. The reporting standards were developed for psychological qualitative research but may hold utility for a broad range of social sciences. They honor a range of qualitative traditions, methods, and reporting styles. The Working Group was composed of a group of researchers with backgrounds in varying methods, research topics, and approaches to inquiry. In this article, they present these standards and their rationale, and they detail the ways that the standards differ from the quantitative research reporting standards. They describe how the standards can be used by authors in the process of writing qualitative research for submission as well as by reviewers and editors in the process of reviewing research. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  17. 40 CFR Table 1 to Subpart Hhhhh of... - Emission Limits and Work Practice Standards for Process Vessels

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants... to a condenser that reduces the outlet gas temperature to: <10 °C if the process vessel contains HAP...

  18. ISO 9001 in a neonatal intensive care unit (NICU).

    PubMed

    Vitner, Gad; Nadir, Erez; Feldman, Michael; Yurman, Shmuel

    2011-01-01

    The aim of this paper is to present the process for approving and certifying a neonatal intensive care unit to ISO 9001 standards. The process started with the department head's decision to improve services quality before deciding to achieve ISO 9001 certification. Department processes were mapped and quality management mechanisms were developed. Process control and performance measurements were defined and implemented to monitor the daily work. A service satisfaction review was conducted to get feedback from families. In total, 28 processes and related work instructions were defined. Process yields showed service improvements. Family satisfaction improved. The paper is based on preparing only one neonatal intensive care unit to the ISO 9001 standard. The case study should act as an incentive for hospital managers aiming to improve service quality based on the ISO 9001 standard. ISO 9001 is becoming a recommended tool to improve clinical service quality.

  19. Standard operating procedures for serum and plasma collection: early detection research network consensus statement standard operating procedure integration working group.

    PubMed

    Tuck, Melissa K; Chan, Daniel W; Chia, David; Godwin, Andrew K; Grizzle, William E; Krueger, Karl E; Rom, William; Sanda, Martin; Sorbara, Lynn; Stass, Sanford; Wang, Wendy; Brenner, Dean E

    2009-01-01

    Specimen collection is an integral component of clinical research. Specimens from subjects with various stages of cancers or other conditions, as well as those without disease, are critical tools in the hunt for biomarkers, predictors, or tests that will detect serious diseases earlier or more readily than currently possible. Analytic methodologies evolve quickly. Access to high-quality specimens, collected and handled in standardized ways that minimize potential bias or confounding factors, is key to the "bench to bedside" aim of translational research. It is essential that standard operating procedures, "the how" of creating the repositories, be defined prospectively when designing clinical trials. Small differences in the processing or handling of a specimen can have dramatic effects in analytical reliability and reproducibility, especially when multiplex methods are used. A representative working group, Standard Operating Procedures Internal Working Group (SOPIWG), comprised of members from across Early Detection Research Network (EDRN) was formed to develop standard operating procedures (SOPs) for various types of specimens collected and managed for our biomarker discovery and validation work. This report presents our consensus on SOPs for the collection, processing, handling, and storage of serum and plasma for biomarker discovery and validation.

  20. Implementing PAT with Standards

    NASA Astrophysics Data System (ADS)

    Chandramohan, Laakshmana Sabari; Doolla, Suryanarayana; Khaparde, S. A.

    2016-02-01

    Perform Achieve Trade (PAT) is a market-based incentive mechanism to promote energy efficiency. The purpose of this work is to address the challenges inherent to inconsistent representation of business processes, and interoperability issues in PAT like cap-and-trade mechanisms especially when scaled. Studies by various agencies have highlighted that as the mechanism evolves including more industrial sectors and industries in its ambit, implementation will become more challenging. This paper analyses the major needs of PAT (namely tracking, monitoring, auditing & verifying energy-saving reports, and providing technical support & guidance to stakeholders); and how the aforesaid reasons affect them. Though current technologies can handle these challenges to an extent, standardization activities for implementation have been scanty for PAT and this work attempts to evolve them. The inconsistent modification of business processes, rules, and procedures across stakeholders, and interoperability among heterogeneous systems are addressed. This paper proposes the adoption of specifically two standards into PAT, namely Business Process Model and Notation for maintaining consistency in business process modelling, and Common Information Model (IEC 61970, 61968, 62325 combined) for information exchange. Detailed architecture and organization of these adoptions are reported. The work can be used by PAT implementing agencies, stakeholders, and standardization bodies.

  1. 40 CFR 405.44 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a point source subject to the provisions of this subpart. Pollutant or pollutant property...

  2. 40 CFR 405.104 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a point source subject to the provisions of this subpart. Pollutant or pollutant property...

  3. 40 CFR 428.76 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... to this subpart that introduces process wastewater pollutants into a publicly owned treatment works... this section. (a) The following pretreatment standard establishes the quantity or quality of pollutants... treatment works by a new point source subject to the provisions of this subpart: Pollutant or pollutant...

  4. 40 CFR 428.76 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... to this subpart that introduces process wastewater pollutants into a publicly owned treatment works... this section. (a) The following pretreatment standard establishes the quantity or quality of pollutants... treatment works by a new point source subject to the provisions of this subpart: Pollutant or pollutant...

  5. 40 CFR 458.26 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a new source subject to the provisions of this subpart: Pollutant or pollutant property...

  6. 40 CFR 406.24 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of pollutants or pollutant properties controlled by this section which may be discharged to a publicly owned treatment works...

  7. 40 CFR 406.44 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a point source subject to the provisions of this subpart. Pollutant or pollutant property...

  8. 40 CFR 458.16 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a new source subject to the provisions of this subpart: Pollutant or pollutant property...

  9. 40 CFR 406.34 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a point source subject to the provisions of this subpart. Pollutant or pollutant property...

  10. 40 CFR 405.124 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a point source subject to the provisions of this subpart. Pollutant or pollutant property...

  11. 40 CFR 428.106 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... this subpart that introduces process wastewater pollutants into a publicly owned treatment works must... section. (a) The following pretreatment standard establishes the quantity or quality of pollutants or... works by a new point source subject to the provisions of this subpart: Pollutant or pollutant property...

  12. 40 CFR 458.36 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a new source subject to the provisions of this subpart: Pollutant or pollutant property...

  13. 40 CFR 458.46 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a new source subject to the provisions of this subpart: Pollutant or pollutant property...

  14. 48 CFR 46.202-3 - Standard inspection requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... CONTRACT MANAGEMENT QUALITY ASSURANCE Contract Quality Requirements 46.202-3 Standard inspection... and tests while work is in process; and (3) Require the contractor to keep complete, and make available to the Government, records of its inspection work. [48 FR 42415, Sept. 19, 1983. Redesignated at...

  15. 40 CFR 406.14 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of pollutants or pollutant properties controlled by this section which may be discharged to a publicly owned treatment works...

  16. 40 CFR 405.44 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a point source subject to the provisions of this subpart. Pollutant or pollutant property...

  17. 40 CFR 458.26 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a new source subject to the provisions of this subpart: Pollutant or pollutant property...

  18. 40 CFR 406.54 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a point source subject to the provisions of this subpart. Pollutant or pollutant property...

  19. 48 CFR 46.202-3 - Standard inspection requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... CONTRACT MANAGEMENT QUALITY ASSURANCE Contract Quality Requirements 46.202-3 Standard inspection... and tests while work is in process; and (3) Require the contractor to keep complete, and make available to the Government, records of its inspection work. [48 FR 42415, Sept. 19, 1983. Redesignated at...

  20. 40 CFR 406.64 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a point source subject to the provisions of this subpart. Pollutant or pollutant property...

  1. 40 CFR 427.96 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a new point source subject to the provisions of this subpart. Pollutant or pollutant...

  2. 40 CFR 428.106 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... this subpart that introduces process wastewater pollutants into a publicly owned treatment works must... section. (a) The following pretreatment standard establishes the quantity or quality of pollutants or... works by a new point source subject to the provisions of this subpart: Pollutant or pollutant property...

  3. 48 CFR 46.202-3 - Standard inspection requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... CONTRACT MANAGEMENT QUALITY ASSURANCE Contract Quality Requirements 46.202-3 Standard inspection... and tests while work is in process; and (3) Require the contractor to keep complete, and make available to the Government, records of its inspection work. [48 FR 42415, Sept. 19, 1983. Redesignated at...

  4. 40 CFR 446.16 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of pollutants or pollutant properties controlled by this section which may be discharged to a publicly owned treatment works...

  5. 40 CFR 427.106 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a new point source subject to the provisions of this subpart. Pollutant or pollutant...

  6. 40 CFR 405.44 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a point source subject to the provisions of this subpart. Pollutant or pollutant property...

  7. 40 CFR 428.56 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... to this subpart that introduces process wastewater pollutants into a publicly owned treatment works... this section. (a) The following pretreatment standard establishes the quantity or quality of pollutants... treatment works by a new point source subject to the provisions of this subpart: Pollutant or pollutant...

  8. 40 CFR 458.16 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a new source subject to the provisions of this subpart: Pollutant or pollutant property...

  9. 48 CFR 46.202-3 - Standard inspection requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... CONTRACT MANAGEMENT QUALITY ASSURANCE Contract Quality Requirements 46.202-3 Standard inspection... and tests while work is in process; and (3) Require the contractor to keep complete, and make available to the Government, records of its inspection work. [48 FR 42415, Sept. 19, 1983. Redesignated at...

  10. 40 CFR 446.16 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of pollutants or pollutant properties controlled by this section which may be discharged to a publicly owned treatment works...

  11. 40 CFR 458.36 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a new source subject to the provisions of this subpart: Pollutant or pollutant property...

  12. 40 CFR 427.116 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a new point source subject to the provisions of this subpart. Pollutant or pollutant...

  13. 40 CFR 428.56 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... to this subpart that introduces process wastewater pollutants into a publicly owned treatment works... this section. (a) The following pretreatment standard establishes the quantity or quality of pollutants... treatment works by a new point source subject to the provisions of this subpart: Pollutant or pollutant...

  14. 40 CFR 408.156 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a new source subject to the provisions of this subpart: Pollutant or pollutant property...

  15. 40 CFR 447.16 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a new source subject to the provisions of this subpart: There shall be no discharge of...

  16. 40 CFR 405.94 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... that introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a point source subject to the provisions of this subpart. Pollutant or pollutant property...

  17. 40 CFR 405.14 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... that introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a point source subject to the provisions of this subpart. Pollutant or pollutant property...

  18. 40 CFR 427.116 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of pollutants or... works by a new point source subject to the provisions of this subpart. Pollutant or pollutant property...

  19. 40 CFR 405.34 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... that introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a point source subject to the provisions of this subpart. Pollutant or pollutant property...

  20. 40 CFR 405.34 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... that introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a point source subject to the provisions of this subpart. Pollutant or pollutant property...

  1. 40 CFR 446.16 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a new source subject to the provisions of this subpart: There shall be no discharge of...

  2. 40 CFR 447.16 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a new source subject to the provisions of this subpart: There shall be no discharge of...

  3. 40 CFR 405.24 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... that introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a point source subject to the provisions of this subpart. Pollutant or pollutant property...

  4. 40 CFR 426.86 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a new point source subject to the provisions of this subpart. Because of the recognition...

  5. 40 CFR 427.116 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of pollutants or... works by a new point source subject to the provisions of this subpart. Pollutant or pollutant property...

  6. 40 CFR 405.114 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... that introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a point source subject to the provisions of this subpart. Pollutant or pollutant property...

  7. 40 CFR 405.24 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... that introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a point source subject to the provisions of this subpart. Pollutant or pollutant property...

  8. 40 CFR 405.24 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... that introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a point source subject to the provisions of this subpart. Pollutant or pollutant property...

  9. 40 CFR 446.16 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a new source subject to the provisions of this subpart: There shall be no discharge of...

  10. 40 CFR 405.34 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... that introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a point source subject to the provisions of this subpart. Pollutant or pollutant property...

  11. 40 CFR 405.14 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... that introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a point source subject to the provisions of this subpart. Pollutant or pollutant property...

  12. 40 CFR 446.16 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a new source subject to the provisions of this subpart: There shall be no discharge of...

  13. 40 CFR 405.14 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... that introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a point source subject to the provisions of this subpart. Pollutant or pollutant property...

  14. 40 CFR 447.16 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... introduces process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In addition, the following pretreatment standard establishes the quantity or quality of... treatment works by a new source subject to the provisions of this subpart: There shall be no discharge of...

  15. 40 CFR 63.7184 - What emission limitations, operating limits, and work practice standards must I meet?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Pollutants for Semiconductor Manufacturing Emission Standards § 63.7184 What emission limitations, operating... this section on and after the compliance dates specified in § 63.7183. (b) Process vents—organic HAP emissions. For each organic HAP process vent, other than process vents from storage tanks, you must limit...

  16. 40 CFR 63.7184 - What emission limitations, operating limits, and work practice standards must I meet?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Pollutants for Semiconductor Manufacturing Emission Standards § 63.7184 What emission limitations, operating... this section on and after the compliance dates specified in § 63.7183. (b) Process vents—organic HAP emissions. For each organic HAP process vent, other than process vents from storage tanks, you must limit...

  17. Final Report. An Integrated Partnership to Create and Lead the Solar Codes and Standards Working Group

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosenthal, Andrew

    The DOE grant, “An Integrated Partnership to Create and Lead the Solar Codes and Standards Working Group,” to New Mexico State University created the Solar America Board for Codes and Standards (Solar ABCs). From 2007 – 2013 with funding from this grant, Solar ABCs identified current issues, established a dialogue among key stakeholders, and catalyzed appropriate activities to support the development of codes and standards that facilitated the installation of high quality, safe photovoltaic systems. Solar ABCs brought the following resources to the PV stakeholder community; Formal coordination in the planning or revision of interrelated codes and standards removing “stovemore » pipes” that have only roofing experts working on roofing codes, PV experts on PV codes, fire enforcement experts working on fire codes, etc.; A conduit through which all interested stakeholders were able to see the steps being taken in the development or modification of codes and standards and participate directly in the processes; A central clearing house for new documents, standards, proposed standards, analytical studies, and recommendations of best practices available to the PV community; A forum of experts that invites and welcomes all interested parties into the process of performing studies, evaluating results, and building consensus on standards and code-related topics that affect all aspects of the market; and A biennial gap analysis to formally survey the PV community to identify needs that are unmet and inhibiting the market and necessary technical developments.« less

  18. Indexicality and "Standard" Edited American English: Examining the Link between Conceptions of Standardness and Perceived Authorial Identity

    ERIC Educational Resources Information Center

    Davila, Bethany

    2012-01-01

    This article explores the indexicality (the ideological process that links language and identity) of "standard" edited American English and the ideologies (specifically, standard language ideology and Whiteness) that work to create and justify common patterns that associate privileged White students with written standardness and that disassociate…

  19. Teaching Practise Utilising Embedded Indigenous Cultural Standards

    ERIC Educational Resources Information Center

    Gilbert, Stephanie

    2017-01-01

    The Wollotuka Institute, University of Newcastle, New South Wales, is the first university or organisation to enter into the accreditation process with the World Indigenous Higher Education Consortium (WINHEC). Part of that process includes identifying the local cultural standards and protocols that drive and shape our work as a cultural entity.…

  20. Interrelationships between Working Memory, Processing Speed, and Language Development in the Age Range 2-4 Years

    ERIC Educational Resources Information Center

    Newbury, Jayne; Klee, Thomas; Stokes, Stephanie F.; Moran, Catherine

    2016-01-01

    Purpose: This study explored associations between working memory and language in children aged 2-4 years. Method: Seventy-seven children aged 24-30 months were assessed on tests measuring language, visual cognition, verbal working memory (VWM), phonological short-term memory (PSTM), and processing speed. A standardized test of receptive and…

  1. Social work, technology, and ethical practices: a review and evaluation of the national association of social workers' technology standards.

    PubMed

    Lopez, Amy

    2014-10-01

    Information and communication technologies (ICTs) are becoming essential to social work practice by providing increased treatment possibilities and reducing barriers to service. While recognizing the importance of ICTs in practice, social work practitioners have had concerns about ethical use. In response, NASW compiled the Standards for Technology and Social Work Practice. While the guidelines set the groundwork, they were not embedded in a process that would allow them to adapt to the swift pace of ICT changes. This article reviews the current Standards, evaluates how these have been implemented by practitioners, and offers suggestions for updates.

  2. National Education Standards: The Complex Challenge for Educational Leaders.

    ERIC Educational Resources Information Center

    Faidley, Ray; Musser, Steven

    1991-01-01

    National standards for education are important elements in the excellence process, but standards imposed by a central authority simply do not work in the Information Era. It would be wise to increase teachers' decision-making role in establishing and implementing local level excellence standards and train teachers to employ the Japanese "kaizen"…

  3. Cost minimizing of cutting process for CNC thermal and water-jet machines

    NASA Astrophysics Data System (ADS)

    Tavaeva, Anastasia; Kurennov, Dmitry

    2015-11-01

    This paper deals with optimization problem of cutting process for CNC thermal and water-jet machines. The accuracy of objective function parameters calculation for optimization problem is investigated. This paper shows that working tool path speed is not constant value. One depends on some parameters that are described in this paper. The relations of working tool path speed depending on the numbers of NC programs frames, length of straight cut, configuration part are presented. Based on received results the correction coefficients for working tool speed are defined. Additionally the optimization problem may be solved by using mathematical model. Model takes into account the additional restrictions of thermal cutting (choice of piercing and output tool point, precedence condition, thermal deformations). At the second part of paper the non-standard cutting techniques are considered. Ones may lead to minimizing of cutting cost and time compared with standard cutting techniques. This paper considers the effectiveness of non-standard cutting techniques application. At the end of the paper the future research works are indicated.

  4. Using continuous process improvement methodology to standardize nursing handoff communication.

    PubMed

    Klee, Kristi; Latta, Linda; Davis-Kirsch, Sallie; Pecchia, Maria

    2012-04-01

    The purpose of this article was to describe the use of continuous performance improvement (CPI) methodology to standardize nurse shift-to-shift handoff communication. The goals of the process were to standardize the content and process of shift handoff, improve patient safety, increase patient and family involvement in the handoff process, and decrease end-of-shift overtime. This article will describe process changes made over a 4-year period as result of application of the plan-do-check-act procedure, which is an integral part of the CPI methodology, and discuss further work needed to continue to refine this critical nursing care process. Copyright © 2012 Elsevier Inc. All rights reserved.

  5. Enhancing the NASA Expendable Launch Vehicle Payload Safety Review Process Through Program Activities

    NASA Technical Reports Server (NTRS)

    Palo, Thomas E.

    2007-01-01

    The safety review process for NASA spacecraft flown on Expendable Launch Vehicles (ELVs) has been guided by NASA-STD 8719.8, Expendable Launch Vehicle Payload Safety Review Process Standard. The standard focused primarily on the safety approval required to begin pre-launch processing at the launch site. Subsequent changes in the contractual, technical, and operational aspects of payload processing, combined with lessons-learned supported a need for the reassessment of the standard. This has resulted in the formation of a NASA ELV Payload Safety Program. This program has been working to address the programmatic issues that will enhance and supplement the existing process, while continuing to ensure the safety of ELV payload activities.

  6. The international development of forensic science standards - A review.

    PubMed

    Wilson-Wilde, Linzi

    2018-04-16

    Standards establish specifications and procedures designed to ensure products, services and systems are safe, reliable and consistently perform as intended. Standards can be used in the accreditation of forensic laboratories or facilities and in the certification of products and services. In recent years there have been various international activities aiming at developing forensic science standards and guidelines. The most significant initiative currently underway within the global forensic community is the development of International Organization for Standardization (ISO) standards. This paper reviews the main bodies working on standards for forensic science, the processes used and the implications for accreditation. This paper specifically discusses the work of ISO Technical Committee TC272, the future TC272 work program for the development of forensic science standards and associated timelines. Also discussed, are the lessons learnt to date in navigating the complex environment of multi-country stakeholder deliberations in standards development. Crown Copyright © 2018. Published by Elsevier B.V. All rights reserved.

  7. 30 CFR 816.14 - Casing and sealing of drilled holes: Temporary.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., DEPARTMENT OF THE INTERIOR PERMANENT PROGRAM PERFORMANCE STANDARDS PERMANENT PROGRAM PERFORMANCE STANDARDS... approved permit application for use to return coal processing waste or water to underground workings, or to...

  8. 40 CFR 411.24 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...) EFFLUENT GUIDELINES AND STANDARDS CEMENT MANUFACTURING POINT SOURCE CATEGORY Leaching Subcategory § 411.24... process wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. In...

  9. Standard development at the Human Variome Project.

    PubMed

    Smith, Timothy D; Vihinen, Mauno

    2015-01-01

    The Human Variome Project (HVP) is a world organization working towards facilitating the collection, curation, interpretation and free and open sharing of genetic variation information. A key component of HVP activities is the development of standards and guidelines. HVP Standards are systems, procedures and technologies that the HVP Consortium has determined must be used by HVP-affiliated data sharing infrastructure and should be used by the broader community. HVP guidelines are considered to be beneficial for HVP affiliated data sharing infrastructure and the broader community to adopt. The HVP also maintains a process for assessing systems, processes and tools that implement HVP Standards and Guidelines. Recommended System Status is an accreditation process designed to encourage the adoption of HVP Standards and Guidelines. Here, we describe the HVP standards development process and discuss the accepted standards, guidelines and recommended systems as well as those under acceptance. Certain HVP Standards and Guidelines are already widely adopted by the community and there are committed users for the others. © The Author(s) 2015. Published by Oxford University Press.

  10. Standard development at the Human Variome Project

    PubMed Central

    Smith, Timothy D.; Vihinen, Mauno

    2015-01-01

    The Human Variome Project (HVP) is a world organization working towards facilitating the collection, curation, interpretation and free and open sharing of genetic variation information. A key component of HVP activities is the development of standards and guidelines. HVP Standards are systems, procedures and technologies that the HVP Consortium has determined must be used by HVP-affiliated data sharing infrastructure and should be used by the broader community. HVP guidelines are considered to be beneficial for HVP affiliated data sharing infrastructure and the broader community to adopt. The HVP also maintains a process for assessing systems, processes and tools that implement HVP Standards and Guidelines. Recommended System Status is an accreditation process designed to encourage the adoption of HVP Standards and Guidelines. Here, we describe the HVP standards development process and discuss the accepted standards, guidelines and recommended systems as well as those under acceptance. Certain HVP Standards and Guidelines are already widely adopted by the community and there are committed users for the others. PMID:25818894

  11. 40 CFR 420.45 - Pretreatment standards for existing sources (PSES).

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... owned treatment works must comply with 40 CFR part 403 and achieve the following pretreatment standards for existing sources. (a) Electric arc furnace steelmaking—semi-wet. No discharge of process... electric arc furnace steelmaking—wet. Subpart D Pollutant or pollutant property Pretreatment standards for...

  12. 40 CFR 420.45 - Pretreatment standards for existing sources (PSES).

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... owned treatment works must comply with 40 CFR part 403 and achieve the following pretreatment standards for existing sources. (a) Electric arc furnace steelmaking—semi-wet. No discharge of process... electric arc furnace steelmaking—wet. Subpart D Pollutant or pollutant property Pretreatment standards for...

  13. 40 CFR 420.45 - Pretreatment standards for existing sources (PSES).

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... owned treatment works must comply with 40 CFR part 403 and achieve the following pretreatment standards for existing sources. (a) Electric arc furnace steelmaking—semi-wet. No discharge of process... electric arc furnace steelmaking—wet. Subpart D Pollutant or pollutant property Pretreatment standards for...

  14. 40 CFR 420.45 - Pretreatment standards for existing sources (PSES).

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... owned treatment works must comply with 40 CFR part 403 and achieve the following pretreatment standards for existing sources. (a) Electric arc furnace steelmaking—semi-wet. No discharge of process... electric arc furnace steelmaking—wet. Subpart D Pollutant or pollutant property Pretreatment standards for...

  15. Openness of Common-Standards Process at Issue

    ERIC Educational Resources Information Center

    Cavanagh, Sean

    2009-01-01

    As the most concerted venture to craft common academic standards in more than a decade rolls forward, the process has drawn criticism from those who say too much of the nitty-gritty work is taking place behind closed doors. The organizations leading the effort--the National Governors Association (NGA) and the Council of Chief State School Officers…

  16. [Work organisation improvement methods applied to activities of Blood Transfusion Establishments (BTE): Lean Manufacturing, VSM, 5S].

    PubMed

    Bertholey, F; Bourniquel, P; Rivery, E; Coudurier, N; Follea, G

    2009-05-01

    Continuous improvement of efficiency as well as new expectations from customers (quality and safety of blood products) and employees (working conditions) imply constant efforts in Blood Transfusion Establishments (BTE) to improve work organisations. The Lean method (from "Lean" meaning "thin") aims at identifying wastages in the process (overproduction, waiting, over-processing, inventory, transport, motion) and then reducing them in establishing a mapping of value chain (Value Stream Mapping). It consists in determining the added value of each step of the process from a customer perspective. Lean also consists in standardizing operations while implicating and responsabilizing all collaborators. The name 5S comes from the first letter of five operations of a Japanese management technique: to clear, rank, keep clean, standardize, make durable. The 5S method leads to develop the team working inducing an evolution of the way in the management is performed. The Lean VSM method has been applied to blood processing (component laboratory) in the Pays de la Loire BTE. The Lean 5S method has been applied to blood processing, quality control, purchasing, warehouse, human resources and quality assurance in the Rhône-Alpes BTE. The experience returns from both BTE shows that these methods allowed improving: (1) the processes and working conditions from a quality perspective, (2) the staff satisfaction, (3) the efficiency. These experiences, implemented in two BTE for different processes, confirm the applicability and usefulness of these methods to improve working organisations in BTE.

  17. Standardization of the Principal Processes in Scientific and Technical Information and Librarianship in the U.S.S.R.

    ERIC Educational Resources Information Center

    Haritonov, R. P.

    1971-01-01

    An important feature of standardization work in the Soviet Union is the preparation and establishment of State standards enabling unified systems to be introduced for documentation, classification, coding and technical and economic information, as well as standards for all kinds of information storage media. (Author/MM)

  18. Standard work for room entry: Linking lean, hand hygiene, and patient-centeredness.

    PubMed

    O'Reilly, Kristin; Ruokis, Samantha; Russell, Kristin; Teves, Tim; DiLibero, Justin; Yassa, David; Berry, Hannah; Howell, Michael D

    2016-03-01

    Healthcare-associated infections are costly and fatal. Substantial front-line, administrative, regulatory, and research efforts have focused on improving hand hygiene. While broad agreement exists that hand hygiene is the most important single approach to infection prevention, compliance with hand hygiene is typically only about 40%(1). Our aim was to develop a standard process for room entry in the intensive care unit that improved compliance with hand hygiene and allowed for maximum efficiency. We recognized that hand hygiene is a single step in a substantially more complicated process of room entry. We applied Lean engineering techniques to develop a standard process that included both physical steps and also standard communication elements from provider to patients and families and created a physical environment to support this. We observed meaningful improvement in the performance of the new standard as well as time savings for clinical providers with each room entry. We also observed an increase in room entries that included verbal communication and an explanation of what the clinician was entering the room to do. The design and implementation of a standardized room entry process and the creation of an environment that supports that new process has resulted in measurable positive outcomes on the medical intensive care unit, including quality, patient experience, efficiency, and staff satisfaction. Designing a process, rather than viewing tasks that need to happen in close proximity in time (either serially or in parallel) as unrelated, simplifies work for staff and results in higher compliance to individual tasks. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Issues of Teaching Metrology in Higher Education Institutions of Civil Engineering in Russia

    ERIC Educational Resources Information Center

    Pukharenko, Yurii Vladimirovich; Norin, Veniamin Aleksandrovich

    2017-01-01

    The work analyses the training process condition in teaching the discipline "Metrology, Standardization, Certification and Quality Control." It proves that the current educational standard regarding the instruction of the discipline "Metrology, Standardization, Certification and Quality Control" does not meet the needs of the…

  20. Template for success: using a resident-designed sign-out template in the handover of patient care.

    PubMed

    Clark, Clancy J; Sindell, Sarah L; Koehler, Richard P

    2011-01-01

    Report our implementation of a standardized handover process in a general surgery residency program. The standardized handover process, sign-out template, method of implementation, and continuous quality improvement process were designed by general surgery residents with support of faculty and senior hospital administration using standard work principles and business models of the Virginia Mason Production System and the Toyota Production System. Nonprofit, tertiary referral teaching hospital. General surgery residents, residency faculty, patient care providers, and hospital administration. After instruction in quality improvement initiatives, a team of general surgery residents designed a sign-out process using an electronic template and standard procedures. The initial implementation phase resulted in 73% compliance. Using resident-driven continuous quality improvement processes, real-time feedback enabled residents to modify and improve this process, eventually attaining 100% compliance and acceptance by residents. The creation of a standardized template and protocol for patient handovers might eliminate communication failures. Encouraging residents to participate in this process can establish the groundwork for successful implementation of a standardized handover process. Integrating a continuous quality-improvement process into such an initiative can promote active participation of busy general surgery residents and lead to successful implementation of standard procedures. Copyright © 2011 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  1. An empirical evaluation of software quality assurance practices and challenges in a developing country: a comparison of Nigeria and Turkey.

    PubMed

    Sowunmi, Olaperi Yeside; Misra, Sanjay; Fernandez-Sanz, Luis; Crawford, Broderick; Soto, Ricardo

    2016-01-01

    The importance of quality assurance in the software development process cannot be overemphasized because its adoption results in high reliability and easy maintenance of the software system and other software products. Software quality assurance includes different activities such as quality control, quality management, quality standards, quality planning, process standardization and improvement amongst others. The aim of this work is to further investigate the software quality assurance practices of practitioners in Nigeria. While our previous work covered areas on quality planning, adherence to standardized processes and the inherent challenges, this work has been extended to include quality control, software process improvement and international quality standard organization membership. It also makes comparison based on a similar study carried out in Turkey. The goal is to generate more robust findings that can properly support decision making by the software community. The qualitative research approach, specifically, the use of questionnaire research instruments was applied to acquire data from software practitioners. In addition to the previous results, it was observed that quality assurance practices are quite neglected and this can be the cause of low patronage. Moreover, software practitioners are neither aware of international standards organizations or the required process improvement techniques; as such their claimed standards are not aligned to those of accredited bodies, and are only limited to their local experience and knowledge, which makes it questionable. The comparison with Turkey also yielded similar findings, making the results typical of developing countries. The research instrument used was tested for internal consistency using the Cronbach's alpha, and it was proved reliable. For the software industry in developing countries to grow strong and be a viable source of external revenue, software assurance practices have to be taken seriously because its effect is evident in the final product. Moreover, quality frameworks and tools which require minimum time and cost are highly needed in these countries.

  2. Quality Space and Launch Requirements Addendum to AS9100C

    DTIC Science & Technology

    2015-03-05

    45 8.9.1 Statistical Process Control (SPC) .......................................................................... 45 8.9.1.1 Out of Control...Systems Center SME Subject Matter Expert SOW Statement of Work SPC Statistical Process Control SPO System Program Office SRP Standard Repair...individual data exceeding the control limits. Control limits are developed using standard statistical methods or other approved techniques and are based on

  3. 77 FR 41075 - Delegation of National Emission Standards for Hazardous Air Pollutants for Source Categories...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-12

    ..., and Sulfur Recovery Units. VVV Publicly Owned X X X X Treatment Works. XXX Ferroalloys X X X X... Ceramics X X X X Manufacturing. LLLLL Asphalt Roofing X X X X and Processing. MMMMM Flexible X X X X... Source X Standards for Aluminum, Copper, and Other Nonferrous Foundries. AAAAAAA Asphalt X Processing and...

  4. Implementation of standardized time limits in sickness insurance and return-to-work: experiences of four actors.

    PubMed

    Ståhl, Christian; Müssener, Ulrika; Svensson, Tommy

    2012-01-01

    In 2008, time limits were introduced in Swedish sickness insurance, comprising a pre-defined schedule for return-to-work. The purpose of this study was to explore experienced consequences of these time limits. Sick-listed persons, physicians, insurance officials and employers were interviewed regarding the process of sick-listing, rehabilitation and return-to-work in relation to the reform. The study comprises qualitative interviews with 11 sick-listed persons, 4 insurance officials, 5 employers and 4 physicians (n = 24). Physicians, employers, and sick-listed persons described insurance officials as increasingly passive, and that responsibility for the process was placed on the sick-listed. Several ethical dilemmas were identified, where officials were forced to act against their ethical principles. Insurance officials' principle of care often clashed with the standardization of the process, that is based on principles of egalitarianism and equal treatment. The cases reported in this study suggest that a policy for activation and early return-to-work in some cases has had the opposite effect: central actors remain passive and the responsibility is placed on the sick-listed, who lacks the strength and knowledge to understand and navigate through the system. The standardized insurance system here promoted experiences of procedural injustice, for both officials and sick-listed persons.

  5. Exploring the New Standards

    ERIC Educational Resources Information Center

    Willard, Ted; Pratt, Harold; Workosky, Cindy

    2012-01-01

    This is an exciting time to be in science education. New science standards are being developed by a group of science educators from across the country, working with 26 states in a process managed by Achieve, Inc., a non-profit education reform organization. The development of the "Next Generation Science Standards" (NGSS) promises to be the most…

  6. The HPS experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colaneri, Luca

    2017-04-01

    With the experimental discovery of the Higgs boson, the Standard Model has been considered veri ed in all its previsions. The Standard Model, though, is still considered an incomplete theory, because it fails to address many theoretical and phenomenological issues. Among those, it doesn't provide any viable Dark Matter candidate. Many Beyond-Standard Model theories, such as the Supersymmetric Standard Model, provide possible solutions. In this work we have reported the experimental observations that led to considerate the existence of a new Force, mediated by a new massive vector boson, that could address all the observed phenomenology. This new dark Forcemore » could open an observational channel between the Standard Model and a new Dark Sector, convey by the interaction of the Standard Model photon with the massive dark photon, also called the A'. Purpose of this work was to develop an independent study of the background processes and the implementation of an independent event generator, to better understand the kinematics of the produced particles in the process e - +W → e - +W' + e + + e - and validate, or invalidate, the o cial event generator.« less

  7. Analytical and Experimental Performance Evaluation of BLE Neighbor Discovery Process Including Non-Idealities of Real Chipsets

    PubMed Central

    Perez-Diaz de Cerio, David; Hernández, Ángela; Valenzuela, Jose Luis; Valdovinos, Antonio

    2017-01-01

    The purpose of this paper is to evaluate from a real perspective the performance of Bluetooth Low Energy (BLE) as a technology that enables fast and reliable discovery of a large number of users/devices in a short period of time. The BLE standard specifies a wide range of configurable parameter values that determine the discovery process and need to be set according to the particular application requirements. Many previous works have been addressed to investigate the discovery process through analytical and simulation models, according to the ideal specification of the standard. However, measurements show that additional scanning gaps appear in the scanning process, which reduce the discovery capabilities. These gaps have been identified in all of the analyzed devices and respond to both regular patterns and variable events associated with the decoding process. We have demonstrated that these non-idealities, which are not taken into account in other studies, have a severe impact on the discovery process performance. Extensive performance evaluation for a varying number of devices and feasible parameter combinations has been done by comparing simulations and experimental measurements. This work also includes a simple mathematical model that closely matches both the standard implementation and the different chipset peculiarities for any possible parameter value specified in the standard and for any number of simultaneous advertising devices under scanner coverage. PMID:28273801

  8. Analytical and Experimental Performance Evaluation of BLE Neighbor Discovery Process Including Non-Idealities of Real Chipsets.

    PubMed

    Perez-Diaz de Cerio, David; Hernández, Ángela; Valenzuela, Jose Luis; Valdovinos, Antonio

    2017-03-03

    The purpose of this paper is to evaluate from a real perspective the performance of Bluetooth Low Energy (BLE) as a technology that enables fast and reliable discovery of a large number of users/devices in a short period of time. The BLE standard specifies a wide range of configurable parameter values that determine the discovery process and need to be set according to the particular application requirements. Many previous works have been addressed to investigate the discovery process through analytical and simulation models, according to the ideal specification of the standard. However, measurements show that additional scanning gaps appear in the scanning process, which reduce the discovery capabilities. These gaps have been identified in all of the analyzed devices and respond to both regular patterns and variable events associated with the decoding process. We have demonstrated that these non-idealities, which are not taken into account in other studies, have a severe impact on the discovery process performance. Extensive performance evaluation for a varying number of devices and feasible parameter combinations has been done by comparing simulations and experimental measurements. This work also includes a simple mathematical model that closely matches both the standard implementation and the different chipset peculiarities for any possible parameter value specified in the standard and for any number of simultaneous advertising devices under scanner coverage.

  9. A comparison of BPMN 2.0 with other notations for manufacturing processes

    NASA Astrophysics Data System (ADS)

    García-Domínguez, A.; Marcos, Mariano; Medina, I.

    2012-04-01

    In order to study their current practices and improve on them, manufacturing firms need to view their processes from several viewpoints at various abstraction levels. Several notations have been developed for this purpose, such as Value Stream Mappings or IDEF models. More recently, the BPMN 2.0 standard from the Object Management Group has been proposed for modeling business processes. A process organizes several activities (manual or automatic) into a single higher-level entity, which can be reused elsewhere in the organization. Its potential for standardizing business interactions is well-known, but there is little work on using BPMN 2.0 to model manufacturing processes. In this work some of the previous notations are outlined and BPMN 2.0 is positioned among them after discussing it in more depth. Some guidelines on using BPMN 2.0 for manufacturing are offered, and its advantages and disadvantages in comparison with the other notations are presented.

  10. The Future of Geospatial Standards

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.; Simonis, I.

    2016-12-01

    The OGC is an international not-for-profit standards development organization (SDO) committed to making quality standards for the geospatial community. A community of more than 500 member organizations with more than 6,000 people registered at the OGC communication platform drives the development of standards that are freely available for anyone to use and to improve sharing of the world's geospatial data. OGC standards are applied in a variety of application domains including Environment, Defense and Intelligence, Smart Cities, Aviation, Disaster Management, Agriculture, Business Development and Decision Support, and Meteorology. Profiles help to apply information models to different communities, thus adapting to particular needs of that community while ensuring interoperability by using common base models and appropriate support services. Other standards address orthogonal aspects such as handling of Big Data, Crowd-sourced information, Geosemantics, or container for offline data usage. Like most SDOs, the OGC develops and maintains standards through a formal consensus process under the OGC Standards Program (OGC-SP) wherein requirements and use cases are discussed in forums generally open to the public (Domain Working Groups, or DWGs), and Standards Working Groups (SWGs) are established to create standards. However, OGC is unique among SDOs in that it also operates the OGC Interoperability Program (OGC-IP) to provide real-world testing of existing and proposed standards. The OGC-IP is considered the experimental playground, where new technologies are researched and developed in a user-driven process. Its goal is to prototype, test, demonstrate, and promote OGC Standards in a structured environment. Results from the OGC-IP often become requirements for new OGC standards or identify deficiencies in existing OGC standards that can be addressed. This presentation will provide an analysis of the work advanced in the OGC consortium including standards and testbeds, where we can extract a trend for the future of geospatial standards. We see a number of key elements in focus, but simultaneously a broadening of standards to address particular communities' needs.

  11. National Standards for United States History: Exploring the American Experience. Grades 5-12. Expanded Edition. Including Examples of Student Achievement.

    ERIC Educational Resources Information Center

    Crabtree, Charlotte; Nash, Gary B.

    Developed through a broad-based national consensus building process, the National History Standards project has involved working toward agreement both on the larger purposes of history in the school curriculum and on the more specific history understandings and thinking processes that all students should have equal opportunity to acquire over 12…

  12. National Standards for World History: Exploring Paths to the Present. Grades 5-12, Expanded Edition. Including Examples of Student Achievement.

    ERIC Educational Resources Information Center

    Crabtree, Charlotte; Nash, Gary B.

    Developed through a broad based national consensus building process, the National History Standards project has involved working toward agreement both on the larger purposes of history in the school curriculum and on the more specific history understandings and thinking processes that all students should have equal opportunity to acquire over 12…

  13. Standardization of uveitis nomenclature for reporting clinical data. Results of the First International Workshop.

    PubMed

    Jabs, Douglas A; Nussenblatt, Robert B; Rosenbaum, James T

    2005-09-01

    To begin a process of standardizing the methods for reporting clinical data in the field of uveitis. Consensus workshop. Members of an international working group were surveyed about diagnostic terminology, inflammation grading schema, and outcome measures, and the results used to develop a series of proposals to better standardize the use of these entities. Small groups employed nominal group techniques to achieve consensus on several of these issues. The group affirmed that an anatomic classification of uveitis should be used as a framework for subsequent work on diagnostic criteria for specific uveitic syndromes, and that the classification of uveitis entities should be on the basis of the location of the inflammation and not on the presence of structural complications. Issues regarding the use of the terms "intermediate uveitis," "pars planitis," "panuveitis," and descriptors of the onset and course of the uveitis were addressed. The following were adopted: standardized grading schema for anterior chamber cells, anterior chamber flare, and for vitreous haze; standardized methods of recording structural complications of uveitis; standardized definitions of outcomes, including "inactive" inflammation, "improvement'; and "worsening" of the inflammation, and "corticosteroid sparing," and standardized guidelines for reporting visual acuity outcomes. A process of standardizing the approach to reporting clinical data in uveitis research has begun, and several terms have been standardized.

  14. Scheduling on the basis of the research of dependences among the construction process parameters

    NASA Astrophysics Data System (ADS)

    Romanovich, Marina; Ermakov, Alexander; Mukhamedzhanova, Olga

    2017-10-01

    The dependences among the construction process parameters are investigated in the article: average integrated value of qualification of the shift, number of workers per shift and average daily amount of completed work on the basis of correlation coefficient are considered. Basic data for the research of dependences among the above-stated parameters have been collected during the construction of two standard objects A and B (monolithic houses), in four months of construction (October, November, December, January). Kobb-Douglas production function has proved the values of coefficients of correlation close to 1. Function is simple to be used and is ideal for the description of the considered dependences. The development function, describing communication among the considered parameters of the construction process, is developed. The function of the development gives the chance to select optimum quantitative and qualitative (qualification) structure of the brigade link for the work during the next period of time, according to a preset value of amount of works. Function of the optimized amounts of works, which reflects interrelation of key parameters of construction process, is developed. Values of function of the optimized amounts of works should be used as the average standard for scheduling of the storming periods of construction.

  15. Work flow of signal processing data of ground penetrating radar case of rigid pavement measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Handayani, Gunawan

    The signal processing of Ground Penetrating Radar (GPR) requires a certain work flow to obtain good results. Even though the Ground Penetrating Radar data looks similar with seismic reflection data, but the GPR data has particular signatures that the seismic reflection data does not have. This is something to do with coupling between antennae and the ground surface. Because of this, the GPR data should be treated differently from the seismic signal data processing work flow. Even though most of the processing steps still follow the same work flow of seismic reflection data such as: filtering, predictive deconvolution etc. Thismore » paper presents the work flow of GPR processing data on rigid pavement measurements. The processing steps start from raw data, de-Wow process, remove DC and continue with the standard process to get rid of noises i.e. filtering process. Some radargram particular features of rigid pavement along with pile foundations are presented.« less

  16. Aligning Your Curriculum to the Common Core State Standards

    ERIC Educational Resources Information Center

    Crawford, Joe

    2011-01-01

    Now that most states have adopted the new Common Core State Standards, the next major challenge is to simplify and implement them by 2014. That is why it is important to begin this work now. Joe Crawford, Milken Award-winning educator and author of "Using Power Standards to Build an Aligned Curriculum", shares his proven process for…

  17. Ride quality criteria and the design process. [standards for ride comfort

    NASA Technical Reports Server (NTRS)

    Ravera, R. J.

    1975-01-01

    Conceptual designs for advanced ground transportation systems often hinge on obtaining acceptable vehicle ride quality while attempting to keep the total guideway cost (initial and subsequent maintenance) as low as possible. Two ride quality standards used extensively in work sponsored by the U.S. Department of Transportation (DOT) are the DOT-Urban Tracked Air Cushion Vehicle (UTACV) standard and the International Standards Organization (ISO) reduced ride comfort criteria. These standards are reviewed and some of the deficiencies, which become apparent when trying to apply them in practice, are noted. Through the use of a digital simulation, the impact of each of these standards on an example design process is examined. It is shown that meeting the ISO specification for the particular vehicle/guideway case investigated is easier than meeting the UTACV standard.

  18. Cerebellar Damage Produces Selective Deficits in Verbal Working Memory

    ERIC Educational Resources Information Center

    Ravizza, Susan M.; Mccormick, Cristin A.; Schlerf, John E.; Justus, Timothy; Ivry, Richard B.; Fiez, Julie A.

    2006-01-01

    The cerebellum is often active in imaging studies of verbal working memory, consistent with a putative role in articulatory rehearsal. While patients with cerebellar damage occasionally exhibit a mild impairment on standard neuropsychological tests of working memory, these tests are not diagnostic for exploring these processes in detail. The…

  19. Ethical Issues in the Research of Group Work

    ERIC Educational Resources Information Center

    Goodrich, Kristopher M.; Luke, Melissa

    2017-01-01

    This article provides a primer for researchers exploring ethical issues in the research of group work. The article begins with an exploration of relevant ethical issues through the research process and current standards guiding its practice. Next, the authors identify resources that group work researchers can consult prior to constructing their…

  20. Corrected simulations for one-dimensional diffusion processes with naturally occurring boundaries.

    PubMed

    Shafiey, Hassan; Gan, Xinjun; Waxman, David

    2017-11-01

    To simulate a diffusion process, a usual approach is to discretize the time in the associated stochastic differential equation. This is the approach used in the Euler method. In the present work we consider a one-dimensional diffusion process where the terms occurring, within the stochastic differential equation, prevent the process entering a region. The outcome is a naturally occurring boundary (which may be absorbing or reflecting). A complication occurs in a simulation of this situation. The term involving a random variable, within the discretized stochastic differential equation, may take a trajectory across the boundary into a "forbidden region." The naive way of dealing with this problem, which we refer to as the "standard" approach, is simply to reset the trajectory to the boundary, based on the argument that crossing the boundary actually signifies achieving the boundary. In this work we show, within the framework of the Euler method, that such resetting introduces a spurious force into the original diffusion process. This force may have a significant influence on trajectories that come close to a boundary. We propose a corrected numerical scheme, for simulating one-dimensional diffusion processes with naturally occurring boundaries. This involves correcting the standard approach, so that an exact property of the diffusion process is precisely respected. As a consequence, the proposed scheme does not introduce a spurious force into the dynamics. We present numerical test cases, based on exactly soluble one-dimensional problems with one or two boundaries, which suggest that, for a given value of the discrete time step, the proposed scheme leads to substantially more accurate results than the standard approach. Alternatively, the standard approach needs considerably more computation time to obtain a comparable level of accuracy to the proposed scheme, because the standard approach requires a significantly smaller time step.

  1. Corrected simulations for one-dimensional diffusion processes with naturally occurring boundaries

    NASA Astrophysics Data System (ADS)

    Shafiey, Hassan; Gan, Xinjun; Waxman, David

    2017-11-01

    To simulate a diffusion process, a usual approach is to discretize the time in the associated stochastic differential equation. This is the approach used in the Euler method. In the present work we consider a one-dimensional diffusion process where the terms occurring, within the stochastic differential equation, prevent the process entering a region. The outcome is a naturally occurring boundary (which may be absorbing or reflecting). A complication occurs in a simulation of this situation. The term involving a random variable, within the discretized stochastic differential equation, may take a trajectory across the boundary into a "forbidden region." The naive way of dealing with this problem, which we refer to as the "standard" approach, is simply to reset the trajectory to the boundary, based on the argument that crossing the boundary actually signifies achieving the boundary. In this work we show, within the framework of the Euler method, that such resetting introduces a spurious force into the original diffusion process. This force may have a significant influence on trajectories that come close to a boundary. We propose a corrected numerical scheme, for simulating one-dimensional diffusion processes with naturally occurring boundaries. This involves correcting the standard approach, so that an exact property of the diffusion process is precisely respected. As a consequence, the proposed scheme does not introduce a spurious force into the dynamics. We present numerical test cases, based on exactly soluble one-dimensional problems with one or two boundaries, which suggest that, for a given value of the discrete time step, the proposed scheme leads to substantially more accurate results than the standard approach. Alternatively, the standard approach needs considerably more computation time to obtain a comparable level of accuracy to the proposed scheme, because the standard approach requires a significantly smaller time step.

  2. Building the United States National Vegetation Classification

    USGS Publications Warehouse

    Franklin, S.B.; Faber-Langendoen, D.; Jennings, M.; Keeler-Wolf, T.; Loucks, O.; Peet, R.; Roberts, D.; McKerrow, A.

    2012-01-01

    The Federal Geographic Data Committee (FGDC) Vegetation Subcommittee, the Ecological Society of America Panel on Vegetation Classification, and NatureServe have worked together to develop the United States National Vegetation Classification (USNVC). The current standard was accepted in 2008 and fosters consistency across Federal agencies and non-federal partners for the description of each vegetation concept and its hierarchical classification. The USNVC is structured as a dynamic standard, where changes to types at any level may be proposed at any time as new information comes in. But, because much information already exists from previous work, the NVC partners first established methods for screening existing types to determine their acceptability with respect to the 2008 standard. Current efforts include a screening process to assign confidence to Association and Group level descriptions, and a review of the upper three levels of the classification. For the upper levels especially, the expectation is that the review process includes international scientists. Immediate future efforts include the review of remaining levels and the development of a proposal review process.

  3. Development of a standardized, citywide process for managing smart-pump drug libraries.

    PubMed

    Walroth, Todd A; Smallwood, Shannon; Arthur, Karen; Vance, Betsy; Washington, Alana; Staublin, Therese; Haslar, Tammy; Reddan, Jennifer G; Fuller, James

    2018-06-15

    Development and implementation of an interprofessional consensus-driven process for review and optimization of smart-pump drug libraries and dosing limits are described. The Indianapolis Coalition for Patient Safety (ICPS), which represents 6 Indianapolis-area health systems, identified an opportunity to reduce clinically insignificant alerts that smart infusion pumps present to end users. Through a consensus-driven process, ICPS aimed to identify best practices to implement at individual hospitals in order to establish specific action items for smart-pump drug library optimization. A work group of pharmacists, nurses, and industrial engineers met to evaluate variability within and lack of scrutiny of smart-pump drug libraries. The work group used Lean Six Sigma methodologies to generate a list of key needs and barriers to be addressed in process standardization. The group reviewed targets for smart-pump drug library optimization, including dosing limits, types of alerts reviewed, policies, and safety best practices. The work group also analyzed existing processes at each site to develop a final consensus statement outlining a model process for reviewing alerts and managing smart-pump data. Analysis of the total number of alerts per device across ICPS-affiliated health systems over a 4-year period indicated a 50% decrease (from 7.2 to 3.6 alerts per device per month) after implementation of the model by ICPS member organizations. Through implementation of a standardized, consensus-driven process for smart-pump drug library optimization, ICPS member health systems reduced clinically insignificant smart-pump alerts. Copyright © 2018 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  4. Review of the transportation planning process in the Denver metropolitan area

    DOT National Transportation Integrated Search

    2012-11-01

    Harmonization Task Groups 1 and 3 (HTG1 and 3) were established by the EU-US International Standards Harmonization Working Group to attempt to harmonize standards (including ISO, CEN, ETSI, IEEE) on security (HTG1) and communications protocols (HTG3)...

  5. Review of the transportation planning process in the Pittsburgh metropolitan area

    DOT National Transportation Integrated Search

    2012-11-12

    Harmonization Task Groups 1 and 3 (HTG1 and 3) were established by the EU-US International Standards Harmonization Working Group to attempt to harmonize standards (including ISO, CEN, ETSI, IEEE) on security (HTG1) and communications protocols (HTG3)...

  6. There's gold in them thar' databases.

    PubMed

    Gillespie, G

    2000-11-01

    Some health care organizations are using sophisticated data mining applications to unearth hidden truths buried in their online clinical and financial information. But the lack of a standard clinical vocabulary and standard work processes is an obstacle CIOs must blast through to reach their treasure.

  7. [Work process and working conditions in poultry processing plants: report of a survey on occupational health surveillance].

    PubMed

    Oliveira, Paulo Antonio Barros; Mendes, Jussara Maria Rosa

    2014-12-01

    This article presents the report of a survey on health surveillance activities performed in poultry processing plants in the south of Brazil. It aims to contribute to an understanding of the work process developed, the growth of the sector, the organization of labor and the confrontation with the economic model of this sector, which has been exposing employees to working conditions that undermine their health. The working conditions identified are considered largely incompatible with health and human dignity. The study supports interinstitutional intervention, especially with the Public Ministry of Labor, criticizes the weak implementation of specific government interventions in health conditions in the industry and introduces the new Regulatory Standard 36 as a positive perspective for the near future.

  8. A Framework for Comprehensive Health Terminology Systems in the United States

    PubMed Central

    Chute, Christopher G.; Cohn, Simon P.; Campbell, James R.

    1998-01-01

    Health care in the United States has become an information-intensive industry, yet electronic health records represent patient data inconsistently for lack of clinical data standards. Classifications that have achieved common acceptance, such as the ICD-9-CM or ICD, aggregate heterogeneous patients into broad categories, which preclude their practical use in decision support, development of refined guidelines, or detailed comparison of patient outcomes or benchmarks. This document proposes a framework for the integration and maturation of clinical terminologies that would have practical applications in patient care, process management, outcome analysis, and decision support. Arising from the two working groups within the standards community—the ANSI (American National Standards Institute) Healthcare Informatics Standards Board Working Group and the Computer-based Patient Records Institute Working Group on Codes and Structures—it outlines policies regarding 1) functional characteristics of practical terminologies, 2) terminology models that can broaden their applications and contribute to their sustainability, 3) maintenance attributes that will enable terminologies to keep pace with rapidly changing health care knowledge and process, and 4) administrative issues that would facilitate their accessibility, adoption, and application to improve the quality and efficiency of American health care. PMID:9824798

  9. Reporting standards for studies of diagnostic test accuracy in dementia

    PubMed Central

    Noel-Storr, Anna H.; McCleery, Jenny M.; Richard, Edo; Ritchie, Craig W.; Flicker, Leon; Cullum, Sarah J.; Davis, Daniel; Quinn, Terence J.; Hyde, Chris; Rutjes, Anne W.S.; Smailagic, Nadja; Marcus, Sue; Black, Sandra; Blennow, Kaj; Brayne, Carol; Fiorivanti, Mario; Johnson, Julene K.; Köpke, Sascha; Schneider, Lon S.; Simmons, Andrew; Mattsson, Niklas; Zetterberg, Henrik; Bossuyt, Patrick M.M.; Wilcock, Gordon

    2014-01-01

    Objective: To provide guidance on standards for reporting studies of diagnostic test accuracy for dementia disorders. Methods: An international consensus process on reporting standards in dementia and cognitive impairment (STARDdem) was established, focusing on studies presenting data from which sensitivity and specificity were reported or could be derived. A working group led the initiative through 4 rounds of consensus work, using a modified Delphi process and culminating in a face-to-face consensus meeting in October 2012. The aim of this process was to agree on how best to supplement the generic standards of the STARD statement to enhance their utility and encourage their use in dementia research. Results: More than 200 comments were received during the wider consultation rounds. The areas at most risk of inadequate reporting were identified and a set of dementia-specific recommendations to supplement the STARD guidance were developed, including better reporting of patient selection, the reference standard used, avoidance of circularity, and reporting of test-retest reliability. Conclusion: STARDdem is an implementation of the STARD statement in which the original checklist is elaborated and supplemented with guidance pertinent to studies of cognitive disorders. Its adoption is expected to increase transparency, enable more effective evaluation of diagnostic tests in Alzheimer disease and dementia, contribute to greater adherence to methodologic standards, and advance the development of Alzheimer biomarkers. PMID:24944261

  10. 1996 DOE technical standards program workshop: Proceedings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-07-01

    The workshop theme is `The Strategic Standardization Initiative - A Technology Exchange and Global Competitiveness Challenge for DOE.` The workshop goal is to inform the DOE technical standards community of strategic standardization activities taking place in the Department, other Government agencies, standards developing organizations, and industry. Individuals working on technical standards will be challenged to improve cooperation and communications with the involved organizations in response to the initiative. Workshop sessions include presentations by representatives from various Government agencies that focus on coordination among and participation of Government personnel in the voluntary standards process; reports by standards organizations, industry, and DOEmore » representatives on current technology exchange programs; and how the road ahead appears for `information superhighway` standardization. Another session highlights successful standardization case studies selected from several sites across the DOE complex. The workshop concludes with a panel discussion on the goals and objectives of the DOE Technical Standards Program as envisioned by senior DOE management. The annual workshop on technical standards has proven to be an effective medium for communicating information related to standards throughout the DOE community. Technical standards are used to transfer technology and standardize work processes to produce consistent, acceptable results. They provide a practical solution to the Department`s challenge to protect the environment and the health and safety of the public and workers during all facility operations. Through standards, the technologies of industries and governments worldwide are available to DOE. The DOE Technical Standards Program, a Department-wide effort that crosscuts all organizations and disciplines, links the Department to those technologies.« less

  11. Neural systems and time course of proactive interference in working memory.

    PubMed

    Du, Yingchun; Zhang, John X; Xiao, Zhuangwei; Wu, Renhua

    2007-01-01

    The storage of information in working memory suffers as a function of proactive interference. Many works using neuroimaging technique have been done to reveal the brain mechanism of interference resolution. However, less is yet known about the time course of this process. Event-related potential method(ERP) and standardized Low Resolution Brain Electromagnetic Tomography method (sLORETA) were used in this study to discover the time course of interference resolution in working memory. The anterior P2 was thought to reflect interference resolution and if so, this process occurred earlier in working memory than in long-term memory.

  12. Review of the transportation planning process in the Portland, Orgeon, metropolitan area

    DOT National Transportation Integrated Search

    2012-11-01

    Harmonization Task Groups 1 and 3 (HTG1 and 3) were established by the EU-US International Standards Harmonization Working Group to attempt to harmonize standards (including ISO, CEN, ETSI, IEEE) on security (HTG1) and communications protocols (HTG3)...

  13. Review of the transportation planning process in the southern California metropolitan area

    DOT National Transportation Integrated Search

    2012-11-01

    Harmonization Task Groups 1 and 3 (HTG1 and 3) were established by the EU-US International Standards Harmonization Working Group to attempt to harmonize standards (including ISO, CEN, ETSI, IEEE) on security (HTG1) and communications protocols (HTG3)...

  14. Review of the transportation planning process in the Minneapolis-St. Paul metropolitan area

    DOT National Transportation Integrated Search

    2012-11-01

    Harmonization Task Groups 1 and 3 (HTG1 and 3) were established by the EU-US International Standards Harmonization Working Group to attempt to harmonize standards (including ISO, CEN, ETSI, IEEE) on security (HTG1) and communications protocols (HTG3)...

  15. Progress on uncooled PbSe detectors for low-cost applications

    NASA Astrophysics Data System (ADS)

    Vergara, German; Gomez, Luis J.; Villamayor, Victor; Alvarez, M.; Rodrigo, Maria T.; del Carmen Torquemada, Maria; Sanchez, Fernando J.; Verdu, Marina; Diezhandino, Jorge; Rodriguez, Purificacion; Catalan, Irene; Almazan, Rosa; Plaza, Julio; Montojo, Maria T.

    2004-08-01

    This work reports on progress on development of polycrystalline PbSe infrared detectors at the Centro de Investigacion y Desarrollo de la Armada (CIDA). Since mid nineties, the CIDA owns an innovative technology for processing uncooled MWIR detectors of polycrystalline PbSe. Based on this technology, some applications have been developed. However, future applications demand smarter, more complex, faster yet cheaper detectors. Aiming to open new perspectives to polycrystalline PbSe detectors, we are currently working on different directions: 1) Processing of 2D arrays: a) Designing and processing low density x-y addressed arrays with 16x16 and 32x32 elements, as an extension of our standard technology. b) Trying to make compatible standard CMOS and polycrystalline PbSe technologies in order to process monolithic large format arrays. 2) Adding new features to the detector such as monolithically integrated spectral discrimination.

  16. Making working memory work: The effects of extended practice on focus capacity and the processes of updating, forward access, and random access

    PubMed Central

    Price, John M.; Colflesh, Gregory J. H.; Cerella, John; Verhaeghen, Paul

    2014-01-01

    We investigated the effects of 10 hours of practice on variations of the N-Back task to investigate the processes underlying possible expansion of the focus of attention within working memory. Using subtractive logic, we showed that random access (i.e., Sternberg-like search) yielded a modest effect (a 50% increase in speed) whereas the processes of forward access (i.e., retrieval in order, as in a standard N-Back task) and updating (i.e., changing the contents of working memory) were executed about 5 times faster after extended practice. We additionally found that extended practice increased working memory capacity as measured by the size of the focus of attention for the forward-access task, but not for variations where probing was in random order. This suggests that working memory capacity may depend on the type of search process engaged, and that certain working-memory-related cognitive processes are more amenable to practice than others. PMID:24486803

  17. 40 CFR 63.7499 - What are the subcategories of boilers and process heaters?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 13 2010-07-01 2010-07-01 false What are the subcategories of boilers..., and Institutional Boilers and Process Heaters Emission Limits and Work Practice Standards § 63.7499 What are the subcategories of boilers and process heaters? The subcategories of boilers and process...

  18. Simulation of springback and microstructural analysis of dual phase steels

    NASA Astrophysics Data System (ADS)

    Kalyan, T. Sri.; Wei, Xing; Mendiguren, Joseba; Rolfe, Bernard

    2013-12-01

    With increasing demand for weight reduction and better crashworthiness abilities in car development, advanced high strength Dual Phase (DP) steels have been progressively used when making automotive parts. The higher strength steels exhibit higher springback and lower dimensional accuracy after stamping. This has necessitated the use of simulation of each stamped component prior to production to estimate the part's dimensional accuracy. Understanding the micro-mechanical behaviour of AHSS sheet may provide more accuracy to stamping simulations. This work can be divided basically into two parts: first modelling a standard channel forming process; second modelling the micro-structure of the process. The standard top hat channel forming process, benchmark NUMISHEET'93, is used for investigating springback effect of WISCO Dual Phase steels. The second part of this work includes the finite element analysis of microstructures to understand the behaviour of the multi-phase steel at a more fundamental level. The outcomes of this work will help in the dimensional control of steels during manufacturing stage based on the material's microstructure.

  19. Thermal stress, human performance, and physical employment standards.

    PubMed

    Cheung, Stephen S; Lee, Jason K W; Oksa, Juha

    2016-06-01

    Many physically demanding occupations in both developed and developing economies involve exposure to extreme thermal environments that can affect work capacity and ultimately health. Thermal extremes may be present in either an outdoor or an indoor work environment, and can be due to a combination of the natural or artificial ambient environment, the rate of metabolic heat generation from physical work, processes specific to the workplace (e.g., steel manufacturing), or through the requirement for protective clothing impairing heat dissipation. Together, thermal exposure can elicit acute impairment of work capacity and also chronic effects on health, greatly contributing to worker health risk and reduced productivity. Surprisingly, in most occupations even in developed economies, there are rarely any standards regarding enforced heat or cold safety for workers. Furthermore, specific physical employment standards or accommodations for thermal stressors are rare, with workers commonly tested under near-perfect conditions. This review surveys the major occupational impact of thermal extremes and existing employment standards, proposing guidelines for improvement and areas for future research.

  20. Using the virtual reality device Oculus Rift for neuropsychological assessment of visual processing capabilities

    PubMed Central

    Foerster, Rebecca M.; Poth, Christian H.; Behler, Christian; Botsch, Mario; Schneider, Werner X.

    2016-01-01

    Neuropsychological assessment of human visual processing capabilities strongly depends on visual testing conditions including room lighting, stimuli, and viewing-distance. This limits standardization, threatens reliability, and prevents the assessment of core visual functions such as visual processing speed. Increasingly available virtual reality devices allow to address these problems. One such device is the portable, light-weight, and easy-to-use Oculus Rift. It is head-mounted and covers the entire visual field, thereby shielding and standardizing the visual stimulation. A fundamental prerequisite to use Oculus Rift for neuropsychological assessment is sufficient test-retest reliability. Here, we compare the test-retest reliabilities of Bundesen’s visual processing components (visual processing speed, threshold of conscious perception, capacity of visual working memory) as measured with Oculus Rift and a standard CRT computer screen. Our results show that Oculus Rift allows to measure the processing components as reliably as the standard CRT. This means that Oculus Rift is applicable for standardized and reliable assessment and diagnosis of elementary cognitive functions in laboratory and clinical settings. Oculus Rift thus provides the opportunity to compare visual processing components between individuals and institutions and to establish statistical norm distributions. PMID:27869220

  1. Using the virtual reality device Oculus Rift for neuropsychological assessment of visual processing capabilities.

    PubMed

    Foerster, Rebecca M; Poth, Christian H; Behler, Christian; Botsch, Mario; Schneider, Werner X

    2016-11-21

    Neuropsychological assessment of human visual processing capabilities strongly depends on visual testing conditions including room lighting, stimuli, and viewing-distance. This limits standardization, threatens reliability, and prevents the assessment of core visual functions such as visual processing speed. Increasingly available virtual reality devices allow to address these problems. One such device is the portable, light-weight, and easy-to-use Oculus Rift. It is head-mounted and covers the entire visual field, thereby shielding and standardizing the visual stimulation. A fundamental prerequisite to use Oculus Rift for neuropsychological assessment is sufficient test-retest reliability. Here, we compare the test-retest reliabilities of Bundesen's visual processing components (visual processing speed, threshold of conscious perception, capacity of visual working memory) as measured with Oculus Rift and a standard CRT computer screen. Our results show that Oculus Rift allows to measure the processing components as reliably as the standard CRT. This means that Oculus Rift is applicable for standardized and reliable assessment and diagnosis of elementary cognitive functions in laboratory and clinical settings. Oculus Rift thus provides the opportunity to compare visual processing components between individuals and institutions and to establish statistical norm distributions.

  2. Standardization of XML Database Exchanges and the James Webb Space Telescope Experience

    NASA Technical Reports Server (NTRS)

    Gal-Edd, Jonathan; Detter, Ryan; Jones, Ron; Fatig, Curtis C.

    2007-01-01

    Personnel from the National Aeronautics and Space Administration (NASA) James Webb Space Telescope (JWST) Project have been working with various standard communities such the Object Management Group (OMG) and the Consultative Committee for Space Data Systems (CCSDS) to assist in the definition of a common extensible Markup Language (XML) for database exchange format. The CCSDS and OMG standards are intended for the exchange of core command and telemetry information, not for all database information needed to exercise a NASA space mission. The mission-specific database, containing all the information needed for a space mission, is translated from/to the standard using a translator. The standard is meant to provide a system that encompasses 90% of the information needed for command and telemetry processing. This paper will discuss standardization of the XML database exchange format, tools used, and the JWST experience, as well as future work with XML standard groups both commercial and government.

  3. Evaluating Faculty Work: Expectations and Standards of Faculty Performance in Research Universities

    ERIC Educational Resources Information Center

    Hardre, Patricia; Cox, Michelle

    2009-01-01

    Expectations and the way they are communicated can influence employees' motivation and performance. Previous research has demonstrated individual effects of workplace climate and individual differences on faculty productivity. The present study focused on the characteristics of institutional performance standards, evaluation processes and…

  4. National Standards for History for Grades K-4: Expanding Children's World in Time and Space. Expanded Edition. Including Examples of Student Achievement for Grades K-2 and 3-4.

    ERIC Educational Resources Information Center

    Crabtree, Charlotte; Nash, Gary B.

    Developed through a broad-based national consensus building process, the national history standards project has involved working toward agreement both on the larger purposes of history in the school curriculum and on the more specific history understandings and thinking processes all students should have equal opportunity to acquire over 12 years…

  5. An evaluation of nonclinical dissociation utilizing a virtual environment shows enhanced working memory and attention.

    PubMed

    Saidel-Goley, Isaac N; Albiero, Erin E; Flannery, Kathleen A

    2012-02-01

    Dissociation is a mental process resulting in the disruption of memory, perception, and sometimes identity. At a nonclinical level, only mild dissociative experiences occur. The nature of nonclinical dissociation is disputed in the literature, with some asserting that it is a beneficial information processing style and others positing that it is a psychopathological phenomenon. The purpose of this study was to further the understanding of nonclinical dissociation with respect to memory and attention, by including a more ecologically valid virtual reality (VR) memory task along with standard neuropsychological tasks. Forty-five undergraduate students from a small liberal arts college in the northeast participated for course credit. The participants completed a battery of tasks including two standard memory tasks, a standard attention task, and an experimental VR memory task; the VR task included immersion in a virtual apartment, followed by incidental object-location recall for objects in the virtual apartment. Support for the theoretical model portraying nonclinical dissociation as a beneficial information processing style was found in this study. Dissociation scores were positively correlated with working memory scores and attentional processing scores on the standard neuropsychological tasks. In terms of the VR task, dissociation scores were positively correlated with more false positive memories that could be the result of a tendency of nonclinical highly dissociative individuals to create more elaborative schemas. This study also demonstrates that VR paradigms add to the prediction of cognitive functioning in testing protocols using standard neuropsychological tests, while simultaneously increasing ecological validity.

  6. 75 FR 52701 - Approval and Promulgation of Implementation Plans; State of Missouri

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-27

    ... information claimed to be Confidential Business Information (CBI) or other information whose disclosure is.... Ventilation Limits 5. Ongoing Ventilation Testing and Reporting Requirements 6. Winter Construction Work..., including building enclosure and ventilation projects, implementation of work practice standards, process...

  7. Studies of the physical, yield and failure behavior of aliphatic polyketones

    NASA Astrophysics Data System (ADS)

    Karttunen, Nicole Renee

    This thesis describes an investigation into the multiaxial yield and failure behavior of an aliphatic polyketone terpolymer. The behavior is studied as a function of: stress state, strain rate, temperature, and sample processing conditions. Results of this work include: elucidation of the behavior of a recently commercialized polymer, increased understanding of the effects listed above, insight into the effects of processing conditions on the morphology of the polyketone, and a description of yield strength of this material as a function of stress state, temperature, and strain rate. The first portion of work focuses on the behavior of a set of samples that are extruded under "common" processing conditions. Following this reference set of tests, the effect of testing this material at different temperatures is studied. A total of four different temperatures are examined. In addition, the effect of altering strain rate is examined. Testing is performed under pseudo-strain rate control at constant nominal octahedral shear strain rate for each failure envelope. A total of three different rates are studied. An extension of the first portion of work involves modeling the yield envelope. This is done by combining two approaches: continuum level and molecular level. The use of both methods allows the description of the yield envelope as a function of stress state, strain rate and temperature. The second portion of work involves the effects of processing conditions. For this work, additional samples are extruded with different shear and thermal histories than the "standard" material. One set of samples is processed with shear rates higher and lower than the standard. A second set is processed at higher and lower cooling rates than the standard. In order to understand the structural cause for changes in behavior with processing conditions, morphological characterization is performed on these samples. In particular, the effect on spherulitic structure is important. Residual stresses are also determined to be important to the behavior of the samples. Finally, an investigation into the crystalline structure of a family of aliphatic polyketones is performed. The effects of side group concentration and size are described.

  8. Software Development Standard Processes (SDSP)

    NASA Technical Reports Server (NTRS)

    Lavin, Milton L.; Wang, James J.; Morillo, Ronald; Mayer, John T.; Jamshidian, Barzia; Shimizu, Kenneth J.; Wilkinson, Belinda M.; Hihn, Jairus M.; Borgen, Rosana B.; Meyer, Kenneth N.; hide

    2011-01-01

    A JPL-created set of standard processes is to be used throughout the lifecycle of software development. These SDSPs cover a range of activities, from management and engineering activities, to assurance and support activities. These processes must be applied to software tasks per a prescribed set of procedures. JPL s Software Quality Improvement Project is currently working at the behest of the JPL Software Process Owner to ensure that all applicable software tasks follow these procedures. The SDSPs are captured as a set of 22 standards in JPL s software process domain. They were developed in-house at JPL by a number of Subject Matter Experts (SMEs) residing primarily within the Engineering and Science Directorate, but also from the Business Operations Directorate and Safety and Mission Success Directorate. These practices include not only currently performed best practices, but also JPL-desired future practices in key thrust areas like software architecting and software reuse analysis. Additionally, these SDSPs conform to many standards and requirements to which JPL projects are beholden.

  9. [The design and implementation of DICOM self-help film printing system].

    PubMed

    Wang, Xiaodong; Jiang, Mowen

    2013-09-01

    This article focuses on the design and implementation of self-help film printing system which based on DICOM standard. According to DICOM standard and the working process of the radiology department, the system realizes self-help printing film as well as monitoring and managing the film printing business.

  10. Travaux Neuchatelois de Linguistique (TRANEL) (Neuchatel Working Papers in Linguistics), Volume 14.

    ERIC Educational Resources Information Center

    Py, Bernard, Ed.; Rubattel, Christian, Ed.

    1989-01-01

    Three papers in linguistics, all in French, are presented. "La delocutivite lexicale en francais standard: esquisse d'un modele derivationnel" ("Lexical Delocutivity in Standard French: Sketch of a Derivational Model"), by Marc Bonhomme, examines the process by which certain expressions become neologisms. "La terminologie…

  11. From Covert Processes to Overt Outcomes of Refutation Text Reading: The Interplay of Science Text Structure and Working Memory Capacity through Eye Fixations

    ERIC Educational Resources Information Center

    Ariasi, Nicola; Mason, Lucia

    2014-01-01

    This study extends current research on the refutation text effect by investigating it in learners with different levels of working memory capacity. The purpose is to outline the link between online processes (revealed by eye fixation indices) and off-line outcomes in these learners. In science education, unlike a standard text, a refutation text…

  12. 40 CFR 63.2465 - What requirements must I meet for process vents that emit hydrogen halide and halogen HAP or HAP...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Pollutants: Miscellaneous Organic Chemical Manufacturing Emission Limits, Work Practice Standards, and... the mass emission rate of HAP metals based on process knowledge, engineering assessment, or test data...

  13. Work extraction from quantum systems with bounded fluctuations in work.

    PubMed

    Richens, Jonathan G; Masanes, Lluis

    2016-11-25

    In the standard framework of thermodynamics, work is a random variable whose average is bounded by the change in free energy of the system. This average work is calculated without regard for the size of its fluctuations. Here we show that for some processes, such as reversible cooling, the fluctuations in work diverge. Realistic thermal machines may be unable to cope with arbitrarily large fluctuations. Hence, it is important to understand how thermodynamic efficiency rates are modified by bounding fluctuations. We quantify the work content and work of formation of arbitrary finite dimensional quantum states when the fluctuations in work are bounded by a given amount c. By varying c we interpolate between the standard and minimum free energies. We derive fundamental trade-offs between the magnitude of work and its fluctuations. As one application of these results, we derive the corrected Carnot efficiency of a qubit heat engine with bounded fluctuations.

  14. Work extraction from quantum systems with bounded fluctuations in work

    PubMed Central

    Richens, Jonathan G.; Masanes, Lluis

    2016-01-01

    In the standard framework of thermodynamics, work is a random variable whose average is bounded by the change in free energy of the system. This average work is calculated without regard for the size of its fluctuations. Here we show that for some processes, such as reversible cooling, the fluctuations in work diverge. Realistic thermal machines may be unable to cope with arbitrarily large fluctuations. Hence, it is important to understand how thermodynamic efficiency rates are modified by bounding fluctuations. We quantify the work content and work of formation of arbitrary finite dimensional quantum states when the fluctuations in work are bounded by a given amount c. By varying c we interpolate between the standard and minimum free energies. We derive fundamental trade-offs between the magnitude of work and its fluctuations. As one application of these results, we derive the corrected Carnot efficiency of a qubit heat engine with bounded fluctuations. PMID:27886177

  15. Work extraction from quantum systems with bounded fluctuations in work

    NASA Astrophysics Data System (ADS)

    Richens, Jonathan G.; Masanes, Lluis

    2016-11-01

    In the standard framework of thermodynamics, work is a random variable whose average is bounded by the change in free energy of the system. This average work is calculated without regard for the size of its fluctuations. Here we show that for some processes, such as reversible cooling, the fluctuations in work diverge. Realistic thermal machines may be unable to cope with arbitrarily large fluctuations. Hence, it is important to understand how thermodynamic efficiency rates are modified by bounding fluctuations. We quantify the work content and work of formation of arbitrary finite dimensional quantum states when the fluctuations in work are bounded by a given amount c. By varying c we interpolate between the standard and minimum free energies. We derive fundamental trade-offs between the magnitude of work and its fluctuations. As one application of these results, we derive the corrected Carnot efficiency of a qubit heat engine with bounded fluctuations.

  16. Making Teacher Work Samples Work at the University of Northern Colorado

    ERIC Educational Resources Information Center

    Parker, Melissa; Sinclair, Christina

    2010-01-01

    Teacher Work Samples (TWS) can be viewed in terms of a product and a process. As a product, the TWS measures a teacher candidate's (TC's) ability to promote student achievement, documents that TCs have met minimum national standards, and validates teacher education programs. Teacher candidates engage in observable, job-related behaviors that serve…

  17. Worldwide Environmental Compliance Assessment and Management Program (ECAMP). German Supplement

    DTIC Science & Technology

    1991-01-01

    auditing of technical installations. The law for the Protection from Harmful Effects from Air Pollution, Noise, Vibrations, and Similar Processes (The...when handling carcinogenic work materials); - Standard Publication number ZH 1/140 (Safety regulations for air polution prevention in work areas); - Z111

  18. Process' standardization and change management in higher education. The case of TEI of Athens

    NASA Astrophysics Data System (ADS)

    Chalaris, Ioannis; Chalaris, Manolis; Gritzalis, Stefanos; Belsis, Petros

    2015-02-01

    The establishment of mature operational procedures and the effort of standardizing and certifying these procedures is a particularly arduous and demanding task which requires strong commitment from management to the existing objectives, administrative stability and continuity, availability of resources, an adequate implementation team with support from all stakeholders and of course great tolerance until tangible results of the investment are shown. Ensuring these conditions, particularly in times of economic crisis, is an extremely difficult task for large organizations such as TEI of Athens where there is heterogeneity in personnel and changes in the administrative hierarchy arise plethora of additional difficulties and require an effective change management. In this work we depict the path of standardization and certification of administrative functions of TEI of Athens, with emphasis on difficulties encountered and how to address them and in particular issues of change management and the culture related to this effort. The requirement for infrastructure needed to be maintained in processes and tools process & strategic management is embodied, in order to evolve mechanisms for continuous improvement processes and storage / recovery of the resulting knowledge. The work concludes with a general design of a road map of internal audit and continuous improvement processes for a large institution of higher education.

  19. PC based PLCs and ethernet based fieldbus: the new standard platform for future VLT instrument control

    NASA Astrophysics Data System (ADS)

    Kiekebusch, Mario J.; Lucuix, Christian; Erm, Toomas M.; Chiozzi, Gianluca; Zamparelli, Michele; Kern, Lothar; Brast, Roland; Pirani, Werther; Reiss, Roland; Popovic, Dan; Knudstrup, Jens; Duchateau, Michel; Sandrock, Stefan; Di Lieto, Nicola

    2014-07-01

    ESO is currently in the final phase of the standardization process for PC-based Programmable Logical Controllers (PLCs) as the new platform for the development of control systems for future VLT/VLTI instruments. The standard solution used until now consists of a Local Control Unit (LCU), a VME-based system having a CPU and commercial and proprietary boards. This system includes several layers of software and many thousands of lines of code developed and maintained in house. LCUs have been used for several years as the interface to control instrument functions but now are being replaced by commercial off-the-shelf (COTS) systems based on BECKHOFF Embedded PCs and the EtherCAT fieldbus. ESO is working on the completion of the software framework that enables a seamless integration into the VLT control system in order to be ready to support upcoming instruments like ESPRESSO and ERIS, that will be the first fully VLT compliant instruments using the new standard. The technology evaluation and standardization process has been a long and combined effort of various engineering disciplines like electronics, control and software, working together to define a solution that meets the requirements and minimizes the impact on the observatory operations and maintenance. This paper presents the challenges of the standardization process and the steps involved in such a change. It provides a technical overview of how industrial standards like EtherCAT, OPC-UA, PLCOpen MC and TwinCAT can be used to replace LCU features in various areas like software engineering and programming languages, motion control, time synchronization and astronomical tracking.

  20. Pollution prevention options to meet impending MACT standards: A compendium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berglund, R.L.; Pickron, J.S.

    1997-12-31

    Under the Clean Air Act Amendments of 1990, the EPA was charged with developing MACT standards for a large group of operations representing a variety of different industry categories. While pollution prevention opportunities for meeting the requirements of the HON standards for the synthetic organic chemical manufacturing industry (SOCMI) and similar standards for the refining industry have drawn significant discussion and attention, little guidance for considering pollution prevention options for meeting other MACT standards have been provided. Yet, in working with companies to meet the requirements of proposed MACT standards for the shipbuilding, wood processing, plastics and gas processing industries,more » a number of pollution prevention opportunities for meeting these requirements were identified in early compliance strategies. This paper will provide a compendium of pollution prevention options for meeting these and other proposed and promulgated MACT standards.« less

  1. 40 CFR 60.103a - Design, equipment, work practice or operational standards.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) Description and simple process flow diagram showing the interconnection of the following components of the... rate. (iv) Description and simple process flow diagram showing all gas lines (including flare, purge... which lines are monitored and identify on the process flow diagram the location and type of each monitor...

  2. 40 CFR 60.103a - Design, equipment, work practice or operational standards.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) Description and simple process flow diagram showing the interconnection of the following components of the... rate. (iv) Description and simple process flow diagram showing all gas lines (including flare, purge... which lines are monitored and identify on the process flow diagram the location and type of each monitor...

  3. The coming commoditization of processes.

    PubMed

    Davenport, Thomas H

    2005-06-01

    Despite the much-ballyhooed increase in outsourcing, most companies are in do-it-yourself mode for the bulk of their processes, in large part because there's no way to compare outside organizations' capabilities with those of internal functions. Given the lack of comparability, it's almost surprising that anyone outsources today. But it's not surprising that cost is by far companies' primary criterion for evaluating outsourcers or that many companies are dissatisfied with their outsourcing relationships. A new world is coming, says the author, and it will lead to dramatic changes in the shape and structure of corporations. A broad set of process standards will soon make it easy to determine whether a business capability can be improved by outsourcing it. Such standards will also help businesses compare service providers and evaluate the costs versus the benefits of outsourcing. Eventually these costs and benefits will be so visible to buyers that outsourced processes will become a commodity, and prices will drop significantly. The low costs and low risk of outsourcing will accelerate the flow of jobs offshore, force companies to reassess their strategies, and change the basis of competition. The speed with which some businesses have already adopted process standards suggests that many previously unscrutinized areas are ripe for change. In the field of technology, for instance, the Carnegie Mellon Software Engineering Institute has developed a global standard for software development processes, called the Capability Maturity Model (CMM). For companies that don't have process standards in place, it makes sense for them to create standards by working with customers, competitors, software providers, businesses that processes may be outsourced to, and objective researchers and standard-setters. Setting standards is likely to lead to the improvement of both internal and outsourced processes.

  4. Making working memory work: the effects of extended practice on focus capacity and the processes of updating, forward access, and random access.

    PubMed

    Price, John M; Colflesh, Gregory J H; Cerella, John; Verhaeghen, Paul

    2014-05-01

    We investigated the effects of 10h of practice on variations of the N-Back task to investigate the processes underlying possible expansion of the focus of attention within working memory. Using subtractive logic, we showed that random access (i.e., Sternberg-like search) yielded a modest effect (a 50% increase in speed) whereas the processes of forward access (i.e., retrieval in order, as in a standard N-Back task) and updating (i.e., changing the contents of working memory) were executed about 5 times faster after extended practice. We additionally found that extended practice increased working memory capacity as measured by the size of the focus of attention for the forward-access task, but not for variations where probing was in random order. This suggests that working memory capacity may depend on the type of search process engaged, and that certain working-memory-related cognitive processes are more amenable to practice than others. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. 20 CFR 416.999 - What is expedited reinstatement?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... disability benefits due to your work activity. The expedited reinstatement provision provides you the option... new application for a new period of eligibility. Since January 1, 2001, you can request to be... to work or unless an exception under the medical improvement review standard process applies. We...

  6. Job Grading Standard for Electroplater, WG-3711.

    ERIC Educational Resources Information Center

    Civil Service Commission, Washington, DC. Bureau of Policies and Standards.

    The standard for Electroplating Worker WG-7 and Electroplater WG-9 covers work involving the use of electrolytic and chemical processes to plate, coat, and treat surfaces of metals and metal alloys for purposes of protection, repair, maintenance, and fabrication of parts and equipment. A knowledge of the preparation, testing, and maintenance of…

  7. Grounding our practice in nursing professional development.

    PubMed

    Dickerson, Pamela S

    2014-07-01

    The Nursing Professional Development: Scope and Standards of Practice is foundational to the work of nurses in a continuing professional development role. Use of the practice and professional performance aspects of the standards supports both quality of learning activities and the continuous growth process of nurses engaged in this area of practice. Copyright 2014, SLACK Incorporated.

  8. Developing Carbon Nanotube Standards at NASA

    NASA Technical Reports Server (NTRS)

    Nikolaev, Pasha; Arepalli, Sivaram; Sosa, Edward; Gorelik, Olga; Yowell, Leonard

    2007-01-01

    Single wall carbon nanotubes (SWCNTs) are currently being produced and processed by several methods. Many researchers are continuously modifying existing methods and developing new methods to incorporate carbon nanotubes into other materials and utilize the phenomenal properties of SWCNTs. These applications require availability of SWCNTs with known properties and there is a need to characterize these materials in a consistent manner. In order to monitor such progress, it is critical to establish a means by which to define the quality of SWCNT material and develop characterization standards to evaluate of nanotube quality across the board. Such characterization standards should be applicable to as-produced materials as well as processed SWCNT materials. In order to address this issue, NASA Johnson Space Center has developed a protocol for purity and dispersion characterization of SWCNTs (Ref.1). The NASA JSC group is currently working with NIST, ANSI and ISO to establish purity and dispersion standards for SWCNT material. A practice guide for nanotube characterization is being developed in cooperation with NIST (Ref.2). Furthermore, work is in progress to incorporate additional characterization methods for electrical, mechanical, thermal, optical and other properties of SWCNTs.

  9. Developing Carbon Nanotube Standards at NASA

    NASA Technical Reports Server (NTRS)

    Nikolaev, Pasha; Arepalli, Sivaram; Sosa, Edward; Gorelik, Olga; Yowell, Leonard

    2007-01-01

    Single wall carbon nanotubes (SWCNTs) are currently being produced and processed by several methods. Many researchers are continuously modifying existing methods and developing new methods to incorporate carbon nanotubes into other materials and utilize the phenomenal properties of SWCNTs. These applications require availability of SWCNTs with known properties and there is a need to characterize these materials in a consistent manner. In order to monitor such progress, it is critical to establish a means by which to define the quality of SWCNT material and develop characterization standards to evaluate of nanotube quality across the board. Such characterization standards should be applicable to as-produced materials as well as processed SWCNT materials. In order to address this issue, NASA Johnson Space Center has developed a protocol for purity and dispersion characterization of SWCNTs. The NASA JSC group is currently working with NIST, ANSI and ISO to establish purity and dispersion standards for SWCNT material. A practice guide for nanotube characterization is being developed in cooperation with NIST. Furthermore, work is in progress to incorporate additional characterization methods for electrical, mechanical, thermal, optical and other properties of SWCNTs.

  10. New procedures of ergonomics design in a large oil company.

    PubMed

    Alhadeff, Cynthia Mossé; Silva, Rosana Fernandes da; Reis, Márcia Sales dos

    2012-01-01

    This study presents the challenge involved in the negotiation and construction of a standard process in a major petroleum company that has the purpose of guiding the implementation of ergonomic studies in the development of projects, systemising the implementation of ergonomics design. The standard was created by a multi-disciplinary working group consisting of specialists in ergonomics, who work in a number of different areas of the company. The objective was to guide "how to" undertake ergonomics in all projects, taking into consideration the development of the ergonomic appraisals of work. It also established that all the process, in each project phase, should be accompanied by a specialist in ergonomics. This process as an innovation in the conception of projects in this company, signals a change of culture, and, for this reason requires broad dissemination throughout the several company leadership levels, and training of professionals in projects of ergonomics design. An implementation plan was also prepared and approved by the corporate governance, complementing the proposed challenge. In this way, this major oil company will implement new procedures of ergonomics design to promote health, safety, and wellbeing of the workforce, besides improving the performance and reliability of its systems and processes.

  11. Standardizing Navigation Data: A Status Update

    NASA Technical Reports Server (NTRS)

    VanEepoel, John M.; Berry, David S.; Pallaschke, Siegmar; Foliard, Jacques; Kiehling, Reinhard; Ogawa, Mina; Showell, Avanaugh; Fertig, Juergen; Castronuovo, Marco

    2007-01-01

    This paper presents the work of the Navigation Working Group of the Consultative Committee for Space Data Systems (CCSDS) on development of standards addressing the transfer of orbit, attitude and tracking data for space objects. Much progress has been made since the initial presentation of the standards in 2004, including the progression of the orbit data standard to an accepted standard, and the near completion of the attitude and tracking data standards. The orbit, attitude and tracking standards attempt to address predominant parameterizations for their respective data, and create a message format that enables communication of the data across space agencies and other entities. The messages detailed in each standard are built upon a keyword = value paradigm, where a fixed list of keywords is provided in the standard where users specify information about their data, and also use keywords to encapsulate their data. The paper presents a primer on the CCSDS standardization process to put in context the state of the message standards, and the parameterizations supported in each standard, then shows examples of these standards for orbit, attitude and tracking data. Finalization of the standards is expected by the end of calendar year 2007.

  12. Setting new standards in MEMS

    NASA Astrophysics Data System (ADS)

    Rimskog, Magnus; O'Loughlin, Brian J.

    2007-02-01

    Silex Microsystems handles a wide range of customized MEMS components. This speech will be describing Silex's MEMS foundry work model for providing customized solutions based on MEMS in a cost effective and well controlled manner. Factors for success are the capabilities to reformulate a customer product concept to manufacturing processes in the wafer fab, using standard process modules and production equipment. A well-controlled system increases the likelihood of a first batch success and enables fast ramp-up into volume production. The following success factors can be listed: strong enduring relationships with the customers; highly qualified well-experienced specialists working close with the customer; process solutions and building blocks ready to use out of a library; addressing manufacturing issues in the early design phase; in-house know how to meet demands for volume manufacturing; access to a wafer fab with high capacity, good organization, high availability of equipment, and short lead times; process development done in the manufacturing environment using production equipment for easy ramp-up to volume production. The article covers a method of working to address these factors: to have a long and enduring relationships with customers utilizing MEMS expertise and working close with customers, to translate their product ideas to MEMS components; to have stable process solutions for features such as Low ohmic vias, Spiked electrodes, Cantilevers, Silicon optical mirrors, Micro needles, etc, which can be used and modified for the customer needs; to use a structured development and design methodology in order to handle hundreds of process modules, and setting up standard run sheets. It is also very important to do real time process development in the manufacturing line. It minimizes the lead-time for the ramp-up of production; to have access to a state of the art Wafer Fab which is well organized, controlled and flexible, with high capacity and short lead-time for prototypes. It is crucial to have intimate control of processes, equipment, organization, production flow control and WIP. This has been addressed by using a fully computerized control and reporting system.

  13. Application of Mean of Absolute Deviation Method for the Selection of Best Nonlinear Component Based on Video Encryption

    NASA Astrophysics Data System (ADS)

    Anees, Amir; Khan, Waqar Ahmad; Gondal, Muhammad Asif; Hussain, Iqtadar

    2013-07-01

    The aim of this work is to make use of the mean of absolute deviation (MAD) method for the evaluation process of substitution boxes used in the advanced encryption standard. In this paper, we use the MAD technique to analyze some popular and prevailing substitution boxes used in encryption processes. In particular, MAD is applied to advanced encryption standard (AES), affine power affine (APA), Gray, Lui J., Residue Prime, S8 AES, SKIPJACK, and Xyi substitution boxes.

  14. Implementation Plan for Flexible Automation in U.S. Shipyards

    DTIC Science & Technology

    1985-01-01

    process steps, cramped work sites, interrupted geometries , irregular or novel shapes, and other factors that affect automatability. We also try to...held by 2 hands in awkward places. Interrupt geometry of plates and beams. Cannot predict outcome. Creates need to measure and recut. Automation, if...of standard. enough over time I every job. I Rearrange work.Redefine work units. Too many interruptions Time, space, geometry only a little work gets

  15. A management, leadership, and board road map to transforming care for patients.

    PubMed

    Toussaint, John

    2013-01-01

    Over the last decade I have studied 115 healthcare organizations in II countries, examining them from the boardroom to the patient bedside. In that time, I have observed one critical element missing from just about every facility: a set of standards that could reliably produce zero-defect care for patients. This lack of standards is largely rooted in the Sloan management approach, a top-down management and leadership structure that is void of standardized accountability. This article offers an alternative approach: management by process--an operating system that engages frontline staff in decisions and imposes standards and processes on the act of managing. Organizations that have adopted management by process have seen quality improve and costs decrease because the people closest to the work are expected to identify problems and solve them. Also detailed are the leadership behaviors required for an organization to successfully implement the management-by-process operating system and the board of trustees' role in supporting the transformation.

  16. Finding Common Ground with the Common Core

    ERIC Educational Resources Information Center

    Moisan, Heidi

    2015-01-01

    This article examines the journey of museum educators at the Chicago History Museum in understanding the Common Core State Standards and implementing them in our work with the school audience. The process raised questions about our teaching philosophy and our responsibility to our audience. Working with colleagues inside and outside of our…

  17. 40 CFR 426.16 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. (a) Applicability. The provisions of this section shall apply to discharges of process waste water pollutants into publicly owned treatment works except for that portion of the waste stream which constitutes cullet water...

  18. 40 CFR 426.16 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... wastewater pollutants into a publicly owned treatment works must comply with 40 CFR part 403. (a) Applicability. The provisions of this section shall apply to discharges of process waste water pollutants into publicly owned treatment works except for that portion of the waste stream which constitutes cullet water...

  19. 29 CFR 30.3 - Equal opportunity standards.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., assignment of work, job performance, rotation among all work processes of the trade, imposition of penalties... apprenticeship including goals and timetables for women and minorities which has been approved as meeting the... amendment will qualify for this exception only if the goals and timetables for minorities and women for the...

  20. 29 CFR 30.3 - Equal opportunity standards.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., assignment of work, job performance, rotation among all work processes of the trade, imposition of penalties... apprenticeship including goals and timetables for women and minorities which has been approved as meeting the... amendment will qualify for this exception only if the goals and timetables for minorities and women for the...

  1. 40 CFR 458.26 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... subpart that introduces process wastewater pollutants into a publicly owned treatment works must comply... quality of pollutants or pollutant properties controlled by this section which may be discharged to a publicly owned treatment works by a new source subject to the provisions of this subpart: Pollutant or...

  2. 40 CFR 426.64 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... this subpart that introduces process wastewater pollutants into a publicly owned treatment works must... quality of pollutants or pollutant properties controlled by this section which may be discharged to a publicly owned treatment works by a point source subject to the provisions of this subpart. Pollutant or...

  3. 40 CFR 426.44 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... this subpart that introduces process wastewater pollutants into a publicly owned treatment works must... quality of pollutants or pollutant properties controlled by this section which may be discharged to a publicly owned treatment works by a point source subject to the provisions of this subpart. Pollutant or...

  4. 40 CFR 458.36 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... subpart that introduces process wastewater pollutants into a publicly owned treatment works must comply... quality of pollutants or pollutant properties controlled by this section which may be discharged to a publicly owned treatment works by a new source subject to the provisions of this subpart: Pollutant or...

  5. 40 CFR 426.106 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... subpart that introduces process wastewater pollutants into a publicly owned treatment works must comply... quality of pollutants or pollutant properties controlled by this section which may be discharged to a publicly owned treatment works by a new point source subject to the provisions of this subpart. Pollutant...

  6. 40 CFR 458.46 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... subpart that introduces process wastewater pollutants into a publicly owned treatment works must comply... quality of pollutants or pollutant properties controlled by this section which may be discharged to a publicly owned treatment works by a new source subject to the provisions of this subpart: Pollutant or...

  7. 40 CFR 405.64 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... this subpart that introduces process wastewater pollutants into a publicly owned treatment works must... quality of pollutants or pollutant properties controlled by this section which may be discharged to a publicly owned treatment works by a point source subject to the provisions of this subpart. Pollutant or...

  8. 40 CFR 408.154 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... this subpart that introduces process wastewater pollutants into a publicly owned treatment works must... quality of pollutants or pollutant properties controlled by this section which may be discharged to a publicly owned treatment works by a point source subject to the provisions of this subpart. Pollutant or...

  9. 40 CFR 408.166 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... this subpart that introduces process wastewater pollutants into a publicly owned treatment works must... quality of pollutants or pollutant properties controlled by this section which may be discharged to a publicly owned treatment works by a new source subject to the provisions of this subpart: Pollutant or...

  10. 40 CFR 408.144 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... this subpart that introduces process wastewater pollutants into a publicly owned treatment works must... quality of pollutants or pollutant properties controlled by this section which may be discharged to a publicly owned treatment works by a point source subject to the provisions of this subpart. Pollutant or...

  11. 40 CFR 458.16 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... subpart that introduces process wastewater pollutants into a publicly owned treatment works must comply... quality of pollutants or pollutant properties controlled by this section which may be discharged to a publicly owned treatment works by a new source subject to the provisions of this subpart: Pollutant or...

  12. 40 CFR 408.176 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... this subpart that introduces process wastewater pollutants into a publicly owned treatment works must... quality of pollutants or pollutant properties controlled by this section which may be discharged to a publicly owned treatment works by a new source subject to the provisions of this subpart: Pollutant or...

  13. 40 CFR 426.24 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... this subpart that introduces process wastewater pollutants into a publicly owned treatment works must... quality of pollutants or pollutant properties controlled by this section which may be discharged to a publicly owned treatment works by a point source subject to the provisions of this subpart. Pollutant or...

  14. 40 CFR 426.34 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... this subpart that introduces process wastewater pollutants into a publicly owned treatment works must... quality of pollutants or pollutant properties controlled by this section which may be discharged to a publicly owned treatment works by a point source subject to the provisions of this subpart. Pollutant or...

  15. 40 CFR 427.86 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... subpart that introduces process wastewater pollutants into a publicly owned treatment works must comply... quality of pollutants or pollutant properties, controlled by this section, which may be discharged to a publicly owned treatment works by a new point source subject to the provisions of this subpart. Pollutant...

  16. 40 CFR 405.64 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... this subpart that introduces process wastewater pollutants into a publicly owned treatment works must... quality of pollutants or pollutant properties controlled by this section which may be discharged to a publicly owned treatment works by a point source subject to the provisions of this subpart. Pollutant or...

  17. 48 CFR 46.202-3 - Standard inspection requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... inspection system that is acceptable to the Government; (2) Give the Government the right to make inspections and tests while work is in process; and (3) Require the contractor to keep complete, and make available to the Government, records of its inspection work. [48 FR 42415, Sept. 19, 1983. Redesignated at...

  18. Generic worklist handler for workflow-enabled products

    NASA Astrophysics Data System (ADS)

    Schmidt, Joachim; Meetz, Kirsten; Wendler, Thomas

    1999-07-01

    Workflow management (WfM) is an emerging field of medical information technology. It appears as a promising key technology to model, optimize and automate processes, for the sake of improved efficiency, reduced costs and improved patient care. The Application of WfM concepts requires the standardization of architectures and interfaces. A component of central interest proposed in this report is a generic work list handler: A standardized interface between a workflow enactment service and application system. Application systems with embedded work list handlers will be called 'Workflow Enabled Application Systems'. In this paper we discus functional requirements of work list handlers, as well as their integration into workflow architectures and interfaces. To lay the foundation for this specification, basic workflow terminology, the fundamentals of workflow management and - later in the paper - the available standards as defined by the Workflow Management Coalition are briefly reviewed.

  19. IEC 61511 and the capital project process--a protective management system approach.

    PubMed

    Summers, Angela E

    2006-03-17

    This year, the process industry has reached an important milestone in process safety-the acceptance of an internationally recognized standard for safety instrumented systems (SIS). This standard, IEC 61511, documents good engineering practice for the assessment, design, operation, maintenance, and management of SISs. The foundation of the standard is established by several requirements in Part 1, Clauses 5-7, which cover the development of a management system aimed at ensuring that functional safety is achieved. The management system includes a quality assurance process for the entire SIS lifecycle, requiring the development of procedures, identification of resources and acquisition of tools. For maximum benefit, the deliverables and quality control checks required by the standard should be integrated into the capital project process, addressing safety, environmental, plant productivity, and asset protection. Industry has become inundated with a multitude of programs focusing on safety, quality, and cost performance. This paper introduces a protective management system, which builds upon the work process identified in IEC 61511. Typical capital project phases are integrated with the management system to yield one comprehensive program to efficiently manage process risk. Finally, the paper highlights areas where internal practices or guidelines should be developed to improve program performance and cost effectiveness.

  20. Lean principles optimize on-time vascular surgery operating room starts and decrease resident work hours.

    PubMed

    Warner, Courtney J; Walsh, Daniel B; Horvath, Alexander J; Walsh, Teri R; Herrick, Daniel P; Prentiss, Steven J; Powell, Richard J

    2013-11-01

    Lean process improvement techniques are used in industry to improve efficiency and quality while controlling costs. These techniques are less commonly applied in health care. This study assessed the effectiveness of Lean principles on first case on-time operating room starts and quantified effects on resident work hours. Standard process improvement techniques (DMAIC methodology: define, measure, analyze, improve, control) were used to identify causes of delayed vascular surgery first case starts. Value stream maps and process flow diagrams were created. Process data were analyzed with Pareto and control charts. High-yield changes were identified and simulated in computer and live settings prior to implementation. The primary outcome measure was the proportion of on-time first case starts; secondary outcomes included hospital costs, resident rounding time, and work hours. Data were compared with existing benchmarks. Prior to implementation, 39% of first cases started on time. Process mapping identified late resident arrival in preoperative holding as a cause of delayed first case starts. Resident rounding process inefficiencies were identified and changed through the use of checklists, standardization, and elimination of nonvalue-added activity. Following implementation of process improvements, first case on-time starts improved to 71% at 6 weeks (P = .002). Improvement was sustained with an 86% on-time rate at 1 year (P < .001). Resident rounding time was reduced by 33% (from 70 to 47 minutes). At 9 weeks following implementation, these changes generated an opportunity cost potential of $12,582. Use of Lean principles allowed rapid identification and implementation of perioperative process changes that improved efficiency and resulted in significant cost savings. This improvement was sustained at 1 year. Downstream effects included improved resident efficiency with decreased work hours. Copyright © 2013 Society for Vascular Surgery. Published by Mosby, Inc. All rights reserved.

  1. A nonmonotonic dependence of standard rate constant on reorganization energy for heterogeneous electron transfer processes on electrode surface

    NASA Astrophysics Data System (ADS)

    Xu, Weilin; Li, Songtao; Zhou, Xiaochun; Xing, Wei; Huang, Mingyou; Lu, Tianhong; Liu, Changpeng

    2006-05-01

    In the present work a nonmonotonic dependence of standard rate constant (k0) on reorganization energy (λ) was discovered qualitatively from electron transfer (Marcus-Hush-Levich) theory for heterogeneous electron transfer processes on electrode surface. It was found that the nonmonotonic dependence of k0 on λ is another result, besides the disappearance of the famous Marcus inverted region, coming from the continuum of electronic states in electrode: with the increase of λ, the states for both Process I and Process II ET processes all vary from nonadiabatic to adiabatic state continuously, and the λ dependence of k0 for Process I is monotonic thoroughly, while for Process II on electrode surface the λ dependence of k0 could show a nonmonotonicity.

  2. Standard Waste Box Lid Screw Removal Option Testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anast, Kurt Roy

    This report provides results from test work conducted to resolve the removal of screws securing the standard waste box (SWB) lids that hold the remediated nitrate salt (RNS) drums. The test work evaluated equipment and process alternatives for removing the 42 screws that hold the SWB lid in place. The screws were secured with a red Loctite thread locker that makes removal very difficult because the rivets that the screw threads into would slip before the screw could be freed from the rivet, making it impossible to remove the screw and therefore the SWB lid.

  3. A cross sectional study on nursing process implementation and associated factors among nurses working in selected hospitals of Central and Northwest zones, Tigray Region, Ethiopia.

    PubMed

    Baraki, Zeray; Girmay, Fiseha; Kidanu, Kalayou; Gerensea, Hadgu; Gezehgne, Dejen; Teklay, Hafte

    2017-01-01

    The nursing process is a systematic method of planning, delivering, and evaluating individualized care for clients in any state of health or illness. Many countries have adopted the nursing process as the standard of care to guide nursing practice; however, the problem is its implementation. If nurses fail to carry out the necessary nursing care through the nursing process; the effectiveness of patient progress may be compromised and can lead to preventable adverse events. This study was aimed to assess the implementation of nursing process and associated factors among nurses working in selected hospitals of central and northwest zones of Tigray, Ethiopia, 2015. A cross sectional observational study design was utilized. Data was collected from 200 participants using structured self-administered questionnaire which was contextually adapted from standardized, reliable and validated measures. The data were entered using Epi Info version 7 and analyzed using SPSS version 20 software. Data were summarized and described using descriptive statistics and multivariate logistic regression was used to determine the relationship of independent and dependent variable. Then, finally, data were presented in tables, graphs, frequency percentage of different variables. Seventy (35%) of participants have implemented nursing process. Different factors showed significant association. Nurses who worked in a stressful atmosphere of the workplace were 99% less likely to implement the nursing process than nurses who worked at a very good atmosphere. The nurses with an educational level of BSc. Degree were 6.972 times more likely to implement the nursing process than those who were diploma qualified. Nurses with no consistent material supply to use the nursing process were 95.1% less likely to implement the nursing process than nurses with consistent material supply. The majority of the participants were not implementing the nursing process properly. There are many factors that hinder them from applying the nursing process of which level of education, knowledge of nurses, skill of nurses, atmosphere of the work place, shortage of material supply to use the nursing process and high number of patient load were scientifically significant for the association test.

  4. 75 FR 65067 - National Emission Standards for Hazardous Air Pollutant Emissions: Hard and Decorative Chromium...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-21

    ... non-air quality health and environmental impacts) and are commonly referred to as maximum achievable... process, stack, storage, or fugitive emissions point, (D) are design, equipment, work practice, or... combination of the above. CAA section 112(d)(2)(A)-(E). The MACT standard may take the form of a design...

  5. Teachers Working Together to Resist and Remake Educational Policy in Contexts of Standardization

    ERIC Educational Resources Information Center

    Pease-Alvarez, Lucinda; Thompson, Alisun

    2014-01-01

    Recently, those examining the role teachers of language minority students play in the language policy-making process have found that their autonomy has been threatened by increasing standardization as reflected in rigid one-size fits all curricular mandates focused on the learning of discrete skills in the national language, enforced high-stakes…

  6. Professional Standards for Teachers: How Do They "Work"? An Experiment in Tracing Standardisation In-the-Making in Teacher Education

    ERIC Educational Resources Information Center

    Ceulemans, Carlijne; Simons, Maarten; Struyf, Elke

    2012-01-01

    During the last two decades, professional standards describing competencies for teaching staff have emerged in nation states all around the world. This article reports on a pilot-study that applies a sociotechnological "lens" to examine this standardisation process in educational policy. In line with ethnographic analyses drawing on…

  7. [Development of a medical equipment support information system based on PDF portable document].

    PubMed

    Cheng, Jiangbo; Wang, Weidong

    2010-07-01

    According to the organizational structure and management system of the hospital medical engineering support, integrate medical engineering support workflow to ensure the medical engineering data effectively, accurately and comprehensively collected and kept in electronic archives. Analyse workflow of the medical, equipment support work and record all work processes by the portable electronic document. Using XML middleware technology and SQL Server database, complete process management, data calculation, submission, storage and other functions. The practical application shows that the medical equipment support information system optimizes the existing work process, standardized and digital, automatic and efficient orderly and controllable. The medical equipment support information system based on portable electronic document can effectively optimize and improve hospital medical engineering support work, improve performance, reduce costs, and provide full and accurate digital data

  8. MTpy: A Python toolbox for magnetotellurics

    NASA Astrophysics Data System (ADS)

    Krieger, Lars; Peacock, Jared R.

    2014-11-01

    We present the software package MTpy that allows handling, processing, and imaging of magnetotelluric (MT) data sets. Written in Python, the code is open source, containing sub-packages and modules for various tasks within the standard MT data processing and handling scheme. Besides the independent definition of classes and functions, MTpy provides wrappers and convenience scripts to call standard external data processing and modelling software. In its current state, modules and functions of MTpy work on raw and pre-processed MT data. However, opposite to providing a static compilation of software, we prefer to introduce MTpy as a flexible software toolbox, whose contents can be combined and utilised according to the respective needs of the user. Just as the overall functionality of a mechanical toolbox can be extended by adding new tools, MTpy is a flexible framework, which will be dynamically extended in the future. Furthermore, it can help to unify and extend existing codes and algorithms within the (academic) MT community. In this paper, we introduce the structure and concept of MTpy. Additionally, we show some examples from an everyday work-flow of MT data processing: the generation of standard EDI data files from raw electric (E-) and magnetic flux density (B-) field time series as input, the conversion into MiniSEED data format, as well as the generation of a graphical data representation in the form of a Phase Tensor pseudosection.

  9. 33 CFR 335.7 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... AND MAINTENANCE OF ARMY CORPS OF ENGINEERS CIVIL WORKS PROJECTS INVOLVING THE DISCHARGE OF DREDGED OR... engineering practices and meeting the environmental standards established by the 404(b)(1) evaluation process...

  10. 33 CFR 335.7 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... AND MAINTENANCE OF ARMY CORPS OF ENGINEERS CIVIL WORKS PROJECTS INVOLVING THE DISCHARGE OF DREDGED OR... engineering practices and meeting the environmental standards established by the 404(b)(1) evaluation process...

  11. 33 CFR 335.7 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... AND MAINTENANCE OF ARMY CORPS OF ENGINEERS CIVIL WORKS PROJECTS INVOLVING THE DISCHARGE OF DREDGED OR... engineering practices and meeting the environmental standards established by the 404(b)(1) evaluation process...

  12. 33 CFR 335.7 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... AND MAINTENANCE OF ARMY CORPS OF ENGINEERS CIVIL WORKS PROJECTS INVOLVING THE DISCHARGE OF DREDGED OR... engineering practices and meeting the environmental standards established by the 404(b)(1) evaluation process...

  13. 33 CFR 335.7 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... AND MAINTENANCE OF ARMY CORPS OF ENGINEERS CIVIL WORKS PROJECTS INVOLVING THE DISCHARGE OF DREDGED OR... engineering practices and meeting the environmental standards established by the 404(b)(1) evaluation process...

  14. Medical professionalism of foreign-born and foreign-trained physicians under close scrutiny: A qualitative study with stakeholders in Germany.

    PubMed

    Klingler, Corinna; Ismail, Fatiha; Marckmann, Georg; Kuehlmeyer, Katja

    2018-01-01

    Hospitals in Germany employ increasing numbers of foreign-born and foreign-trained (FB&FT) physicians. Studies have investigated how FB&FT physicians experience their professional integration into the German healthcare system, however, the perspectives of stakeholders working with and shaping the work experiences of FB&FT physicians in German hospitals have so far been neglected. This study explores relevant stakeholders' opinions and attitudes towards FB&FT physicians-which likely influence how these physicians settle in-and how these opinions were formed. We conducted a qualitative interview study with 25 stakeholders working in hospitals or in health policy development. The interviews were analyzed within a constructivist research paradigm using methods derived from Grounded Theory (situational analysis as well as open, axial and selective coding). We found that stakeholders tended to focus on problems in FB&FT physicians' work performance. Participants criticized FB&FT physicians' work for deviating from presumably shared professional standards (skill or knowledge and behavioral standards). The professional standards invoked to justify problem-focused statements comprised the definition of an ideal behavior, attitude or ability and a tolerance range that was adapted in a dynamic process. Behavior falling outside the tolerance range was criticized as unacceptable, requiring action to prevent similar deviations in the future. Furthermore, we derived three strategies (minimization, homogenization and quality management) proposed by participants to manage deviations from assumed professional standards by FB&FT physicians. We critically reflect on the social processes of evaluation and problematization and question the legitimacy of professional standards invoked. We also discuss discriminatory tendencies visible in evaluative statements of some participants as well as in some of the strategies proposed. We suggest it will be key to develop and implement better support strategies for FB&FT physicians while also addressing problematic attitudes within the receiving system to further professional integration.

  15. Medical professionalism of foreign-born and foreign-trained physicians under close scrutiny: A qualitative study with stakeholders in Germany

    PubMed Central

    Ismail, Fatiha; Marckmann, Georg; Kuehlmeyer, Katja

    2018-01-01

    Hospitals in Germany employ increasing numbers of foreign-born and foreign-trained (FB&FT) physicians. Studies have investigated how FB&FT physicians experience their professional integration into the German healthcare system, however, the perspectives of stakeholders working with and shaping the work experiences of FB&FT physicians in German hospitals have so far been neglected. This study explores relevant stakeholders’ opinions and attitudes towards FB&FT physicians—which likely influence how these physicians settle in—and how these opinions were formed. We conducted a qualitative interview study with 25 stakeholders working in hospitals or in health policy development. The interviews were analyzed within a constructivist research paradigm using methods derived from Grounded Theory (situational analysis as well as open, axial and selective coding). We found that stakeholders tended to focus on problems in FB&FT physicians’ work performance. Participants criticized FB&FT physicians’ work for deviating from presumably shared professional standards (skill or knowledge and behavioral standards). The professional standards invoked to justify problem-focused statements comprised the definition of an ideal behavior, attitude or ability and a tolerance range that was adapted in a dynamic process. Behavior falling outside the tolerance range was criticized as unacceptable, requiring action to prevent similar deviations in the future. Furthermore, we derived three strategies (minimization, homogenization and quality management) proposed by participants to manage deviations from assumed professional standards by FB&FT physicians. We critically reflect on the social processes of evaluation and problematization and question the legitimacy of professional standards invoked. We also discuss discriminatory tendencies visible in evaluative statements of some participants as well as in some of the strategies proposed. We suggest it will be key to develop and implement better support strategies for FB&FT physicians while also addressing problematic attitudes within the receiving system to further professional integration. PMID:29447259

  16. Cognitive and behavioural assessment of people with traumatic brain injury in the work place: occupational therapists' perceptions.

    PubMed

    Bootes, Kylie; Chapparo, Christine J

    2002-01-01

    Cognitive and behavioural impairments, in the absence of severe physical disability, are commonly related to poor return to work outcomes for people with traumatic brain injury (TBI). Along with other health professionals, occupational therapists make judgements about cognitive and behavioural dimensions of work capacity of clients with TBI during the return to work process. Unlike many physical functional capacity evaluations, there is no standard method that therapists use to assess the ability of people with TBI to perform cognitive operations required for work. Little is known about what information occupational therapists use in their assessment of cognitive and behavioural aspects of client performance within the work place. This study employed qualitative research methods to determine what information is utilised by 20 therapists who assess the work capacity of people with TBI in the workplace. Results indicated that the process of making judgements about cognitive and behavioural competence within the work place is a multifaceted process. Therapists triangulate client information from multiple sources and types of data to produce an accurate view of client work capacity. Central to this process is the relationship between the client, the job and the work environment.

  17. Participative Work Design in Lean Production: A Strategy for Dissolving the Paradox between Standardized Work and Team Proactivity by Stimulating Team Learning?

    ERIC Educational Resources Information Center

    Lantz, Annika; Hansen, Niklas; Antoni, Conny

    2015-01-01

    Purpose: The purpose of this paper is to explore job design mechanisms that enhance team proactivity within a lean production system where autonomy is uttermost restricted. We propose and test a model where the team learning process of building shared meaning of work mediates the relationship between team participative decision-making, inter team…

  18. 40 CFR 63.7890 - What emissions limitations and work practice standards must I meet for process vents?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... total organic compounds (TOC) (minus methane and ethane) to a level below 1.4 kg/hr and 2.8 Mg/yr (3.0... process vents the emissions of TOC (minus methane and ethane) by 95 percent by weight or more. (c) For...

  19. Statistical Process Control: Going to the Limit for Quality.

    ERIC Educational Resources Information Center

    Training, 1987

    1987-01-01

    Defines the concept of statistical process control, a quality control method used especially in manufacturing. Generally, concept users set specific standard levels that must be met. Makes the point that although employees work directly with the method, management is responsible for its success within the plant. (CH)

  20. KSC's work flow assistant

    NASA Technical Reports Server (NTRS)

    Wilkinson, John; Johnson, Earl

    1991-01-01

    The work flow assistant (WFA) is an advanced technology project under the shuttle processing data management system (SPDMS) at Kennedy Space Center (KSC). It will be utilized for short range scheduling, controlling work flow on the floor, and providing near real-time status for all major space transportation systems (STS) work centers at KSC. It will increase personnel and STS safety and improve productivity through deeper active scheduling that includes tracking and correlation of STS and ground support equipment (GSE) configuration and work. It will also provide greater accessibility to this data. WFA defines a standards concept for scheduling data which permits both commercial off-the-shelf (COTS) scheduling tools and WFA developed applications to be reused. WFA will utilize industry standard languages and workstations to achieve a scalable, adaptable, and portable architecture which may be used at other sites.

  1. Scholarship: The Key to Creating Change through Outreach

    ERIC Educational Resources Information Center

    Bruns, Karen; Conklin, Nikki; Wright, Mindy; Hoover, David; Brace, Ben; Wise, Greg; Pendleton, Fariba; Dann, Michael; Martin, Michael; Childers, Jeri

    2002-01-01

    Outreach can and should exemplify the characteristics typical of any scholarly work if it is to create change in our communities and universities. Glassick, Huber, and Maeroff's insight on the standards for scholarly work are reflected in the processes commonly used to implement outreach. Boyer challenges us to think of scholarship as a communal…

  2. Science under Scrutiny

    ERIC Educational Resources Information Center

    Wright, Lynne

    2003-01-01

    Increasingly, coordinators are undertaking a scrutiny of work in order to check standards in their subjects. This is done frequently for English and mathematics, where annual targets for attainment in year 6 have to be set, but less so for science. Carrying out a scrutiny of work can be a daunting and time-consuming process. Faced with a pile of…

  3. Designing Class Activities to Meet Specific Core Training Competencies: A Developmental Approach

    ERIC Educational Resources Information Center

    Guth, Lorraine J.; McDonnell, Kelly A.

    2004-01-01

    This article presents a developmental model for designing and utilizing class activities to meet specific Association for Specialists in Group Work (ASGW) core training competencies for group workers. A review of the relevant literature about teaching group work and meeting core training standards is provided. The authors suggest a process by…

  4. 40 CFR 63.7530 - How do I demonstrate initial compliance with the emission limitations, fuel specifications and...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... compliance with the emission limitations, fuel specifications and work practice standards? 63.7530 Section 63... Institutional Boilers and Process Heaters Testing, Fuel Analyses, and Initial Compliance Requirements § 63.7530 How do I demonstrate initial compliance with the emission limitations, fuel specifications and work...

  5. 40 CFR 63.7530 - How do I demonstrate initial compliance with the emission limitations, fuel specifications and...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... compliance with the emission limitations, fuel specifications and work practice standards? 63.7530 Section 63... Institutional Boilers and Process Heaters Testing, Fuel Analyses, and Initial Compliance Requirements § 63.7530 How do I demonstrate initial compliance with the emission limitations, fuel specifications and work...

  6. Architecture for Survivable System Processing (ASSP)

    NASA Astrophysics Data System (ADS)

    Wood, Richard J.

    1991-11-01

    The Architecture for Survivable System Processing (ASSP) Program is a multi-phase effort to implement Department of Defense (DOD) and commercially developed high-tech hardware, software, and architectures for reliable space avionics and ground based systems. System configuration options provide processing capabilities to address Time Dependent Processing (TDP), Object Dependent Processing (ODP), and Mission Dependent Processing (MDP) requirements through Open System Architecture (OSA) alternatives that allow for the enhancement, incorporation, and capitalization of a broad range of development assets. High technology developments in hardware, software, and networking models, address technology challenges of long processor life times, fault tolerance, reliability, throughput, memories, radiation hardening, size, weight, power (SWAP) and security. Hardware and software design, development, and implementation focus on the interconnectivity/interoperability of an open system architecture and is being developed to apply new technology into practical OSA components. To insure for widely acceptable architecture capable of interfacing with various commercial and military components, this program provides for regular interactions with standardization working groups (e.g.) the International Standards Organization (ISO), American National Standards Institute (ANSI), Society of Automotive Engineers (SAE), and Institute of Electrical and Electronic Engineers (IEEE). Selection of a viable open architecture is based on the widely accepted standards that implement the ISO/OSI Reference Model.

  7. Architecture for Survivable System Processing (ASSP)

    NASA Technical Reports Server (NTRS)

    Wood, Richard J.

    1991-01-01

    The Architecture for Survivable System Processing (ASSP) Program is a multi-phase effort to implement Department of Defense (DOD) and commercially developed high-tech hardware, software, and architectures for reliable space avionics and ground based systems. System configuration options provide processing capabilities to address Time Dependent Processing (TDP), Object Dependent Processing (ODP), and Mission Dependent Processing (MDP) requirements through Open System Architecture (OSA) alternatives that allow for the enhancement, incorporation, and capitalization of a broad range of development assets. High technology developments in hardware, software, and networking models, address technology challenges of long processor life times, fault tolerance, reliability, throughput, memories, radiation hardening, size, weight, power (SWAP) and security. Hardware and software design, development, and implementation focus on the interconnectivity/interoperability of an open system architecture and is being developed to apply new technology into practical OSA components. To insure for widely acceptable architecture capable of interfacing with various commercial and military components, this program provides for regular interactions with standardization working groups (e.g.) the International Standards Organization (ISO), American National Standards Institute (ANSI), Society of Automotive Engineers (SAE), and Institute of Electrical and Electronic Engineers (IEEE). Selection of a viable open architecture is based on the widely accepted standards that implement the ISO/OSI Reference Model.

  8. Single-Run Single-Mask Inductively-Coupled-Plasma Reactive-Ion-Etching Process for Fabricating Suspended High-Aspect-Ratio Microstructures

    NASA Astrophysics Data System (ADS)

    Yang, Yao-Joe; Kuo, Wen-Cheng; Fan, Kuang-Chao

    2006-01-01

    In this work, we present a single-run single-mask (SRM) process for fabricating suspended high-aspect-ratio structures on standard silicon wafers using an inductively coupled plasma-reactive ion etching (ICP-RIE) etcher. This process eliminates extra fabrication steps which are required for structure release after trench etching. Released microstructures with 120 μm thickness are obtained by this process. The corresponding maximum aspect ratio of the trench is 28. The SRM process is an extended version of the standard process proposed by BOSCH GmbH (BOSCH process). The first step of the SRM process is a standard BOSCH process for trench etching, then a polymer layer is deposited on trench sidewalls as a protective layer for the subsequent structure-releasing step. The structure is released by dry isotropic etching after the polymer layer on the trench floor is removed. All the steps can be integrated into a single-run ICP process. Also, only one mask is required. Therefore, the process complexity and fabrication cost can be effectively reduced. Discussions on each SRM step and considerations for avoiding undesired etching of the silicon structures during the release process are also presented.

  9. Characterizing neurocognitive late effects in childhood leukemia survivors using a combination of neuropsychological and cognitive neuroscience measures.

    PubMed

    Van Der Plas, Ellen; Erdman, Lauren; Nieman, Brian J; Weksberg, Rosanna; Butcher, Darci T; O'connor, Deborah L; Aufreiter, Susanne; Hitzler, Johann; Guger, Sharon L; Schachar, Russell J; Ito, Shinya; Spiegler, Brenda J

    2017-10-10

    Knowledge about cognitive late effects in survivors of childhood acute lymphoblastic leukemia (ALL) is largely based on standardized neuropsychological measures and parent reports. To examine whether cognitive neuroscience paradigms provided additional insights into neurocognitive and behavioral late effects in ALL survivors, we assessed cognition and behavior using a selection of cognitive neuroscience tasks and standardized measures probing domains previously demonstrated to be affected by chemotherapy. 130 ALL survivors and 158 control subjects, between 8 and 18 years old at time of testing, completed the n-back (working memory) and stop-signal (response inhibition) tasks. ALL survivors also completed standardized measures of intelligence (Wechsler Intelligence Scales [WISC-IV]), motor skills (Grooved Pegboard), math abilities (WIAT-III), and executive functions (Delis-Kaplan Executive Function System). Parents completed behavioral measures of executive functions (Behavior Rating Inventory of Executive Function [BRIEF]) and attention (Conners-3). ALL survivors exhibited deficiencies in working memory and response inhibition compared with controls. ALL survivors also exhibited deficits on WISC-IV working memory and processing speed, Grooved Pegboard, WIAT-III addition and subtraction fluency, and numerical operations, as well as DKEFS number-letter switching. Parent reports suggested more attention deficits (Conners-3) and behavioral difficulties (BRIEF) in ALL survivors compared with referenced norms. Low correspondence between standardized and experimental measures of working memory and response inhibition was noted. The use of cognitive neuroscience paradigms complements our understanding of the cognitive deficits evident after treatment of ALL. These measures could further delineate cognitive processes involved in neurocognitive late effects, providing opportunities to explore their underlying mechanisms.

  10. 40 CFR Table 2 to Subpart Ffff of... - Emission Limits and Work Practice Standards for Batch Process Vents

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...: Miscellaneous Organic Chemical Manufacturing Pt. 63, Subpt. FFFF, Table 2 Table 2 to Subpart FFFF of Part 63... vents a. Reduce collective uncontrolled organic HAP emissions from the sum of all batch process vents... applicable. b. Reduce collective uncontrolled organic HAP emissions from the sum of all batch process vents...

  11. 40 CFR Table 2 to Subpart Ffff of... - Emission Limits and Work Practice Standards for Batch Process Vents

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...: Miscellaneous Organic Chemical Manufacturing Pt. 63, Subpt. FFFF, Table 2 Table 2 to Subpart FFFF of Part 63... vents a. Reduce collective uncontrolled organic HAP emissions from the sum of all batch process vents... applicable. b. Reduce collective uncontrolled organic HAP emissions from the sum of all batch process vents...

  12. Lattice Gauge Theories Within and Beyond the Standard Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gelzer, Zechariah John

    The Standard Model of particle physics has been very successful in describing fundamental interactions up to the highest energies currently probed in particle accelerator experiments. However, the Standard Model is incomplete and currently exhibits tension with experimental data for interactions involvingmore » $B$~mesons. Consequently, $B$-meson physics is of great interest to both experimentalists and theorists. Experimentalists worldwide are studying the decay and mixing processes of $B$~mesons in particle accelerators. Theorists are working to understand the data by employing lattice gauge theories within and beyond the Standard Model. This work addresses the theoretical effort and is divided into two main parts. In the first part, I present a lattice-QCD calculation of form factors for exclusive semileptonic decays of $B$~mesons that are mediated by both charged currents ($$B \\to \\pi \\ell \

  13. The National United States Center Data Repository: Core essential interprofessional practice & education data enabling triple aim analytics.

    PubMed

    Pechacek, Judith; Shanedling, Janet; Lutfiyya, May Nawal; Brandt, Barbara F; Cerra, Frank B; Delaney, Connie White

    2015-01-01

    Understanding the impact that interprofessional education and collaborative practice (IPECP) might have on triple aim patient outcomes is of high interest to health care providers, educators, administrators, and policy makers. Before the work undertaken by the National Center for Interprofessional Practice and Education at the University of Minnesota, no standard mechanism to acquire and report outcome data related to interprofessional education and collaborative practice and its effect on triple aim outcomes existed. This article describes the development and adoption of the National Center Data Repository (NCDR) designed to capture data related to IPECP processes and outcomes to support analyses of the relationship of IPECP on the Triple Aim. The data collection methods, web-based survey design and implementation process are discussed. The implications of this informatics work to the field of IPECP and health care quality and safety include creating standardized capacity to describe interprofessional practice and measure outcomes connecting interprofessional education and collaborative practice to the triple aim within and across sites/settings, leveraging an accessible data collection process using user friendly web-based survey design to support large data scholarship and instrument testing, and establishing standardized data elements and variables that can potentially lead to enhancements to national/international information system and academic accreditation standards to further team-based, interprofessional, collaborative research in the field.

  14. The National United States Center Data Repository: Core essential interprofessional practice & education data enabling triple aim analytics

    PubMed Central

    Pechacek, Judith; Shanedling, Janet; Lutfiyya, May Nawal; Brandt, Barbara F.; Cerra, Frank B.; Delaney, Connie White

    2015-01-01

    Abstract Understanding the impact that interprofessional education and collaborative practice (IPECP) might have on triple aim patient outcomes is of high interest to health care providers, educators, administrators, and policy makers. Before the work undertaken by the National Center for Interprofessional Practice and Education at the University of Minnesota, no standard mechanism to acquire and report outcome data related to interprofessional education and collaborative practice and its effect on triple aim outcomes existed. This article describes the development and adoption of the National Center Data Repository (NCDR) designed to capture data related to IPECP processes and outcomes to support analyses of the relationship of IPECP on the Triple Aim. The data collection methods, web-based survey design and implementation process are discussed. The implications of this informatics work to the field of IPECP and health care quality and safety include creating standardized capacity to describe interprofessional practice and measure outcomes connecting interprofessional education and collaborative practice to the triple aim within and across sites/settings, leveraging an accessible data collection process using user friendly web-based survey design to support large data scholarship and instrument testing, and establishing standardized data elements and variables that can potentially lead to enhancements to national/international information system and academic accreditation standards to further team-based, interprofessional, collaborative research in the field. PMID:26652631

  15. A nonmonotonic dependence of standard rate constant on reorganization energy for heterogeneous electron transfer processes on electrode surface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu Weilin; Li Songtao; Zhou Xiaochun

    2006-05-07

    In the present work a nonmonotonic dependence of standard rate constant (k{sup 0}) on reorganization energy ({lambda}) was discovered qualitatively from electron transfer (Marcus-Hush-Levich) theory for heterogeneous electron transfer processes on electrode surface. It was found that the nonmonotonic dependence of k{sup 0} on {lambda} is another result, besides the disappearance of the famous Marcus inverted region, coming from the continuum of electronic states in electrode: with the increase of {lambda}, the states for both Process I and Process II ET processes all vary from nonadiabatic to adiabatic state continuously, and the {lambda} dependence of k{sup 0} for Process Imore » is monotonic thoroughly, while for Process II on electrode surface the {lambda} dependence of k{sup 0} could show a nonmonotonicity.« less

  16. Demography of Principals' Work and School Improvement: Content Validity of Kentucky's Standards and Indicators for School Improvement (SISI)

    ERIC Educational Resources Information Center

    Lindle, Jane Clark; Stalion, Nancy; Young, Lu

    2005-01-01

    Kentucky's accountability system includes a school-processes audit known as Standards and Indicators for School Improvement (SISI), which is in a nascent stage of validation. Content validity methods include comparison to instruments measuring similar constructs as well as other techniques such as job analysis. This study used a two-phase process…

  17. Women's Socialization into Nontraditional Heavy WorkA Case Study.

    PubMed

    Atwood Sanders, M

    1994-01-01

    An increasing number of opportunities are available for women in nontraditional, blue-collar work. However, for many women the lack of job skills, poor work conditions, differential treatment, shift work, and heavy physical labor deter them from entering or remaining in such jobs. This case study attempted to uncover workplace issues, the socialization process, and adaptive strategies for a female steel worker. Her strategies for survival included hard work, high work standards, "courteous but cool" relationships with coworkers, "standing up" for herself, and good friends or support systems.

  18. Ergonomics, quality and continuous improvement--conceptual and empirical relationships in an industrial context.

    PubMed

    Eklund, J

    1997-10-01

    This paper reviews the literature comparing the fields of ergonomics and quality, mainly in an industrial context, including mutual influences, similarities and differences. Relationships between ergonomics and the factors: work conditions, product design, ISO 9000, continuous improvements and TQM are reviewed in relation to the consequence, application, and process domains. The definitions of ergonomics and quality overlap substantially. Quality deficiencies, human errors and ergonomics problems often have the same cause, which in many cases can be traced to the design of work, workplace and environment e.g. noise, light, postures, loads, pace and work content. In addition, the possibility of performing to a high standard at work is an important prerequisite for satisfaction and well-being. Contradictions between the two fields have been identified in the view of concepts such as standardization, reduction of variability and copying of best practice, requiring further research. The field of quality would gain by incorporating ergonomics knowledge, especially in the areas of work design and human capability, since these factors are decisive for human performance and also therefore the performance of the systems involved. The field of ergonomics, on the other hand, would benefit from developing a stronger emphasis on methodologies and structures for improvement processes, including a clearer link with leadership and company strategies. Just as important is a further development of practicable participative ergonomics methods and tools for use at workplaces by the workers themselves, in order to integrate the top-down and the bottom-up processes and achieve better impact. Using participative processes for problem-solving and continuous improvement, focusing ergonomics and quality jointly has a great potential for improving working conditions and quality results simultaneously, and satisfying most of the interested parties.

  19. Managing Interoperability for GEOSS - A Report from the SIF

    NASA Astrophysics Data System (ADS)

    Khalsa, S. J.; Actur, D.; Nativi, S.; Browdy, S.; Eglitis, P.

    2009-04-01

    The Global Earth Observation System of Systems (GEOSS) is a coordinating and integrating framework for Earth observing and information systems, which are contributed on a voluntary basis by Members and Participating Organizations of the intergovernmental Group on Earth Observations (GEO). GEOSS exists to support informed decision making for the benefit of society, including the implementation of international environmental treaty obligations. GEO Members and Participating organizations use the GEOSS Common Infrastructure (GCI) to register their Earth observation resources, thereby making them discoverable and consumable by both humans and client applications. Essential to meeting GEO user needs is a process for supporting interoperability of observing, processing, modeling and dissemination capabilities. The GEO Standards and Interoperability Forum (SIF) was created to develop, implement and oversee this process. The SIF supports GEO organizations contributing resources to the GEOSS by helping them understand and work with the GEOSS interoperability guidelines and encouraging them to register their "interoperability arrangements" (standards or other ad hoc arrangements for interoperability) in the GEOSS standards registry, which is part of the GCI. These registered interoperability arrangements support the actual services used to achieve interoperability of systems. By making information about these interoperability arrangements available to users of the GEOSS the SIF enhances the understanding and utility of contributed resources. We describe the procedures that the SIF has enacted to carry out its work. To operate effectively the SIF uses a workflow system and is establishing a set of regional teams and domain experts. In the near term our work has focused on population and review of the GEOSS Standards Registry, but we are also developing approaches to achieving progressive convergence on, and uptake of, an optimal set of interoperability arrangements for all of GEOSS.

  20. Taming Many-Parameter BSM Models with Bayesian Neural Networks

    NASA Astrophysics Data System (ADS)

    Kuchera, M. P.; Karbo, A.; Prosper, H. B.; Sanchez, A.; Taylor, J. Z.

    2017-09-01

    The search for physics Beyond the Standard Model (BSM) is a major focus of large-scale high energy physics experiments. One method is to look for specific deviations from the Standard Model that are predicted by BSM models. In cases where the model has a large number of free parameters, standard search methods become intractable due to computation time. This talk presents results using Bayesian Neural Networks, a supervised machine learning method, to enable the study of higher-dimensional models. The popular phenomenological Minimal Supersymmetric Standard Model was studied as an example of the feasibility and usefulness of this method. Graphics Processing Units (GPUs) are used to expedite the calculations. Cross-section predictions for 13 TeV proton collisions will be presented. My participation in the Conference Experience for Undergraduates (CEU) in 2004-2006 exposed me to the national and global significance of cutting-edge research. At the 2005 CEU, I presented work from the previous summer's SULI internship at Lawrence Berkeley Laboratory, where I learned to program while working on the Majorana Project. That work inspired me to follow a similar research path, which led me to my current work on computational methods applied to BSM physics.

  1. Lessons Learned from LIBS Calibration Development

    NASA Astrophysics Data System (ADS)

    Dyar, M. D.; Breves, E. A.; Lepore, K. H.; Boucher, T. F.; Giguere, S.

    2016-10-01

    More than two decades of work have been dedicated to development of robust standards, data processing, and calibration tools for LIBS. Here we summarize major considerations for improving accuracy of LIBS chemical analyses.

  2. How personal and standardized coordination impact implementation of integrated care.

    PubMed

    Benzer, Justin K; Cramer, Irene E; Burgess, James F; Mohr, David C; Sullivan, Jennifer L; Charns, Martin P

    2015-10-02

    Integrating health care across specialized work units has the potential to lower costs and increase quality and access to mental health care. However, a key challenge for healthcare managers is how to develop policies, procedures, and practices that coordinate care across specialized units. The purpose of this study was to identify how organizational factors impacted coordination, and how to facilitate implementation of integrated care. Semi-structured interviews were conducted in August 2009 with 30 clinic leaders and 35 frontline staff who were recruited from a convenience sample of 16 primary care and mental health clinics across eight medical centers. Data were drawn from a management evaluation of primary care-mental health integration in the US Department of Veterans Affairs. To protect informant confidentiality, the institutional review board did not allow quotations. Interviews identified antecedents of organizational coordination processes, and highlighted how these antecedents can impact the implementation of integrated care. Overall, implementing new workflow practices were reported to create conflicts with pre-existing standardized coordination processes. Personal coordination (i.e., interpersonal communication processes) between primary care leaders and staff was reported to be effective in overcoming these barriers both by working around standardized coordination barriers and modifying standardized procedures. This study identifies challenges to integrated care that might be solved with attention to personal and standardized coordination. A key finding was that personal coordination both between primary care and mental health leaders and between frontline staff is important for resolving barriers related to integrated care implementation. Integrated care interventions can involve both new standardized procedures and adjustments to existing procedures. Aligning and integrating procedures between primary care and specialty care requires personal coordination amongst leaders. Interpersonal relationships should be strengthened between staff when personal connections are important for coordinating patient care across clinical settings.

  3. Peer Review: Promoting Efficient School District Operations

    ERIC Educational Resources Information Center

    Hale, Jason S.

    2010-01-01

    Many professions recognize the benefits of peer reviews to assess processes and operations because peers can more easily identify one another's inefficiencies and provide some kind of intervention. Generally, the goal of the peer review process is to verify whether the work satisfies the standards set by the industry. A number of states have begun…

  4. Quality Space and Launch Requirements, Addendum to AS9100C

    DTIC Science & Technology

    2015-05-08

    45 8.9.1 Statistical Process Control (SPC) .......................................................................... 45...SMC Space and Missile Systems Center SME Subject Matter Expert SOW Statement of Work SPC Statistical Process Control SPO System Program Office SRP...occur without any individual data exceeding the control limits. Control limits are developed using standard statistical methods or other approved

  5. Short-Term Memory and Auditory Processing Disorders: Concurrent Validity and Clinical Diagnostic Markers

    ERIC Educational Resources Information Center

    Maerlender, Arthur

    2010-01-01

    Auditory processing disorders (APDs) are of interest to educators and clinicians, as they impact school functioning. Little work has been completed to demonstrate how children with APDs perform on clinical tests. In a series of studies, standard clinical (psychometric) tests from the Wechsler Intelligence Scale for Children, Fourth Edition…

  6. Connecting the Curriculum through National Science and Mathematics Standards: A Matrix Approach.

    ERIC Educational Resources Information Center

    Francis, Raymond

    This paper provides instructions for linking conceptual understandings using the Connections Matrix. The Connections Matrix and the process of connecting the curriculum works equally well with state-level learning objectives or outcomes. The intent of this process is to help educators see the overlap and connections between what teachers say they…

  7. Automated Procurement System (APS): Project management plan (DS-03), version 1.2

    NASA Technical Reports Server (NTRS)

    Murphy, Diane R.

    1994-01-01

    The National Aeronautics and Space Administration (NASA) Marshall Space Flight Center (MSFC) is implementing an Automated Procurement System (APS) to streamline its business activities that are used to procure goods and services. This Project Management Plan (PMP) is the governing document throughout the implementation process and is identified as the APS Project Management Plan (DS-03). At this point in time, the project plan includes the schedules and tasks necessary to proceed through implementation. Since the basis of APS is an existing COTS system, the implementation process is revised from the standard SDLC. The purpose of the PMP is to provide the framework for the implementation process. It discusses the roles and responsibilities of the NASA project staff, the functions to be performed by the APS Development Contractor (PAI), and the support required of the NASA computer support contractor (CSC). To be successful, these three organizations must work together as a team, working towards the goals established in this Project Plan. The Project Plan includes a description of the proposed system, describes the work to be done, establishes a schedule of deliverables, and discusses the major standards and procedures to be followed.

  8. Examples as an Instructional Tool in Mathematics and Science Classrooms: Teachers' Perceptions and Attitudes

    ERIC Educational Resources Information Center

    Huang, Xiaoxia; Cribbs, Jennifer

    2017-01-01

    This study examined mathematics and science teachers' perceptions and use of four types of examples, including typical textbook examples (standard worked examples) and erroneous worked examples in the written form as well as mastery modelling examples and peer modelling examples involving the verbalization of the problem-solving process. Data…

  9. Urban School Formation: Identity Work and Constructing an Origin Story

    ERIC Educational Resources Information Center

    Silver, Lauren J.

    2015-01-01

    In this article, I analyze the formation of Shearwater High School in St. Louis, Missouri and I explore the identity work involved in building the school. Diverging from standardization trends in urban education, the school came to fruition through a creative process, which was in-flux and supported by diverse communities in St. Louis. This…

  10. Exploring Expressive Vocabulary Variability in Two-Year-Olds: The Role of Working Memory

    ERIC Educational Resources Information Center

    Newbury, Jayne; Klee, Thomas; Stokes, Stephanie F.; Moran, Catherine

    2015-01-01

    Purpose: This study explored whether measures of working memory ability contribute to the wide variation in 2-year-olds' expressive vocabulary skills. Method: Seventy-nine children (aged 24-30 months) were assessed by using standardized tests of vocabulary and visual cognition, a processing speed measure, and behavioral measures of verbal working…

  11. Process Evaluation of an Integrated Health Promotion/Occupational Health Model in WellWorks-2

    ERIC Educational Resources Information Center

    Hunt, Mary Kay; Lederman, Ruth; Stoddard, Anne M.; LaMontagne, Anthony D.; McLellan, Deborah; Combe, Candace; Barbeau, Elizabeth; Sorensen, Glorian

    2005-01-01

    Disparities in chronic disease risk by occupation call for new approaches to health promotion. WellWorks-2 was a randomized, controlled study comparing the effectiveness of a health promotion/occupational health program (HP/OHS) with a standard intervention (HP). Interventions in both studies were based on the same theoretical foundations. Results…

  12. Opening the black box of ethics policy work: evaluating a covert practice.

    PubMed

    Frolic, Andrea; Drolet, Katherine; Bryanton, Kim; Caron, Carole; Cupido, Cynthia; Flaherty, Barb; Fung, Sylvia; McCall, Lori

    2012-01-01

    Hospital ethics committees (HECs) and ethicists generally describe themselves as engaged in four domains of practice: case consultation, research, education, and policy work. Despite the increasing attention to quality indicators, practice standards, and evaluation methods for the other domains, comparatively little is known or published about the policy work of HECs or ethicists. This article attempts to open the "black box" of this health care ethics practice by providing two detailed case examples of ethics policy reviews. We also describe the development and application of an evaluation strategy to assess the quality of ethics policy review work, and to enable continuous improvement of ethics policy review processes. Given the potential for policy work to impact entire patient populations and organizational systems, it is imperative that HECs and ethicists develop clearer roles, responsibilities, procedural standards, and evaluation methods to ensure the delivery of consistent, relevant, and high-quality ethics policy reviews.

  13. Exploring the Use of Enterprise Content Management Systems in Unification Types of Organizations

    NASA Astrophysics Data System (ADS)

    Izza Arshad, Noreen; Mehat, Mazlina; Ariff, Mohamed Imran Mohamed

    2014-03-01

    The aim of this paper is to better understand how highly standardized and integrated businesses known as unification types of organizations use Enterprise Content Management Systems (ECMS) to support their business processes. Multiple case study approach was used to study the ways two unification organizations use their ECMS in their daily work practices. Arising from these case studies are insights into the differing ways in which ECMS is used to support businesses. Based on the comparisons of the two cases, this study proposed that unification organizations may use ECMS in four ways, for: (1) collaboration, (2) information sharing that supports a standardized process structure, (3) building custom workflows that support integrated and standardized processes, and (4) providing links and access to information systems. These findings may guide organizations that are highly standardized and integrated in fashion, to achieve their intended ECMS-use, to understand reasons for ECMS failures and underutilization and to exploit technologies investments.

  14. 40 CFR Table 1 to Subpart Ffff of... - Emission Limits and Work Practice Standards for Continuous Process Vents

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Pollutants: Miscellaneous Organic Chemical Manufacturing Pt. 63, Subpt. FFFF, Table 1 Table 1 to Subpart FFFF... continuous process vent a. Not applicable i. Reduce emissions of total organic HAP by ≥98 percent by weight or to an outlet process concentration ≤20 ppmv as organic HAP or TOC by venting emissions through a...

  15. Integrated flexible manufacturing program for manufacturing automation and rapid prototyping

    NASA Technical Reports Server (NTRS)

    Brooks, S. L.; Brown, C. W.; King, M. S.; Simons, W. R.; Zimmerman, J. J.

    1993-01-01

    The Kansas City Division of Allied Signal Inc., as part of the Integrated Flexible Manufacturing Program (IFMP), is developing an integrated manufacturing environment. Several systems are being developed to produce standards and automation tools for specific activities within the manufacturing environment. The Advanced Manufacturing Development System (AMDS) is concentrating on information standards (STEP) and product data transfer; the Expert Cut Planner system (XCUT) is concentrating on machining operation process planning standards and automation capabilities; the Advanced Numerical Control system (ANC) is concentrating on NC data preparation standards and NC data generation tools; the Inspection Planning and Programming Expert system (IPPEX) is concentrating on inspection process planning, coordinate measuring machine (CMM) inspection standards and CMM part program generation tools; and the Intelligent Scheduling and Planning System (ISAPS) is concentrating on planning and scheduling tools for a flexible manufacturing system environment. All of these projects are working together to address information exchange, standardization, and information sharing to support rapid prototyping in a Flexible Manufacturing System (FMS) environment.

  16. Overlap junctions for high coherence superconducting qubits

    NASA Astrophysics Data System (ADS)

    Wu, X.; Long, J. L.; Ku, H. S.; Lake, R. E.; Bal, M.; Pappas, D. P.

    2017-07-01

    Fabrication of sub-micron Josephson junctions is demonstrated using standard processing techniques for high-coherence, superconducting qubits. These junctions are made in two separate lithography steps with normal-angle evaporation. Most significantly, this work demonstrates that it is possible to achieve high coherence with junctions formed on aluminum surfaces cleaned in situ by Ar plasma before junction oxidation. This method eliminates the angle-dependent shadow masks typically used for small junctions. Therefore, this is conducive to the implementation of typical methods for improving margins and yield using conventional CMOS processing. The current method uses electron-beam lithography and an additive process to define the top and bottom electrodes. Extension of this work to optical lithography and subtractive processes is discussed.

  17. [Investigation on pattern of quality control for Chinese materia medica based on famous-region drug and bioassay--the work reference].

    PubMed

    Yan, Dan; Xiao, Xiaohe

    2011-05-01

    Selection and standardization of the work reference are the technical issues to be faced with in the bioassay of Chinese materia medica. Taking the bioassay of Coptis chinensis. as an example, the manufacture process of the famous-region drugs extraction was explained from the aspects of original identification, routine examination, component analysis and bioassay. The common technologies were extracted, and the selection and standardization procedures of the work reference for the bioassay of Chinese materia medica were drawn up, so as to provide technical support for constructing a new mode and method of the quality control of Chinese materia medica based on the famous-region drugs and bioassay.

  18. Gold-standard evaluation of a folksonomy-based ontology learning model

    NASA Astrophysics Data System (ADS)

    Djuana, E.

    2018-03-01

    Folksonomy, as one result of collaborative tagging process, has been acknowledged for its potential in improving categorization and searching of web resources. However, folksonomy contains ambiguities such as synonymy and polysemy as well as different abstractions or generality problem. To maximize its potential, some methods for associating tags of folksonomy with semantics and structural relationships have been proposed such as using ontology learning method. This paper evaluates our previous work in ontology learning according to gold-standard evaluation approach in comparison to a notable state-of-the-art work and several baselines. The results show that our method is comparable to the state-of the art work which further validate our approach as has been previously validated using task-based evaluation approach.

  19. DICOM static and dynamic representation through unified modeling language

    NASA Astrophysics Data System (ADS)

    Martinez-Martinez, Alfonso; Jimenez-Alaniz, Juan R.; Gonzalez-Marquez, A.; Chavez-Avelar, N.

    2004-04-01

    The DICOM standard, as all standards, specifies in generic way the management in network and storage media environments of digital medical images and their related information. However, understanding the specifications for particular implementation is not a trivial work. Thus, this work is about understanding and modelling parts of the DICOM standard using Object Oriented methodologies, as part of software development processes. This has offered different static and dynamic views, according with the standard specifications, and the resultant models have been represented through the Unified Modelling Language (UML). The modelled parts are related to network conformance claim: Network Communication Support for Message Exchange, Message Exchange, Information Object Definitions, Service Class Specifications, Data Structures and Encoding, and Data Dictionary. The resultant models have given a better understanding about DICOM parts and have opened the possibility of create a software library to develop DICOM conformable PACS applications.

  20. Process Improvement in a Radically Changing Organization

    NASA Technical Reports Server (NTRS)

    Varga, Denise M.; Wilson, Barbara M.

    2007-01-01

    This presentation describes how the NASA Glenn Research Center planned and implemented a process improvement effort in response to a radically changing environment. As a result of a presidential decision to redefine the Agency's mission, many ongoing projects were canceled and future workload would be awarded based on relevance to the Exploration Initiative. NASA imposed a new Procedural Requirements standard on all future software development, and the Center needed to redesign its processes from CMM Level 2 objectives to meet the new standard and position itself for CMMI. The intended audience for this presentation is systems/software developers and managers in a large, research-oriented organization that may need to respond to imposed standards while also pursuing CMMI Maturity Level goals. A set of internally developed tools will be presented, including an overall Process Improvement Action Item database, a formal inspection/peer review tool, metrics collection spreadsheet, and other related technologies. The Center also found a need to charter Technical Working Groups (TWGs) to address particular Process Areas. In addition, a Marketing TWG was needed to communicate the process changes to the development community, including an innovative web site portal.

  1. NASA Standard for Models and Simulations: Credibility Assessment Scale

    NASA Technical Reports Server (NTRS)

    Babula, Maria; Bertch, William J.; Green, Lawrence L.; Hale, Joseph P.; Mosier, Gary E.; Steele, Martin J.; Woods, Jody

    2009-01-01

    As one of its many responses to the 2003 Space Shuttle Columbia accident, NASA decided to develop a formal standard for models and simulations (M&S). Work commenced in May 2005. An interim version was issued in late 2006. This interim version underwent considerable revision following an extensive Agency-wide review in 2007 along with some additional revisions as a result of the review by the NASA Engineering Management Board (EMB) in the first half of 2008. Issuance of the revised, permanent version, hereafter referred to as the M&S Standard or just the Standard, occurred in July 2008. Bertch, Zang and Steeleiv provided a summary review of the development process of this standard up through the start of the review by the EMB. A thorough recount of the entire development process, major issues, key decisions, and all review processes are available in Ref. v. This is the second of a pair of papers providing a summary of the final version of the Standard. Its focus is the Credibility Assessment Scale, a key feature of the Standard, including an example of its application to a real-world M&S problem for the James Webb Space Telescope. The companion paper summarizes the overall philosophy of the Standard and an overview of the requirements. Verbatim quotes from the Standard are integrated into the text of this paper, and are indicated by quotation marks.

  2. European standardization effort: interworking the goal

    NASA Astrophysics Data System (ADS)

    Mattheus, Rudy A.

    1993-09-01

    In the European Standardization Committee (CEN), the technical committee responsible for the standardization activities in Medical Informatics (CEN TC 251), has agreed upon the directions of the scopes to follow in this field. They are described in the Directory of the European Standardization Requirements for Healthcare Informatics and Programme for the Development of Standards adopted on 02-28-1991 by CEN/TC 251 and approved by CEN/BT. Top-down objectives describe the common framework and items like terminology, security, more bottom up oriented items describe fields like medical imaging and multi-media. The draft standard is described; the general framework model and object oriented model; the interworking aspects, the relation to ISO standards, and the DICOM proposal. This paper also focuses on all the boundaries in the standardization work, which are also influencing the standardization process.

  3. Quality assurance in military medical research and medical radiation accident management.

    PubMed

    Hotz, Mark E; Meineke, Viktor

    2012-08-01

    The provision of quality radiation-related medical diagnostic and therapeutic treatments cannot occur without the presence of robust quality assurance and standardization programs. Medical laboratory services are essential in patient treatment and must be able to meet the needs of all patients and the clinical personnel responsible for the medical care of these patients. Clinical personnel involved in patient care must embody the quality assurance process in daily work to ensure program sustainability. In conformance with the German Federal Government's concept for modern departmental research, the international standard ISO 9001, one of the relevant standards of the International Organization for Standardization (ISO), is applied in quality assurance in military medical research. By its holistic approach, this internationally accepted standard provides an excellent basis for establishing a modern quality management system in line with international standards. Furthermore, this standard can serve as a sound basis for the further development of an already established quality management system when additional standards shall apply, as for instance in reference laboratories or medical laboratories. Besides quality assurance, a military medical facility must manage additional risk events in the context of early recognition/detection of health risks of military personnel on deployment in order to be able to take appropriate preventive and protective measures; for instance, with medical radiation accident management. The international standard ISO 31000:2009 can serve as a guideline for establishing risk management. Clear organizational structures and defined work processes are required when individual laboratory units seek accreditation according to specific laboratory standards. Furthermore, international efforts to develop health laboratory standards must be reinforced that support sustainable quality assurance, as in the exchange and comparison of test results within the scope of external quality assurance, but also in the exchange of special diagnosis data among international research networks. In summary, the acknowledged standard for a quality management system to ensure quality assurance is the very generic standard ISO 9001.Health Phys. 103(2):221-225; 2012.

  4. 40 CFR Table 1 to Subpart Uuuu of... - Emission Limits and Work Practice Standards

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... cellulosic sponge operation i. reduce total uncontrolled sulfide emissions (reported as carbon disulfide) by... process unit” mean “cellulose food casing, rayon, cellulosic sponge, cellophane, or cellulose ether...

  5. Visual analysis of trash bin processing on garbage trucks in low resolution video

    NASA Astrophysics Data System (ADS)

    Sidla, Oliver; Loibner, Gernot

    2015-03-01

    We present a system for trash can detection and counting from a camera which is mounted on a garbage collection truck. A working prototype has been successfully implemented and tested with several hours of real-world video. The detection pipeline consists of HOG detectors for two trash can sizes, and meanshift tracking and low level image processing for the analysis of the garbage disposal process. Considering the harsh environment and unfavorable imaging conditions, the process works already good enough so that very useful measurements from video data can be extracted. The false positive/false negative rate of the full processing pipeline is about 5-6% at fully automatic operation. Video data of a full day (about 8 hrs) can be processed in about 30 minutes on a standard PC.

  6. Manufacturing Cell Therapies Using Engineered Biomaterials.

    PubMed

    Abdeen, Amr A; Saha, Krishanu

    2017-10-01

    Emerging manufacturing processes to generate regenerative advanced therapies can involve extensive genomic and/or epigenomic manipulation of autologous or allogeneic cells. These cell engineering processes need to be carefully controlled and standardized to maximize safety and efficacy in clinical trials. Engineered biomaterials with smart and tunable properties offer an intriguing tool to provide or deliver cues to retain stemness, direct differentiation, promote reprogramming, manipulate the genome, or select functional phenotypes. This review discusses the use of engineered biomaterials to control human cell manufacturing. Future work exploiting engineered biomaterials has the potential to generate manufacturing processes that produce standardized cells with well-defined critical quality attributes appropriate for clinical testing. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Working conditions and effects of ISO 9000 in six furniture-making companies: implementation and processes.

    PubMed

    Karltun, J; Axelsson, J; Eklund, J

    1998-08-01

    What effects will the implementation of the quality standard ISO 9000 have regarding working conditions and competitive advantages? Which are the most important change process characteristics for assuring improved working conditions and other desired effects? These are the main questions behind this study of six furniture-making companies which implemented ISO 9000 during the period 1991-1994. The results show that customer requirement was the dominant goal to implement ISO 9000. Five of the six companies succeeded in gaining certification. The influence on working conditions was limited, but included better order and housekeeping, more positive attitudes towards discussing quality shortcomings, a few workplace improvements, work enrichment caused by additional tasks within the quality system and a better understanding of external customer demands. Among the negative effects were new, apparently meaningless, tasks for individual workers as well as more stress and more physically strenuous work. The effects on the companies included a decrease in external quality-related costs and improved delivery precision. The study confirms the importance for efficient change of the design of the change process, and identifies 'improvement methodology' as the most important process characteristic. Improved working conditions are enhanced by added relevant strategic goals and by a participative implementation process.

  8. Proposed minimum reporting standards for chemical analysis Chemical Analysis Working Group (CAWG) Metabolomics Standards Initiative (MSI)

    PubMed Central

    Amberg, Alexander; Barrett, Dave; Beale, Michael H.; Beger, Richard; Daykin, Clare A.; Fan, Teresa W.-M.; Fiehn, Oliver; Goodacre, Royston; Griffin, Julian L.; Hankemeier, Thomas; Hardy, Nigel; Harnly, James; Higashi, Richard; Kopka, Joachim; Lane, Andrew N.; Lindon, John C.; Marriott, Philip; Nicholls, Andrew W.; Reily, Michael D.; Thaden, John J.; Viant, Mark R.

    2013-01-01

    There is a general consensus that supports the need for standardized reporting of metadata or information describing large-scale metabolomics and other functional genomics data sets. Reporting of standard metadata provides a biological and empirical context for the data, facilitates experimental replication, and enables the re-interrogation and comparison of data by others. Accordingly, the Metabolomics Standards Initiative is building a general consensus concerning the minimum reporting standards for metabolomics experiments of which the Chemical Analysis Working Group (CAWG) is a member of this community effort. This article proposes the minimum reporting standards related to the chemical analysis aspects of metabolomics experiments including: sample preparation, experimental analysis, quality control, metabolite identification, and data pre-processing. These minimum standards currently focus mostly upon mass spectrometry and nuclear magnetic resonance spectroscopy due to the popularity of these techniques in metabolomics. However, additional input concerning other techniques is welcomed and can be provided via the CAWG on-line discussion forum at http://msi-workgroups.sourceforge.net/ or http://Msi-workgroups-feedback@lists.sourceforge.net. Further, community input related to this document can also be provided via this electronic forum. PMID:24039616

  9. Drafting standards on cognitive accessibility: a global collaboration.

    PubMed

    Steel, Emily J; Janeslätt, Gunnel

    2017-05-01

    The International Organization for Standardization (ISO) is working on accessibility of products to support people with cognitive impairment. Working Group 10, within the technical committee 173 (assistive products for persons with disability) was formed in 2014 to draft standards for assistive products that support people with cognitive impairment. This article explains the scope and purpose of the working group and the context for its formation, and describes the plans and process for drafting and publishing new international standards. The proposed suite of standards is presented, with examples from a draft standard on daily time management. It draws on international research evidence for the effectiveness of assistive products designed to support time management in people with cognitive impairment. Examples of assistive products and their key features are provided based on domains of time as defined in the International Classification of Functioning, Disability and Health for Children and Youth (ICF-CY). The proposed standards will provide design recommendations for features and functions that increase the accessibility of products used by people with cognitive impairment. They are intended to be used by designers, manufactures, educators and services providers, to facilitate their commitment to inclusion and demonstrate their willingness to work with accessibility regulation. Implications for Rehabilitation New standards based on universal design (UD) principles can guide the design of more user-friendly assistive products for people with cognitive impairment. Greater usability of assistive products, whether mainstream or specially-designed, will make it easier for practitioners to find and introduce assistive solutions to individuals with cognitive impairment. Greater usability of assistive products for daily time management can decrease the need for user training and support and enable participation.

  10. An International Collaborative Standardizing a Comprehensive Patient-Centered Outcomes Measurement Set for Colorectal Cancer.

    PubMed

    Zerillo, Jessica A; Schouwenburg, Maartje G; van Bommel, Annelotte C M; Stowell, Caleb; Lippa, Jacob; Bauer, Donna; Berger, Ann M; Boland, Gilles; Borras, Josep M; Buss, Mary K; Cima, Robert; Van Cutsem, Eric; van Duyn, Eino B; Finlayson, Samuel R G; Hung-Chun Cheng, Skye; Langelotz, Corinna; Lloyd, John; Lynch, Andrew C; Mamon, Harvey J; McAllister, Pamela K; Minsky, Bruce D; Ngeow, Joanne; Abu Hassan, Muhammad R; Ryan, Kim; Shankaran, Veena; Upton, Melissa P; Zalcberg, John; van de Velde, Cornelis J; Tollenaar, Rob

    2017-05-01

    Global health systems are shifting toward value-based care in an effort to drive better outcomes in the setting of rising health care costs. This shift requires a common definition of value, starting with the outcomes that matter most to patients. The International Consortium for Health Outcomes Measurement (ICHOM), a nonprofit initiative, was formed to define standard sets of outcomes by medical condition. In this article, we report the efforts of ICHOM's working group in colorectal cancer. The working group was composed of multidisciplinary oncology specialists in medicine, surgery, radiation therapy, palliative care, nursing, and pathology, along with patient representatives. Through a modified Delphi process during 8 months (July 8, 2015 to February 29, 2016), ICHOM led the working group to a consensus on a final recommended standard set. The process was supported by a systematic PubMed literature review (1042 randomized clinical trials and guidelines from June 3, 2005, to June 3, 2015), a patient focus group (11 patients with early and metastatic colorectal cancer convened during a teleconference in August 2015), and a patient validation survey (among 276 patients with and survivors of colorectal cancer between October 15, 2015, and November 4, 2015). After consolidating findings of the literature review and focus group meeting, a list of 40 outcomes was presented to the WG and underwent voting. The final recommendation includes outcomes in the following categories: survival and disease control, disutility of care, degree of health, and quality of death. Selected case-mix factors were recommended to be collected at baseline to facilitate comparison of results across treatments and health care professionals. A standardized set of patient-centered outcome measures to inform value-based health care in colorectal cancer was developed. Pilot efforts are under way to measure the standard set among members of the working group.

  11. Aeromedical Disposition and Waiver Consideration for ISS Crewmembers

    NASA Technical Reports Server (NTRS)

    Taddeo, Terrance

    2012-01-01

    Aeromedical certification of astronauts and cosmonauts traveling to the International Space Station is a multi?-tiered process that involv es standards agreed to by the partner agencies, and participation by the individual agency aeromedical boards and a multilateral space medi cine board. Medical standards are updated continually by a multilater al working group. The boards operate by consensus and strive to achie ve effective decision making through experience, medical judgment, medical evidence and risk modeling. The aim of the certification process is to minimize the risk to the ISS program of loss of mission object ives due to human health issues.

  12. 40 CFR Table 2 to Subpart Ffff of... - Emission Limits and Work Practice Standards for Batch Process Vents

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Pollutants: Miscellaneous Organic Chemical Manufacturing Pt. 63, Subpt. FFFF, Table 2 Table 2 to Subpart FFFF... Group 1 batch process vents a. Reduce collective uncontrolled organic HAP emissions from the sum of all... a flare); or Not applicable. b. Reduce collective uncontrolled organic HAP emissions from the sum of...

  13. An assessment model for quality management

    NASA Astrophysics Data System (ADS)

    Völcker, Chr.; Cass, A.; Dorling, A.; Zilioli, P.; Secchi, P.

    2002-07-01

    SYNSPACE together with InterSPICE and Alenia Spazio is developing an assessment method to determine the capability of an organisation in the area of quality management. The method, sponsored by the European Space Agency (ESA), is called S9kS (SPiCE- 9000 for SPACE). S9kS is based on ISO 9001:2000 with additions from the quality standards issued by the European Committee for Space Standardization (ECSS) and ISO 15504 - Process Assessments. The result is a reference model that supports the expansion of the generic process assessment framework provided by ISO 15504 to nonsoftware areas. In order to be compliant with ISO 15504, requirements from ISO 9001 and ECSS-Q-20 and Q-20-09 have been turned into process definitions in terms of Purpose and Outcomes, supported by a list of detailed indicators such as Practices, Work Products and Work Product Characteristics. In coordination with this project, the capability dimension of ISO 15504 has been revised to be consistent with ISO 9001. As contributions from ISO 9001 and the space quality assurance standards are separable, the stripped down version S9k offers organisations in all industries an assessment model based solely on ISO 9001, and is therefore interesting to all organisations, which intend to improve their quality management system based on ISO 9001.

  14. Quality standards in a rheumatology Day-Care Hospital Unit. The proposal of the Spanish Society of Rheumatology Day Hospitals' Working Group.

    PubMed

    García-Vicuña, Rosario; Montoro, María; Egües Dubuc, César Antonio; Bustabad Reyes, Sagrario; Gómez-Centeno, Antonio; Muñoz-Fernández, Santiago; Pérez Pampín, Eva; Román Ivorra, Jose Andrés; Balsa, Alejandro; Loza, Estíbaliz

    2014-01-01

    In recent years, the Rheumatology Day-Care Hospital Units (DHU have undergone extensive development. However, the quality standards are poorly documented and mainly limited to structure items rather than including broad and specific areas of this specialty. To develop specific quality standards for Rheumatology DHU. After a systematic review of the literature and related documents, a working group (WG) involving 8 DHU-experienced rheumatologists developed an initial proposal of the quality standards, under the supervision of an expert methodologist. A second round was held by the WG group to review the initial proposal and to consider further suggestions. Once the content was agreed upon by consensus, a final report was prepared. 17 structure standards, 25 process standards and 10 results standards were defined, with special emphasis on specific aspects of the Rheumatology DHU. The proposal includes: 1) essential standards to 2) excellent standards, 3) a Rheumatology DHU services portfolio and 4) performance criteria. The proposed quality standards are the basis for developing the indicators and other management tools for Rheumatology DHU, thereby ensuring a patient-oriented practice based on both the evidence and the experience. Copyright © 2013 Elsevier España, S.L.U. All rights reserved.

  15. Employment training for disadvantaged or dependent populations.

    PubMed

    Stern, H

    1982-01-01

    The vocational rehabilitation process is viewed as having two dominant work-related components: the actual work-training experience and employability skills. The paper argues that both components are critical and must be integrated. The major role of the vocational rehabilitation agency is viewed as that of provider of employability (or job-seeking) skills programs. These programs consist of: (1) employability skills courses, (2) work performance demand standard setting, and (3) on-the-job rotational task schemes. Actual work skills can only be provided in the "real world" of work. Centralized work-training programs are viewed as creating inappropriate socialization and only moderately transferable skills.

  16. Examining the Learning Outcomes Included in the Turkish Science Curriculum in Terms of Science Process Skills: A Document Analysis with Standards-Based Assessment

    ERIC Educational Resources Information Center

    Duruk, Umit; Akgün, Abuzer; Dogan, Ceylan; Gülsuyu, Fatma

    2017-01-01

    Science process skills have provided a valuable chance for everyone to construct their own knowledge by means of scientific inquiry. If students are to understand what science is and how it actually works, then they should necessarily make use of their science process skills as well as scientific content knowledge compulsory to be learned in any…

  17. Process Definition and Modeling Guidebook. Version 01.00.02

    DTIC Science & Technology

    1992-12-01

    material (and throughout the guidebook)process defnition is considered to be the act of representing the important characteristics of a process in a...characterized by software standards and guidelines, software inspections and reviews, and more formalized testing (including test plans, test sup- port tools...paper-based approach works well for training, examples, and possibly even small pilot projects and case studies. However, large projects will benefit from

  18. A whole process quality control system for energy measuring instruments inspection based on IOT technology

    NASA Astrophysics Data System (ADS)

    Yin, Bo; Liu, Li; Wang, Jiahan; Li, Xiran; Liu, Zhenbo; Li, Dewei; Wang, Jun; Liu, Lu; Wu, Jun; Xu, Tingting; Cui, He

    2017-10-01

    Electric energy measurement as a basic work, an accurate measurements play a vital role for the economic interests of both parties of power supply, the standardized management of the measurement laboratory at all levels is a direct factor that directly affects the fairness of measurement. Currently, the management of metering laboratories generally uses one-dimensional bar code as the recognition object, advances the testing process by manual management, most of the test data requires human input to generate reports. There are many problems and potential risks in this process: Data cannot be saved completely, cannot trace the status of inspection, the inspection process isn't completely controllable and so on. For the provincial metrology center's actual requirements of the whole process management for the performance test of the power measuring appliances, using of large-capacity RF tags as a process management information media, we developed a set of general measurement experiment management system, formulated a standardized full performance test process, improved the raw data recording mode of experimental process, developed a storehouse automatic inventory device, established a strict test sample transfer and storage system, ensured that all the raw data of the inspection can be traced back, achieved full life-cycle control of the sample, significantly improved the quality control level and the effectiveness of inspection work.

  19. Library Information-Processing System

    NASA Technical Reports Server (NTRS)

    1985-01-01

    System works with Library of Congress MARC II format. System composed of subsystems that provide wide range of library informationprocessing capabilities. Format is American National Standards Institute (ANSI) format for machine-readable bibliographic data. Adaptable to any medium-to-large library.

  20. 40 CFR 63.7891 - How do I demonstrate initial compliance with the emissions limitations and work practice...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL... Standards for Hazardous Air Pollutants: Site Remediation Process Vents § 63.7891 How do I demonstrate...

  1. 40 CFR 63.7893 - How do I demonstrate continuous compliance with the emissions limitations and work practice...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ....7893 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED... Emission Standards for Hazardous Air Pollutants: Site Remediation Process Vents § 63.7893 How do I...

  2. 40 CFR 63.7891 - How do I demonstrate initial compliance with the emissions limitations and work practice...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ....7891 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED... Emission Standards for Hazardous Air Pollutants: Site Remediation Process Vents § 63.7891 How do I...

  3. 40 CFR 63.7893 - How do I demonstrate continuous compliance with the emissions limitations and work practice...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ....7893 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED... Emission Standards for Hazardous Air Pollutants: Site Remediation Process Vents § 63.7893 How do I...

  4. 40 CFR 63.7893 - How do I demonstrate continuous compliance with the emissions limitations and work practice...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ....7893 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED... Emission Standards for Hazardous Air Pollutants: Site Remediation Process Vents § 63.7893 How do I...

  5. 40 CFR 63.7891 - How do I demonstrate initial compliance with the emissions limitations and work practice...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ....7891 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED... Emission Standards for Hazardous Air Pollutants: Site Remediation Process Vents § 63.7891 How do I...

  6. 40 CFR 63.7891 - How do I demonstrate initial compliance with the emissions limitations and work practice...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ....7891 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED... Emission Standards for Hazardous Air Pollutants: Site Remediation Process Vents § 63.7891 How do I...

  7. 40 CFR 63.7893 - How do I demonstrate continuous compliance with the emissions limitations and work practice...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ....7893 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED... Emission Standards for Hazardous Air Pollutants: Site Remediation Process Vents § 63.7893 How do I...

  8. Guidance on Systematic Planning Using the Data Quality Objectives Process, EPA QA/G-4

    EPA Pesticide Factsheets

    Provides a standard working tool for project managers and planners to develop DQO for determining the type, quantity, and quality of data needed to reach defensible decisions or make credible estimates.

  9. Implementation of the fugitive emissions system program: The OxyChem experience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deshmukh, A.

    An overview is provided for the Fugitive Emissions System (FES) that has been implemented at Occidental Chemical in conjunction with the computer-based maintenance system called PassPort{reg_sign} developed by Indus Corporation. The goal of PassPort{reg_sign} FES program has been to interface with facilities data, equipment information, work standards and work orders. Along the way, several implementation hurdles had to be overcome before a monitoring and regulatory system could be standardized for the appropriate maintenance, process and environmental groups. This presentation includes step-by-step account of several case studies that developed during the implementation of the FES system.

  10. Forensic entomology: implementing quality assurance for expertise work.

    PubMed

    Gaudry, Emmanuel; Dourel, Laurent

    2013-09-01

    The Department of Forensic Entomology (Institut de Recherche Criminelle de la Gendarmerie Nationale, France) was accredited by the French Committee of Accreditation (Cofrac's Healthcare section) in October 2007 on the basis of NF EN ISO/CEI 17025 standard. It was the first accreditation in this specific field of forensic sciences in France and in Europe. The present paper introduces the accreditation process in forensic entomology (FE) through the experience of the Department of Forensic Entomology. Based upon the identification of necrophagous insects and the study of their biology, FE must, as any other expertise work in forensic sciences, demonstrate integrity and good working practice to satisfy both the courts and the scientific community. FE does not, strictly speaking, follow an analytical method. This could explain why, to make up for a lack of appropriate quality reference, a specific documentation was drafted and written by the staff of the Department of Forensic Entomology in order to define working methods complying with quality standards (testing methods). A quality assurance system is laborious to set up and maintain and can be perceived as complex, time-consuming and never-ending. However, a survey performed in 2011 revealed that the accreditation process in the frame of expertise work has led to new well-defined working habits, based on an effort at transparency. It also requires constant questioning and a proactive approach, both profitable for customers (magistrates, investigators) and analysts (forensic entomologists).

  11. Preparing skilled labor in industry through production-based curriculum approach in vocational high school

    NASA Astrophysics Data System (ADS)

    Yoto

    2017-09-01

    Vocational high school (Sekolah Menengah Kejuruan / SMK) aims to prepare mid-level skilled labors to work in the industry and are able to create self-employment opportunities. For those reasons, the curriculum in SMK should be based on meeting the needs of the industries and is able to prepare learners to master the competence in accordance with the skills program of their choice. Production based curriculum is the curriculum which the learning process is designed together with the production process or using production process as a learning medium. This approach with the primary intention to introduce students with the real working environment and not merely simulations. In the production-based curriculum implementation model, students are directly involved in the industry through the implementation of industrial working practices, do work on production units in school, and do practical work in school by doing the job as done in the industry by using industry standards machines.

  12. Defense Finance and Accounting Service Work on the Navy Defense Business Operations Fund FY 1995 Financial Statements

    DTIC Science & Technology

    1996-11-22

    consolidation of financial statements , and for an automated process to transfer financial statement data from the Central Data Base to a... consolidation of financial statements . The Deputy Chief Financial Officer also indicated that the DFAS Cleveland Center approved a system change request...ently is developing Standard Operating Procedures to ensure consistency and standardization in the adjustment and consolidation of financial statements .

  13. The Development of a Dental Diagnostic Terminology

    PubMed Central

    Kalenderian, Elsbeth; Ramoni, Rachel L.; White, Joel M.; Schoonheim-Klein, Meta E.; Stark, Paul C.; Kimmes, Nicole S.; Zeller, Gregory G.; Willis, George P.; Walji, Muhammad F.

    2011-01-01

    There is no commonly accepted standardized terminology for oral diagnoses. The purpose of this article is to report the development of a standardized dental diagnostic terminology by a work group of dental faculty members. The work group developed guiding principles for decision making and adhered to principles of terminology development. The members used an iterative process to develop a terminology incorporating concepts represented in the Toronto/University of California, San Francisco/Creighton University and International Classification of Diseases (ICD)-9/10 codes and periodontal and endodontic diagnoses. Domain experts were consulted to develop a final list of diagnostic terms. A structure was developed, consisting of thirteen categories, seventy-eight subcategories, and 1,158 diagnostic terms, hierarchically organized and mappable to other terminologies and ontologies. Use of this standardized diagnostic terminology will reinforce the diagnosis-treatment link and will facilitate clinical research, quality assurance, and patient communication. Future work will focus on implementation and approaches to enhance the validity and reliability of diagnostic term utilization. PMID:21205730

  14. 78 FR 23289 - Public Review of Draft National Shoreline Data Content Standard

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-18

    ...The Federal Geographic Data Committee (FGDC) is conducting a public review of the draft National Shoreline Data Content Standard. The FGDC has developed a draft National Shoreline Data Content Standard that provides a framework for shoreline data development, sharing of data, and shoreline data transformation and fusion. The FGDC Coastal and Marine Spatial Data Subcommittee, chaired by the National Oceanic and Atmospheric Administration (NOAA), sponsored development of the draft standard. The FGDC Coordination Group, comprised of representatives of Federal agencies, has approved releasing this draft standard for public review and comment. The draft National Shoreline Data Content Standard defines attributes or elements that are common for shoreline data development and provides suggested domains for the elements. The functional scope includes definition of data models, schemas, entities, relationships, definitions, and crosswalks to related standards. The draft National Shoreline Data Content Standard is intended to enhance the shoreline framework by providing technical guidance on shoreline semantics, data structures and their relationships to builders and users of shoreline data. The geographical scope of the draft standard comprises all shorelines of navigable waters within the United States and its territories. The primary intended users of the National Shoreline Data Content Standard are the mapping, shoreline engineering, coastal zone management, flood insurance, and natural resource management communities. The FGDC invites all stakeholders to comment on this draft standard to ensure that it meets their needs. The draft National Shoreline Data Content Standard may be downloaded from https://www.fgdc.gov/standards/projects/FGDC-standards- projects/shoreline-data-content/ revisedDraftNationalShorelineDataContentStandard. Comments shall be submitted using the content template at http://www.fgdc.gov/standards/ process/standards-directives/template.doc. Instructions for completing the comment template are found in FGDC Standards Directive 2d, Standards Working Group Review Guidelines: Review Comment Template, http://www.fgdc.gov/standards/process/standards-directives/directive- 2d-standards-working-group-review-guidelines-review-comment-template. Comments that concern specific issues/changes/additions may result in revisions to the National Shoreline Data Content Standard. Reviewers may obtain information about how comments were addressed upon request. After formal endorsement of the standard by the FGDC, the National Shoreline Data Content Standard and a summary analysis of the changes will be made available to the public on the FGDC Web site, www.fgdc.gov.

  15. Writing standard operating procedures (SOPs) for cryostorage protocols: using shoot meristem cryopreservation as an example.

    PubMed

    Harding, Keith; Benson, Erica E

    2015-01-01

    Standard operating procedures are a systematic way of making sure that biopreservation processes, tasks, protocols, and operations are correctly and consistently performed. They are the basic documents of biorepository quality management systems and are used in quality assurance, control, and improvement. Methodologies for constructing workflows and writing standard operating procedures and work instructions are described using a plant cryopreservation protocol as an example. This chapter is pertinent to other biopreservation sectors because how methods are written, interpreted, and implemented can affect the quality of storage outcomes.

  16. Standard operating procedures for clinical research departments.

    PubMed

    Kee, Ashley Nichole

    2011-01-01

    A set of standard operating procedures (SOPs) provides a clinical research department with clear roles, responsibilities, and processes to ensure compliance, accuracy, and timeliness of data. SOPs also serve as a standardized training program for new employees. A practice may have an employee that can assist in the development of SOPs. There are also consultants that specialize in working with a practice to develop and write practice-specific SOPs. Making SOPs a priority will save a practice time and money in the long run and make the research practice more attractive to corporate study sponsors.

  17. NDT standards from the perspective of the Department of Defense

    NASA Astrophysics Data System (ADS)

    Strauss, Bernard

    1992-09-01

    The interaction of the DoD non-Government Society (NGS) bodies in the area of nondestructive testing (NDT) are illustrated. The adoption process for NGS is outlined including the criteria for adoption, what adoption means, and the advantages of DoD/NGS interaction. The tasks of the DoD's Standardization Program Plan for NDT are described along with DoD's efforts on a Joint Army, Navy, Air Force (JANNAF) NDE Subcommittee and on an international standardization group (America, Britain, Canada, and Australia) called the Quadripartite Working Group on Proofing, Inspection, and Quality Assurance.

  18. A CMOS microdisplay with integrated controller utilizing improved silicon hot carrier luminescent light sources

    NASA Astrophysics Data System (ADS)

    Venter, Petrus J.; Alberts, Antonie C.; du Plessis, Monuko; Joubert, Trudi-Heleen; Goosen, Marius E.; Janse van Rensburg, Christo; Rademeyer, Pieter; Fauré, Nicolaas M.

    2013-03-01

    Microdisplay technology, the miniaturization and integration of small displays for various applications, is predominantly based on OLED and LCoS technologies. Silicon light emission from hot carrier electroluminescence has been shown to emit light visibly perceptible without the aid of any additional intensification, although the electrical to optical conversion efficiency is not as high as the technologies mentioned above. For some applications, this drawback may be traded off against the major cost advantage and superior integration opportunities offered by CMOS microdisplays using integrated silicon light sources. This work introduces an improved version of our previously published microdisplay by making use of new efficiency enhanced CMOS light emitting structures and an increased display resolution. Silicon hot carrier luminescence is often created when reverse biased pn-junctions enter the breakdown regime where impact ionization results in carrier transport across the junction. Avalanche breakdown is typically unwanted in modern CMOS processes. Design rules and process design are generally tailored to prevent breakdown, while the voltages associated with breakdown are too high to directly interact with the rest of the CMOS standard library. This work shows that it is possible to lower the operating voltage of CMOS light sources without compromising the optical output power. This results in more efficient light sources with improved interaction with other standard library components. This work proves that it is possible to create a reasonably high resolution microdisplay while integrating the active matrix controller and drivers on the same integrated circuit die without additional modifications, in a standard CMOS process.

  19. Ar+ and CuBr laser-assisted chemical bleaching of teeth: estimation of whiteness degree

    NASA Astrophysics Data System (ADS)

    Dimitrov, S.; Todorovska, Roumyana; Gizbrecht, Alexander I.; Raychev, L.; Petrov, Lyubomir P.

    2003-11-01

    In this work the results of adaptation of impartial methods for color determination aimed at developing of techniques for estimation of human teeth whiteness degree, sufficiently handy for common use in clinical practice are presented. For approbation and by the way of illustration of the techniques, standards of teeth colors were used as well as model and naturally discolored human teeth treated by two bleaching chemical compositions activated by three light sources each: Ar+ and CuBr lasers, and a standard halogen photopolymerization lamp. Typical reflection and fluorescence spectra of some samples are presented; the samples colors were estimated by a standard computer processing in RGB and B coordinates. The results of the applied spectral and colorimetric techniques are in a good agreement with those of the standard computer processing of the corresponding digital photographs and complies with the visually estimated degree of the teeth whiteness judged according to the standard reference scale commonly used in the aesthetic dentistry.

  20. Comparison of data used for setting occupational exposure limits.

    PubMed

    Schenk, Linda

    2010-01-01

    It has previously been shown that occupational exposure limits (OELs) for the same substance can vary significantly between different standard-setters. The work presented in this paper identifies the steps in the process towards establishing an OEL and how variations in those processes could account for these differences. This study selects for further scrutiny substances for which the level of OELs vary by a factor of 100, focussing on 45 documents concerning 14 substances from eight standard-setters. Several of the OELs studied were more than 20 years old and based on outdated knowledge. Furthermore, different standard-setters sometimes based their OELs on different sets of data, and data availability alone could not explain all differences in the selection of data sets used by standard-setters. While the interpretation of key studies did not differ significantly in standard-setters' documentations, the evaluations of the key studies' quality did. Also, differences concerning the critical effect coincided with differences in the level of OELs for half of the substances.

  1. BIM integration in education: A case study of the construction technology project Bolt Tower Dolni Vitkovice

    NASA Astrophysics Data System (ADS)

    Venkrbec, Vaclav; Bittnerova, Lucie

    2017-12-01

    Building information modeling (BIM) can support effectiveness during many activities in the AEC industry. even when processing a construction-technological project. This paper presents an approach how to use building information model in higher education, especially during the work on diploma thesis and it supervision. Diploma thesis is project based work, which aims to compile a construction-technological project for a selected construction. The paper describes the use of input data, working with them and compares this process with standard input data such as printed design documentation. The effectiveness of using the building information model as a input data for construction-technological project is described in the conclusion.

  2. Standards for the Analysis and Processing of Surface-Water Data and Information Using Electronic Methods

    USGS Publications Warehouse

    Sauer, Vernon B.

    2002-01-01

    Surface-water computation methods and procedures are described in this report to provide standards from which a completely automated electronic processing system can be developed. To the greatest extent possible, the traditional U. S. Geological Survey (USGS) methodology and standards for streamflow data collection and analysis have been incorporated into these standards. Although USGS methodology and standards are the basis for this report, the report is applicable to other organizations doing similar work. The proposed electronic processing system allows field measurement data, including data stored on automatic field recording devices and data recorded by the field hydrographer (a person who collects streamflow and other surface-water data) in electronic field notebooks, to be input easily and automatically. A user of the electronic processing system easily can monitor the incoming data and verify and edit the data, if necessary. Input of the computational procedures, rating curves, shift requirements, and other special methods are interactive processes between the user and the electronic processing system, with much of this processing being automatic. Special computation procedures are provided for complex stations such as velocity-index, slope, control structures, and unsteady-flow models, such as the Branch-Network Dynamic Flow Model (BRANCH). Navigation paths are designed to lead the user through the computational steps for each type of gaging station (stage-only, stagedischarge, velocity-index, slope, rate-of-change in stage, reservoir, tide, structure, and hydraulic model stations). The proposed electronic processing system emphasizes the use of interactive graphics to provide good visual tools for unit values editing, rating curve and shift analysis, hydrograph comparisons, data-estimation procedures, data review, and other needs. Documentation, review, finalization, and publication of records are provided for with the electronic processing system, as well as archiving, quality assurance, and quality control.

  3. Sup wit Eval Ext?

    ERIC Educational Resources Information Center

    Patton, Michael Quinn

    2008-01-01

    Extension and evaluation share some similar challenges, including working with diverse stakeholders, parallel processes for focusing priorities, meeting common standards of excellence, and adapting to globalization, new technologies, and changing times. Evaluations of extension programs have helped clarify how change occurs, especially the…

  4. 17 CFR 256.154 - Materials and supplies.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... attributable to work orders for service company property in process of construction shall be charged to account... cumulative average, first-in-first-out, or such other method of inventory accounting as conforms with generally accepted accounting standards consistently applied. ...

  5. Development of uniform standards for allowable lane closure : final report, September 2008.

    DOT National Transportation Integrated Search

    2008-09-01

    Procedures for determining allowable lane closure hours to perform maintenance, construction, resurfacing, regional permit and major access permit work on the state highway system were evaluated. The current process involves the collection of traffic...

  6. WaterML, an Information Standard for the Exchange of in-situ hydrological observations

    NASA Astrophysics Data System (ADS)

    Valentine, D.; Taylor, P.; Zaslavsky, I.

    2012-04-01

    The WaterML 2.0 Standards Working Group (SWG), working within the Open Geospatial Consortium (OGC) and in cooperation with the joint OGC-World Meteorological Organization (WMO) Hydrology Domain Working Group (HDWG), has developed an open standard for the exchange of water observation data; WaterML 2.0. The focus of the standard is time-series data, commonly generated from in-situ style monitoring. This is high value data for hydrological applications such as flood forecasting, environmental reporting and supporting hydrological infrastructure (e.g. dams, supply systems), which is commonly exchanged, but a lack of standards inhibits efficient reuse and automation. The process of developing WaterML required doing a harmonization analysis of existing standards to identify overlapping concepts and come to agreement on a harmonized definition. Generally the formats captured similar requirements, all with subtle differences, such as how time-series point metadata was handled. The in-progress standard WaterML 2.0 incorporates the semantics of the hydrologic information: location, procedure, and observations, and is implemented as an application schema of the Geography Markup Language version 3.2.1, making use of the OGC Observations & Measurements standards. WaterML2.0 is designed as an extensible schema to allow encoding of data to be used in a variety of exchange scenarios. Example areas of usage are: exchange of data for operational hydrological monitoring programs; supporting operation of infrastructure (e.g. dams, supply systems); cross-border exchange of observational data; release of data for public dissemination; enhancing disaster management through data exchange; and exchange in support of national reporting The first phase of WaterML2.0 focused on structural definitions allowing for the transfer of time-series, with less work on harmonization of vocabulary items such as quality codes. Vocabularies from various organizations tend to be specific and take time to come to agreement on. This will be continued in future work for the HDWG, along with extending the information model to cover additional types of hydrologic information: rating and gauging information, and water quality. Rating curves, gaugings and river cross sections are commonly exchanged in addition to standard time-series data to allow information relating to conversions such as river level to discharge. Members of the HDWG plan to initiate this work in early 2012. Water quality data is varied in the way it is processed and in the number of phenomena it measures. It will require specific components of extension to the WaterML2.0 model, most likely making use of the specimen types within O&M and extensive use of controlled vocabularies. Other future work involves different target encodings for the WaterML2.0 conceptual model, such as JSON, netCDF, CSV etc. are optimized for particular needs, such as efficiency in size of the encoding and parsing of structure, but may not be capable of representing the full extent of the WaterML2.0 information model. Certain encodings are best matched for particular needs; the community has begun investigation into when and how best to implement these.

  7. Software And Systems Engineering Risk Management

    DTIC Science & Technology

    2010-04-01

    RSKM 2004 COSO Enterprise RSKM Framework 2006 ISO/IEC 16085 Risk Management Process 2008 ISO/IEC 12207 Software Lifecycle Processes 2009 ISO/IEC...1 Software And Systems Engineering Risk Management John Walz VP Technical and Conferences Activities, IEEE Computer Society Vice-Chair Planning...Software & Systems Engineering Standards Committee, IEEE Computer Society US TAG to ISO TMB Risk Management Working Group Systems and Software

  8. Medication room madness: calming the chaos.

    PubMed

    Conrad, Carole; Fields, Willa; McNamara, Tracey; Cone, Maryann; Atkins, Patricia

    2010-01-01

    Nurses work in stressful environments, encountering interruptions and distractions at almost every turn. The aim of this medication safety project was to improve the physical design and organizational layout of the medication room, reduce nurse interruptions and distractions, and create a standard medication process for enhanced patient safety and efficiency. This successful change improved the medication administration process, decreased medication errors, and enhanced nursing satisfaction.

  9. 40 CFR Table 1 to Subpart Uuuu of... - Emission Limits and Work Practice Standards

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... least once per month as specified in § 63.148(f)(2)). 12. heat exchanger system that cools process equipment or materials in the process unit each existing or new affected source monitor and repair the heat exchanger system according to § 63.104(a) through (e), except that references to “chemical manufacturing...

  10. 40 CFR Table 1 to Subpart Uuuu of... - Emission Limits and Work Practice Standards

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... least once per month as specified in § 63.148(f)(2)). 12. heat exchanger system that cools process equipment or materials in the process unit each existing or new affected source monitor and repair the heat exchanger system according to § 63.104(a) through (e), except that references to “chemical manufacturing...

  11. 40 CFR Table 1 to Subpart Uuuu of... - Emission Limits and Work Practice Standards

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... least once per month as specified in § 63.148(f)(2)). 12. heat exchanger system that cools process equipment or materials in the process unit each existing or new affected source monitor and repair the heat exchanger system according to § 63.104(a) through (e), except that references to “chemical manufacturing...

  12. 40 CFR Table 1 to Subpart Uuuu of... - Emission Limits and Work Practice Standards

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... least once per month as specified in § 63.148(f)(2)). 12. heat exchanger system that cools process equipment or materials in the process unit each existing or new affected source monitor and repair the heat exchanger system according to § 63.104(a) through (e), except that references to “chemical manufacturing...

  13. Implementation Cryptography Data Encryption Standard (DES) and Triple Data Encryption Standard (3DES) Method in Communication System Based Near Field Communication (NFC)

    NASA Astrophysics Data System (ADS)

    Ratnadewi; Pramono Adhie, Roy; Hutama, Yonatan; Saleh Ahmar, A.; Setiawan, M. I.

    2018-01-01

    Cryptography is a method used to create secure communication by manipulating sent messages during the communication occurred so only intended party that can know the content of that messages. Some of the most commonly used cryptography methods to protect sent messages, especially in the form of text, are DES and 3DES cryptography method. This research will explain the DES and 3DES cryptography method and its use for stored data security in smart cards that working in the NFC-based communication system. Several things that will be explained in this research is the ways of working of DES and 3DES cryptography method in doing the protection process of a data and software engineering through the creation of application using C++ programming language to realize and test the performance of DES and 3DES cryptography method in encrypted data writing process to smart cards and decrypted data reading process from smart cards. The execution time of the entering and the reading process data using a smart card DES cryptography method is faster than using 3DES cryptography.

  14. [Quality Management System in Pathological Laboratory].

    PubMed

    Koyatsu, Junichi; Ueda, Yoshihiko

    2015-07-01

    Even compared to other clinical laboratories, the pathological laboratory conducts troublesome work, and many of the work processes are also manual. Therefore, the introduction of the systematic management of administration is necessary. It will be a shortcut to use existing standards such as ISO 15189 for this purpose. There is no standard specialized for the pathological laboratory, but it is considered to be important to a pathological laboratory in particular. 1. Safety nianagement of the personnel and environmental conditions. Comply with laws and regulations concerning the handling of hazardous materials. 2. Pre-examination processes. The laboratory shall have documented procedures for the proper collection and handling of primary samples. Developed and documented criteria for acceptance or rejection of samples are applied. 3. Examination processes. Selection, verification, and validation of the examination procedures. Devise a system that can constantly monitor the traceability of the sample. 4. Post-examination processes. Storage, retention, and disposal of clinical samples. 5. Release of results. When examination results fall within established alert or critical intervals, immediately notify the physicians. The important point is to recognize the needs of the client and be aware that pathological diagnoses are always "the final diagnoses".

  15. Accelerated pharmacokinetic map determination for dynamic contrast enhanced MRI using frequency-domain based Tofts model.

    PubMed

    Vajuvalli, Nithin N; Nayak, Krupa N; Geethanath, Sairam

    2014-01-01

    Dynamic Contrast Enhanced Magnetic Resonance Imaging (DCE-MRI) is widely used in the diagnosis of cancer and is also a promising tool for monitoring tumor response to treatment. The Tofts model has become a standard for the analysis of DCE-MRI. The process of curve fitting employed in the Tofts equation to obtain the pharmacokinetic (PK) parameters is time-consuming for high resolution scans. Current work demonstrates a frequency-domain approach applied to the standard Tofts equation to speed-up the process of curve-fitting in order to obtain the pharmacokinetic parameters. The results obtained show that using the frequency domain approach, the process of curve fitting is computationally more efficient compared to the time-domain approach.

  16. Characterisation of Ductile Prepregs

    NASA Astrophysics Data System (ADS)

    Pinto, F.; White, A.; Meo, M.

    2013-04-01

    This study is focused on the analysis of micro-perforated prepregs created from standard, off the shelf prepregs modified by a particular laser process to enhance ductility of prepregs for better formability and drapability. Fibres are shortened through the use of laser cutting in a predetermined pattern intended to maintain alignment, and therefore mechanical properties, yet increase ductility at the working temperature. The increase in ductility allows the product to be more effectively optimised for specific forming techniques. Tensile tests were conducted on several specimens in order to understand the ductility enhancement offered by this process with different micro-perforation patterns over standard prepregs. Furthermore, the effects of forming temperature was also analysed to assess the applicability of this material to hot draping techniques and other heated processes.

  17. Standardization and quality management in next-generation sequencing.

    PubMed

    Endrullat, Christoph; Glökler, Jörn; Franke, Philipp; Frohme, Marcus

    2016-09-01

    DNA sequencing continues to evolve quickly even after > 30 years. Many new platforms suddenly appeared and former established systems have vanished in almost the same manner. Since establishment of next-generation sequencing devices, this progress gains momentum due to the continually growing demand for higher throughput, lower costs and better quality of data. In consequence of this rapid development, standardized procedures and data formats as well as comprehensive quality management considerations are still scarce. Here, we listed and summarized current standardization efforts and quality management initiatives from companies, organizations and societies in form of published studies and ongoing projects. These comprise on the one hand quality documentation issues like technical notes, accreditation checklists and guidelines for validation of sequencing workflows. On the other hand, general standard proposals and quality metrics are developed and applied to the sequencing workflow steps with the main focus on upstream processes. Finally, certain standard developments for downstream pipeline data handling, processing and storage are discussed in brief. These standardization approaches represent a first basis for continuing work in order to prospectively implement next-generation sequencing in important areas such as clinical diagnostics, where reliable results and fast processing is crucial. Additionally, these efforts will exert a decisive influence on traceability and reproducibility of sequence data.

  18. The labor movement's role in gaining federal safety and health standards to protect America's workers.

    PubMed

    Weinstock, Deborah; Failey, Tara

    2014-11-01

    In the United States, unions sometimes joined by worker advocacy groups (e.g., Public Citizen and the American Public Health Association) have played a critical role in strengthening worker safety and health protections. They have sought to improve standards that protect workers by participating in the rulemaking process, through written comments and involvement in hearings; lobbying decision-makers; petitioning the Department of Labor; and defending improved standards in court. Their efforts have culminated in more stringent exposure standards, access to information about the presence of potentially hazardous toxic chemicals, and improved access to personal protective equipment-further improving working conditions in the United States.

  19. Diversification and Challenges of Software Engineering Standards

    NASA Technical Reports Server (NTRS)

    Poon, Peter T.

    1994-01-01

    The author poses certain questions in this paper: 'In the future, should there be just one software engineering standards set? If so, how can we work towards that goal? What are the challenges of internationalizing standards?' Based on the author's personal view, the statement of his position is as follows: 'There should NOT be just one set of software engineering standards in the future. At the same time, there should NOT be the proliferation of standards, and the number of sets of standards should be kept to a minimum.It is important to understand the diversification of the areas which are spanned by the software engineering standards.' The author goes on to describe the diversification of processes, the diversification in the national and international character of standards organizations, the diversification of the professional organizations producing standards, the diversification of the types of businesses and industries, and the challenges of internationalizing standards.

  20. Digital approach to stabilizing optical frequency combs and beat notes of CW lasers

    NASA Astrophysics Data System (ADS)

    Čížek, Martin; Číp, Ondřej; Å míd, Radek; Hrabina, Jan; Mikel, Břetislav; Lazar, Josef

    2013-10-01

    In cases when it is necessary to lock optical frequencies generated by an optical frequency comb to a precise radio frequency (RF) standard (GPS-disciplined oscillator, H-maser, etc.) the usual practice is to implement phase and frequency-locked loops. Such system takes the signal generated by the RF standard (usually 10 MHz or 100 MHz) as a reference and stabilizes the repetition and offset frequencies of the comb contained in the RF output of the f-2f interferometer. These control loops are usually built around analog electronic circuits processing the output signals from photo detectors. This results in transferring the stability of the standard from RF to optical frequency domain. The presented work describes a different approach based on digital signal processing and software-defined radio algorithms used for processing the f-2f and beat-note signals. Several applications of digital phase and frequency locks to a RF standard are demonstrated: the repetition (frep) and offset frequency (fceo) of the comb, and the frequency of the beat note between a CW laser source and a single component of the optical frequency comb spectrum.

  1. Study and validation of tools interoperability in JPSEC

    NASA Astrophysics Data System (ADS)

    Conan, V.; Sadourny, Y.; Jean-Marie, K.; Chan, C.; Wee, S.; Apostolopoulos, J.

    2005-08-01

    Digital imagery is important in many applications today, and the security of digital imagery is important today and is likely to gain in importance in the near future. The emerging international standard ISO/IEC JPEG-2000 Security (JPSEC) is designed to provide security for digital imagery, and in particular digital imagery coded with the JPEG-2000 image coding standard. One of the primary goals of a standard is to ensure interoperability between creators and consumers produced by different manufacturers. The JPSEC standard, similar to the popular JPEG and MPEG family of standards, specifies only the bitstream syntax and the receiver's processing, and not how the bitstream is created or the details of how it is consumed. This paper examines the interoperability for the JPSEC standard, and presents an example JPSEC consumption process which can provide insights in the design of JPSEC consumers. Initial interoperability tests between different groups with independently created implementations of JPSEC creators and consumers have been successful in providing the JPSEC security services of confidentiality (via encryption) and authentication (via message authentication codes, or MACs). Further interoperability work is on-going.

  2. Comments on the Serial Homology and Homologues of Vertebral Lateral Projections in Crocodylia (Eusuchia).

    PubMed

    Gomes de Souza, Rafael

    2018-03-07

    The literature on crocodylian anatomy presents the transverse process in an ambiguous meaning, which could represent all lateral expansions derived from the neural arch, including vertebrae from cervical to caudal series, or in a more restrictive meaning, being applied only to lumbar vertebrae. The lateral expansion of sacral and caudal vertebrae usually referred to as the transverse process has been discovered to be fused ribs, bringing more ambiguity to this term. Therefore, with the lack of a definition for transverse process and other associated terms, the present work aims to propose a nomenclatural standardization, as well as definitions and biological meaning, for vertebral rib related structures. Vertebra obtained from museum collections from a total of 87 specimens of 22 species of all extant Crocodylia genera were studied. All vertebrae, except cervical and first three dorsal, exhibit transverse processes. The transverse process is more developed in dorsal and lumbar vertebrae than in sacral and caudal vertebrae in which it is suppressed by the fused ribs. The serial homology hypotheses here proposed can also be aplied to other Crurotarsi and saurischian dinosaurs specimens. This standardization clarifies the understand of the serial homology among those homotypes, and reduces the ambiguity and misleadings in future work comparisons. Anat Rec, 2018. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  3. Productivity standards for histology laboratories.

    PubMed

    Buesa, René J

    2010-04-01

    The information from 221 US histology laboratories (histolabs) and 104 from 24 other countries with workloads from 600 to 116 000 cases per year was used to calculate productivity standards for 23 technical and 27 nontechnical tasks and for 4 types of work flow indicators. The sample includes 254 human, 40 forensic, and 31 veterinary pathology services. Statistical analyses demonstrate that most productivity standards are not different between services or worldwide. The total workload for the US human pathology histolabs averaged 26 061 cases per year, with 54% between 10 000 and less than 30 000. The total workload for 70% of the histolabs from other countries was less than 20 000, with an average of 15 226 cases per year. The fundamental manual technical tasks in the histolab and their productivity standards are as follows: grossing (14 cases per hour), cassetting (54 cassettes per hour), embedding (50 blocks per hour), and cutting (24 blocks per hour). All the other tasks, each with their own productivity standards, can be completed by auxiliary staff or using automatic instruments. Depending on the level of automation of the histolab, all the tasks derived from a workload of 25 cases will require 15.8 to 17.7 hours of work completed by 2.4 to 2.7 employees with 18% of their working time not directly dedicated to the production of diagnostic slides. This article explains how to extrapolate this productivity calculation for any workload and different levels of automation. The overall performance standard for all the tasks, including 8 hours for automated tissue processing, is 3.2 to 3.5 blocks per hour; and its best indicator is the value of the gross work flow productivity that is essentially dependent on how the work is organized. This article also includes productivity standards for forensic and veterinary histolabs, but the staffing benchmarks for histolabs will be the subject of a separate article. Copyright 2010 Elsevier Inc. All rights reserved.

  4. The Paris System for Reporting Urinary Cytology: The Quest to Develop a Standardized Terminology.

    PubMed

    Barkan, Güliz A; Wojcik, Eva M; Nayar, Ritu; Savic-Prince, Spasenija; Quek, Marcus L; Kurtycz, Daniel F I; Rosenthal, Dorothy L

    2016-07-01

    The main purpose of urine cytology is to detect high-grade urothelial carcinoma. With this principle in mind, The Paris System (TPS) Working Group, composed of cytopathologists, surgical pathologists, and urologists, has proposed and published a standardized reporting system that includes specific diagnostic categories and cytomorphologic criteria for the reliable diagnosis of high-grade urothelial carcinoma. This paper outlines the essential elements of TPS and the process that led to the formation and rationale of the reporting system. TPS Working Group, organized at the 2013 International Congress of Cytology, conceived a standardized platform on which to base cytologic interpretation of urine samples. The widespread dissemination of this approach to cytologic examination and reporting of urologic samples and the scheme's universal acceptance by pathologists and urologists is critical for its success. For urologists, understanding the diagnostic criteria, their clinical implications, and limitations of TPS is essential if they are to utilize urine cytology and noninvasive ancillary tests in a thoughtful and practical manner. This is the first international/inclusive attempt at standardizing urinary cytology. The success of TPS will depend on the pathology and urology communities working collectively to improve this seminal paradigm shift, and optimize the impact on patient care.

  5. Algorithms for Computation of Fundamental Properties of Seawater. Endorsed by Unesco/SCOR/ICES/IAPSO Joint Panel on Oceanographic Tables and Standards and SCOR Working Group 51. Unesco Technical Papers in Marine Science, No. 44.

    ERIC Educational Resources Information Center

    Fofonoff, N. P.; Millard, R. C., Jr.

    Algorithms for computation of fundamental properties of seawater, based on the practicality salinity scale (PSS-78) and the international equation of state for seawater (EOS-80), are compiled in the present report for implementing and standardizing computer programs for oceanographic data processing. Sample FORTRAN subprograms and tables are given…

  6. Office Requirements in the Portland Standard Metropolitan Area

    ERIC Educational Resources Information Center

    Robertson, Leonard

    1975-01-01

    The findings evolved from questionnaires received from 204 firms pertaining to needed skills in spelling, typewriting, automatic typewriting, calculating machines, transcription machines, shorthand, and work processing, as well as to attributes of job attendance, cooperation, courtesy, telephone personality and appearance. (Author)

  7. 76 FR 14807 - Delegation of National Emission Standards for Hazardous Air Pollutants for Source Categories...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-18

    ... Reforming, and Sulfur Recovery Units. VVV Publicly Owned Treatment X X X Works. XXX Ferroalloys Production.... LLLLL Asphalt Roofing and X X X Processing. MMMMM Flexible Polyurethane X X X Foam Fabrication Operation...

  8. Envisioning Transformation in VA Mental Health Services Through Collaborative Site Visits.

    PubMed

    Kearney, Lisa K; Schaefer, Jeanne A; Dollar, Katherine M; Iwamasa, Gayle Y; Katz, Ira; Schmitz, Theresa; Schohn, Mary; Resnick, Sandra G

    2018-04-16

    This column reviews the unique contributions of multiple partners in establishing a standardized site visit process to promote quality improvement in mental health care at the Veterans Health Administration. Working as a team, leaders in policy and operations, staff of research centers, and regional- and facility-level mental health leaders developed a standardized protocol for evaluating mental health services at each site and using the data to help implement policy goals. The authors discuss the challenges experienced and lessons learned in this systemwide process and how this information can be part of a framework for improving mental health services on a national level.

  9. Design, fabrication and characterization of a poly-silicon PN junction

    NASA Astrophysics Data System (ADS)

    Tower, Jason D.

    This thesis details the design, fabrication, and characterization of a PN junction formed from p-type mono-crystalline silicon and n-type poly-crystalline silicon. The primary product of this project was a library of standard operating procedures (SOPs) for the fabrication of such devices, laying the foundations for future work and the development of a class in fabrication processes. The fabricated PN junction was characterized; in particular its current-voltage relationship was measured and fit to models. This characterization was to determine whether or not the fabrication process could produce working PN junctions with acceptable operational parameters.

  10. Toxicological Benchmarks for Screening Potential Contaminants of Concern for Effects on Soil and Litter Invertebrates and Heterotrophic Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Will, M.E.

    1994-01-01

    This report presents a standard method for deriving benchmarks for the purpose of ''contaminant screening,'' performed by comparing measured ambient concentrations of chemicals. The work was performed under Work Breakdown Structure 1.4.12.2.3.04.07.02 (Activity Data Sheet 8304). In addition, this report presents sets of data concerning the effects of chemicals in soil on invertebrates and soil microbial processes, benchmarks for chemicals potentially associated with United States Department of Energy sites, and literature describing the experiments from which data were drawn for benchmark derivation.

  11. Multifractal Properties of Process Control Variables

    NASA Astrophysics Data System (ADS)

    Domański, Paweł D.

    2017-06-01

    Control system is an inevitable element of any industrial installation. Its quality affects overall process performance significantly. The assessment, whether control system needs any improvement or not, requires relevant and constructive measures. There are various methods, like time domain based, Minimum Variance, Gaussian and non-Gaussian statistical factors, fractal and entropy indexes. Majority of approaches use time series of control variables. They are able to cover many phenomena. But process complexities and human interventions cause effects that are hardly visible for standard measures. It is shown that the signals originating from industrial installations have multifractal properties and such an analysis may extend standard approach to further observations. The work is based on industrial and simulation data. The analysis delivers additional insight into the properties of control system and the process. It helps to discover internal dependencies and human factors, which are hardly detectable.

  12. Flexibility First, Then Standardize: A Strategy for Growing Inter-Departmental Systems.

    PubMed

    á Torkilsheyggi, Arnvør

    2015-01-01

    Any attempt to use IT to standardize work practices faces the challenge of finding a balance between standardization and flexibility. In implementing electronic whiteboards with the goal of standardizing inter-departmental practices, a hospital in Denmark chose to follow the strategy of "flexibility first, then standardization." To improve the local grounding of the system, they first focused on flexibility by configuring the whiteboards to support intra-departmental practices. Subsequently, they focused on standardization by using the white-boards to negotiate standardization of inter-departmental practices. This paper investigates the chosen strategy and finds: that super users on many wards managed to configure the whiteboard to support intra-departmental practices; that initiatives to standardize inter-departmental practices improved coordination of certain processes; and that the chosen strategy posed a challenge for finding the right time and manner to shift the balance from flexibility to standardization.

  13. Workflow-Based Software Development Environment

    NASA Technical Reports Server (NTRS)

    Izygon, Michel E.

    2013-01-01

    The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment

  14. Recommendations for improved and coherent acquisition and processing of backscatter data from seafloor-mapping sonars

    NASA Astrophysics Data System (ADS)

    Lamarche, Geoffroy; Lurton, Xavier

    2018-06-01

    Multibeam echosounders are becoming widespread for the purposes of seafloor bathymetry mapping, but the acquisition and the use of seafloor backscatter measurements, acquired simultaneously with the bathymetric data, are still insufficiently understood, controlled and standardized. This presents an obstacle to well-accepted, standardized analysis and application by end users. The Marine Geological and Biological Habitat Mapping group (Geohab.org) has long recognized the need for better coherence and common agreement on acquisition, processing and interpretation of seafloor backscatter data, and established the Backscatter Working Group (BSWG) in May 2013. This paper presents an overview of this initiative, the mandate, structure and program of the working group, and a synopsis of the BSWG Guidelines and Recommendations to date. The paper includes (1) an overview of the current status in sensors and techniques available in seafloor backscatter data from multibeam sonars; (2) the presentation of the BSWG structure and results; (3) recommendations to operators, end-users, sonar manufacturers, and software developers using sonar backscatter for seafloor-mapping applications, for best practice methods and approaches for data acquisition and processing; and (4) a discussion on the development needs for future systems and data processing. We propose for the first time a nomenclature of backscatter processing levels that affords a means to accurately and efficiently describe the data processing status, and to facilitate comparisons of final products from various origins.

  15. Altered standards of care during an influenza pandemic: identifying ethical, legal, and practical principles to guide decision making.

    PubMed

    Levin, Donna; Cadigan, Rebecca Orfaly; Biddinger, Paul D; Condon, Suzanne; Koh, Howard K

    2009-12-01

    Although widespread support favors prospective planning for altered standards of care during mass casualty events, the literature includes few, if any, accounts of groups that have formally addressed the overarching policy considerations at the state level. We describe the planning process undertaken by public health officials in the Commonwealth of Massachusetts, along with community and academic partners, to explore the issues surrounding altered standards of care in the event of pandemic influenza. Throughout 2006, the Massachusetts Department of Public Health and the Harvard School of Public Health Center for Public Health Preparedness jointly convened a working group comprising ethicists, lawyers, clinicians, and local and state public health officials to consider issues such as allocation of antiviral medications, prioritization of critical care, and state seizure of private assets. Community stakeholders were also engaged in the process through facilitated discussion of case scenarios focused on these and other issues. The objective of this initiative was to establish a framework and some fundamental principles that would subsequently guide the process of establishing specific altered standards of care protocols. The group collectively identified 4 goals and 7 principles to guide the equitable allocation of limited resources and establishment of altered standards of care protocols. Reviewing and analyzing this process to date may serve as a resource for other states.

  16. The 5S lean method as a tool of industrial management performances

    NASA Astrophysics Data System (ADS)

    Filip, F. C.; Marascu-Klein, V.

    2015-11-01

    Implementing the 5S (seiri, seiton, seiso, seiketsu, and shitsuke) method is carried out through a significant study whose purpose to analyse and deployment the management performance in order to emphasize the problems and working mistakes, reducing waste (stationary and waiting times), flow transparency, storage areas by properly marking and labelling, establishing standards work (everyone knows exactly where are the necessary things), safety and ergonomic working places (the health of all employees). The study describes the impact of the 5S lean method implemented to storing, cleaning, developing and sustaining a production working place from an industrial company. In order to check and sustain the 5S process, it is needed to use an internal audit, called “5S audit”. Implementing the 5S methodology requires organization and safety of the working process, properly marking and labelling of the working place, and audits to establish the work in progress and to maintain the improved activities.

  17. Speech Recognition as a Transcription Aid: A Randomized Comparison With Standard Transcription

    PubMed Central

    Mohr, David N.; Turner, David W.; Pond, Gregory R.; Kamath, Joseph S.; De Vos, Cathy B.; Carpenter, Paul C.

    2003-01-01

    Objective. Speech recognition promises to reduce information entry costs for clinical information systems. It is most likely to be accepted across an organization if physicians can dictate without concerning themselves with real-time recognition and editing; assistants can then edit and process the computer-generated document. Our objective was to evaluate the use of speech-recognition technology in a randomized controlled trial using our institutional infrastructure. Design. Clinical note dictations from physicians in two specialty divisions were randomized to either a standard transcription process or a speech-recognition process. Secretaries and transcriptionists also were assigned randomly to each of these processes. Measurements. The duration of each dictation was measured. The amount of time spent processing a dictation to yield a finished document also was measured. Secretarial and transcriptionist productivity, defined as hours of secretary work per minute of dictation processed, was determined for speech recognition and standard transcription. Results. Secretaries in the endocrinology division were 87.3% (confidence interval, 83.3%, 92.3%) as productive with the speech-recognition technology as implemented in this study as they were using standard transcription. Psychiatry transcriptionists and secretaries were similarly less productive. Author, secretary, and type of clinical note were significant (p < 0.05) predictors of productivity. Conclusion. When implemented in an organization with an existing document-processing infrastructure (which included training and interfaces of the speech-recognition editor with the existing document entry application), speech recognition did not improve the productivity of secretaries or transcriptionists. PMID:12509359

  18. The MP (Materialization Pattern) Model for Representing Math Educational Standards

    NASA Astrophysics Data System (ADS)

    Choi, Namyoun; Song, Il-Yeol; An, Yuan

    Representing natural languages with UML has been an important research issue for various reasons. Little work has been done for modeling imperative mood sentences which are the sentence structure of math educational standard statements. In this paper, we propose the MP (Materialization Pattern) model that captures the semantics of English sentences used in math educational standards. The MP model is based on the Reed-Kellogg sentence diagrams and creates MP schemas with the UML notation. The MP model explicitly represents the semantics of the sentences by extracting math concepts and the cognitive process of math concepts from math educational standard statements, and simplifies modeling. This MP model is also developed to be used for aligning math educational standard statements via schema matching.

  19. To Trace a Law: Use of Library Materials in a Classroom Exercise.

    ERIC Educational Resources Information Center

    Shannon, Michael Owen

    A legislative history shows the various stages in the process of enacting laws. In order to follow the legislative process the student is asked to select a topic of interest and research the various steps as a bill becomes law. Then he is given descriptions of some current and standard reference works which will help him find information on the…

  20. 40 CFR Table 1 to Subpart Ffff of... - Emission Limits and Work Practice Standards for Continuous Process Vents

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...: Miscellaneous Organic Chemical Manufacturing Pt. 63, Subpt. FFFF, Table 1 Table 1 to Subpart FFFF of Part 63... vent a. Not applicable i. Reduce emissions of total organic HAP by ≥98 percent by weight or to an outlet process concentration ≤20 ppmv as organic HAP or TOC by venting emissions through a closed-vent...

  1. 40 CFR Table 1 to Subpart Ffff of... - Emission Limits and Work Practice Standards for Continuous Process Vents

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...: Miscellaneous Organic Chemical Manufacturing Pt. 63, Subpt. FFFF, Table 1 Table 1 to Subpart FFFF of Part 63... vent a. Not applicable i. Reduce emissions of total organic HAP by ≥98 percent by weight or to an outlet process concentration ≤20 ppmv as organic HAP or TOC by venting emissions through a closed-vent...

  2. The Collaborating States Initiative (CSI) Recommended Process for Developing State Policies and Guidelines to Support Social and Emotional Learning

    ERIC Educational Resources Information Center

    Dusenbury, Linda; Yoder, Nick

    2017-01-01

    In the work of the authors with states over the years, they have observed that most follow a similar process when they develop policies or guidelines to support statewide implementation of social and emotional learning (SEL), such as establishing learning goals or standards for student social and emotional competencies, or providing guidance to…

  3. Standardization as an Arena for Open Innovation

    NASA Astrophysics Data System (ADS)

    Grøtnes, Endre

    This paper argues that anticipatory standardization can be viewed as an arena for open innovation and shows this through two cases from mobile telecommunication standardization. One case is the Android initiative by Google and the Open Handset Alliance, while the second case is the general standardization work of the Open Mobile Alliance. The paper shows how anticipatory standardization intentionally uses inbound and outbound streams of research and intellectual property to create new innovations. This is at the heart of the open innovation model. The standardization activities use both pooling of R&D and the distribution of freely available toolkits to create products and architectures that can be utilized by the participants and third parties to leverage their innovation. The paper shows that the technology being standardized needs to have a systemic nature to be part of an open innovation process.

  4. Neuronal Adaptive Mechanisms Underlying Intelligent Information Processing

    DTIC Science & Technology

    1981-05-01

    Physiol. 134: 451-470, 1956. J. Freud , S, Unpublished, untitled paper (1895) subsequently published in Freud , Sigmund - Standard Edition...of the Complete Psychological Works of Freud , edited by J. Strachey. New York, Macmillan 1: 281-287, 1964. Gallagher, J.P. and Shinnick-Gallagher

  5. Exploring Expressive Vocabulary Variability in Two-Year-Olds: The Role of Working Memory.

    PubMed

    Newbury, Jayne; Klee, Thomas; Stokes, Stephanie F; Moran, Catherine

    2015-12-01

    This study explored whether measures of working memory ability contribute to the wide variation in 2-year-olds' expressive vocabulary skills. Seventy-nine children (aged 24-30 months) were assessed by using standardized tests of vocabulary and visual cognition, a processing speed measure, and behavioral measures of verbal working memory and phonological short-term memory. Strong correlations were observed between phonological short-term memory, verbal working memory, and expressive vocabulary. Speed of spoken word recognition showed a moderate significant correlation with expressive vocabulary. In a multivariate regression model for expressive vocabulary, the most powerful predictor was a measure of phonological short-term memory (accounting for 66% unique variance), followed by verbal working memory (6%), sex (2%), and age (1%). Processing speed did not add significant unique variance. These findings confirm previous research positing a strong role for phonological short-term memory in early expressive vocabulary acquisition. They also extend previous research in two ways. First, a unique association between verbal working memory and expressive vocabulary in 2-year-olds was observed. Second, processing speed was not a unique predictor of variance in expressive vocabulary when included alongside measures of working memory.

  6. Effects of working memory contents and perceptual load on distractor processing: When a response-related distractor is held in working memory.

    PubMed

    Koshino, Hideya

    2017-01-01

    Working memory and attention are closely related. Recent research has shown that working memory can be viewed as internally directed attention. Working memory can affect attention in at least two ways. One is the effect of working memory load on attention, and the other is the effect of working memory contents on attention. In the present study, an interaction between working memory contents and perceptual load in distractor processing was investigated. Participants performed a perceptual load task in a standard form in one condition (Single task). In the other condition, a response-related distractor was maintained in working memory, rather than presented in the same stimulus display as a target (Dual task). For the Dual task condition, a significant compatibility effect was found under high perceptual load; however, there was no compatibility effect under low perceptual load. These results suggest that the way the contents of working memory affect visual search depends on perceptual load. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. FDA recognition of consensus standards in the premarket notification program.

    PubMed

    Marlowe, D E; Phillips, P J

    1998-01-01

    "The FDA has long advocated the use of standards as a significant contributor to safety and effectiveness of medical devices," Center for Devices and Radiological Health's (CDRH) Donald E. Marlowe and Philip J. Phillips note in the following article, highlighting the latest U.S. Food and Drug Administration (FDA) plans for use of standards. They note that the important role standards can play has been reinforced as part of FDA reengineering efforts undertaken in anticipation of an increased regulatory work-load and declining agency resources. As part of its restructuring effort, the FDA announced last spring that it would recognize some consensus standards for use in the device approval process. Under the new 510(k) paradigm--the FDA's proposal to streamline premarket review, which includes incorporating the use of standards in the review of 510(k) submissions--the FDA will accept proof of compliance with standards as evidence of device safety and effectiveness. Manufacturers may submit declarations of conformity to standards instead of following the traditional review process. The International Electrotechnical Commission (IEC) 60601 series of consensus standards, which deals with many safety issues common to electrical medical devices, was the first to be chosen for regulatory review. Other standards developed by nationally or internationally recognized standards development organizations, such as AAMI, may be eligible for use to ensure review requirements. In the following article, Marlowe and Phillips describe the FDA's plans to use standards in the device review process. The article focuses on the use of standards for medical device review, the development of the standards recognition process for reviewing devices, and the anticipated benefits of using standards to review devices. One important development has been the recent implementation of the FDA Modernization Act of 1997 (FDAMA), which advocates the use of standards in the device review process. In implementing the legislation, the FDA published in the Federal Register a list of standards to which manufacturers may declare conformity. Visit AAMI's Web site at www.aami.org/news/fda.standards for a copy of the list and for information on nominating other standards for official recognition by the agency. The FDA expects that use of standards will benefit the agency and manufacturers alike: "We estimate that in time, reliance on declarations of conformity to recognized standards could save the agency considerable resources while reducing the regulatory obstacles to entry to domestic and international markets," state the authors.

  8. Approach to data exchange: the spatial data transfer standard

    USGS Publications Warehouse

    Rossmeissl, Hedy J.; Rugg, Robert D.

    1992-01-01

    Significant developments have taken place in the disciplines of cartography and geography in recent years with the advent of computer hardware and software that manipulate and process digital cartographic and geographic data more efficiently. The availability of inexpensive and powerful hardware and software systems offers the capability of displaying and analyzing spatial data to a growing number of users. As a result, developing and using existing digital cartographic databases are becoming very popular. However, the absence of uniform standards for the transfer of digital spatial data is hindering the exchange of data and increasing costs. Several agencies of the U.S. government and the academic community have been working hard over the last few years to develop a spatial data transfer standard that includes definitions of standard terminology, a spatial data transfer specification, recommendations on reporting digital cartographic data quality, and standard topographic and hydrographic entity terms and definitions. This proposed standard was published in the January 1988 issue of The American Cartographer. Efforts to test and promote this standard were coordinated by the U.S. Geological Survey. A Technical Review Board was appointed with representatives from the U.S. government, the private sector, and the academic community to complete the standard for submittal to the National Institute of Standards and Technology for approval as a Federal Information Processing Standard. The proposed standard was submitted in February 1992 for final approval.

  9. Compressed Sensing Quantum Process Tomography for Superconducting Quantum Gates

    NASA Astrophysics Data System (ADS)

    Rodionov, Andrey

    An important challenge in quantum information science and quantum computing is the experimental realization of high-fidelity quantum operations on multi-qubit systems. Quantum process tomography (QPT) is a procedure devised to fully characterize a quantum operation. We first present the results of the estimation of the process matrix for superconducting multi-qubit quantum gates using the full data set employing various methods: linear inversion, maximum likelihood, and least-squares. To alleviate the problem of exponential resource scaling needed to characterize a multi-qubit system, we next investigate a compressed sensing (CS) method for QPT of two-qubit and three-qubit quantum gates. Using experimental data for two-qubit controlled-Z gates, taken with both Xmon and superconducting phase qubits, we obtain estimates for the process matrices with reasonably high fidelities compared to full QPT, despite using significantly reduced sets of initial states and measurement configurations. We show that the CS method still works when the amount of data is so small that the standard QPT would have an underdetermined system of equations. We also apply the CS method to the analysis of the three-qubit Toffoli gate with simulated noise, and similarly show that the method works well for a substantially reduced set of data. For the CS calculations we use two different bases in which the process matrix is approximately sparse (the Pauli-error basis and the singular value decomposition basis), and show that the resulting estimates of the process matrices match with reasonably high fidelity. For both two-qubit and three-qubit gates, we characterize the quantum process by its process matrix and average state fidelity, as well as by the corresponding standard deviation defined via the variation of the state fidelity for different initial states. We calculate the standard deviation of the average state fidelity both analytically and numerically, using a Monte Carlo method. Overall, we show that CS QPT offers a significant reduction in the needed amount of experimental data for two-qubit and three-qubit quantum gates.

  10. Influence of Short Austenitization Treatments on the Mechanical Properties of Low-Alloy Steels for Hot Forming Applications

    NASA Astrophysics Data System (ADS)

    Holzweissig, Martin Joachim; Lackmann, Jan; Konrad, Stefan; Schaper, Mirko; Niendorf, Thomas

    2015-07-01

    The current work elucidates an improvement of the mechanical properties of tool-quenched low-alloy steel by employing extremely short austenitization durations utilizing a press heating arrangement. Specifically, the influence of different austenitization treatments—involving austenitization durations ranging from three to 15 seconds—on the mechanical properties of low-alloy steel in comparison to an industrial standard furnace process was examined. A thorough set of experiments was conducted to investigate the role of different austenitization durations and temperatures on the resulting mechanical properties such as hardness, bending angle, tensile strength, and strain at fracture. The most important finding is that the hardness, the bending angle as well as the tensile strength increase with shortened austenitization durations. Furthermore, the ductility of the steels exhibits almost no difference following the short austenitization durations and the standard furnace process. The enhancement of the mechanical properties imposed by the short heat treatments investigated, is related to a refinement of microstructural features as compared to the standard furnace process.

  11. An Integrated Social, Economic, and Ecologic Conceptual (ISEEC) framework for considering rangeland sustainability

    USGS Publications Warehouse

    Fox, W.E.; McCollum, D.W.; Mitchell, J.E.; Swanson, L.E.; Kreuter, U.P.; Tanaka, J.A.; Evans, G.R.; Theodore, Heintz H.; Breckenridge, R.P.; Geissler, P.H.

    2009-01-01

    Currently, there is no standard method to assess the complex systems in rangeland ecosystems. Decision makers need baselines to create a common language of current rangeland conditions and standards for continued rangeland assessment. The Sustainable Rangeland Roundtable (SRR), a group of private and public organizations and agencies, has created a forum to discuss rangeland sustainability and assessment. The SRR has worked to integrate social, economic, and ecological disciplines related to rangelands and has identified a standard set of indicators that can be used to assess rangeland sustainability. As part of this process, SRR has developed a two-tiered conceptual framework from a systems perspective to study the validity of indicators and the relationships among them. The first tier categorizes rangeland characteristics into four states. The second tier defines processes affecting these states through time and space. The framework clearly shows that the processes affect and are affected by each other. ?? 2009 Taylor & Francis Group, LLC.

  12. WHO Expert Committee on Specifications for Pharmaceutical Preparations.

    PubMed

    2009-01-01

    The Expert Committee on Specifications for Pharmaceutical Preparations works towards clear, independent and practical standards and guidelines for the quality assurance of medicines. Standards are developed by the Committee through worldwide consultation and an international consensus-building process. The following new standards and guidelines were adopted and recommended for use: the current list of available International Chemical Reference Substances and International Infrared Reference Spectra; guidelines on stability testing of active pharmaceutical ingredients and finished pharmaceutical products; procedure for prequalification of pharmaceutical products; and the procedure for assessing the acceptability, in principle, of active pharmaceutical ingredients for use in pharmaceutical products.

  13. Aerogels Insulate Missions and Consumer Products

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Aspen Aerogels, of Northborough, Massachusetts, worked with NASA through an SBIR contract with Kennedy Space Center to develop a robust, flexible form of aerogel for cryogenic insulation for space shuttle launch applications. The company has since used the same manufacturing process developed under the SBIR award to expand its product offerings into the more commercial realms, making the naturally fragile aerogel available for the first time as a standard insulation that can be handled and installed just like standard insulation.

  14. The Effects of Spatial Disorientation on Working Memory and Mathematical Processing

    DTIC Science & Technology

    2010-12-17

    Regulation 70-25 and USAMRMC Regulation 70-25 on use of volunteers in research. Standard Form 298 (Rev. 8/98) REPORT DOCUMENTATION PAGE Prescribed...DeHart & Davis, 2002). Previc and Ercoline (2004) cite three reasons as to why formation flight in DVE is particularly conducive to SD: (1) pilots...within normal limits” as defined by American National Standards Institute. A power analysis indicated a total of 36 participants were needed for the study

  15. Computational Fluid Dynamics Assessment Associated with Transcatheter Heart Valve Prostheses: A Position Paper of the ISO Working Group.

    PubMed

    Wei, Zhenglun Alan; Sonntag, Simon Johannes; Toma, Milan; Singh-Gryzbon, Shelly; Sun, Wei

    2018-04-19

    The governing international standard for the development of prosthetic heart valves is International Organization for Standardization (ISO) 5840. This standard requires the assessment of the thrombus potential of transcatheter heart valve substitutes using an integrated thrombus evaluation. Besides experimental flow field assessment and ex vivo flow testing, computational fluid dynamics is a critical component of this integrated approach. This position paper is intended to provide and discuss best practices for the setup of a computational model, numerical solving, post-processing, data evaluation and reporting, as it relates to transcatheter heart valve substitutes. This paper is not intended to be a review of current computational technology; instead, it represents the position of the ISO working group consisting of experts from academia and industry with regards to considerations for computational fluid dynamic assessment of transcatheter heart valve substitutes.

  16. Leveraging electronic healthcare record standards and semantic web technologies for the identification of patient cohorts

    PubMed Central

    Fernández-Breis, Jesualdo Tomás; Maldonado, José Alberto; Marcos, Mar; Legaz-García, María del Carmen; Moner, David; Torres-Sospedra, Joaquín; Esteban-Gil, Angel; Martínez-Salvador, Begoña; Robles, Montserrat

    2013-01-01

    Background The secondary use of electronic healthcare records (EHRs) often requires the identification of patient cohorts. In this context, an important problem is the heterogeneity of clinical data sources, which can be overcome with the combined use of standardized information models, virtual health records, and semantic technologies, since each of them contributes to solving aspects related to the semantic interoperability of EHR data. Objective To develop methods allowing for a direct use of EHR data for the identification of patient cohorts leveraging current EHR standards and semantic web technologies. Materials and methods We propose to take advantage of the best features of working with EHR standards and ontologies. Our proposal is based on our previous results and experience working with both technological infrastructures. Our main principle is to perform each activity at the abstraction level with the most appropriate technology available. This means that part of the processing will be performed using archetypes (ie, data level) and the rest using ontologies (ie, knowledge level). Our approach will start working with EHR data in proprietary format, which will be first normalized and elaborated using EHR standards and then transformed into a semantic representation, which will be exploited by automated reasoning. Results We have applied our approach to protocols for colorectal cancer screening. The results comprise the archetypes, ontologies, and datasets developed for the standardization and semantic analysis of EHR data. Anonymized real data have been used and the patients have been successfully classified by the risk of developing colorectal cancer. Conclusions This work provides new insights in how archetypes and ontologies can be effectively combined for EHR-driven phenotyping. The methodological approach can be applied to other problems provided that suitable archetypes, ontologies, and classification rules can be designed. PMID:23934950

  17. Leveraging electronic healthcare record standards and semantic web technologies for the identification of patient cohorts.

    PubMed

    Fernández-Breis, Jesualdo Tomás; Maldonado, José Alberto; Marcos, Mar; Legaz-García, María del Carmen; Moner, David; Torres-Sospedra, Joaquín; Esteban-Gil, Angel; Martínez-Salvador, Begoña; Robles, Montserrat

    2013-12-01

    The secondary use of electronic healthcare records (EHRs) often requires the identification of patient cohorts. In this context, an important problem is the heterogeneity of clinical data sources, which can be overcome with the combined use of standardized information models, virtual health records, and semantic technologies, since each of them contributes to solving aspects related to the semantic interoperability of EHR data. To develop methods allowing for a direct use of EHR data for the identification of patient cohorts leveraging current EHR standards and semantic web technologies. We propose to take advantage of the best features of working with EHR standards and ontologies. Our proposal is based on our previous results and experience working with both technological infrastructures. Our main principle is to perform each activity at the abstraction level with the most appropriate technology available. This means that part of the processing will be performed using archetypes (ie, data level) and the rest using ontologies (ie, knowledge level). Our approach will start working with EHR data in proprietary format, which will be first normalized and elaborated using EHR standards and then transformed into a semantic representation, which will be exploited by automated reasoning. We have applied our approach to protocols for colorectal cancer screening. The results comprise the archetypes, ontologies, and datasets developed for the standardization and semantic analysis of EHR data. Anonymized real data have been used and the patients have been successfully classified by the risk of developing colorectal cancer. This work provides new insights in how archetypes and ontologies can be effectively combined for EHR-driven phenotyping. The methodological approach can be applied to other problems provided that suitable archetypes, ontologies, and classification rules can be designed.

  18. Prevalence of low back pain and associated factors among farmers during the rice transplanting process

    PubMed Central

    Keawduangdee, Petcharat; Puntumetakul, Rungthip; Swangnetr, Manida; Laohasiriwong, Wongsa; Settheetham, Dariwan; Yamauchi, Junichiro; Boucaut, Rose

    2015-01-01

    [Purpose] The aim of this study was to investigate the prevalence of low back pain and associated factors in Thai rice farmers during the rice transplanting process. [Subjects and Methods] Three hundred and forty-four farmers, aged 20–59 years old, were asked to answer a questionnaire modified from the Standard Nordic Questionnaire (Thai version). The questionnaire sought demographic, back-related, and psychosocial data. [Results] The results showed that the prevalence of low back pain was 83.1%. Farmers younger than 45 years old who worked in the field fewer than six days were more likely to experience low back pain than those who worked for at least six days. Farmers with high stress levels were more likely to have low back pain. [Conclusion] In the rice transplanting process, the low back pain experienced by the farmers was associated with the weekly work duration and stress. PMID:26311961

  19. Standardized severe maternal morbidity review: rationale and process.

    PubMed

    Kilpatrick, Sarah J; Berg, Cynthia; Bernstein, Peter; Bingham, Debra; Delgado, Ana; Callaghan, William M; Harris, Karen; Lanni, Susan; Mahoney, Jeanne; Main, Elliot; Nacht, Amy; Schellpfeffer, Michael; Westover, Thomas; Harper, Margaret

    2014-08-01

    Severe maternal morbidity and mortality have been rising in the United States. To begin a national effort to reduce morbidity, a specific call to identify all pregnant and postpartum women experiencing admission to an intensive care unit or receipt of 4 or more units of blood for routine review has been made. While advocating for review of these cases, no specific guidance for the review process was provided. Therefore, the aim of this expert opinion is to present guidelines for a standardized severe maternal morbidity interdisciplinary review process to identify systems, professional, and facility factors that can be ameliorated, with the overall goal of improving institutional obstetric safety and reducing severe morbidity and mortality among pregnant and recently pregnant women. This opinion was developed by a multidisciplinary working group that included general obstetrician-gynecologists, maternal-fetal medicine subspecialists, certified nurse-midwives, and registered nurses all with experience in maternal mortality reviews. A process for standardized review of severe maternal morbidity addressing committee organization, review process, medical record abstraction and assessment, review culture, data management, review timing, and review confidentiality is presented. Reference is made to a sample severe maternal morbidity abstraction and assessment form.

  20. Phase-I monitoring of standard deviations in multistage linear profiles

    NASA Astrophysics Data System (ADS)

    Kalaei, Mahdiyeh; Soleimani, Paria; Niaki, Seyed Taghi Akhavan; Atashgar, Karim

    2018-03-01

    In most modern manufacturing systems, products are often the output of some multistage processes. In these processes, the stages are dependent on each other, where the output quality of each stage depends also on the output quality of the previous stages. This property is called the cascade property. Although there are many studies in multistage process monitoring, there are fewer works on profile monitoring in multistage processes, especially on the variability monitoring of a multistage profile in Phase-I for which no research is found in the literature. In this paper, a new methodology is proposed to monitor the standard deviation involved in a simple linear profile designed in Phase I to monitor multistage processes with the cascade property. To this aim, an autoregressive correlation model between the stages is considered first. Then, the effect of the cascade property on the performances of three types of T 2 control charts in Phase I with shifts in standard deviation is investigated. As we show that this effect is significant, a U statistic is next used to remove the cascade effect, based on which the investigated control charts are modified. Simulation studies reveal good performances of the modified control charts.

  1. HIPS: A new hippocampus subfield segmentation method.

    PubMed

    Romero, José E; Coupé, Pierrick; Manjón, José V

    2017-12-01

    The importance of the hippocampus in the study of several neurodegenerative diseases such as Alzheimer's disease makes it a structure of great interest in neuroimaging. However, few segmentation methods have been proposed to measure its subfields due to its complex structure and the lack of high resolution magnetic resonance (MR) data. In this work, we present a new pipeline for automatic hippocampus subfield segmentation using two available hippocampus subfield delineation protocols that can work with both high and standard resolution data. The proposed method is based on multi-atlas label fusion technology that benefits from a novel multi-contrast patch match search process (using high resolution T1-weighted and T2-weighted images). The proposed method also includes as post-processing a new neural network-based error correction step to minimize systematic segmentation errors. The method has been evaluated on both high and standard resolution images and compared to other state-of-the-art methods showing better results in terms of accuracy and execution time. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Study Methods to Standardize Thermography NDE

    NASA Technical Reports Server (NTRS)

    Walker, James L.; Workman, Gary L.

    1998-01-01

    The purpose of this work is to develop thermographic inspection methods and standards for use in evaluating structural composites and aerospace hardware. Qualification techniques and calibration methods are investigated to standardize the thermographic method for use in the field. Along with the inspections of test standards structural hardware, support hardware is designed and fabricated to aid in the thermographic process. Also, a standard operating procedure is developed for performing inspections with the Bales Thermal Image Processor (TIP). Inspections are performed on a broad range of structural composites. These materials include various graphite/epoxies, graphite/cyanide-ester, graphite/silicon-carbide, graphite phenolic and Keviar/epoxy. Also metal honeycomb (titanium and aluminum faceplates over an aluminum honeycomb core) structures are investigated. Various structural shapes are investigated and the thickness of the structures vary from as few as 3 plies to as many as 80 plies. Special emphasis is placed on characterizing defects in attachment holes and bondlines, in addition to those resulting from impact damage and the inclusion of foreign matter. Image processing through statistical analysis and digital filtering is investigated to enhance the quality and quantify the NDE thermal images when necessary.

  3. Study Methods to Standardize Thermography NDE

    NASA Technical Reports Server (NTRS)

    Walker, James L.; Workman, Gary L.

    1998-01-01

    The purpose of this work is to develop thermographic inspection methods and standards for use in evaluating structural composites and aerospace hardware. Qualification techniques and calibration methods are investigated to standardize the thermographic method for use in the field. Along with the inspections of test standards structural hardware, support hardware is designed and fabricated to aid in the thermographic process. Also, a standard operating procedure is developed for performing inspections with the Bales Thermal Image Processor (TIP). Inspections are performed on a broad range of structural composites. These materials include graphite/epoxies, graphite/cyanide-ester, graphite/silicon-carbide, graphite phenolic and Kevlar/epoxy. Also metal honeycomb (titanium and aluminum faceplates over an aluminum honeycomb core) structures are investigated. Various structural shapes are investigated and the thickness of the structures vary from as few as 3 plies to as many as 80 plies. Special emphasis is placed on characterizing defects in attachment holes and bondlines, in addition to those resulting from impact damage and the inclusion of foreign matter. Image processing through statistical analysis and digital filtering is investigated to enhance the quality and quantify the NDE thermal images when necessary.

  4. Algorithms and programming tools for image processing on the MPP:3

    NASA Technical Reports Server (NTRS)

    Reeves, Anthony P.

    1987-01-01

    This is the third and final report on the work done for NASA Grant 5-403 on Algorithms and Programming Tools for Image Processing on the MPP:3. All the work done for this grant is summarized in the introduction. Work done since August 1986 is reported in detail. Research for this grant falls under the following headings: (1) fundamental algorithms for the MPP; (2) programming utilities for the MPP; (3) the Parallel Pascal Development System; and (4) performance analysis. In this report, the results of two efforts are reported: region growing, and performance analysis of important characteristic algorithms. In each case, timing results from MPP implementations are included. A paper is included in which parallel algorithms for region growing on the MPP is discussed. These algorithms permit different sized regions to be merged in parallel. Details on the implementation and peformance of several important MPP algorithms are given. These include a number of standard permutations, the FFT, convolution, arbitrary data mappings, image warping, and pyramid operations, all of which have been implemented on the MPP. The permutation and image warping functions have been included in the standard development system library.

  5. Label free sensing of creatinine using a 6 GHz CMOS near-field dielectric immunosensor.

    PubMed

    Guha, S; Warsinke, A; Tientcheu, Ch M; Schmalz, K; Meliani, C; Wenger, Ch

    2015-05-07

    In this work we present a CMOS high frequency direct immunosensor operating at 6 GHz (C-band) for label free determination of creatinine. The sensor is fabricated in standard 0.13 μm SiGe:C BiCMOS process. The report also demonstrates the ability to immobilize creatinine molecules on a Si3N4 passivation layer of the standard BiCMOS/CMOS process, therefore, evading any further need of cumbersome post processing of the fabricated sensor chip. The sensor is based on capacitive detection of the amount of non-creatinine bound antibodies binding to an immobilized creatinine layer on the passivated sensor. The chip bound antibody amount in turn corresponds indirectly to the creatinine concentration used in the incubation phase. The determination of creatinine in the concentration range of 0.88-880 μM is successfully demonstrated in this work. A sensitivity of 35 MHz/10 fold increase in creatinine concentration (during incubation) at the centre frequency of 6 GHz is gained by the immunosensor. The results are compared with a standard optical measurement technique and the dynamic range and sensitivity is of the order of the established optical indication technique. The C-band immunosensor chip comprising an area of 0.3 mm(2) reduces the sensing area considerably, therefore, requiring a sample volume as low as 2 μl. The small analyte sample volume and label free approach also reduce the experimental costs in addition to the low fabrication costs offered by the batch fabrication technique of CMOS/BiCMOS process.

  6. Using Lean methodologies to streamline processing of requests for durable medical equipment and supplies for children with complex conditions.

    PubMed

    Fields, Elise; Neogi, Smriti; Schoettker, Pamela J; Lail, Jennifer

    2017-12-12

    An improvement team from the Complex Care Center at our large pediatric medical center participated in a 60-day initiative to use Lean methodologies to standardize their processes, eliminate waste and improve the timely and reliable provision of durable medical equipment and supplies. The team used value stream mapping to identify processes needing improvement. Improvement activities addressed the initial processing of a request, provider signature on the form, returning the form to the sender, and uploading the completed documents to the electronic medical record. Data on lead time (time between receiving a request and sending the completed request to the Health Information Management department) and process time (amount of time the staff worked on the request) were collected via manual pre- and post-time studies. Following implementation of interventions, the median lead time for processing durable medical equipment and supply requests decreased from 50 days to 3 days (p < 0.0001). Median processing time decreased from 14min to 9min (p < 0.0001). The decrease in processing time realized annual cost savings of approximately $11,000. Collaborative leadership and multidisciplinary training in Lean methods allowed the CCC staff to incorporate common sense, standardize practices, and adapt their work environment to improve the timely and reliable provision of equipment and supplies that are essential for their patients. The application of Lean methodologies to processing requests for DME and supplies could also result in a natural spread to other paperwork and requests, thus avoiding delays and potential risk for clinical instability or deterioration. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. To recognize the use of international standards for making harmonized regulation of medical devices in Asia-pacific.

    PubMed

    Anand, K; Saini, Ks; Chopra, Y; Binod, Sk

    2010-07-01

    'Medical Devices' include everything from highly sophisticated, computerized, medical equipment, right down to simple wooden tongue depressors. Regulations embody the public expectations for how buildings and facilities are expected to perform and as such represent public policy. Regulators, who develop and enforce regulations, are empowered to act in the public's interest to set this policy and are ultimately responsible to the public in this regard. Standardization contributes to the basic infrastructure that underpins society including health and environment, while promoting sustainability and good regulatory practice. The international organizations that produce International Standards are the International Electrotechnical Commission (IEC), the International Organization for Standardization (ISO), and the International Telecommunication Union (ITU). With the increasing globalization of markets, International Standards (as opposed to regional or national standards) have become critical to the trading process, ensuring a level playing field for exports, and ensuring that imports meet the internationally recognized levels of performance and safety. The development of standards is done in response to sectors and stakeholders that express a clearly established need for them. An industry sector or other stakeholder group typically communicates its requirement for standards to one of the national members. To be accepted for development, a proposed work item must receive a majority support of the participating members, who verify the global relevance of the proposed item. The regulatory authority (RA) should provide a method for the recognition of international voluntary standards and for public notification of such recognition. The process of recognition may vary from country to country. Recognition may occur by periodic publication of lists of standards that a regulatory authority has found will meet the Essential Principles. In conclusion, International standards, such as, basic standards, group standards, and product standards, are a tool for harmonizing regulatory processes, to assure the safety, quality, and performance of medical devices. Standards represent the opinion of experts from all interested parties, including industry, regulators, users, and others.

  8. The promise of cyborg intelligence.

    PubMed

    Brown, Michael F; Brown, Alexander A

    2017-03-01

    Yu et al. (2016) demonstrated that algorithms designed to find efficient routes in standard mazes can be integrated with the natural processes controlling rat navigation and spatial choices, and they pointed out the promise of such "cyborg intelligence" for biorobotic applications. Here, we briefly describe Yu et al.'s work, explore its relevance to the study of comparative cognition, and indicate how work involving cyborg intelligence would benefit from interdisciplinary collaboration between behavioral scientists and engineers.

  9. The Use of Nominal Group Technique to Determine Additional Support Needs for a Group of Victorian TAFE Managers and Senior Educators

    ERIC Educational Resources Information Center

    Bailey, Anthony

    2013-01-01

    The nominal group technique (NGT) is a structured process to gather information from a group. The technique was first described in 1975 and has since become a widely-used standard to facilitate working groups. The NGT is effective for generating large numbers of creative new ideas and for group priority setting. This paper describes the process of…

  10. Improving the safety and quality of nursing care through standardized operating procedures in Bosnia and Herzegovina.

    PubMed

    Ausserhofer, Dietmar; Rakic, Severin; Novo, Ahmed; Dropic, Emira; Fisekovic, Eldin; Sredic, Ana; Van Malderen, Greet

    2016-06-01

    We explored how selected 'positive deviant' healthcare facilities in Bosnia and Herzegovina approach the continuous development, adaptation, implementation, monitoring and evaluation of nursing-related standard operating procedures. Standardized nursing care is internationally recognized as a critical element of safe, high-quality health care; yet very little research has examined one of its key instruments: nursing-related standard operating procedures. Despite variability in Bosnia and Herzegovina's healthcare and nursing care quality, we assumed that some healthcare facilities would have developed effective strategies to elevate nursing quality and safety through the use of standard operating procedures. Guided by the 'positive deviance' approach, we used a multiple-case study design to examine a criterion sample of four facilities (two primary healthcare centres and two hospitals), collecting data via focus groups and individual interviews. In each studied facility, certification/accreditation processes were crucial to the initiation of continuous development, adaptation, implementation, monitoring and evaluation of nursing-related SOPs. In one hospital and one primary healthcare centre, nurses working in advanced roles (i.e. quality coordinators) were responsible for developing and implementing nursing-related standard operating procedures. Across the four studied institutions, we identified a consistent approach to standard operating procedures-related processes. The certification/accreditation process is enabling necessary changes in institutions' organizational cultures, empowering nurses to take on advanced roles in improving the safety and quality of nursing care. Standardizing nursing procedures is key to improve the safety and quality of nursing care. Nursing and Health Policy are needed in Bosnia and Herzegovina to establish a functioning institutional framework, including regulatory bodies, educational systems for developing nurses' capacities or the inclusion of nursing-related standard operating procedures in certification/accreditation standards. © 2016 International Council of Nurses.

  11. Investigation of Natural Radioactivity in a Monazite Processing Plant in Japan.

    PubMed

    Iwaoka, Kazuki; Yajima, Kazuaki; Suzuki, Toshikazu; Yonehara, Hidenori; Hosoda, Masahiro; Tokonami, Shinji; Kanda, Reiko

    2017-09-01

    Monazite is a naturally occurring radioactive material that is processed for use in a variety of domestic applications. At present, there is little information available on potential radiation doses experienced by people working with monazite. The ambient dose rate and activity concentration of natural radionuclides in raw materials, products, and dust in work sites as well as the Rn and Rn concentrations in work sites were measured in a monazite processing plant in Japan. Dose estimations for plant workers were also conducted. The activity concentration of the U series in raw materials and products for the monazite processing plant was found to be higher than the relevant values described in the International Atomic Energy Agency Safety Standards. The ambient dose rates in the raw material yard were higher than those in other work sites. Moreover, the activity concentrations of dust in the milling site were higher than those in other work sites. The Rn concentrations in all work sites were almost the same as those in regular indoor environments in Japan. The Rn concentrations in all work sites were much higher than those in regular indoor environments in Japan. The maximum value of the effective dose for workers was 0.62 mSv y, which is lower than the reference level range (1-20 mSv y) for abnormally high levels of natural background radiation published in the International Commission of Radiological Protection Publication 103.

  12. Sentence processing and verbal working memory in a white-matter-disconnection patient.

    PubMed

    Meyer, Lars; Cunitz, Katrin; Obleser, Jonas; Friederici, Angela D

    2014-08-01

    The Arcuate Fasciculus/Superior Longitudinal Fasciculus (AF/SLF) is the white-matter bundle that connects posterior superior temporal and inferior frontal cortex. Its causal functional role in sentence processing and verbal working memory is currently under debate. While impairments of sentence processing and verbal working memory often co-occur in patients suffering from AF/SLF damage, it is unclear whether these impairments result from shared white-matter damage to the verbal-working-memory network. The present study sought to specify the behavioral consequences of focal AF/SLF damage for sentence processing and verbal working memory, which were assessed in a single patient suffering from a cleft-like lesion spanning the deep left superior temporal gyrus, sparing most surrounding gray matter. While tractography suggests that the ventral fronto-temporal white-matter bundle is intact in this patient, the AF/SLF was not visible to tractography. In line with the hypothesis that the AF/SLF is causally involved in sentence processing, the patient׳s performance was selectively impaired on sentences that jointly involve both complex word orders and long word-storage intervals. However, the patient was unimpaired on sentences that only involved long word-storage intervals without involving complex word orders. On the contrary, the patient performed generally worse than a control group across standard verbal-working-memory tests. We conclude that the AF/SLF not only plays a causal role in sentence processing, linking regions of the left dorsal inferior frontal gyrus to the temporo-parietal region, but moreover plays a crucial role in verbal working memory, linking regions of the left ventral inferior frontal gyrus to the left temporo-parietal region. Together, the specific sentence-processing impairment and the more general verbal-working-memory impairment may imply that the AF/SLF subserves both sentence processing and verbal working memory, possibly pointing to the AF and SLF respectively supporting each. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. [Output standard in the mental health services of Reggio Emilia, Italy. Methodological issues].

    PubMed

    Grassi, G

    2000-01-01

    The project Output Standards of the Mental Health Department (MHD) of Reggio Emilia is set out to define outputs and quality standards and to guarantee transparency and to facilitate organizational improvement. The MHD started an interprofessional working group that defined the MHD outputs as long as process, quality peculiarities, indicators and standards for each output. The MHD Director validated the group results. The MHD defined 9 outputs and its indicators and standards and consequently modified its data registration system, the way to supply free and partially charged services and budget indicators. As a result, a new instrument for management and quality control has been provided. The A. maintains that to define outputs, indicators and standards will allow to compare several services of the Department, get them omogeneous and guarantee and improve quality.

  14. New IEEE standard enables data collection for medical applications.

    PubMed

    Kennelly, R J; Wittenber, J

    1994-01-01

    The IEEE has gone to ballot on a "Standard for Medical Device Communications", IEEE P1073. The lower layer, hardware portions of the standard are expected to be approved by the IEEE Standards Board at their December 11-13, 1994 meeting. Other portions of the standard are in the initial stages of the IEEE ballot process. The intent of the standard is to allow hospitals and other users to interface medical electronic devices to host computer systems in a standard, interchangeable manner. The standard is optimized for acute care environments such as ICU's, operating rooms, and emergency rooms. [1] IEEE General Committee and Subcommittee work has been on-going since 1984. Significant amounts of work have been done to discover and meet the needs of the patient care setting. Surveys performed in 1989 identified the following four key user requirements for medical device communications: 1) Frequent reconfiguration of the network. 2) Allow "plug and play" operation by users. 3) Associate devices with a specific bed and patient. 4) Support a wide range of hospital computer system topologies. Additionally, the most critical difference in the acute care setting is patient safety, which has an overall effect on the standard. The standard that went to ballot meets these requirements. The standard is based on existing ISO standards. P1073 is compliant with the OSI seven layer model. P1073 specifies the entire communication stack, from object-oriented software to hospital unique connectors. The standard will be able to be put forward as a true international standard, much in the way that the IEEE 802.x family of standards (like Ethernet) were presented as draft ISO standards.(ABSTRACT TRUNCATED AT 250 WORDS)

  15. A legacy of struggle: the OSHA ergonomics standard and beyond, Part II.

    PubMed

    Delp, Linda; Mojtahedi, Zahra; Sheikh, Hina; Lemus, Jackie

    2014-11-01

    The OSHA ergonomics standard issued in 2000 was repealed within four months through a Congressional resolution that limits future ergonomics rulemaking. This section continues the conversation initiated in Part I, documenting a legacy of struggle for an ergonomics standard through the voices of eight labor, academic, and government key informants. Part I summarized important components of the standard; described the convergence of labor activism, research, and government action that laid the foundation for a standard; and highlighted the debates that characterized the rulemaking process. Part II explores the anti-regulatory political landscape of the 1990s, as well as the key opponents, power dynamics, and legal maneuvers that led to repeal of the standard. This section also describes the impact of the ergonomics struggle beyond the standard itself and ends with a discussion of creative state-level policy initiatives and coalition approaches to prevent work-related musculoskeletal disorders (WMSDs) in today's sociopolitical context.

  16. SEL's Software Process-Improvement Program

    NASA Technical Reports Server (NTRS)

    Basili, Victor; Zelkowitz, Marvin; McGarry, Frank; Page, Jerry; Waligora, Sharon; Pajerski, Rose

    1995-01-01

    The goals and operations of the Software Engineering Laboratory (SEL) is reviewed. For nearly 20 years the SEL has worked to understand, assess, and improve software and the development process within the production environment of the Flight Dynamics Division (FDD) of NASA's Goddard Space Flight Center. The SEL was established in 1976 with the goals of reducing: (1) the defect rate of delivered software, (2) the cost of software to support flight projects, and (3) the average time to produce mission-support software. After studying over 125 projects of FDD, the results have guided the standards, management practices, technologies, and the training within the division. The results of the studies have been a 75 percent reduction in defects, a 50 percent reduction in cost, and a 25 percent reduction in development time. Over time the goals of SEL have been clarified. The goals are now stated as: (1) Understand baseline processes and product characteristics, (2) Assess improvements that have been incorporated into the development projects, (3) Package and infuse improvements into the standard SEL process. The SEL improvement goal is to demonstrate continual improvement of the software process by carrying out analysis, measurement and feedback to projects with in the FDD environment. The SEL supports the understanding of the process by study of several processes including, the effort distribution, and error detection rates. The SEL assesses and refines the processes. Once the assessment and refinement of a process is completed, the SEL packages the process by capturing the process in standards, tools and training.

  17. Low-temperature volume radiation annealing of cold-worked bands of Al-Li-Cu-Mg alloy by 20-40 keV Ar+ ion

    NASA Astrophysics Data System (ADS)

    Ovchinnikov, V. V.; Gushchina, N. V.; Mozharovsky, S. M.; Kaigorodova, L. I.

    2017-01-01

    The processes of radiation-dynamic nature (in contrast to the thermally-activated processes) in the course of short-term irradiation of 1 mm thick bands of cold-worked aluminum alloy 1441 (of system Al-Li-Cu-Mg) with Ar+ 20-40 keV were studied. An effect of in-the-bulk (throughout the whole of metal bands thickness) low-temperature radiation annealing of the named alloy, multiply accelerated as compared with common thermal annealing processes was registered (with projected ranges of ions of considered energies definitely not exceeding 0.1 μm). The processes of recrystallization and intermetallic structure changes (occurring within a few seconds of Ar+ irradiation) have the common features as well as the differences in comparison with the results of two hour standard thermal annealing.

  18. Standards and Stories: The Interactional Work of Informed Choice in Ontario Midwifery Care

    PubMed Central

    Spoel, Philippa; Mckenzie, Pamela; James, Susan; Hobberlin, Jessica

    2013-01-01

    This paper uses a discourse-rhetorical approach to analyze how Ontario midwives and their clients interactionally accomplish the healthcare communicative process of «informed choice.» Working with four excerpts from recorded visits between Ontario midwives and women, the analysis focuses on the discursive rendering during informed choice conversations of two contrasting kinds of evidence — professional standards and story-telling — related to potential interventions during labour. We draw on the concepts of discursive hybridity (Sarangi and Roberts 1999) and recontextualization (Linell 1998; Sarangi 1998) to trace the complex and creative ways in which the conversational participants reconstruct the meanings of these evidentiary sources to address their particular care contexts. This analysis shows how, though very different in their forms, both modes of evidence function as hybrid and flexible discursive resources that perform both instrumental and social-relational healthcare work. PMID:24289941

  19. Risk-Based Tailoring of the Verification, Validation, and Accreditation/Acceptance Processes (Adaptation fondee sur le risque, des processus de verification, de validation, et d’accreditation/d’acceptation)

    DTIC Science & Technology

    2012-04-01

    Systems Concepts and Integration SET Sensors and Electronics Technology SISO Simulation Interoperability Standards Organization SIW Simulation...conjunction with 2006 Fall SIW 2006 September SISO Standards Activity Committee approved beginning IEEE balloting 2006 October IEEE Project...019 published 2008 June Edinborough, UK Held in conjunction with 2008 Euro- SIW 2008 September Laurel, MD, US Work on Composite Model 2008 December

  20. Advancement of the Artificial Pancreas through the Development of Interoperability Standards

    PubMed Central

    Picton, Peter E.; Yeung, Melanie; Hamming, Nathaniel; Desborough, Lane; Dassau, Eyal; Cafazzo, Joseph A.

    2013-01-01

    Despite advancements in the development of the artificial pancreas, barriers in the form of proprietary data and communication protocols of diabetes devices have made the integration of these components challenging. The Artificial Pancreas Standards and Technical Platform Project is an initiative funded by the JDRF Canadian Clinical Trial Network with the goal of developing device communication standards for the interoperability of diabetes devices. Stakeholders from academia, industry, regulatory agencies, and medical and patient communities have been engaged in advancing this effort. In this article, we describe this initiative along with the process involved in working with the standards organizations and stakeholders that are key to ensuring effective standards are developed and adopted. Discussion from a special session of the 12th Annual Diabetes Technology Meeting is also provided. PMID:23911190

  1. Establishing Time for Professional Learning

    ERIC Educational Resources Information Center

    Journal of Staff Development, 2013

    2013-01-01

    Time for collaborative learning is an essential resource for educators working to implement college- and career-ready standards. The pages in this article include tools from the workbook "Establishing Time for Professional Learning." The tools support a complete process to help educators effectively find and use time. The following…

  2. Updating contextualized clinical practice guidelines on stroke rehabilitation and low back pain management using a novel assessment framework that standardizes decisions.

    PubMed

    Gambito, Ephraim D V; Gonzalez-Suarez, Consuelo B; Grimmer, Karen A; Valdecañas, Carolina M; Dizon, Janine Margarita R; Beredo, Ma Eulalia J; Zamora, Marcelle Theresa G

    2015-11-04

    Clinical practice guidelines need to be regularly updated with current literature in order to remain relevant. This paper reports on the approach taken by the Philippine Academy of Rehabilitation Medicine (PARM). This dovetails with its writing guide, which underpinned its foundational work in contextualizing guidelines for stroke and low back pain (LBP) in 2011. Working groups of Filipino rehabilitation physicians and allied health practitioners met to reconsider and modify, where indicated, the 'typical' Filipino patient care pathways established in the foundation guidelines. New clinical guidelines on stroke and low back pain which had been published internationally in the last 3 years were identified using a search of electronic databases. The methodological quality of each guideline was assessed using the iCAHE Guideline Quality Checklist, and only those guidelines which provided full text references, evidence hierarchy and quality appraisal of the included literature, were included in the PARM update. Each of the PARM-endorsed recommendations was then reviewed, in light of new literature presented in the included clinical guidelines. A novel standard updating approach was developed based on the criteria reported by Johnston et al. (Int J Technol Assess Health Care 19(4):646-655, 2003) and then modified to incorporate wording from the foundational PARM writing guide. The new updating tool was debated, pilot-tested and agreed upon by the PARM working groups, before being applied to the guideline updating process. Ten new guidelines on stroke and eleven for low back pain were identified. Guideline quality scores were moderate to good, however not all guidelines comprehensively linked the evidence body underpinning recommendations with the literature. Consequently only five stroke and four low back pain guidelines were included. The modified PARM updating guide was applied by all working groups to ensure standardization of the wording of updated recommendations and the underpinning evidence bases. The updating tool provides a simple, standard and novel approach that incorporates evidence hierarchy and quality, and wordings of recommendations. It could be used efficiently by other guideline updaters particularly in developing countries, where resources for guideline development and updates are limited. When many people are involved in guideline writing, there is always the possibility of 'slippage' in use of wording and interpretation of evidence. The PARM updating tool provides a mechanism for maintaining a standard process for guideline updating processes that can be followed by clinicians with basic training in evidence-based practice principles.

  3. Standardization in gully erosion studies: methodology and interpretation of magnitudes from a global review

    NASA Astrophysics Data System (ADS)

    Castillo, Carlos; Gomez, Jose Alfonso

    2016-04-01

    Standardization is the process of developing common conventions or proceedings to facilitate the communication, use, comparison and exchange of products or information among different parties. It has been an useful tool in different fields from industry to statistics due to technical, economic and social reasons. In science the need for standardization has been recognised in the definition of methods as well as in publication formats. With respect to gully erosion, a number of initiatives have been carried out to propose common methodologies, for instance, for gully delineation (Castillo et al., 2014) and geometrical measurements (Casalí et al., 2015). The main aims of this work are: 1) to examine previous proposals in gully erosion literature implying standardization processes; 2) to contribute with new approaches to improve the homogeneity of methodologies and presentation of results for a better communication among the gully erosion community. For this purpose, we evaluated the basic information provided on environmental factors, discussed the delineation and measurement procedures proposed in previous works and, finally, we analysed statistically the severity of degradation levels derived from different indicators at the world scale. As a result, we presented suggestions aiming to serve as guidance for survey design as well as for the interpretation of vulnerability levels and degradation rates for future gully erosion studies. References Casalí, J., Giménez, R., and Campo-Bescós, M. A.: Gully geometry: what are we measuring?, SOIL, 1, 509-513, doi:10.5194/soil-1-509-2015, 2015. Castillo C., Taguas E. V., Zarco-Tejada P., James M. R., and Gómez J. A. (2014), The normalized topographic method: an automated procedure for gully mapping using GIS, Earth Surf. Process. Landforms, 39, 2002-2015, doi: 10.1002/esp.3595

  4. Overlay metrology for double patterning processes

    NASA Astrophysics Data System (ADS)

    Leray, Philippe; Cheng, Shaunee; Laidler, David; Kandel, Daniel; Adel, Mike; Dinu, Berta; Polli, Marco; Vasconi, Mauro; Salski, Bartlomiej

    2009-03-01

    The double patterning (DPT) process is foreseen by the industry to be the main solution for the 32 nm technology node and even beyond. Meanwhile process compatibility has to be maintained and the performance of overlay metrology has to improve. To achieve this for Image Based Overlay (IBO), usually the optics of overlay tools are improved. It was also demonstrated that these requirements are achievable with a Diffraction Based Overlay (DBO) technique named SCOLTM [1]. In addition, we believe that overlay measurements with respect to a reference grid are required to achieve the required overlay control [2]. This induces at least a three-fold increase in the number of measurements (2 for double patterned layers to the reference grid and 1 between the double patterned layers). The requirements of process compatibility, enhanced performance and large number of measurements make the choice of overlay metrology for DPT very challenging. In this work we use different flavors of the standard overlay metrology technique (IBO) as well as the new technique (SCOL) to address these three requirements. The compatibility of the corresponding overlay targets with double patterning processes (Litho-Etch-Litho-Etch (LELE); Litho-Freeze-Litho-Etch (LFLE), Spacer defined) is tested. The process impact on different target types is discussed (CD bias LELE, Contrast for LFLE). We compare the standard imaging overlay metrology with non-standard imaging techniques dedicated to double patterning processes (multilayer imaging targets allowing one overlay target instead of three, very small imaging targets). In addition to standard designs already discussed [1], we investigate SCOL target designs specific to double patterning processes. The feedback to the scanner is determined using the different techniques. The final overlay results obtained are compared accordingly. We conclude with the pros and cons of each technique and suggest the optimal metrology strategy for overlay control in double patterning processes.

  5. Improvement for enhancing effectiveness of universal power system (UPS) continuous testing process

    NASA Astrophysics Data System (ADS)

    Sriratana, Lerdlekha

    2018-01-01

    This experiment aims to enhance the effectiveness of the Universal Power System (UPS) continuous testing process of the Electrical and Electronic Institute by applying work scheduling and time study methods. Initially, the standard time of testing process has not been considered that results of unaccurate testing target and also time wasting has been observed. As monitoring and reducing waste time for improving the efficiency of testing process, Yamazumi chart and job scheduling theory (North West Corner Rule) were applied to develop new work process. After the improvements, the overall efficiency of the process possibly increased from 52.8% to 65.6% or 12.7%. Moreover, the waste time could reduce from 828.3 minutes to 653.6 minutes or 21%, while testing units per batch could increase from 3 to 4 units. Therefore, the number of testing units would increase from 12 units up to 20 units per month that also contribute to increase of net income of UPS testing process by 72%.

  6. DHM simulation in virtual environments: a case-study on control room design.

    PubMed

    Zamberlan, M; Santos, V; Streit, P; Oliveira, J; Cury, R; Negri, T; Pastura, F; Guimarães, C; Cid, G

    2012-01-01

    This paper will present the workflow developed for the application of serious games in the design of complex cooperative work settings. The project was based on ergonomic studies and development of a control room among participative design process. Our main concerns were the 3D human virtual representation acquired from 3D scanning, human interaction, workspace layout and equipment designed considering ergonomics standards. Using Unity3D platform to design the virtual environment, the virtual human model can be controlled by users on dynamic scenario in order to evaluate the new work settings and simulate work activities. The results obtained showed that this virtual technology can drastically change the design process by improving the level of interaction between final users and, managers and human factors team.

  7. Toward international collaboration on credentialing in health promotion and health education: the Galway Consensus Conference.

    PubMed

    Allegrante, John P; Barry, Margaret M; Auld, M Elaine; Lamarre, Marie-Claude; Taub, Alyson

    2009-06-01

    The interest in competencies, standards, and quality assurance in the professional preparation of public health professionals whose work involves health promotion and health education dates back several decades. In Australia, Europe, and North America, where the interest in credentialing has gained momentum, there have been rapidly evolving efforts to codify competencies and standards of practice as well as the processes by which quality and accountability can be ensured in academic professional preparation programs. The Galway Consensus Conference was conceived as a first step in an effort to explore the development of an international consensus regarding the core competencies of health education specialists and professionals in health promotion and the commonalities and differences in establishing uniform standards for the accreditation of academic professional preparation programs around the world. This article describes the purposes, objectives, and process of the Galway Consensus Conference and the background to the meeting that was convened.

  8. The caBIG Terminology Review Process

    PubMed Central

    Cimino, James J.; Hayamizu, Terry F.; Bodenreider, Olivier; Davis, Brian; Stafford, Grace A.; Ringwald, Martin

    2009-01-01

    The National Cancer Institute (NCI) is developing an integrated biomedical informatics infrastructure, the cancer Biomedical Informatics Grid (caBIG®), to support collaboration within the cancer research community. A key part of the caBIG architecture is the establishment of terminology standards for representing data. In order to evaluate the suitability of existing controlled terminologies, the caBIG Vocabulary and Data Elements Workspace (VCDE WS) working group has developed a set of criteria that serve to assess a terminology's structure, content, documentation, and editorial process. This paper describes the evolution of these criteria and the results of their use in evaluating four standard terminologies: the Gene Ontology (GO), the NCI Thesaurus (NCIt), the Common Terminology for Adverse Events (known as CTCAE), and the laboratory portion of the Logical Objects, Identifiers, Names and Codes (LOINC). The resulting caBIG criteria are presented as a matrix that may be applicable to any terminology standardization effort. PMID:19154797

  9. The need for GPS standardization

    NASA Technical Reports Server (NTRS)

    Lewandowski, Wlodzimierz W.; Petit, Gerard; Thomas, Claudine

    1992-01-01

    A desirable and necessary step for improvement of the accuracy of Global Positioning System (GPS) time comparisons is the establishment of common GPS standards. For this reason, the CCDS proposed the creation of a special group of experts with the objective of recommending procedures and models for operational time transfer by GPS common-view method. Since the announcement of the implementation of Selective Availability at the end of last spring, action has become much more urgent and this CCDS Group on GPS Time Transfer Standards has now been set up. It operates under the auspices of the permanent CCDS Working Group on TAI and works in close cooperation with the Sub-Committee on Time of the Civil GPS Service Interface Committee (CGSIC). Taking as an example the implementation of SA during the first week of July 1991, this paper illustrates the need to develop urgently at least two standardized procedures in GPS receiver software: monitoring GPS tracks with a common time scale and retaining broadcast ephemeris parameters throughout the duration of a track. Other matters requiring action are the adoption of common models for atmospheric delay, a common approach to hardware design and agreement about short-term data processing. Several examples of such deficiencies in standardization are presented.

  10. Process of prototyping coronary stents from biodegradable Fe-Mn alloys.

    PubMed

    Hermawan, Hendra; Mantovani, Diego

    2013-11-01

    Biodegradable stents are considered to be a recent innovation, and their feasibility and applicability have been proven in recent years. Research in this area has focused on materials development and biological studies, rather than on how to transform the developed biodegradable materials into the stent itself. Currently available stent technology, the laser cutting-based process, might be adapted to fabricate biodegradable stents. In this work, the fabrication, characterization and testing of biodegradable Fe-Mn stents are described. A standard process for fabricating and testing stainless steel 316L stents was referred to. The influence of process parameters on the physical, metallurgical and mechanical properties of the stents, and the quality of the produced stents, were investigated. It was found that some steps of the standard process such as laser cutting can be directly applied, but changes to parameters are needed for annealing, and alternatives are needed to replace electropolishing. Copyright © 2013 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  11. European Council of Legal Medicine (ECLM) accreditation of forensic pathology services in Europe.

    PubMed

    Mangin, P; Bonbled, F; Väli, M; Luna, A; Bajanowski, T; Hougen, H P; Ludes, B; Ferrara, D; Cusack, D; Keller, E; Vieira, N

    2015-03-01

    Forensic experts play a major role in the legal process as they offer professional expert opinion and evidence within the criminal justice system adjudicating on the innocence or alleged guilt of an accused person. In this respect, medico-legal examination is an essential part of the investigation process, determining in a scientific way the cause(s) and manner of unexpected and/or unnatural death or bringing clinical evidence in case of physical, psychological, or sexual abuse in living people. From a legal perspective, these types of investigation must meet international standards, i.e., it should be independent, effective, and prompt. Ideally, the investigations should be conducted by board-certified experts in forensic medicine, endowed with a solid experience in this field, without any hierarchical relationship with the prosecuting authorities and having access to appropriate facilities in order to provide forensic reports of high quality. In this respect, there is a need for any private or public national or international authority including non-governmental organizations seeking experts qualified in forensic medicine to have at disposal a list of specialists working in accordance with high standards of professional performance within forensic pathology services that have been successfully submitted to an official accreditation/certification process using valid and acceptable criteria. To reach this goal, the National Association of Medical Examiners (NAME) has elaborated an accreditation/certification checklist which should be served as decision-making support to assist inspectors appointed to evaluate applicants. In the same spirit than NAME Accreditation Standards, European Council of Legal Medicine (ECLM) board decided to set up an ad hoc working group with the mission to elaborate an accreditation/certification procedure similar to the NAME's one but taking into account the realities of forensic medicine practices in Europe and restricted to post-mortem investigations. This accreditation process applies to services and not to individual practitioners by emphasizing policies and procedures rather than professional performance. In addition, the standards to be complied with should be considered as the minimum standards needed to get the recognition of performing and reliable forensic pathology service.

  12. A Standardized Framework for Transplant-Specific Competencies for Dietitians.

    PubMed

    Pieloch, Daniel; Friedman, Golnaz G; DiCecco, Sara; Ulerich, Linda; Beer, Stacey; Hasse, Jeanette

    2017-09-01

    Dietitians have extensive training and are considered the experts in medical nutrition therapy (MNT). Although dietitian competencies for MNT are well established, competencies that account for the expanded roles of dietitians working in transplantation have not been developed. These expanded roles require a better understanding of transplant processes, regulations, and even the business side of transplant, novel concepts to most dietitians. Therefore, we proposed a standardized framework of transplant-specific competencies for dietitians practicing in transplantation. These competencies can help improve and standardize initial and ongoing training for transplant dietitians moving forward, ultimately leading to improved patient care for transplant candidates, recipients, and donors.

  13. Procedures and Standards for Residential Ventilation System Commissioning: An Annotated Bibliography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stratton, J. Chris; Wray, Craig P.

    2013-04-01

    Beginning with the 2008 version of Title 24, new homes in California must comply with ANSI/ASHRAE Standard 62.2-2007 requirements for residential ventilation. Where installed, the limited data available indicate that mechanical ventilation systems do not always perform optimally or even as many codes and forecasts predict. Commissioning such systems when they are installed or during subsequent building retrofits is a step towards eliminating deficiencies and optimizing the tradeoff between energy use and acceptable IAQ. Work funded by the California Energy Commission about a decade ago at Berkeley Lab documented procedures for residential commissioning, but did not focus on ventilation systems.more » Since then, standards and approaches for commissioning ventilation systems have been an active area of work in Europe. This report describes our efforts to collect new literature on commissioning procedures and to identify information that can be used to support the future development of residential-ventilation-specific procedures and standards. We recommend that a standardized commissioning process and a commissioning guide for practitioners be developed, along with a combined energy and IAQ benefit assessment standard and tool, and a diagnostic guide for estimating continuous pollutant emission rates of concern in residences (including a database that lists emission test data for commercially-available labeled products).« less

  14. Using ISO 25040 standard for evaluating electronic health record systems.

    PubMed

    Oliveira, Marília; Novaes, Magdala; Vasconcelos, Alexandre

    2013-01-01

    Quality of electronic health record systems (EHR-S) is one of the key points in the discussion about the safe use of this kind of system. It stimulates creation of technical standards and certifications in order to establish the minimum requirements expected for these systems. [1] In other side, EHR-S suppliers need to invest in evaluation of their products to provide systems according to these requirements. This work presents a proposal of use ISO 25040 standard, which focuses on the evaluation of software products, for define a model of evaluation of EHR-S in relation to Brazilian Certification for Electronic Health Record Systems - SBIS-CFM Certification. Proposal instantiates the process described in ISO 25040 standard using the set of requirements that is scope of the Brazilian certification. As first results, this research has produced an evaluation model and a scale for classify an EHR-S about its compliance level in relation to certification. This work in progress is part for the acquisition of the degree of master in Computer Science at the Federal University of Pernambuco.

  15. Work of the Web Weavers: Web Development in Academic Libraries

    ERIC Educational Resources Information Center

    Bundza, Maira; Vander Meer, Patricia Fravel; Perez-Stable, Maria A.

    2009-01-01

    Although the library's Web site has become a standard tool for seeking information and conducting research in academic institutions, there are a variety of ways libraries approach the often challenging--and sometimes daunting--process of Web site development and maintenance. Three librarians at Western Michigan University explored issues related…

  16. Facilitating the Transition to Postgraduate Attainment: The Experience of One Postgraduate, Pre-Registration Physiotherapy Programme

    ERIC Educational Resources Information Center

    Spearing, Rachel

    2014-01-01

    Students on the MSc Physiotherapy (pre-registration) programme at Manchester Metropolitan University work at postgraduate level, whilst studying to become physiotherapists. To facilitate the transition to postgraduate attainment, students participated in two sessions designed to inform them about assessment processes and standards. The hypothesis…

  17. The Dreaded "Work" Problems Revisited: Connections through Problem Solving from Basic Fractions to Calculus

    ERIC Educational Resources Information Center

    Shore, Felice S.; Pascal, Matthew

    2008-01-01

    This article describes several distinct approaches taken by preservice elementary teachers to solving a classic rate problem. Their approaches incorporate a variety of mathematical concepts, ranging from proportions to infinite series, and illustrate the power of all five NCTM Process Standards. (Contains 8 figures.)

  18. 40 CFR 415.66 - Pretreatment standards for new sources (PSNS).

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... the mercury cell process, which introduces pollutants into a publicly owned treatment works, must...): Subpart F—Chlor-Alkali-Mercury Cells Pollutant or pollutant property PSNS effluent limitations Maximum for any 1 day Average of daily values for 30 consecutive days Milligrams per liter Mercury (T) 0.11 0.048...

  19. The Language, Working Memory, and Other Cognitive Demands of Verbal Tasks

    ERIC Educational Resources Information Center

    Archibald, Lisa M. D.

    2013-01-01

    Purpose: To gain a better understanding of the cognitive processes supporting verbal abilities, the underlying structure and interrelationships between common verbal measures were investigated. Methods: An epidemiological sample (n = 374) of school-aged children completed standardized tests of language, intelligence, and short-term and working…

  20. 75 FR 12772 - Federal Labor Standards Payee Verification and Payment Processing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-17

    ... the Office of Management and Budget (OMB) for review, as required by the Paperwork Reduction Act. The... restitution payments on behalf of construction and maintenance workers who have been underpaid for work... be sent to: HUD Desk Officer, Office of Management and Budget, New Executive Office Building...

  1. Library Dream Machines: Helping Students Master Super Online Catalogs.

    ERIC Educational Resources Information Center

    Webb, T. D.

    1992-01-01

    Describes how automation has transformed the library and how super-catalogs have affected the process of doing research. Explains how faculty and librarians can work together to help students to use the available databases effectively, by teaching them Boolean logic, standard record formats, filing rules, etc. (DMM)

  2. It's the Teacher, Stupid

    ERIC Educational Resources Information Center

    Fletcher, Geoffrey H.

    2012-01-01

    The author has taken the liberty of borrowing from past candidate (and president) Bill Clinton's "War Room" mantra to suggest that those who are working toward preparing schools for online assessments of Common Core State Standards (CCSS) might be forgetting the most important element in the process: the classroom teacher. The author argues that…

  3. AMPAC as an intelligent communication core for printing process

    NASA Astrophysics Data System (ADS)

    Mishina, Hiromichi; Yuasa, Tomonori

    2000-12-01

    The feature analysis of the conventional exchange format used in the field of the graphic arts is performed. It becomes clear from this consideration that most standard defining the transmission format have too strict limitations to adapt for the communication required in the creative and flexible work.

  4. 15 CFR 296.20 - The selection process.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAMS TECHNOLOGY INNOVATION... award criteria listed in § 296.22. In some cases NIST may conduct oral reviews and/or site visits. The.... (e) NIST reserves the right to negotiate the cost and scope of the proposed work with the proposers...

  5. 15 CFR 296.20 - The selection process.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAMS TECHNOLOGY INNOVATION... award criteria listed in § 296.22. In some cases NIST may conduct oral reviews and/or site visits. The.... (e) NIST reserves the right to negotiate the cost and scope of the proposed work with the proposers...

  6. 15 CFR 296.20 - The selection process.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAMS TECHNOLOGY INNOVATION... award criteria listed in § 296.22. In some cases NIST may conduct oral reviews and/or site visits. The.... (e) NIST reserves the right to negotiate the cost and scope of the proposed work with the proposers...

  7. 15 CFR 296.20 - The selection process.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAMS TECHNOLOGY INNOVATION... award criteria listed in § 296.22. In some cases NIST may conduct oral reviews and/or site visits. The.... (e) NIST reserves the right to negotiate the cost and scope of the proposed work with the proposers...

  8. 15 CFR 296.20 - The selection process.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAMS TECHNOLOGY INNOVATION... award criteria listed in § 296.22. In some cases NIST may conduct oral reviews and/or site visits. The.... (e) NIST reserves the right to negotiate the cost and scope of the proposed work with the proposers...

  9. Reinforcing the Afrocentric Paradigm: A Theoretical Project

    ERIC Educational Resources Information Center

    Sams, Timothy E.

    2010-01-01

    Thomas Kuhn's 1962 groundbreaking work, "The Scientific Revolution," established the process for creating, and the components of, a disciplinary paradigm. This "scientific revolution" has evolved to become the standard for determining a field's claim to disciplinary status. In 2001 and 2003, Ama Mazama, used Kuhn's model to establish the…

  10. Cataloging Guide for Instructional Materials Used in Livonia Public Schools Instructional Materials Centers.

    ERIC Educational Resources Information Center

    Livonia Public Schools, MI.

    This working guide for Livonia's Public Schools provides detailed instructions in preparing and handling catalog cards, a supplemental cataloging and classification guide, and typing rules for technical processing. Standard abbreviations are given for making classification entries, and separate cataloging instructions are given for charts,…

  11. New Perils for the Contract Ethnographer.

    ERIC Educational Resources Information Center

    Fetterman, David M.

    1981-01-01

    Conditions of contract research may lead some workers to ignore publication rights of colleagues whose reports are of limited circulation. The author presents a case example of how this process occurred with the use of his own work and argues for rigorous ethical standards in the publication of contract research results. (Author/GC)

  12. Working memory capacity and redundant information processing efficiency.

    PubMed

    Endres, Michael J; Houpt, Joseph W; Donkin, Chris; Finn, Peter R

    2015-01-01

    Working memory capacity (WMC) is typically measured by the amount of task-relevant information an individual can keep in mind while resisting distraction or interference from task-irrelevant information. The current research investigated the extent to which differences in WMC were associated with performance on a novel redundant memory probes (RMP) task that systematically varied the amount of to-be-remembered (targets) and to-be-ignored (distractor) information. The RMP task was designed to both facilitate and inhibit working memory search processes, as evidenced by differences in accuracy, response time, and Linear Ballistic Accumulator (LBA) model estimates of information processing efficiency. Participants (N = 170) completed standard intelligence tests and dual-span WMC tasks, along with the RMP task. As expected, accuracy, response-time, and LBA model results indicated memory search and retrieval processes were facilitated under redundant-target conditions, but also inhibited under mixed target/distractor and redundant-distractor conditions. Repeated measures analyses also indicated that, while individuals classified as high (n = 85) and low (n = 85) WMC did not differ in the magnitude of redundancy effects, groups did differ in the efficiency of memory search and retrieval processes overall. Results suggest that redundant information reliably facilitates and inhibits the efficiency or speed of working memory search, and these effects are independent of more general limits and individual differences in the capacity or space of working memory.

  13. [Formula: see text]Working memory and attention in pediatric brain tumor patients treated with and without radiation therapy.

    PubMed

    Raghubar, Kimberly P; Mahone, E Mark; Yeates, Keith Owen; Cecil, Kim M; Makola, Monwabisi; Ris, M Douglas

    2017-08-01

    Children are at risk for cognitive difficulties following the diagnosis and treatment of a brain tumor. Longitudinal studies have consistently demonstrated declines on measures of intellectual functioning, and recently it has been proposed that specific neurocognitive processes underlie these changes, including working memory, processing speed, and attention. However, a fine-grained examination of the affected neurocognitive processes is required to inform intervention efforts. Radiation therapy (RT) impacts white matter integrity, likely affecting those cognitive processes supported by distributed neural networks. This study examined working memory and attention in children during the early delayed stages of recovery following surgical resection and RT. The participants included 27 children diagnosed with pediatric brain tumor, treated with (n = 12) or without (n = 15) RT, who completed experimental and standardized measures of working memory and attention (n-back and digit span tasks). Children treated with radiation performed less well than those who did not receive radiation on the n-back measure, though performance at the 0-back level was considerably poorer than would be expected for both groups, perhaps suggesting difficulties with more basic processes such as vigilance. Along these lines, marginal differences were noted on digit span forward. The findings are discussed with respect to models of attention and working memory, and the interplay between the two.

  14. Information processing efficiency in patients with multiple sclerosis.

    PubMed

    Archibald, C J; Fisk, J D

    2000-10-01

    Reduced information processing efficiency, consequent to impaired neural transmission, has been proposed as underlying various cognitive problems in patients with Multiple Sclerosis (MS). This study employed two measures developed from experimental psychology that control for the potential confound of perceptual-motor abnormalities (Salthouse, Babcock, & Shaw, 1991; Sternberg, 1966, 1969) to assess the speed of information processing and working memory capacity in patients with mild to moderate MS. Although patients had significantly more cognitive complaints than neurologically intact matched controls, their performance on standard tests of immediate memory span did not differ from control participants and their word list learning was within normal limits. On the experimental measures, both relapsing-remitting and secondary-progressive patients exhibited significantly slowed information processing speed relative to controls. However, only the secondary-progressive patients had an additional decrement in working memory capacity. Depression, fatigue, or neurologic disability did not account for performance differences on these measures. While speed of information processing may be slowed early in the disease process, deficits in working memory capacity may appear only as there is progression of MS. It is these latter deficits, however, that may underlie the impairment of new learning that patients with MS demonstrate.

  15. Report from the First CERT-RMM Users Group Workshop Series

    DTIC Science & Technology

    2012-04-01

    deploy processes to support our programs – Benchmark our programs to determine current gaps – Complements current work in CMMI® and ISO 27001 19...benchmarking program performance through process analytics and Lean/Six Sigma activities to ensure Performance Excellence. • Provides ISO Standards...Office www.cmu.edu/ iso 29 Carnegie Mellon University • Est 1967 in Pittsburgh, PA • Global, private research university • Ranked 22nd • 15,000

  16. Productivity analysis to overcome the limited availability of production time in SME FBS

    NASA Astrophysics Data System (ADS)

    Nurhasanah, N.; Jingga; Aribowo, B.; Gayatri, AM; Mardhika, DA; Tanjung, WN; Suri, QA; Safitri, R.; Supriyanto, A.

    2017-12-01

    Good industrial development should pay attention to the human factor as the main driver. Condition of work procedures, work area, and environment can affect the production result because if not optimal, the production will run slowly. If the work system is less than optimal, the productivity will do so, the operator will work uncomfortably and be easy to undergo work fatigue, even it can cause work accidents. Thus, the optimal and ergonomic arrangement of the the overall work system mechanism and work environment design is required for workers to work well, regularly, safely and comfortably with the aim of improving work productivity. This research measures the performance in textile SME (Small and Medium Enterprise) located in Sukabumi which is SME FBS which produces children’s clothing. This performance measurement is aimed at improving the competitiveness of the textile IKM so that it has the equal competitiveness with other SMEs or with textile industries that already have their name in market. Based on the method of hour standard time and TOC calculation at 2 FBS CMT (Cut-Make-Trim) in Sukabumi, which are the CMT Margaluyu Village and CMT Purabaya Village, the result is that the standard time of shirt work on CMT Margaluyu Village is less than that of CMT Desa Purabaya. It can be seen that more effective in SME FBS production is by process method.

  17. Databases and coordinated research projects at the IAEA on atomic processes in plasmas

    NASA Astrophysics Data System (ADS)

    Braams, Bastiaan J.; Chung, Hyun-Kyung

    2012-05-01

    The Atomic and Molecular Data Unit at the IAEA works with a network of national data centres to encourage and coordinate production and dissemination of fundamental data for atomic, molecular and plasma-material interaction (A+M/PMI) processes that are relevant to the realization of fusion energy. The Unit maintains numerical and bibliographical databases and has started a Wiki-style knowledge base. The Unit also contributes to A+M database interface standards and provides a search engine that offers a common interface to multiple numerical A+M/PMI databases. Coordinated Research Projects (CRPs) bring together fusion energy researchers and atomic, molecular and surface physicists for joint work towards the development of new data and new methods. The databases and current CRPs on A+M/PMI processes are briefly described here.

  18. CERES: A new cerebellum lobule segmentation method.

    PubMed

    Romero, Jose E; Coupé, Pierrick; Giraud, Rémi; Ta, Vinh-Thong; Fonov, Vladimir; Park, Min Tae M; Chakravarty, M Mallar; Voineskos, Aristotle N; Manjón, Jose V

    2017-02-15

    The human cerebellum is involved in language, motor tasks and cognitive processes such as attention or emotional processing. Therefore, an automatic and accurate segmentation method is highly desirable to measure and understand the cerebellum role in normal and pathological brain development. In this work, we propose a patch-based multi-atlas segmentation tool called CERES (CEREbellum Segmentation) that is able to automatically parcellate the cerebellum lobules. The proposed method works with standard resolution magnetic resonance T1-weighted images and uses the Optimized PatchMatch algorithm to speed up the patch matching process. The proposed method was compared with related recent state-of-the-art methods showing competitive results in both accuracy (average DICE of 0.7729) and execution time (around 5 minutes). Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Implementation of a quality assurance process for non-therapeutic infant male circumcision providers in North West England.

    PubMed

    Whittaker, P J; Gollins, H J; Roaf, E J

    2014-03-01

    Infant male circumcision is practised by many groups for religious and cultural reasons. Prompted by a desire to minimize the complication rate and to help parents identify good quality providers, a quality assurance (QA) process for infant male circumcision providers has been developed in Greater Manchester. Local stakeholders agreed a set of minimum standards, and providers were invited to submit evidence of their practice in relation to these standards. In participation with parents, community groups, faith groups, healthcare staff and safeguarding partners, an information leaflet for parents was produced. Engagement work with local community groups, faith groups, providers and healthcare staff was vital to ensure that the resources are accessible to parents and that providers continue to engage in the process. Providers that met the QA standards have been listed on a local website. Details of the website are included in the information leaflet distributed by maternity services, health visitors, primary care and community and faith groups. The leaflet is available in seven languages. Local QA processes can be used to encourage and identify good practice and to support parents who need to access services outside the remit of the National Health Service.

  20. The Use of Dreams in Psychotherapy

    PubMed Central

    Schredl, Michael; Bohusch, Claudia; Kahl, Johanna; Mader, Andrea; Somesan, Alexandra

    2000-01-01

    Since the publication of Sigmund Freud's The Interpretation of Dreams, dream interpretation has been a standard technique often used in psychotherapy. However, empirical studies about the frequency of working on dreams in therapy are lacking. The present study elicited, via a self-developed questionnaire, various aspects of work on dreams applied by psychotherapists in private practice. The findings indicate that dreams were often used in therapy, especially in psychoanalysis. In addition, a significant relationship was found between the frequency of the therapists' working on their own dreams and frequency of work on dreams in therapy. Because work on dreams was rated as beneficial for the clients, further studies investigating the effectiveness and the process of working on dreams will be of interest. PMID:10793127

  1. New Directions in Space Operations Services in Support of Interplanetary Exploration

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.

    2005-01-01

    To gain access to the necessary operational processes and data in support of NASA's Lunar/Mars Exploration Initiative, new services, adequate levels of computing cycles and access to myriad forms of data must be provided to onboard spacecraft and ground based personnel/systems (earth, lunar and Martian) to enable interplanetary exploration by humans. These systems, cycles and access to vast amounts of development, test and operational data will be required to provide a new level of services not currently available to existing spacecraft, on board crews and other operational personnel. Although current voice, video and data systems in support of current space based operations has been adequate, new highly reliable and autonomous processes and services will be necessary for future space exploration activities. These services will range from the more mundane voice in LEO to voice in interplanetary travel which because of the high latencies will require new voice processes and standards. New services, like component failure predictions based on data mining of significant quantities of data, located at disparate locations, will be required. 3D or holographic representation of onboard components, systems or family members will greatly improve maintenance, operations and service restoration not to mention crew morale. Current operational systems and standards, like the Internet Protocol, will not able to provide the level of service required end to end from an end point on the Martian surface like a scientific instrument to a researcher at a university. Ground operations whether earth, lunar or Martian and in flight operations to the moon and especially to Mars will require significant autonomy that will require access to highly reliable processing capabilities, data storage based on network storage technologies. Significant processing cycles will be needed onboard but could be borrowed from other locations either ground based or onboard other spacecraft. Reliability will be a key factor with onboard and distributed backup processing an absolutely necessary requirement. Current cluster processing/Grid technologies may provide the basis for providing these services. An overview of existing services, future services that will be required and the technologies and standards required to be developed will be presented. The purpose of this paper will be to initiate a technological roadmap, albeit at a high level, of current voice, video, data and network technologies and standards (which show promise for adaptation or evolution) to what technologies and standards need to be redefined, adjusted or areas where new ones require development. The roadmap should begin the differentiation between non manned and manned processes/services where applicable. The paper will be based in part on the activities of the CCSDS Monitor and Control working group which is beginning the process of standardization of the these processes. Another element of the paper will be based on an analysis of current technologies supporting space flight processes and services at JSC, MSFC, GSFC and to a lesser extent at KSC. Work being accomplished in areas such as Grid computing, data mining and network storage at ARC, IBM and the University of Alabama at Huntsville will be researched and analyzed.

  2. Interoperability in planetary research for geospatial data analysis

    NASA Astrophysics Data System (ADS)

    Hare, Trent M.; Rossi, Angelo P.; Frigeri, Alessandro; Marmo, Chiara

    2018-01-01

    For more than a decade there has been a push in the planetary science community to support interoperable methods for accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (e.g., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized geospatial image formats, geologic mapping conventions, U.S. Federal Geographic Data Committee (FGDC) cartographic and metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Map Tile Services (cached image tiles), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they can be just as valuable for planetary domain. Another initiative, called VESPA (Virtual European Solar and Planetary Access), will marry several of the above geoscience standards and astronomy-based standards as defined by International Virtual Observatory Alliance (IVOA). This work outlines the current state of interoperability initiatives in use or in the process of being researched within the planetary geospatial community.

  3. 45 CFR 2543.84 - Contract Work Hours and Safety Standards Act.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Contract Work Hours and Safety Standards Act. 2543... laborer on the basis of a standard work week of 40 hours. Work in excess of the standard work week is... pay for all hours worked in excess of 40 hours in the work week. Section 107 of the Act is applicable...

  4. Academy of nutrition and dietetics: revised 2014 standards of practice and standards of professional performance for registered dietitian nutritionists (competent, proficient, and expert) in sports nutrition and dietetics.

    PubMed

    Steinmuller, Patricia L; Kruskall, Laura J; Karpinski, Christine A; Manore, Melinda M; Macedonio, Michele A; Meyer, Nanna L

    2014-04-01

    Sports nutrition and dietetics addresses relationships of nutrition with physical activity, including weight management, exercise, and physical performance. Nutrition plays a key role in the prevention and treatment of obesity and chronic disease and for maintenance of health, and the ability to engage in physical activity, sports, and other aspects of physical performance. Thus, the Sports, Cardiovascular, and Wellness Nutrition Dietetic Practice Group, with guidance from the Academy of Nutrition and Dietetics Quality Management Committee, has developed the Revised 2014 Standards of Practice and Standards of Professional Performance as a resource for Registered Dietitian Nutritionists working in sports nutrition and dietetics to assess their current skill levels and to identify areas for further professional development in this emerging practice area. The revised document reflects advances in sports nutrition and dietetics practice since the original standards were published in 2009 and replaces those standards. The Standards of Practice represents the four steps in the Nutrition Care Process as applied to the care of patients/clients. The Standards of Professional Performance covers six standards of professional performance: quality in practice, competence and accountability, provision of services, application of research, communication and application of knowledge, and utilization and management of resources. Within each standard, specific indicators provide measurable action statements that illustrate how the standards can be applied to practice. The indicators describe three skill levels (competent, proficient, and expert) for Registered Dietitian Nutritionists working in sports nutrition and dietetics. The Standards of Practice and Standards of Professional Performance are complementary resources for Registered Dietitian Nutritionists in sports nutrition and dietetics practice. Copyright © 2014 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  5. "State of the Art" of technical protection measures in Austria and the effectiveness documented during bedload and debris flow events

    NASA Astrophysics Data System (ADS)

    Moser, Markus; Mehlhorn, Susanne; Rudolf-Miklau, Florian; Suda, Jürgen

    2017-04-01

    Since the beginning of systematic torrent control in Austria 130 years ago, barriers are constructed for protection purposes. Until the end of the 1960s, solid barriers were built at the exits of depositional areas to prevent dangerous debris flows from reaching high consequence areas. The development of solid barriers with large slots or slits to regulate sediment transport began with the use of reinforced concrete during the 1970s (Rudolf-Miklau, Suda 2011). In order to dissipate the energy of debris flows debris flow breakers have been designed since the 1980s. By slowing and depositing the surge front of the debris flow, downstream reaches of the stream channel and settlement areas should be exposed to considerably lower dynamic impact. In the past, the technological development of these constructions was only steered by the experiences of the engineering practice while an institutionalized process of standardization comparable to other engineering branches was not existent. In future all structures have to be designed and dimensioned according to the EUROCODE standards. This was the reason to establish an interdisciplinary working group (ON-K 256) at the Austrian Standards Institute (ASI), which has managed to developed comprehensive new technical standards for torrent control engineering, including load models, design, dimensioning and life cycle assessment of torrent control works (technical standard ONR 24800 - series). Extreme torrential events comprise four definable displacement processes floods; fluvial solid transport; hyper-concentrated solid transport (debris floods) and debris flow (stony debris flow or mud-earth flow). As a rule, the design of the torrential barriers has to follow its function (Kettl, 1984). Modern protection concepts in torrent control are scenario-oriented and try to optimize different functions in a chain of protections structures (function chain). More or less the first step for the designing the optimal construction type is the definition of the displacement processes for each torrent section. The criteria for each process are defined in the technical standard ONR 24800 - series in Austria. According to ONR 24800 the functions of torrential barriers can be divided in process control functional types (retention; dosing and filtering; energy dissipation). The last step is the designing of the construction type. Bedload and debris events in Austria showed the functionality of the barriers. On the basis of these findings and results, some recommendations were derived to improve the function fulfilment of the technical protection measures.

  6. Simulant Basis for the Standard High Solids Vessel Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterson, Reid A.; Fiskum, Sandra K.; Suffield, Sarah R.

    The Waste Treatment and Immobilization Plant (WTP) is working to develop a Standard High Solids Vessel Design (SHSVD) process vessel. To support testing of this new design, WTP engineering staff requested that a Newtonian simulant and a non-Newtonian simulant be developed that would represent the Most Adverse Design Conditions (in development) with respect to mixing performance as specified by WTP. The majority of the simulant requirements are specified in 24590-PTF-RPT-PE-16-001, Rev. 0. The first step in this process is to develop the basis for these simulants. This document describes the basis for the properties of these two simulant types. Themore » simulant recipes that meet this basis will be provided in a subsequent document.« less

  7. Recommendations for selecting drug-drug interactions for clinical decision support.

    PubMed

    Tilson, Hugh; Hines, Lisa E; McEvoy, Gerald; Weinstein, David M; Hansten, Philip D; Matuszewski, Karl; le Comte, Marianne; Higby-Baker, Stefanie; Hanlon, Joseph T; Pezzullo, Lynn; Vieson, Kathleen; Helwig, Amy L; Huang, Shiew-Mei; Perre, Anthony; Bates, David W; Poikonen, John; Wittie, Michael A; Grizzle, Amy J; Brown, Mary; Malone, Daniel C

    2016-04-15

    Recommendations for including drug-drug interactions (DDIs) in clinical decision support (CDS) are presented. A conference series was conducted to improve CDS for DDIs. A work group consisting of 20 experts in pharmacology, drug information, and CDS from academia, government agencies, health information vendors, and healthcare organizations was convened to address (1) the process to use for developing and maintaining a standard set of DDIs, (2) the information that should be included in a knowledge base of standard DDIs, (3) whether a list of contraindicated drug pairs can or should be established, and (4) how to more intelligently filter DDI alerts. We recommend a transparent, systematic, and evidence-driven process with graded recommendations by a consensus panel of experts and oversight by a national organization. We outline key DDI information needed to help guide clinician decision-making. We recommend judicious classification of DDIs as contraindicated and more research to identify methods to safely reduce repetitive and less-relevant alerts. An expert panel with a centralized organizer or convener should be established to develop and maintain a standard set of DDIs for CDS in the United States. The process should be evidence driven, transparent, and systematic, with feedback from multiple stakeholders for continuous improvement. The scope of the expert panel's work should be carefully managed to ensure that the process is sustainable. Support for research to improve DDI alerting in the future is also needed. Adoption of these steps may lead to consistent and clinically relevant content for interruptive DDIs, thus reducing alert fatigue and improving patient safety. Copyright © 2016 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  8. Nurse practitioners as attending providers for workers with uncomplicated back injuries: using administrative data to evaluate quality and process of care.

    PubMed

    Sears, Jeanne M; Wickizer, Thomas M; Franklin, Gary M; Cheadle, Allen D; Berkowitz, Bobbie

    2007-08-01

    The objectives of this study were 1) to identify quality and process of care indicators available in administrative workers' compensation data and to document their association with work disability outcomes, and 2) to use these indicators to assess whether nurse practitioners (NPs), recently authorized to serve as attending providers for injured workers in Washington State, performed differently than did primary care physicians (PCPs). Quality and process of care indicators for NP and PCP back injury claims from Washington State were compared using direct standardization and logistic regression. This study found little evidence of differences between NP and PCP claims in case mix or quality of care. The process of care indicators that we identified were highly associated with the duration of work disability and have potential for further development to assess and promote quality improvement.

  9. The role of working memory and declarative memory in trace conditioning

    PubMed Central

    Connor, David A.; Gould, Thomas J.

    2017-01-01

    Translational assays of cognition that are similarly implemented in both lower and higher-order species, such as rodents and primates, provide a means to reconcile preclinical modeling of psychiatric neuropathology and clinical research. To this end, Pavlovian conditioning has provided a useful tool for investigating cognitive processes in both lab animal models and humans. This review focuses on trace conditioning, a form of Pavlovian conditioning typified by the insertion of a temporal gap (i.e., trace interval) between presentations of a conditioned stimulus (CS) and an unconditioned stimulus (US). This review aims to discuss pre-clinical and clinical work investigating the mnemonic processes recruited for trace conditioning. Much work suggests that trace conditioning involves unique neurocognitive mechanisms to facilitate formation of trace memories in contrast to standard Pavlovian conditioning. For example, the hippocampus and prefrontal cortex (PFC) appear to play critical roles in trace conditioning. Moreover, cognitive mechanistic accounts in human studies suggest that working memory and declarative memory processes are engaged to facilitate formation of trace memories. The aim of this review is to integrate cognitive and neurobiological accounts of trace conditioning from preclinical and clinical studies to examine involvement of working and declarative memory. PMID:27422017

  10. Bringing Standardized Processes in Atom-Probe Tomography: I Establishing Standardized Terminology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Ian M; Danoix, F; Forbes, Richard

    2011-01-01

    Defining standardized methods requires careful consideration of the entire field and its applications. The International Field Emission Society (IFES) has elected a Standards Committee, whose task is to determine the needed steps to establish atom-probe tomography as an accepted metrology technique. Specific tasks include developing protocols or standards for: terminology and nomenclature; metrology and instrumentation, including specifications for reference materials; test methodologies; modeling and simulations; and science-based health, safety, and environmental practices. The Committee is currently working on defining terminology related to atom-probe tomography with the goal to include terms into a document published by the International Organization for Standardsmore » (ISO). A lot of terms also used in other disciplines have already been defined) and will be discussed for adoption in the context of atom-probe tomography.« less

  11. CrowdMapping: A Crowdsourcing-Based Terminology Mapping Method for Medical Data Standardization.

    PubMed

    Mao, Huajian; Chi, Chenyang; Huang, Boyu; Meng, Haibin; Yu, Jinghui; Zhao, Dongsheng

    2017-01-01

    Standardized terminology is the prerequisite of data exchange in analysis of clinical processes. However, data from different electronic health record systems are based on idiosyncratic terminology systems, especially when the data is from different hospitals and healthcare organizations. Terminology standardization is necessary for the medical data analysis. We propose a crowdsourcing-based terminology mapping method, CrowdMapping, to standardize the terminology in medical data. CrowdMapping uses a confidential model to determine how terminologies are mapped to a standard system, like ICD-10. The model uses mappings from different health care organizations and evaluates the diversity of the mapping to determine a more sophisticated mapping rule. Further, the CrowdMapping model enables users to rate the mapping result and interact with the model evaluation. CrowdMapping is a work-in-progress system, we present initial results mapping terminologies.

  12. Standards for contamination control

    NASA Astrophysics Data System (ADS)

    Borson, Eugene N.

    2004-10-01

    Standards are an important component of national and international trade. We depend upon standards to assure that manufactured parts will work together, wherever they are made, and that we speak the same technical language, no matter what language we speak. Understanding is important in order to know when to take exceptions to or tailor the standard to fit the job. Standards that are used in contamination control have increased in numbers over the years as more industries have had to improve their manufacturing processes to enhance reliability or yields of products. Some older standards have been revised to include new technologies, and many new standards have been developed. Some of the new standards were written for specific industries while others apply to many industries. Many government standards have been replaced with standards from nongovernmental standards organizations. This trend has been encouraged by U.S. law that requires the government to use commercial standards where possible. This paper reviews some of the more important standards for the aerospace industry, such as IEST-STD-CC1246 and ISO 14644-1, that have been published in recent years. Benefits, usage, and problems with some standards will be discussed. Some standards are referenced, and websites of some standards organizations are listed.

  13. Open environments to support systems engineering tool integration: A study using the Portable Common Tool Environment (PCTE)

    NASA Technical Reports Server (NTRS)

    Eckhardt, Dave E., Jr.; Jipping, Michael J.; Wild, Chris J.; Zeil, Steven J.; Roberts, Cathy C.

    1993-01-01

    A study of computer engineering tool integration using the Portable Common Tool Environment (PCTE) Public Interface Standard is presented. Over a 10-week time frame, three existing software products were encapsulated to work in the Emeraude environment, an implementation of the PCTE version 1.5 standard. The software products used were a computer-aided software engineering (CASE) design tool, a software reuse tool, and a computer architecture design and analysis tool. The tool set was then demonstrated to work in a coordinated design process in the Emeraude environment. The project and the features of PCTE used are described, experience with the use of Emeraude environment over the project time frame is summarized, and several related areas for future research are summarized.

  14. Standardized patient feedback: making it work across disciplines.

    PubMed

    Dayer Berenson, Linda; Goodill, Sharon W; Wenger, Sarah

    2012-01-01

    In health professions education, feedback can be defined as the sharing of information about a student's performance. The most valuable learning occurs when students receive detailed feedback delivered in a way they can utilize it. In clinical simulations, feedback from a standardized patient (SP) offers a unique perspective. This article presents some of the underlying theory and research on feedback delivery with a particular emphasis on the role of non-verbal communication. We explore what feedback students need from SPs, how to provide feedback effectively as well as common challenges to the process. The authors, working from different health care disciplines, collaborated to develop a training workshop for the college's SPs designed to ensure a consistent approach to SP feedback delivery. We describe this workshop and its outcomes.

  15. [HL7 standard--features, principles, and methodology].

    PubMed

    Koncar, Miroslav

    2005-01-01

    The mission of HL7 Inc. non-profit organization is to provide standards for the exchange, management and integration of data that support clinical patient care, and the management, delivery and evaluation of healthcare services. As the standards developed by HL7 Inc. represent the world's most influential standardization efforts in the field of medical informatics, the HL7 family of standards has been recognized by the technical and scientific community as the foundation for the next generation healthcare information systems. Versions 1 and 2 of HL7 standard have solved many issues, but also demonstrated the size and complexity of health information sharing problem. As the solution complete new methodology has been adopted that is encompassed in the HL7 Version 3 recommendations. This approach standardizes Reference Information Model (RIM), which is the source of all derived domain models and message structures. Message design is now defined in detail, enabling interoperability between loosely coupled systems that are.designed by different vendors and deployed in various environments. At the start of the Primary Healthcare Information System project in the Republic of Croatia in 2002, the decision was to go directly to Version 3. The target scope of work includes clinical, financial and administrative data management in the domain of healthcare processes. By using HL7v3 standardized methodology we were able to completely map the Croatian primary healthcare domain to HL7v3 artefacts. Further refinement processes that are planned for the future will provide semantic interoperability and detailed description of all elements in HL7 messages. Our HL7 Business Component is in constant process of studying different legacy applications, making solid foundation for their integration to HL7-enabled communication environment.

  16. N400 Event-Related Potential and Standardized Measures of Reading in Late Elementary School Children: Correlated or Independent?

    PubMed Central

    Coch, Donna; Benoit, Clarisse

    2015-01-01

    We investigated whether and how standardized behavioral measures of reading and electrophysiological measures of reading were related in 72 typically developing, late elementary school children. Behavioral measures included standardized tests of spelling, phonological processing, vocabulary, comprehension, naming speed, and memory. Electrophysiological measures were composed of the amplitude of the N400 component of the event-related potential waveform elicited by real words, pseudowords, nonpronounceable letter strings, and strings of letter-like symbols (false fonts). The only significant brain-behavior correlations were between standard scores on the vocabulary test and N400 mean amplitude to real words (r = −.272) and pseudowords (r = −.235). We conclude that, while these specific sets of standardized behavioral and electrophysiological measures both provide an index of reading, for the most part, they are independent and draw upon different underlying processing resources. [T]o completely analyze what we do when we read… would be to describe very many of the most intricate workings of the human mind, as well as to unravel the tangled story of the most remarkable specific performance that civilization has learned in all its history(Huey, 1908/1968, p. 3). PMID:26346715

  17. Using Lean Management to Reduce Emergency Department Length of Stay for Medicine Admissions.

    PubMed

    Allaudeen, Nazima; Vashi, Anita; Breckenridge, Julia S; Haji-Sheikhi, Farnoosh; Wagner, Sarah; Posley, Keith A; Asch, Steven M

    The practice of boarding admitted patients in the emergency department (ED) carries negative operational, clinical, and patient satisfaction consequences. Lean tools have been used to improve ED workflow. Interventions focused on reducing ED length of stay (LOS) for admitted patients are less explored. To evaluate a Lean-based initiative to reduce ED LOS for medicine admissions. Prospective quality improvement initiative performed at a single university-affiliated Department of Veterans Affairs (VA) medical center from February 2013 to February 2016. We performed a Lean-based multidisciplinary initiative beginning with a rapid process improvement workshop to evaluate current processes, identify root causes of delays, and develop countermeasures. Frontline staff developed standard work for each phase of the ED stay. Units developed a daily management system to reinforce, evaluate, and refine standard work. The primary outcome was the change in ED LOS for medicine admissions pre- and postintervention. ED LOS at the intervention site was compared with other similar VA facilities as controls over the same time period using a difference-in-differences approach. ED LOS for medicine admissions reduced 26.4%, from 8.7 to 6.4 hours. Difference-in-differences analysis showed that ED LOS for combined medicine and surgical admissions decreased from 6.7 to 6.0 hours (-0.7 hours, P = .003) at the intervention site compared with no change (5.6 hours, P = .2) at the control sites. We utilized Lean management to significantly reduce ED LOS for medicine admissions. Specifically, the development and management of standard work were key to sustaining these results.

  18. Development of a New Intelligent Joystick for People with Reduced Mobility.

    PubMed

    Mrabet, Makrem; Rabhi, Yassine; Fnaiech, Farhat

    2018-01-01

    Despite the diversity of electric wheelchairs, many people with physical limitations and seniors have difficulty using their standard joystick. As a result, they cannot meet their needs or ensure safe travel. Recent assistive technologies can help to give them autonomy and independence. This work deals with the real-time implementation of an artificial intelligence device to overcome these problems. Following a review of the literature from previous work, we present the methodology and process for implementing our intelligent control system on an electric wheelchair. The system is based on a neural algorithm that overcomes problems with standard joystick maneuvers such as the inability to move correctly in one direction. However, this implies the need for an appropriate methodology to map the position of the joystick handle. Experiments on a real wheelchair are carried out with real patients of the Mohamed Kassab National Institute Orthopedic, Physical and Functional Rehabilitation Hospital of Tunis. The proposed intelligent system gives good results compared to the use of a standard joystick.

  19. Development of a New Intelligent Joystick for People with Reduced Mobility

    PubMed Central

    Mrabet, Makrem; Fnaiech, Farhat

    2018-01-01

    Despite the diversity of electric wheelchairs, many people with physical limitations and seniors have difficulty using their standard joystick. As a result, they cannot meet their needs or ensure safe travel. Recent assistive technologies can help to give them autonomy and independence. This work deals with the real-time implementation of an artificial intelligence device to overcome these problems. Following a review of the literature from previous work, we present the methodology and process for implementing our intelligent control system on an electric wheelchair. The system is based on a neural algorithm that overcomes problems with standard joystick maneuvers such as the inability to move correctly in one direction. However, this implies the need for an appropriate methodology to map the position of the joystick handle. Experiments on a real wheelchair are carried out with real patients of the Mohamed Kassab National Institute Orthopedic, Physical and Functional Rehabilitation Hospital of Tunis. The proposed intelligent system gives good results compared to the use of a standard joystick. PMID:29765462

  20. Specialty Engineering Supplement to IEEE-15288.1

    DTIC Science & Technology

    2015-05-15

    receiver required to work in a dense EMI environment. (15) Any RF receiver with a burnout level of less than 30 dBm (1 mW). b. A summary of all...Context 2.1 ISO-IEC-IEEE-15288: 2015, Systems and Software Engineering — System life cycle processes ISO-IEC-IEEE 15288 is the DOD-adopted standard for...to ISO-15288 for application of systems engineering on defense programs that was developed by a joint services working group under the auspices of the

  1. Pleiades and OCO-2: Using Supercomputing Resources to Process OCO-2 Science Data

    NASA Technical Reports Server (NTRS)

    LaHaye, Nick

    2012-01-01

    For a period of ten weeks I got the opportunity to assist in doing research for the OCO-2 project in the Science Data Operations System Team. This research involved writing a prototype interface that would work as a model for the system implemented for the project's operations. This would only be the case if when the system is tested it worked properly and up to the team's standards. This paper gives the details of the research done and its results.

  2. A Survey of Real-Time Operating Systems and Virtualization Solutions for Space Systems

    DTIC Science & Technology

    2015-03-01

    probe, an unmanned spacecraft orbiting Mercury (“Messenger,” n.d.; “VxWorks Space,” n.d.). SpaceX , the private space travel company, uses an unspecified...VxWorks platform on its Dragon reusable spacecraft (“ SpaceX ,” n.d.). 5 Supports the 1003.1 standard but does not provide process creation...2013, March 6). ELC: SpaceX lessons learned. Retrieved from http://lwn.net/ Articles/540368/ 112 Embedded hardware. (n.d.). Retrieved

  3. Methodology of problem-based learning engineering and technology and of its implementation with modern computer resources

    NASA Astrophysics Data System (ADS)

    Lebedev, A. A.; Ivanova, E. G.; Komleva, V. A.; Klokov, N. M.; Komlev, A. A.

    2017-01-01

    The considered method of learning the basics of microelectronic circuits and systems amplifier enables one to understand electrical processes deeper, to understand the relationship between static and dynamic characteristics and, finally, bring the learning process to the cognitive process. The scheme of problem-based learning can be represented by the following sequence of procedures: the contradiction is perceived and revealed; the cognitive motivation is provided by creating a problematic situation (the mental state of the student), moving the desire to solve the problem, to raise the question "why?", the hypothesis is made; searches for solutions are implemented; the answer is looked for. Due to the complexity of architectural schemes in the work the modern methods of computer analysis and synthesis are considered in the work. Examples of engineering by students in the framework of students' scientific and research work of analog circuits with improved performance based on standard software and software developed at the Department of Microelectronics MEPhI.

  4. Active-passive bistatic surveillance for long range air defense

    NASA Astrophysics Data System (ADS)

    Wardrop, B.; Molyneux-Berry, M. R. B.

    1992-06-01

    A hypothetical mobile support receiver capable of working within existing and future air defense networks as a means to maintain essential surveillance functions is considered. It is shown how multibeam receiver architecture supported by digital signal processing can substantially improve surveillance performance against chaff and jamming threats. A dual-mode support receiver concept is proposed which is based on the state-of-the-art phased-array technology, modular processing in industry standard hardware and existing networks.

  5. NBS (National Bureau of Standards): Materials measurements. [space processing experiments

    NASA Technical Reports Server (NTRS)

    Manning, J. R.

    1983-01-01

    Work directed toward the measurement of materials properties important to the design and interpretation of space processing experiments and determinations of how the space environment may offer a unique opportunity for performing improved measurements and producing materials with improved properties is reported. Surface tensions and their variations with temperature and impurities; convection during undirectional solidification; and measurement of the high temperature thermophysical properties of tungsten group liquids and solids are discussed and results are summarized.

  6. Materials for Heated Head Automated Thermoplastic Tape Placement

    NASA Technical Reports Server (NTRS)

    Jensen, Brian J.; Kinney, Megan C.; Cano, Roberto J.; Grimsley, Brian W.

    2012-01-01

    NASA Langley Research Center (LaRC) is currently pursuing multiple paths to develop out of autoclave (OOA) polymeric composite materials and processes. Polymeric composite materials development includes the synthesis of new and/or modified thermosetting and thermoplastic matrix resins designed for specific OOA processes. OOA processes currently under investigation include vacuum bag only (VBO) prepreg/composite fabrication, resin transfer molding (RTM), vacuum assisted resin transfer molding (VARTM) and heated head automated thermoplastic tape placement (HHATP). This paper will discuss the NASA Langley HHATP facility and capabilities and recent work on characterizing thermoplastic tape quality and requirements for quality part production. Samples of three distinct versions of APC-2 (AS4/PEEK) thermoplastic dry tape were obtained from two materials vendors, TENCATE, Inc. and CYTEC Engineered Materials** (standard grade and an experimental batch). Random specimens were taken from each of these samples and subjected to photo-microscopy and surface profilometry. The CYTEC standard grade of APC-2 tape had the most voids and splits and the highest surface roughness and/or waviness. Since the APC-2 tape is composed of a thermoplastic matrix, it offers the flexibility of reprocessing to improve quality, and thereby improve final quality of HHATP laminates. Discussions will also include potential research areas and future work that is required to advance the state of the art in the HHATP process for composite fabrication.

  7. Ergonomic risk assessment with DesignCheck to evaluate assembly work in different phases of the vehicle development process.

    PubMed

    Winter, Gabriele; Schaub, Karlheinz G; Großmann, Kay; Laun, Gerhard; Landau, Kurt; Bruder, Ralph

    2012-01-01

    Occupational hazards exist, if the design of the work situation is not in accordance with ergonomic design principles. At assembly lines ergonomics is applied to the design of work equipment and tasks and to work organisation. The ignoring of ergonomic principles in planning and design of assembly work leads to unfavourable working posture, action force and material handling. Disorders of the musculoskeletal system are of a common occurrence throughout Europe. Musculoskeletal disorders are a challenge against the background of disabled workers. The changes in a worker's capability have to be regarded in the conception of redesigned and new assembly lines. In this way ergonomics becomes progressively more important in planning and design of vehicles: The objective of ergonomic design in different stages of the vehicles development process is to achieve an optimal adaptation of the assembly work to workers. Hence the ergonomic screening tool "Design Check" (DC) was developed to identify ergonomic deficits in workplace layouts. The screening-tool is based on the current ergonomic state of the art in the design of physical work and relevant EU legal requirements. It was tested within a federal German research project at selected work stations at the assembly lines at Dr.-Ing. h.c. F. Porsche AG / Stuttgart. Meanwhile the application of the screening-tool DC is transferred in other parts of the Porsche AG, Stuttgart. It is also realized as an ergonomic standard method to perform assembly work in different phases of the vehicle development process.

  8. Proposed Standards for Variable Harmonization Documentation and Referencing: A Case Study Using QuickCharmStats 1.1

    PubMed Central

    Winters, Kristi; Netscher, Sebastian

    2016-01-01

    Comparative statistical analyses often require data harmonization, yet the social sciences do not have clear operationalization frameworks that guide and homogenize variable coding decisions across disciplines. When faced with a need to harmonize variables researchers often look for guidance from various international studies that employ output harmonization, such as the Comparative Survey of Election Studies, which offer recoding structures for the same variable (e.g. marital status). More problematically there are no agreed documentation standards or journal requirements for reporting variable harmonization to facilitate a transparent replication process. We propose a conceptual and data-driven digital solution that creates harmonization documentation standards for publication and scholarly citation: QuickCharmStats 1.1. It is free and open-source software that allows for the organizing, documenting and publishing of data harmonization projects. QuickCharmStats starts at the conceptual level and its workflow ends with a variable recording syntax. It is therefore flexible enough to reflect a variety of theoretical justifications for variable harmonization. Using the socio-demographic variable ‘marital status’, we demonstrate how the CharmStats workflow collates metadata while being guided by the scientific standards of transparency and replication. It encourages researchers to publish their harmonization work by providing researchers who complete the peer review process a permanent identifier. Those who contribute original data harmonization work to their discipline can now be credited through citations. Finally, we propose peer-review standards for harmonization documentation, describe a route to online publishing, and provide a referencing format to cite harmonization projects. Although CharmStats products are designed for social scientists our adherence to the scientific method ensures our products can be used by researchers across the sciences. PMID:26859494

  9. Standardization of pitch-range settings in voice acoustic analysis.

    PubMed

    Vogel, Adam P; Maruff, Paul; Snyder, Peter J; Mundt, James C

    2009-05-01

    Voice acoustic analysis is typically a labor-intensive, time-consuming process that requires the application of idiosyncratic parameters tailored to individual aspects of the speech signal. Such processes limit the efficiency and utility of voice analysis in clinical practice as well as in applied research and development. In the present study, we analyzed 1,120 voice files, using standard techniques (case-by-case hand analysis), taking roughly 10 work weeks of personnel time to complete. The results were compared with the analytic output of several automated analysis scripts that made use of preset pitch-range parameters. After pitch windows were selected to appropriately account for sex differences, the automated analysis scripts reduced processing time of the 1,120 speech samples to less than 2.5 h and produced results comparable to those obtained with hand analysis. However, caution should be exercised when applying the suggested preset values to pathological voice populations.

  10. Packet telemetry and packet telecommand - The new generation of spacecraft data handling techniques

    NASA Technical Reports Server (NTRS)

    Hooke, A. J.

    1983-01-01

    Because of rising costs and reduced reliability of spacecraft and ground network hardware and software customization, standardization Packet Telemetry and Packet Telecommand concepts are emerging as viable alternatives. Autonomous packets of data, within each concept, which are created within ground and space application processes through the use of formatting techniques, are switched end-to-end through the space data network to their destination application processes through the use of standard transfer protocols. This process may result in facilitating a high degree of automation and interoperability because of completely mission-independent-designed intermediate data networks. The adoption of an international guideline for future space telemetry formatting of the Packet Telemetry concept, and the advancement of the NASA-ESA Working Group's Packet Telecommand concept to a level of maturity parallel to the of Packet Telemetry are the goals of the Consultative Committee for Space Data Systems. Both the Packet Telemetry and Packet Telecommand concepts are reviewed.

  11. Software tool for physics chart checks.

    PubMed

    Li, H Harold; Wu, Yu; Yang, Deshan; Mutic, Sasa

    2014-01-01

    Physics chart check has long been a central quality assurance (QC) measure in radiation oncology. The purpose of this work is to describe a software tool that aims to accomplish simplification, standardization, automation, and forced functions in the process. Nationally recognized guidelines, including American College of Radiology and American Society for Radiation Oncology guidelines and technical standards, and the American Association of Physicists in Medicine Task Group reports were identified, studied, and summarized. Meanwhile, the reported events related to physics chart check service were analyzed using an event reporting and learning system. A number of shortfalls in the chart check process were identified. To address these problems, a software tool was designed and developed under Microsoft. Net in C# to hardwire as many components as possible at each stage of the process. The software consists of the following 4 independent modules: (1) chart check management; (2) pretreatment and during treatment chart check assistant; (3) posttreatment chart check assistant; and (4) quarterly peer-review management. The users were a large group of physicists in the author's radiation oncology clinic. During over 1 year of use the tool has proven very helpful in chart checking management, communication, documentation, and maintaining consistency. The software tool presented in this work aims to assist physicists at each stage of the physics chart check process. The software tool is potentially useful for any radiation oncology clinics that are either in the process of pursuing or maintaining the American College of Radiology accreditation.

  12. [Enhancement of quality by employing qualification-oriented staff and team-oriented cooperation].

    PubMed

    Meyenburg-Altwarg, Iris; Tecklenburg, Andreas

    2010-01-01

    Taking three practical examples from a university hospital the present article describes how quality can be improved by linking deployment of qualification-oriented staff with team-oriented cooperation, especially with regard to the professional groups of physicians and nurses. In the first example, a cross-professional work group defined tasks which--in a legally acceptable manner--allow selected activities to be transferred from physicians to nurses, improving the work processes of all persons concerned. Work and duty profiles, training and modified work processes were created and implemented according to the PDCA circle-based process. The first evaluation took place after nine months using interviews, questionnaires (patients, physicians, and nurses) as well as CIRS. In the second example, emphasis was placed on offers of supplementary services for private patients resulting in a lightening of the workload on the nursing staff. These supplementary services are intended to enhance the wellbeing of the patients. Special external-service staff provide high standard hotel services. These services consistently receive high ratings from the patients. The methods used for introduction and evaluation are analogous to those used in the first example. The third example is concerned with the extension of nursing care and patient empowerment beyond the boundaries of ward and hospital. The guidelines were the implementation of the national expert standard for discharge management according to the DNQP. The methods of introduction were analogous to those used in example 1. For the evaluation interviews were conducted with all participating groups. In all examples actual quantitative measures (key ratios) are not yet available; however, the data collected from the interviews and questionnaires of all the participants are promising.

  13. A Clinician-Centered Evaluation of the Usability of AHLTA and Automated Clinical Practice Guidelines at TAMC

    DTIC Science & Technology

    2011-03-31

    evidence based medicine into clinical practice. It will decrease costs and enable multiple stakeholders to work in an open content/source environment to exchange clinical content, develop and test technology and explore processes in applied CDS. Design: Comparative study between the KMR infrastructure and capabilities developed as an open source, vendor agnostic solution for aCPG execution within AHLTA and the current DoD/MHS standard evaluating: H1: An open source, open standard KMR and Clinical Decision Support Engine can enable organizations to share domain

  14. Towards a Quantitative Performance Measurement Framework to Assess the Impact of Geographic Information Standards

    NASA Astrophysics Data System (ADS)

    Vandenbroucke, D.; Van Orshoven, J.; Vancauwenberghe, G.

    2012-12-01

    Over the last decennia, the use of Geographic Information (GI) has gained importance, in public as well as in private sector. But even if many spatial data and related information exist, data sets are scattered over many organizations and departments. In practice it remains difficult to find the spatial data sets needed, and to access, obtain and prepare them for using in applications. Therefore Spatial Data Infrastructures (SDI) haven been developed to enhance the access, the use and sharing of GI. SDIs consist of a set of technological and non-technological components to reach this goal. Since the nineties many SDI initiatives saw light. Ultimately, all these initiatives aim to enhance the flow of spatial data between organizations (users as well as producers) involved in intra- and inter-organizational and even cross-country business processes. However, the flow of information and its re-use in different business processes requires technical and semantic interoperability: the first should guarantee that system components can interoperate and use the data, while the second should guarantee that data content is understood by all users in the same way. GI-standards within the SDI are necessary to make this happen. However, it is not known if this is realized in practice. Therefore the objective of the research is to develop a quantitative framework to assess the impact of GI-standards on the performance of business processes. For that purpose, indicators are defined and tested in several cases throughout Europe. The proposed research will build upon previous work carried out in the SPATIALIST project. It analyzed the impact of different technological and non-technological factors on the SDI-performance of business processes (Dessers et al., 2011). The current research aims to apply quantitative performance measurement techniques - which are frequently used to measure performance of production processes (Anupindi et al., 2005). Key to reach the research objectives is a correct design of the test cases. The major challenge is: to set-up the analytical framework for analyzing the impact of GI-standards on the process performance, to define the appropriate indicators and to choose the right test cases. In order to do so, it is proposed to define the test cases as 8 pairs of organizations (see figure). The paper will present the state of the art of performance measurement in the context of work processes, propose a series of SMART indicators for describing the set-up and measure the performance, define the test case set-up and suggest criteria for the selection of the test cases, i.e. the organizational pairs. References Anupindi, R., Chopra, S., Deshmukh, S.D., Van Mieghem, J.A., & Zemel, E. (2006). Managing Business Process Flows: Principles of Operations Management. New-Jersey, USA: Prentice Hall. Dessers, D., Crompvoets, J., Janssen, K., Vancauwenberghe, G., Vandenbroucke, D. & Vanhaverbeke, L. (2011). SDI at work: The Spatial Zoning Plans Case. Leuven, Belgium: Katholieke Universiteit Leuven.

  15. Development of Field Methodology and Processes for Task Analysis and Training Feedback

    DTIC Science & Technology

    1978-10-31

    To evaluate technical ability and/or pco ad~nil- 2064 5. If part is in. notifies Shop Pffice t-e Job status is tration of shop supply elemnt and...Pepairs are ct- Dieted within a reasonable tnie frare consistent with prevailing conditions and pablispied standards, 5. Corpletion cf work must be

  16. Stability of Work Values: Individual Differences and Relationship with Decision Making.

    ERIC Educational Resources Information Center

    Ravlin, Elizabeth C.; And Others

    Values in the workplace have long been a topic of interest for both researchers in organizational behavior and management practitioners alike. Values are believed to be deeply internalized standards for personal behavior because they are based on a person's experience. Relatively little attention has been paid to the processes relating to…

  17. Guided Science Inquiry Instruction with Students with Special Education Needs. R2Ed Working Paper 2015-1

    ERIC Educational Resources Information Center

    White, Andrew S.; Kunz, Gina M.; Whitham, Rebekah; Houston, Jim; Nugent, Gwen

    2015-01-01

    National and state educational mandates require students achieve proficiency in not only science content, but also "science inquiry", or those process skills associated with science (National Research Council, 2011; Next Generation Science Standards, 2013). Science inquiry instruction has been shown to improve student achievement and…

  18. Quality Assurance for Digital Learning Object Repositories: Issues for the Metadata Creation Process

    ERIC Educational Resources Information Center

    Currier, Sarah; Barton, Jane; O'Beirne, Ronan; Ryan, Ben

    2004-01-01

    Metadata enables users to find the resources they require, therefore it is an important component of any digital learning object repository. Much work has already been done within the learning technology community to assure metadata quality, focused on the development of metadata standards, specifications and vocabularies and their implementation…

  19. Thrown object hazards in forest operations

    Treesearch

    Robert Rummer; John Klepac

    2011-01-01

    Mechanized equipment for forest operations provide better operator protection in this hazardous work environment. However operators of forestry cutting machines are now exposed to new hazards from the high-energy cutting devices used to cut trees and process logs. Anecdotal reports of thrown objects document a risk of injury and fatality. Two new ISO standards have...

  20. Integration of Mathematical and Natural-Science Knowledge in School Students' Project-Based Activity

    ERIC Educational Resources Information Center

    Luneeva, Olga L.; Zakirova, Venera G.

    2017-01-01

    New educational standards implementation prioritizes the projective beginning of training in school education. Therefore, consideration of educational activity only as the process of obtaining ready knowledge should be abandoned. Thus the relevance of the studied problem is substantiated by the need to develop methodical works connected with the…

  1. DACUM: Bridging the Gap between Work and High Performance.

    ERIC Educational Resources Information Center

    Norton, Robert E.; McLennan, Krystyna S.

    The DACUM (Developing A Curriculum) occupational analysis process provides a systematic way to look at worker duties and tasks so that important knowledge, skills, standards, tools, and attitudes can be handed on to the next generation of workers. Revamped by The Ohio State University's Center on Education and Training for Employment, DACUM…

  2. Expanding horizons. Integrating environmental health in occupational health nursing.

    PubMed

    Rogers, B; Cox, A R

    1998-01-01

    1. Environmental hazards are ubiquitous. Many exist in the workplace or occur as a result of work process exposures. 2. Environmental health is a natural component of the expanding practice of occupational health nursing. 3. AAOHN's vision for occupational and environmental health will continue to set the standard and provide leadership in the specialty.

  3. Promoting Learning and Achievement through Self-Assessment

    ERIC Educational Resources Information Center

    Andrade, Heidi; Valtcheva, Anna

    2009-01-01

    Criteria-referenced self-assessment is a process during which students collect information about their own performance or progress; compare it to explicitly stated criteria, goals, or standards; and revise accordingly. The authors argue that self-assessment must be a formative type of assessment, done on drafts of works in progress: It should not…

  4. Open-Ended Questions and the Process Standards

    ERIC Educational Resources Information Center

    Sanchez, Wendy B.

    2013-01-01

    Open-ended questions, as discussed in this article, are questions that can be solved or explained in a variety of ways, that focus on conceptual aspects of mathematics, and that have the potential to expose students' understanding and misconceptions. When working with teachers who are using open-ended questions with their students for the…

  5. Group Work during International Disaster Outreach Projects: A Model to Advance Cultural Competence

    ERIC Educational Resources Information Center

    West-Olatunji, Cirecie; Henesy, Rachel; Varney, Melanie

    2015-01-01

    Given the rise in disasters worldwide, counselors will increasingly be called upon to respond. Current accreditation standards require that programs train students to become skillful in disaster/crisis interventions. Group processing to enhance self-awareness and improve conceptualization skills is an essential element of such training. This…

  6. VLSI design of lossless frame recompression using multi-orientation prediction

    NASA Astrophysics Data System (ADS)

    Lee, Yu-Hsuan; You, Yi-Lun; Chen, Yi-Guo

    2016-01-01

    Pursuing an experience of high-end visual quality drives human to demand a higher display resolution and a higher frame rate. Hence, a lot of powerful coding tools are aggregated together in emerging video coding standards to improve coding efficiency. This also makes video coding standards suffer from two design challenges: heavy computation and tremendous memory bandwidth. The first issue can be properly solved by a careful hardware architecture design with advanced semiconductor processes. Nevertheless, the second one becomes a critical design bottleneck for a modern video coding system. In this article, a lossless frame recompression using multi-orientation prediction technique is proposed to overcome this bottleneck. This work is realised into a silicon chip with the technology of TSMC 0.18 µm CMOS process. Its encoding capability can reach full-HD (1920 × 1080)@48 fps. The chip power consumption is 17.31 mW@100 MHz. Core area and chip area are 0.83 × 0.83 mm2 and 1.20 × 1.20 mm2, respectively. Experiment results demonstrate that this work exhibits an outstanding performance on lossless compression ratio with a competitive hardware performance.

  7. International Federation of Nurse Anesthetists' anesthesia program approval process.

    PubMed

    Horton, B J; Anang, S P; Riesen, M; Yang, H-J; Björkelund, K B

    2014-06-01

    The International Federation of Nurse Anesthetists is improving anaesthesia patient care through a voluntary Anesthesia Program Approval Process (APAP) for schools and programmes. It is the result of a coordinated effort by anaesthesia leaders from many nations to implement a voluntary quality improvement system for education. These leaders firmly believe that meeting international education standards is an important way to improve anaesthesia, pain management and resuscitative care to patients worldwide. By 2013, 14 anaesthesia programmes from France, Iceland, Indonesia, Philippines, Sweden, Switzerland, Netherlands, Tunisia and the USA had successfully completed the process. Additional programmes were scheduled for review in 2014. Faculty from these programmes, who have successfully completed APAP, show how anaesthesia educators throughout the world seek to continually improve education and patient care by pledging to meet common education standards. As national governments, education ministers and heads of education institutions work to decrease shortages of healthcare workers, they would benefit from considering the value offered by quality improvement systems supported by professional organizations. When education programmes are measured against standards developed by experts in a profession, policy makers can be assured that the programmes have met certain standards of quality. They can also be confident that graduates of approved programmes are appropriately trained healthcare workers for their citizens. © 2014 International Council of Nurses.

  8. Data preprocessing methods of FT-NIR spectral data for the classification cooking oil

    NASA Astrophysics Data System (ADS)

    Ruah, Mas Ezatul Nadia Mohd; Rasaruddin, Nor Fazila; Fong, Sim Siong; Jaafar, Mohd Zuli

    2014-12-01

    This recent work describes the data pre-processing method of FT-NIR spectroscopy datasets of cooking oil and its quality parameters with chemometrics method. Pre-processing of near-infrared (NIR) spectral data has become an integral part of chemometrics modelling. Hence, this work is dedicated to investigate the utility and effectiveness of pre-processing algorithms namely row scaling, column scaling and single scaling process with Standard Normal Variate (SNV). The combinations of these scaling methods have impact on exploratory analysis and classification via Principle Component Analysis plot (PCA). The samples were divided into palm oil and non-palm cooking oil. The classification model was build using FT-NIR cooking oil spectra datasets in absorbance mode at the range of 4000cm-1-14000cm-1. Savitzky Golay derivative was applied before developing the classification model. Then, the data was separated into two sets which were training set and test set by using Duplex method. The number of each class was kept equal to 2/3 of the class that has the minimum number of sample. Then, the sample was employed t-statistic as variable selection method in order to select which variable is significant towards the classification models. The evaluation of data pre-processing were looking at value of modified silhouette width (mSW), PCA and also Percentage Correctly Classified (%CC). The results show that different data processing strategies resulting to substantial amount of model performances quality. The effects of several data pre-processing i.e. row scaling, column standardisation and single scaling process with Standard Normal Variate indicated by mSW and %CC. At two PCs model, all five classifier gave high %CC except Quadratic Distance Analysis.

  9. Improved compliance by BPM-driven workflow automation.

    PubMed

    Holzmüller-Laue, Silke; Göde, Bernd; Fleischer, Heidi; Thurow, Kerstin

    2014-12-01

    Using methods and technologies of business process management (BPM) for the laboratory automation has important benefits (i.e., the agility of high-level automation processes, rapid interdisciplinary prototyping and implementation of laboratory tasks and procedures, and efficient real-time process documentation). A principal goal of the model-driven development is the improved transparency of processes and the alignment of process diagrams and technical code. First experiences of using the business process model and notation (BPMN) show that easy-to-read graphical process models can achieve and provide standardization of laboratory workflows. The model-based development allows one to change processes quickly and an easy adaption to changing requirements. The process models are able to host work procedures and their scheduling in compliance with predefined guidelines and policies. Finally, the process-controlled documentation of complex workflow results addresses modern laboratory needs of quality assurance. BPMN 2.0 as an automation language to control every kind of activity or subprocess is directed to complete workflows in end-to-end relationships. BPMN is applicable as a system-independent and cross-disciplinary graphical language to document all methods in laboratories (i.e., screening procedures or analytical processes). That means, with the BPM standard, a communication method of sharing process knowledge of laboratories is also available. © 2014 Society for Laboratory Automation and Screening.

  10. "GSFC FSB Application of Perspective-Based Inspections"

    NASA Technical Reports Server (NTRS)

    Shell, Elaine; Shull, Forrest

    2004-01-01

    The scope of work described in our proposal consisted of developing inspection standards targeted to Branch-specific types of defects (gained from analysis of Branch project defect histories), and including Branch-relevant perspectives and questions to guide defect detection. The tailored inspection guidelines were to be applied on real Branch projects with support as needed from the technology infusion team. This still accurately describes the scope of work performed. It was originally proposed that the Perspective-Based inspection standard would be applied on three projects within the Branch: GPM, JWST, and SDO. Rather than apply the proposed standard to all three, we inserted a new step, in which the standard was instead applied on a single pilot project, cFE (described above). This decision was a good match for the Branch goals since, due to the "design for reuse" nature of cFE, inspections played an even more crucial than usual role in that development process. Also, since cFE is being designed to provide general-purpose functionality, key representatives fiom our target projects were involved in inspections of cFE to provide perspectives from different missions. In this way, they could get some exposure to and the chance to provide feedback on the proposed standards before applying them on their own projects. The Branch-baselined standards will still be applied on GPM, JWST, and SDO, although outside the time frame of this funding. Finally, we originally proposed using the analysis of Branch defect sources to indicate in which phases Perspective-Based inspections could provide the best potential for future improvement, although experience on previous Branch projects suggested that our efforts would likely be focused on requirements and code inspections. In the actual work, we focused exclusively on requirements inspections, as this was the highest-priority work currently being done on our cFE pilot project.

  11. The Next Generation Science Standards: A potential revolution for geoscience education

    NASA Astrophysics Data System (ADS)

    Wysession, Michael E.

    2014-05-01

    The first and only set of U.S.-nationally distributed K-12 science education standards have been adopted by many states across America, with the potential to be adopted by many more. Earth and space science plays a prominent role in the new standards, with particular emphasis on critical Earth issues such as climate change, sustainability, and human impacts on Earth systems. In the states that choose to adopt the Next Generation Science Standards (NGSS), American youth will have a rigorous practice-based formal education in these important areas. Much work needs to be done to insure the adoption and adequate implementation of the NGSS by a majority of American states, however, and there are many things that Earth and space scientists can do to help facilitate the process.

  12. Development of noise emission measurement specifications for color printing multifunctional devices

    NASA Astrophysics Data System (ADS)

    Kimizuka, Ikuo

    2005-09-01

    Color printing (including copying) is becoming more popular application in home, as well as in offices. Existing de jule and/or industrial standards (such as ISO 7779, ECMA-74, ANSI S12.10 series, etc.), however, state only monochrome patterns, which are mainly intended for acoustic noise testing of mechanical impact type printers. This paper discusses the key issues and corresponding resolutions for development of color printing patterns for acoustic noise measurements. The results of these technical works will be published by JBMS-74 (new industry standard of JBMIA within 2005), and hopefully be the technical basis of updating other standards mentioned above. This paper also shows the development processes and key features of proposed patterns.

  13. The importance of production standard operating procedure in a family business company

    NASA Astrophysics Data System (ADS)

    Hongdiyanto, C.

    2017-12-01

    Plastic industry is a growing sector, therefore UD X which engage in this business has a great potential to grow as well. The problem faced by this family business company is that no standard operating procedure is used and it lead to problem in the quality and quantity produced. This research is aim to create a production standard operating procedure for UD X. Semistructure interview is used to gather information from respondent to help writer create the SOP. There are four SOP’s created, namely: classifying SOP, sorting SOP, milling SOP and packing SOP. Having SOP will improve the effectiveness of production because employees already know how to work in each stages of production process.

  14. Bidding cost evaluation with fuzzy methods on building project in Jakarta

    NASA Astrophysics Data System (ADS)

    Susetyo, Budi; Utami, Tin Budi

    2017-11-01

    National construction companies today demanded to become more competitive to face increasingly competition. Every construction company especially the contractor must work better than ever. Ability to prepare cost of the work that represents the efficiency and effectiveness of the implementation of the work necessary to produce cost - competitive. The project is considered successful if the target meets the quality, cost and time. From the aspect of cost, the project has been designed in accordance with certain technical criteria to be taken into account based on standard costs. To ensure the cost efficiency of the bidding process carried out meet the rules of a fairly and competitive. The research objective is to formulate the proper way to compare several deals with the standard cost of the work. The fuzzy technique is used as a evaluation methods to decision making. The evaluation not merely based on the lowest prices. The methods is looking for the most valuable and reasonable prices. The comparison is conducted to determine the most cost-competitive and reasonable as the winner of the bidding.

  15. Interpreting international governance standards for health IT use within general medical practice.

    PubMed

    Mahncke, Rachel J; Williams, Patricia A H

    2014-01-01

    General practices in Australia recognise the importance of comprehensive protective security measures. Some elements of information security governance are incorporated into recommended standards, however the governance component of information security is still insufficiently addressed in practice. The International Organistion for Standardisation (ISO) released a new global standard in May 2013 entitled, ISO/IEC 27014:2013 Information technology - Security techniques - Governance of information security. This standard, applicable to organisations of all sizes, offers a framework against which to assess and implement the governance components of information security. The standard demonstrates the relationship between governance and the management of information security, provides strategic principles and processes, and forms the basis for establishing a positive information security culture. An analysis interpretation of this standard for use in Australian general practice was performed. This work is unique as such interpretation for the Australian healthcare environment has not been undertaken before. It demonstrates an application of the standard at a strategic level to inform existing development of an information security governance framework.

  16. Evaluation of the Neutron Data Standards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlson, A. D.; Pronyaev, V. G.; Capote, R.

    With the need for improving existing nuclear data evaluations, (e.g., ENDF/B-VIII.0 and JEFF-3.3 releases) the first step was to evaluate the standards for use in such a library. This new standards evaluation made use of improved experimental data and some developments in the methodology of analysis and evaluation. In addition to the work on the traditional standards, this work produced the extension of some energy ranges and includes new reactions that are called reference cross sections. Since the effort extends beyond the traditional standards, it is called the neutron data standards evaluation. This international effort has produced new evaluations ofmore » the following cross section standards: the H(n,n), 6Li(n,t), 10B(n,α), 10B(n,α 1γ), natC(n,n), Au(n,γ), 235U(n,f) and 238U(n,f). Also in the evaluation process the 238U(n,γ) and 239Pu(n,f) cross sections that are not standards were evaluated. Evaluations were also obtained for data that are not traditional standards: the Maxwellian spectrum averaged cross section for the Au(n,γ) cross section at 30 keV; reference cross sections for prompt γ-ray production in fast neutron-induced reactions; reference cross sections for very high energy fission cross sections; the 252Cf spontaneous fission neutron spectrum and the 235U prompt fission neutron spectrum induced by thermal incident neutrons; and the thermal neutron constants. The data and covariance matrices of the uncertainties were obtained directly from the evaluation procedure.« less

  17. Evaluation of the Neutron Data Standards

    DOE PAGES

    Carlson, A. D.; Pronyaev, V. G.; Capote, R.; ...

    2018-02-01

    With the need for improving existing nuclear data evaluations, (e.g., ENDF/B-VIII.0 and JEFF-3.3 releases) the first step was to evaluate the standards for use in such a library. This new standards evaluation made use of improved experimental data and some developments in the methodology of analysis and evaluation. In addition to the work on the traditional standards, this work produced the extension of some energy ranges and includes new reactions that are called reference cross sections. Since the effort extends beyond the traditional standards, it is called the neutron data standards evaluation. This international effort has produced new evaluations ofmore » the following cross section standards: the H(n,n), 6Li(n,t), 10B(n,α), 10B(n,α 1γ), natC(n,n), Au(n,γ), 235U(n,f) and 238U(n,f). Also in the evaluation process the 238U(n,γ) and 239Pu(n,f) cross sections that are not standards were evaluated. Evaluations were also obtained for data that are not traditional standards: the Maxwellian spectrum averaged cross section for the Au(n,γ) cross section at 30 keV; reference cross sections for prompt γ-ray production in fast neutron-induced reactions; reference cross sections for very high energy fission cross sections; the 252Cf spontaneous fission neutron spectrum and the 235U prompt fission neutron spectrum induced by thermal incident neutrons; and the thermal neutron constants. The data and covariance matrices of the uncertainties were obtained directly from the evaluation procedure.« less

  18. Evaluation of the Neutron Data Standards

    NASA Astrophysics Data System (ADS)

    Carlson, A. D.; Pronyaev, V. G.; Capote, R.; Hale, G. M.; Chen, Z.-P.; Duran, I.; Hambsch, F.-J.; Kunieda, S.; Mannhart, W.; Marcinkevicius, B.; Nelson, R. O.; Neudecker, D.; Noguere, G.; Paris, M.; Simakov, S. P.; Schillebeeckx, P.; Smith, D. L.; Tao, X.; Trkov, A.; Wallner, A.; Wang, W.

    2018-02-01

    With the need for improving existing nuclear data evaluations, (e.g., ENDF/B-VIII.0 and JEFF-3.3 releases) the first step was to evaluate the standards for use in such a library. This new standards evaluation made use of improved experimental data and some developments in the methodology of analysis and evaluation. In addition to the work on the traditional standards, this work produced the extension of some energy ranges and includes new reactions that are called reference cross sections. Since the effort extends beyond the traditional standards, it is called the neutron data standards evaluation. This international effort has produced new evaluations of the following cross section standards: the H(n,n), 6Li(n,t), 10B(n,α), 10B(n,α1 γ), natC(n,n), Au(n,γ), 235U(n,f) and 238U(n,f). Also in the evaluation process the 238U(n,γ) and 239Pu(n,f) cross sections that are not standards were evaluated. Evaluations were also obtained for data that are not traditional standards: the Maxwellian spectrum averaged cross section for the Au(n,γ) cross section at 30 keV; reference cross sections for prompt γ-ray production in fast neutron-induced reactions; reference cross sections for very high energy fission cross sections; the 252Cf spontaneous fission neutron spectrum and the 235U prompt fission neutron spectrum induced by thermal incident neutrons; and the thermal neutron constants. The data and covariance matrices of the uncertainties were obtained directly from the evaluation procedure.

  19. Ground control station software design for micro aerial vehicles

    NASA Astrophysics Data System (ADS)

    Walendziuk, Wojciech; Oldziej, Daniel; Binczyk, Dawid Przemyslaw; Slowik, Maciej

    2017-08-01

    This article describes the process of designing the equipment part and the software of a ground control station used for configuring and operating micro unmanned aerial vehicles (UAV). All the works were conducted on a quadrocopter model being a commonly accessible commercial construction. This article contains a characteristics of the research object, the basics of operating the micro aerial vehicles (MAV) and presents components of the ground control station model. It also describes the communication standards for the purpose of building a model of the station. Further part of the work concerns the software of the product - the GIMSO application (Generally Interactive Station for Mobile Objects), which enables the user to manage the actions and communication and control processes from the UAV. The process of creating the software and the field tests of a station model are also presented in the article.

  20. [Automation and organization of technological process of urinalysis].

    PubMed

    Kolenkin, S M; Kishkun, A A; Kol'chenko, O L

    2000-12-01

    Results of introduction into practice of a working model of industrial technology of laboratory studies and KONE Specific Supra and Miditron M devices are shown as exemplified by clinical analysis of the urine. This technology helps standardize all stages and operations, improves the efficiency of quality control of laboratory studies, rationally organizes the work at all stages of the process, creates a system for permanent improvement of the efficiency of investigations at the preanalytical, analytical, and postanalytical stages of technological process of laboratory studies. As a result of introduction of this technology into laboratory practice, violations of quality criteria of clinical urinalysis decreased from 15 to 8% at the preanalytical stage and from 6 to 3% at the analytical stage. Automation of the analysis decreased the need in reagents 3-fold and improved the productivity at the analytical stage 4-fold.

  1. KAPPA -- Kernel Application Package

    NASA Astrophysics Data System (ADS)

    Currie, Malcolm J.; Berry, David. S.

    KAPPA is an applications package comprising about 180 general-purpose commands for image processing, data visualisation, and manipulation of the standard Starlink data format---the NDF. It is intended to work in conjunction with Starlink's various specialised packages. In addition to the NDF, KAPPA can also process data in other formats by using the `on-the-fly' conversion scheme. Many commands can process data arrays of arbitrary dimension, and others work on both spectra and images. KAPPA operates from both the UNIX C-shell and the ICL command language. This document describes how to use KAPPA and its features. There is some description of techniques too, including a section on writing scripts. This document includes several tutorials and is illustrated with numerous examples. The bulk of this document comprises detailed descriptions of each command as well as classified and alphabetical summaries.

  2. 40 CFR 745.85 - Work practice standards.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 31 2011-07-01 2011-07-01 false Work practice standards. 745.85... Renovation § 745.85 Work practice standards. (a) Standards for renovation activities. Renovations must be... in § 745.90(b). (1) Occupant protection. Firms must post signs clearly defining the work area and...

  3. Systematic review: work-related stress and the HSE management standards.

    PubMed

    Brookes, K; Limbert, C; Deacy, C; O'Reilly, A; Scott, S; Thirlaway, K

    2013-10-01

    The Health and Safety Executive (HSE) has defined six management standards representing aspects of work that, if poorly managed, are associated with lower levels of employee health and productivity, and increased sickness absence. The HSE indicator tool aims to measure organizations' performance in managing the primary stressors identified by the HSE management standards. The aims of the study are to explore how the HSE indicator tool has been implemented within organizations and to identify contexts in which the tool has been used, its psychometric properties and relationships with alternative measures of well-being and stress. Studies that matched specific criteria were included in the review. Abstracts were considered by two researchers to ensure a reliable process. Full texts were obtained when abstracts met the inclusion criteria. Thirteen papers were included in the review. Using factor analysis and measures of reliability, the studies suggest that the HSE indicator tool is a psychometrically sound measure. The tool has been used to measure work-related stress across different occupational groups, with a clear relationship between the HSE tool and alternative measures of well-being. Limitations of the tool and recommendations for future research are discussed. The HSE indicator tool is a psychometrically sound measure of organizational performance against the HSE management standards. As such it can provide a broad overview of sources of work-related stress within organizations. More research is required to explore the use of the tool in the design of interventions to reduce stress, and its use in different contexts and with different cultural and gender groups.

  4. Repository-Based Software Engineering Program: Working Program Management Plan

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Repository-Based Software Engineering Program (RBSE) is a National Aeronautics and Space Administration (NASA) sponsored program dedicated to introducing and supporting common, effective approaches to software engineering practices. The process of conceiving, designing, building, and maintaining software systems by using existing software assets that are stored in a specialized operational reuse library or repository, accessible to system designers, is the foundation of the program. In addition to operating a software repository, RBSE promotes (1) software engineering technology transfer, (2) academic and instructional support of reuse programs, (3) the use of common software engineering standards and practices, (4) software reuse technology research, and (5) interoperability between reuse libraries. This Program Management Plan (PMP) is intended to communicate program goals and objectives, describe major work areas, and define a management report and control process. This process will assist the Program Manager, University of Houston at Clear Lake (UHCL) in tracking work progress and describing major program activities to NASA management. The goal of this PMP is to make managing the RBSE program a relatively easy process that improves the work of all team members. The PMP describes work areas addressed and work efforts being accomplished by the program; however, it is not intended as a complete description of the program. Its focus is on providing management tools and management processes for monitoring, evaluating, and administering the program; and it includes schedules for charting milestones and deliveries of program products. The PMP was developed by soliciting and obtaining guidance from appropriate program participants, analyzing program management guidance, and reviewing related program management documents.

  5. The protocol and design of a randomised controlled study on training of attention within the first year after acquired brain injury.

    PubMed

    Bartfai, Aniko; Markovic, Gabriela; Sargenius Landahl, Kristina; Schult, Marie-Louise

    2014-05-08

    To describe the design of the study aiming to examine intensive targeted cognitive rehabilitation of attention in the acute (<4 months) and subacute rehabilitation phases (4-12 months) after acquired brain injury and to evaluate the effects on function, activity and participation (return to work). Within a prospective, randomised, controlled study 120 consecutive patients with stroke or traumatic brain injury were randomised to 20 hours of intensive attention training by Attention Process Training or by standard, activity based training. Progress was evaluated by Statistical Process Control and by pre and post measurement of functional and activity levels. Return to work was also evaluated in the post-acute phase. Primary endpoints were the changes in the attention measure, Paced Auditory Serial Addition Test and changes in work ability. Secondary endpoints included measurement of cognitive functions, activity and work return. There were 3, 6 and 12-month follow ups focussing on health economics. The study will provide information on rehabilitation of attention in the early phases after ABI; effects on function, activity and return to work. Further, the application of Statistical Process Control might enable closer investigation of the cognitive changes after acquired brain injury and demonstrate the usefulness of process measures in rehabilitation. The study was registered at ClinicalTrials.gov Protocol. NCT02091453, registered: 19 March 2014.

  6. MathWorks Simulink and C++ integration with the new VLT PLC-based standard development platform for instrument control systems

    NASA Astrophysics Data System (ADS)

    Kiekebusch, Mario J.; Di Lieto, Nicola; Sandrock, Stefan; Popovic, Dan; Chiozzi, Gianluca

    2014-07-01

    ESO is in the process of implementing a new development platform, based on PLCs, for upcoming VLT control systems (new instruments and refurbishing of existing systems to manage obsolescence issues). In this context, we have evaluated the integration and reuse of existing C++ libraries and Simulink models into the real-time environment of BECKHOFF Embedded PCs using the capabilities of the latest version of TwinCAT software and MathWorks Embedded Coder. While doing so the aim was to minimize the impact of the new platform by adopting fully tested solutions implemented in C++. This allows us to reuse the in house expertise, as well as extending the normal capabilities of the traditional PLC programming environments. We present the progress of this work and its application in two concrete cases: 1) field rotation compensation for instrument tracking devices like derotators, 2) the ESO standard axis controller (ESTAC), a generic model-based controller implemented in Simulink and used for the control of telescope main axes.

  7. Prick test: evolution towards automated reading.

    PubMed

    Justo, X; Díaz, I; Gil, J J; Gastaminza, G

    2016-08-01

    The prick test is one of the most common medical methods for diagnosing allergies, and it has been carried out in a similar and laborious manner over many decades. In an attempt to standardize the reading of the test, many researchers have tried to automate the process of measuring the allergic reactions found by developing systems and algorithms based on multiple technologies. This work reviews the techniques for automatic wheal measurement with the aim of pointing out their advantages and disadvantages and the progress in the field. Furthermore, it provides a classification scheme for the different technologies applied. The works discussed herein provide evidence that significant challenges still exist for the development of an automatic wheal measurement system that not only helps allergists in their medical practice but also allows for the standardization of the reading and data exchange. As such, the aim of the work was to serve as guideline for the development of a proper and feasible system. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  8. Standardization of noncontact 3D measurement

    NASA Astrophysics Data System (ADS)

    Takatsuji, Toshiyuki; Osawa, Sonko; Sato, Osamu

    2008-08-01

    As the global R&D competition is intensified, more speedy measurement instruments are required both in laboratories and production process. In machinery areas, while contact type coordinate measuring machines (CMM) have been widely used, noncontact type CMMs are growing its market share which are capable of measuring enormous number of points at once. Nevertheless, since no industrial standard concerning an accuracy test of noncontact CMMs exists, each manufacturer writes the accuracy of their product according to their own rules, and this situation gives confusion to customers. The working group ISO/TC 213/WG 10 is trying to make a new ISO standard which stipulates an accuracy test of noncontact CMMs. The concept and the situation of discussion of this new standard will be explained. In National Metrology Institute of Japan (NMIJ), we are collecting measurement data which serves as a technical background of the standards together with a consortium formed by users and manufactures. This activity will also be presented.

  9. Standards for vision science libraries: 2014 revision.

    PubMed

    Motte, Kristin; Caldwell, C Brooke; Lamson, Karen S; Ferimer, Suzanne; Nims, J Chris

    2014-10-01

    This Association of Vision Science Librarians revision of the "Standards for Vision Science Libraries" aspires to provide benchmarks to address the needs for the services and resources of modern vision science libraries (academic, medical or hospital, pharmaceutical, and so on), which share a core mission, are varied by type, and are located throughout the world. Through multiple meeting discussions, member surveys, and a collaborative revision process, the standards have been updated for the first time in over a decade. While the range of types of libraries supporting vision science services, education, and research is wide, all libraries, regardless of type, share core attributes, which the standards address. The current standards can and should be used to help develop new vision science libraries or to expand the growth of existing libraries, as well as to support vision science librarians in their work to better provide services and resources to their respective users.

  10. Standards for vision science libraries: 2014 revision

    PubMed Central

    Motte, Kristin; Caldwell, C. Brooke; Lamson, Karen S.; Ferimer, Suzanne; Nims, J. Chris

    2014-01-01

    Objective: This Association of Vision Science Librarians revision of the “Standards for Vision Science Libraries” aspires to provide benchmarks to address the needs for the services and resources of modern vision science libraries (academic, medical or hospital, pharmaceutical, and so on), which share a core mission, are varied by type, and are located throughout the world. Methods: Through multiple meeting discussions, member surveys, and a collaborative revision process, the standards have been updated for the first time in over a decade. Results: While the range of types of libraries supporting vision science services, education, and research is wide, all libraries, regardless of type, share core attributes, which the standards address. Conclusions: The current standards can and should be used to help develop new vision science libraries or to expand the growth of existing libraries, as well as to support vision science librarians in their work to better provide services and resources to their respective users. PMID:25349547

  11. The Energy Industry Profile of ISO/DIS 19115-1: Facilitating Discovery and Evaluation of, and Access to Distributed Information Resources

    NASA Astrophysics Data System (ADS)

    Hills, S. J.; Richard, S. M.; Doniger, A.; Danko, D. M.; Derenthal, L.; Energistics Metadata Work Group

    2011-12-01

    A diverse group of organizations representative of the international community involved in disciplines relevant to the upstream petroleum industry, - energy companies, - suppliers and publishers of information to the energy industry, - vendors of software applications used by the industry, - partner government and academic organizations, has engaged in the Energy Industry Metadata Standards Initiative. This Initiative envisions the use of standard metadata within the community to enable significant improvements in the efficiency with which users discover, evaluate, and access distributed information resources. The metadata standard needed to realize this vision is the initiative's primary deliverable. In addition to developing the metadata standard, the initiative is promoting its adoption to accelerate realization of the vision, and publishing metadata exemplars conformant with the standard. Implementation of the standard by community members, in the form of published metadata which document the information resources each organization manages, will allow use of tools requiring consistent metadata for efficient discovery and evaluation of, and access to, information resources. While metadata are expected to be widely accessible, access to associated information resources may be more constrained. The initiative is being conducting by Energistics' Metadata Work Group, in collaboration with the USGIN Project. Energistics is a global standards group in the oil and natural gas industry. The Work Group determined early in the initiative, based on input solicited from 40+ organizations and on an assessment of existing metadata standards, to develop the target metadata standard as a profile of a revised version of ISO 19115, formally the "Energy Industry Profile of ISO/DIS 19115-1 v1.0" (EIP). The Work Group is participating on the ISO/TC 211 project team responsible for the revision of ISO 19115, now ready for "Draft International Standard" (DIS) status. With ISO 19115 an established, capability-rich, open standard for geographic metadata, EIP v1 is expected to be widely acceptable within the community and readily sustainable over the long-term. The EIP design, also per community requirements, will enable discovery, evaluation, and access to types of information resources considered important to the community, including structured and unstructured digital resources, and physical assets such as hardcopy documents and material samples. This presentation will briefly review the development of this initiative as well as the current and planned Work Group activities. More time will be spent providing an overview of the EIP v1, including the requirements it prescribes, design efforts made to enable automated metadata capture and processing, and the structure and content of its documentation, which was written to minimize ambiguity and facilitate implementation. The Work Group considers EIP v1 a solid initial design for interoperable metadata, and first step toward the vision of the Initiative.

  12. Finding-specific display presets for computed radiography soft-copy reading.

    PubMed

    Andriole, K P; Gould, R G; Webb, W R

    1999-05-01

    Much work has been done to optimize the display of cross-sectional modality imaging examinations for soft-copy reading (i.e., window/level tissue presets, and format presentations such as tile and stack modes, four-on-one, nine-on-one, etc). Less attention has been paid to the display of digital forms of the conventional projection x-ray. The purpose of this study is to assess the utility of providing presets for computed radiography (CR) soft-copy display, based not on the window/level settings, but on processing applied to the image optimized for visualization of specific findings, pathologies, etc (i.e., pneumothorax, tumor, tube location). It is felt that digital display of CR images based on finding-specific processing presets has the potential to: speed reading of digital projection x-ray examinations on soft copy; improve diagnostic efficacy; standardize display across examination type, clinical scenario, important key findings, and significant negatives; facilitate image comparison; and improve confidence in and acceptance of soft-copy reading. Clinical chest images are acquired using an Agfa-Gevaert (Mortsel, Belgium) ADC 70 CR scanner and Fuji (Stamford, CT) 9000 and AC2 CR scanners. Those demonstrating pertinent findings are transferred over the clinical picture archiving and communications system (PACS) network to a research image processing station (Agfa PS5000), where the optimal image-processing settings per finding, pathologic category, etc, are developed in conjunction with a thoracic radiologist, by manipulating the multiscale image contrast amplification (Agfa MUSICA) algorithm parameters. Soft-copy display of images processed with finding-specific settings are compared with the standard default image presentation for 50 cases of each category. Comparison is scored using a 5-point scale with the positive scale denoting the standard presentation is preferred over the finding-specific processing, the negative scale denoting the finding-specific processing is preferred over the standard presentation, and zero denoting no difference. Processing settings have been developed for several findings including pneumothorax and lung nodules, and clinical cases are currently being collected in preparation for formal clinical trials. Preliminary results indicate a preference for the optimized-processing presentation of images over the standard default, particularly by inexperienced radiology residents and referring clinicians.

  13. Defining the Path Forward: Guidance for Laboratory Medicine Guidelines

    PubMed Central

    Jones, Patricia M.; Chin, Alex C.; Christenson, Robert H.

    2015-01-01

    The National Academy of Clinical Biochemistry (NACB) has developed consensus-based guidelines for the laboratory evaluation and monitoring of patients with specified disorders for two decades. In 1997, the NACB recognized the need to standardize the process of guideline development and promulgated its first Standard Operating Procedure (SOP) for this purpose. In 2010, the American Association of Clinical Chemistry (AACC) and NACB created the Evidence-Based Laboratory Medicine Committee (EBLMC). Among other roles, this group was given responsibility to provide oversight of clinical practice guideline development in accordance with SOP guidance and using currently accepted good practices. In 2011, the U.S. Institute of Medicine (IOM) published two reports of relevance: ‘Clinical Practice Guidelines We Can Trust’ and ‘Finding What Works in Health Care – Standards for Systematic Reviews.’ These reports were created as part of a response to a legislative mandate from the U.S. Congress requesting that steps be taken to implement recommendations from lOM’s report on ‘Knowing What Works in Health Care’ (2008). The latest revision of the laboratory medicine practice guidelines (LMPG) SOP was in part driven by these reports. NACB continues to develop LMPGs at a rate of roughly one per year through standard processes detailed in its 2014 revision of the SOP. This article describes the NACB and EBLMC experience in developing LMPGs with a focus on the evolution and use of the latest SOP. AACC and NACB have established a solid track record in collaboratively working with many clinical societies and professional organizations on clinical practice guideline development. Presently, three LMPG’s are in various stages of development and all with the collaboration of other clinical/professional groups. The practices and tools being used for current LMPGs in progress are also highlighted in the context of the challenges that presently exist for effective clinical practice guideline development in the U.S. PMID:27683491

  14. Standardization of shape memory alloy test methods toward certification of aerospace applications

    NASA Astrophysics Data System (ADS)

    Hartl, D. J.; Mabe, J. H.; Benafan, O.; Coda, A.; Conduit, B.; Padan, R.; Van Doren, B.

    2015-08-01

    The response of shape memory alloy (SMA) components employed as actuators has enabled a number of adaptable aero-structural solutions. However, there are currently no industry or government-accepted standardized test methods for SMA materials when used as actuators and their transition to commercialization and production has been hindered. This brief fast track communication introduces to the community a recently initiated collaborative and pre-competitive SMA specification and standardization effort that is expected to deliver the first ever regulatory agency-accepted material specification and test standards for SMA as employed as actuators for commercial and military aviation applications. In the first phase of this effort, described herein, the team is working to review past efforts and deliver a set of agreed-upon properties to be included in future material certification specifications as well as the associated experiments needed to obtain them in a consistent manner. Essential for the success of this project is the participation and input from a number of organizations and individuals, including engineers and designers working in materials and processing development, application design, SMA component fabrication, and testing at the material, component, and system level. Going forward, strong consensus among this diverse body of participants and the SMA research community at large is needed to advance standardization concepts for universal adoption by the greater aerospace community and especially regulatory bodies. It is expected that the development and release of public standards will be done in collaboration with an established standards development organization.

  15. Nature and origins of mathematics difficulties in very preterm children: a different etiology than developmental dyscalculia.

    PubMed

    Simms, Victoria; Gilmore, Camilla; Cragg, Lucy; Clayton, Sarah; Marlow, Neil; Johnson, Samantha

    2015-02-01

    Children born very preterm (<32 wk) are at high risk for mathematics learning difficulties that are out of proportion to other academic and cognitive deficits. However, the etiology of mathematics difficulties in very preterm children is unknown. We sought to identify the nature and origins of preterm children's mathematics difficulties. One hundred and fifteen very preterm children aged 8-10 y were assessed in school with a control group of 77 term-born classmates. Achievement in mathematics, working memory, visuospatial processing, inhibition, and processing speed were assessed using standardized tests. Numerical representations and specific mathematics skills were assessed using experimental tests. Very preterm children had significantly poorer mathematics achievement, working memory, and visuospatial skills than term-born controls. Although preterm children had poorer performance in specific mathematics skills, there was no evidence of imprecise numerical representations. Difficulties in mathematics were associated with deficits in visuospatial processing and working memory. Mathematics difficulties in very preterm children are associated with deficits in working memory and visuospatial processing not numerical representations. Thus, very preterm children's mathematics difficulties are different in nature from those of children with developmental dyscalculia. Interventions targeting general cognitive problems, rather than numerical representations, may improve very preterm children's mathematics achievement.

  16. [A medical consumable material management information system].

    PubMed

    Tang, Guoping; Hu, Liang

    2014-05-01

    Medical consumables material is essential supplies to carry out medical work, which has a wide range of varieties and a large amount of usage. How to manage it feasibly and efficiently that has been a topic of concern to everyone. This article discussed about how to design a medical consumable material management information system that has a set of standardized processes, bring together medical supplies administrator, suppliers and clinical departments. Advanced management mode, enterprise resource planning (ERP) applied to the whole system design process.

  17. High Strength P/M Gears for Vehicle Transmissions - Phase 2

    DTIC Science & Technology

    2008-08-15

    and while it was considered amenable to standard work material transfer ("blue steel" chutes for example) from other P/M processing equipment, no...depend of the machine design but should be kept to a minimum in order to minimize part transfer times. Position control of the linear axis is...Establish design of ausform gear finishing machine for P/M gears: The "Focus" part identified in phase I (New Process Planet gear P/N 17864, component

  18. The INPE handouts to the 6th LANDSAT Technical Working Group (LTWG) Meeting

    NASA Technical Reports Server (NTRS)

    Debarrosaguirre, J. L. (Principal Investigator); Parada, L. E. M.; Depaulapereira, S.

    1984-01-01

    LANDSAT receiving and processing system in its present configuration and status are described, as well as the experience already obtained with LANDSATs 4 and 5. The revised table of station plans for TM reception and products and of implementation schedule for data formats employing superstructure conventions is updated. Standardization of the worldwide reference systems is proposed. The INPE preliminary TM products price list is included. A TM image received and processed is shown to illustrate the appearance of the products offered.

  19. A Format for Phylogenetic Placements

    PubMed Central

    Matsen, Frederick A.; Hoffman, Noah G.; Gallagher, Aaron; Stamatakis, Alexandros

    2012-01-01

    We have developed a unified format for phylogenetic placements, that is, mappings of environmental sequence data (e.g., short reads) into a phylogenetic tree. We are motivated to do so by the growing number of tools for computing and post-processing phylogenetic placements, and the lack of an established standard for storing them. The format is lightweight, versatile, extensible, and is based on the JSON format, which can be parsed by most modern programming languages. Our format is already implemented in several tools for computing and post-processing parsimony- and likelihood-based phylogenetic placements and has worked well in practice. We believe that establishing a standard format for analyzing read placements at this early stage will lead to a more efficient development of powerful and portable post-analysis tools for the growing applications of phylogenetic placement. PMID:22383988

  20. A format for phylogenetic placements.

    PubMed

    Matsen, Frederick A; Hoffman, Noah G; Gallagher, Aaron; Stamatakis, Alexandros

    2012-01-01

    We have developed a unified format for phylogenetic placements, that is, mappings of environmental sequence data (e.g., short reads) into a phylogenetic tree. We are motivated to do so by the growing number of tools for computing and post-processing phylogenetic placements, and the lack of an established standard for storing them. The format is lightweight, versatile, extensible, and is based on the JSON format, which can be parsed by most modern programming languages. Our format is already implemented in several tools for computing and post-processing parsimony- and likelihood-based phylogenetic placements and has worked well in practice. We believe that establishing a standard format for analyzing read placements at this early stage will lead to a more efficient development of powerful and portable post-analysis tools for the growing applications of phylogenetic placement.

Top