The standards process: X3 information processing systems
NASA Technical Reports Server (NTRS)
Emard, Jean-Paul
1993-01-01
The topics are presented in viewgraph form and include the following: International Organization for Standards (ISO); International Electrotechnical Committee (IEC); ISO/IEC Joint Technical Committee 1 (JTC-1); U.S. interface to JTC-1; ANSI; national organizations; U.S. standards development processes; national and international standards developing organizations; regional organizations; and X3 information processing systems.
NASA Technical Reports Server (NTRS)
Callender, E. David; Steinbacher, Jody
1989-01-01
This is the third of five volumes on Information System Life-Cycle and Documentation Standards which present a well organized, easily used standard for providing technical information needed for developing information systems, components, and related processes. This volume states the Software Management and Assurance Program documentation standard for a product specification document and for data item descriptions. The framework can be applied to any NASA information system, software, hardware, operational procedures components, and related processes.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-30
...-03] NIST Federal Information Processing Standard (FIPS) 140-3 (Second Draft), Security Requirements... Technology (NIST), Commerce. ACTION: Notice and Request for Comments. SUMMARY: The National Institute of Standards and Technology (NIST) seeks additional comments on specific sections of Federal Information...
Information system life-cycle and documentation standards, volume 1
NASA Technical Reports Server (NTRS)
Callender, E. David; Steinbacher, Jody
1989-01-01
The Software Management and Assurance Program (SMAP) Information System Life-Cycle and Documentation Standards Document describes the Version 4 standard information system life-cycle in terms of processes, products, and reviews. The description of the products includes detailed documentation standards. The standards in this document set can be applied to the life-cycle, i.e., to each phase in the system's development, and to the documentation of all NASA information systems. This provides consistency across the agency as well as visibility into the completeness of the information recorded. An information system is software-intensive, but consists of any combination of software, hardware, and operational procedures required to process, store, or transmit data. This document defines a standard life-cycle model and content for associated documentation.
CMMI(Registered) for Acquisition, Version 1.3. CMMI-ACQ, V1.3
2010-11-01
and Software Engineering – System Life Cycle Processes [ ISO 2008b] ISO /IEC 27001 :2005 Information technology – Security techniques – Information...International Organization for Standardization and International Electrotechnical Commission. ISO /IEC 27001 Information Technology – Security Techniques...International Organization for Standardization/International Electrotechnical Commission ( ISO /IEC) body of standards. CMMs focus on improving processes
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-07
... DEPARTMENT OF VETERANS AFFAIRS [OMB Control No. 2900-0521] Agency Information Collection (Credit Underwriting Standards and Procedures for Processing VA Guaranteed Loans) Activity Under OMB Review AGENCY... information abstracted below to the Office of Management and Budget (OMB) for review and comment. The PRA...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-29
... DEPARTMENT OF VETERANS AFFAIRS [OMB Control No. 2900-0521] Proposed Information Collection (Credit Underwriting Standards and Procedures for Processing VA Guaranteed Loans) Activity: Comment Request AGENCY... comment on the proposed collection of certain information by the agency. Under the Paperwork Reduction Act...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-01
... DEPARTMENT OF VETERANS AFFAIRS [OMB Control No. 2900-0521] Proposed Information Collection (Credit Underwriting Standards and Procedures for Processing VA Guaranteed Loans) Activity: Comment Request AGENCY... comment on the proposed collection of certain information by the agency. Under the Paperwork Reduction Act...
ERIC Educational Resources Information Center
Radack, Shirley M.
1994-01-01
Examines the role of the National Institute of Standards and Technology (NIST) in the development of the National Information Infrastructure (NII). Highlights include the standards process; voluntary standards; Open Systems Interconnection problems; Internet Protocol Suite; consortia; government's role; and network security. (16 references) (LRW)
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-06
... hash algorithms in many computer network applications. On February 11, 2011, NIST published a notice in... Information Security Management Act (FISMA) of 2002 (Pub. L. 107-347), the Secretary of Commerce is authorized to approve Federal Information Processing Standards (FIPS). NIST activities to develop computer...
Khan, Wajahat Ali; Khattak, Asad Masood; Hussain, Maqbool; Amin, Muhammad Bilal; Afzal, Muhammad; Nugent, Christopher; Lee, Sungyoung
2014-08-01
Heterogeneity in the management of the complex medical data, obstructs the attainment of data level interoperability among Health Information Systems (HIS). This diversity is dependent on the compliance of HISs with different healthcare standards. Its solution demands a mediation system for the accurate interpretation of data in different heterogeneous formats for achieving data interoperability. We propose an adaptive AdapteR Interoperability ENgine mediation system called ARIEN, that arbitrates between HISs compliant to different healthcare standards for accurate and seamless information exchange to achieve data interoperability. ARIEN stores the semantic mapping information between different standards in the Mediation Bridge Ontology (MBO) using ontology matching techniques. These mappings are provided by our System for Parallel Heterogeneity (SPHeRe) matching system and Personalized-Detailed Clinical Model (P-DCM) approach to guarantee accuracy of mappings. The realization of the effectiveness of the mappings stored in the MBO is evaluation of the accuracy in transformation process among different standard formats. We evaluated our proposed system with the transformation process of medical records between Clinical Document Architecture (CDA) and Virtual Medical Record (vMR) standards. The transformation process achieved over 90 % of accuracy level in conversion process between CDA and vMR standards using pattern oriented approach from the MBO. The proposed mediation system improves the overall communication process between HISs. It provides an accurate and seamless medical information exchange to ensure data interoperability and timely healthcare services to patients.
This data set consists of Census Designated Place and Federal Information Processing Standard (FIPS) Populated Place boundaries for the State of Arizona which were extracted from the 1992 U.S. Census Bureau TIGER line files.
NASA Technical Reports Server (NTRS)
Callender, E. David; Steinbacher, Jody
1989-01-01
This is the second of five volumes of the Information System Life-Cycle and Documentation Standards. This volume provides a well-organized, easily used standard for management plans used in acquiring, assuring, and developing information systems and software, hardware, and operational procedures components, and related processes.
ERIC Educational Resources Information Center
Haritonov, R. P.
1971-01-01
An important feature of standardization work in the Soviet Union is the preparation and establishment of State standards enabling unified systems to be introduced for documentation, classification, coding and technical and economic information, as well as standards for all kinds of information storage media. (Author/MM)
Federal information processing standards codes [for places around the world
DOT National Transportation Integrated Search
1993-05-01
Intended as a companion to the International Travel and Tourism CD-ROM, this document provides the provisional list of the Federal Information Processing Standards. It is based on the FIPS Pub 10-4 and contains Change Notices #1-15, and is current as...
Army Communicator. Volume 37, Number 2, Summer 2012
2012-01-01
solution will have to meet four criteria: FIPS 140-2 validated crypto; approved data-at-rest; Common Access Card enablement; and, enterprise management...Information Grid. Common Access Cards , Federal Information Processing Standard 140-2 certifications, and software compliance are just a few of the...and Evaluation Command BMC – Brigade Modernization Command CAC – Common Access Card FIPS – Federal Information Processing Standard GIG – Global
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-11
... before May 12, 2011. ADDRESSES: Written comments may be sent to: Chief, Computer Security Division... FURTHER INFORMATION CONTACT: Elaine Barker, Computer Security Division, National Institute of Standards... Quynh Dang, Computer Security Division, National Institute of Standards and Technology, Gaithersburg, MD...
An overview of the National Space Science data Center Standard Information Retrieval System (SIRS)
NASA Technical Reports Server (NTRS)
Shapiro, A.; Blecher, S.; Verson, E. E.; King, M. L. (Editor)
1974-01-01
A general overview is given of the National Space Science Data Center (NSSDC) Standard Information Retrieval System. A description, in general terms, the information system that contains the data files and the software system that processes and manipulates the files maintained at the Data Center. Emphasis is placed on providing users with an overview of the capabilities and uses of the NSSDC Standard Information Retrieval System (SIRS). Examples given are taken from the files at the Data Center. Detailed information about NSSDC data files is documented in a set of File Users Guides, with one user's guide prepared for each file processed by SIRS. Detailed information about SIRS is presented in the SIRS Users Guide.
78 FR 47785 - Notice of Information Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-06
... Standards and Technology (NIST) Federal Information Processing Standard (FIPS) 201: Personal Identity...), address, employment history, biometric identifiers (e.g. fingerprints), signature, digital photograph... use of other forms of information technology. Comments submitted in response to this notice will be...
78 FR 47784 - Notice of Information Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-06
... Standards and Technology (NIST) Federal Information Processing Standard (FIPS) 201: Personal Identity...), address, employment history, biometric identifiers (e.g. fingerprints), signature, digital photograph... collection techniques or the use of other forms of information technology. Comments submitted in response to...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-10
... Technology (NIST) requests comments on revisions to Federal Information Processing Standard (FIPS) 186-3... http://csrc.nist.gov/publications/PubsDrafts.html . DATES: Comments must be received on or before [email protected]nist.gov , with ``186-3 Change Notice'' in the subject line. FOR FURTHER...
NASA Technical Reports Server (NTRS)
Callender, E. David; Steinbacher, Jody
1989-01-01
This is the fourth of five volumes on Information System Life-Cycle and Documentation Standards. This volume provides a well organized, easily used standard for assurance documentation for information systems and software, hardware, and operational procedures components, and related processes. The specifications are developed in conjunction with the corresponding management plans specifying the assurance activities to be performed.
NASA Technical Reports Server (NTRS)
Callender, E. David; Steinbacher, Jody
1989-01-01
This is the fifth of five volumes on Information System Life-Cycle and Documentation Standards. This volume provides a well organized, easily used standard for management control and status reports used in monitoring and controlling the management, development, and assurance of informations systems and software, hardware, and operational procedures components, and related processes.
Standards Setting and Federal Information Policy: The Escrowed Encryption Standard (EES).
ERIC Educational Resources Information Center
Gegner, Karen E.; Veeder, Stacy B.
1994-01-01
Examines the standards process used for developing the Escrowed Encryption Standard (EES) and its possible impact on national communication and information policies. Discusses the balance between national security and law enforcement concerns versus privacy rights and economic competitiveness in the area of foreign trade and export controls. (67…
Expansion and Compression of Time Correlate with Information Processing in an Enumeration Task.
Wutz, Andreas; Shukla, Anuj; Bapi, Raju S; Melcher, David
2015-01-01
Perception of temporal duration is subjective and is influenced by factors such as attention and context. For example, unexpected or emotional events are often experienced as if time subjectively expands, suggesting that the amount of information processed in a unit of time can be increased. Time dilation effects have been measured with an oddball paradigm in which an infrequent stimulus is perceived to last longer than standard stimuli in the rest of the sequence. Likewise, time compression for the oddball occurs when the duration of the standard items is relatively brief. Here, we investigated whether the amount of information processing changes when time is perceived as distorted. On each trial, an oddball stimulus of varying numerosity (1-14 items) and duration was presented along with standard items that were either short (70 ms) or long (1050 ms). Observers were instructed to count the number of dots within the oddball stimulus and to judge its relative duration with respect to the standards on that trial. Consistent with previous results, oddballs were reliably perceived as temporally distorted: expanded for longer standard stimuli blocks and compressed for shorter standards. The occurrence of these distortions of time perception correlated with perceptual processing; i.e. enumeration accuracy increased when time was perceived as expanded and decreased with temporal compression. These results suggest that subjective time distortions are not epiphenomenal, but reflect real changes in sensory processing. Such short-term plasticity in information processing rate could be evolutionarily advantageous in optimizing perception and action during critical moments.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-31
... Information Collection: Federal Labor Standards Payee Verification and Payment Processing AGENCY: Office of the Chief Information Officer, HUD. ACTION: Notice. SUMMARY: HUD has submitted the proposed information collection requirement described below to the Office of Management and Budget (OMB) for review, in...
Rethinking the 2000 ACRL Standards: Some Things to Consider
ERIC Educational Resources Information Center
Kuhlthau, Carol C.
2013-01-01
I propose three "rethinks" to consider in recasting the ACRL Standards for information literacy for the coming decades. First, rethink the concept of information need. Second, rethink the notion that information literacy is composed of a set of abilities for "extracting information." Third, rethink the holistic process of…
CMMI(Registered) for Development, Version 1.3
2010-11-01
ISO /IEC 15288:2008 Systems and Software Engineering – System Life Cycle Processes [ ISO 2008b] ISO /IEC 27001 :2005 Information technology – Security...IEC 2005 International Organization for Standardization and International Electrotechnical Commission. ISO /IEC 27001 Information Technology...International Electrotechnical Commission ( ISO /IEC) body of standards. CMMs focus on improving processes in an organization. They contain the
Industrial Process Cooling Towers: National Emission Standards for Hazardous Air Pollutants
Standards limiting discharge of chromium compound air emissions from industrial process cooling towers (IPCT's). Includes rule history, Federal Registry citations, implementation information and additional resources.
Strategic, Organizational and Standardization Aspects of Integrated Information Systems. Volume 6.
1987-12-01
TEST CHART NATIONAL BUREAU OF STANDARDS- 1963-A Masaustt Strategic, Organizational, and Intueoyffomto TechnlogyStandardization Aspects of UJ Kowledge ...reasons (such as the desired level of processing power and the amount of storage space), organizational reasons (such as each department obtaining its...of processing power falls, Abbott can afford to subordinate efficient processing for organizational effectiveness. 4. Steps in an Analytical Process
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-22
... Proposed Information Collection to OMB ``Logic Model'' Grant Performance Report Standard AGENCY: Office of... proposal. Applicants of HUD Federal Financial Assistance are required to indicate intended results and impacts. Grant recipients report against their baseline performance standards. This process standardizes...
Interpreting international governance standards for health IT use within general medical practice.
Mahncke, Rachel J; Williams, Patricia A H
2014-01-01
General practices in Australia recognise the importance of comprehensive protective security measures. Some elements of information security governance are incorporated into recommended standards, however the governance component of information security is still insufficiently addressed in practice. The International Organistion for Standardisation (ISO) released a new global standard in May 2013 entitled, ISO/IEC 27014:2013 Information technology - Security techniques - Governance of information security. This standard, applicable to organisations of all sizes, offers a framework against which to assess and implement the governance components of information security. The standard demonstrates the relationship between governance and the management of information security, provides strategic principles and processes, and forms the basis for establishing a positive information security culture. An analysis interpretation of this standard for use in Australian general practice was performed. This work is unique as such interpretation for the Australian healthcare environment has not been undertaken before. It demonstrates an application of the standard at a strategic level to inform existing development of an information security governance framework.
Retinal Information Processing for Minimum Laser Lesion Detection and Cumulative Damage
1992-09-17
TAL3Unaqr~orJ:ccd [] J ,;--Wicic tion --------------... MYRON....... . ................... ... ....... ...........................MYRON L. WOLBARSHT B D ist...possible beneficial visual function of the small retinal image movements. B . Visual System Models Prior models of visual system information processing have...against standard secondary sources whose calibrations can be traced to the National Bureau of Standards. B . Electrophysiological Techniques Extracellular
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-18
... Glatz, Division of Policy and Planning, Office of Information Technology, Consumer Product Safety... appropriate, and other forms of information technology. Title: Safety Standard for Bicycle Helmets--16 CFR... and process for Commission acceptance of accreditation of third party conformity assessment bodies for...
2017-07-31
Telemetry Standards, IRIG Standard 106-17 Annex A.2, May 2017 ANNEX A.2 Magnetic Tape Recorder and Reproducer Information and Use Criteria...Annex A.2, May 2017 A.2-1 ANNEX A.2 Magnetic Tape Recorder and Reproducer Information and Use Criteria 1. Other Instrumentation Magnetic Tape...http://webstore.ansi.org). Documentation applicable to this annex is identified in the following bullets. • ISO 1860 (1986), Information Processing
Information Retrieval System for Japanese Standard Disease-Code Master Using XML Web Service
Hatano, Kenji; Ohe, Kazuhiko
2003-01-01
Information retrieval system of Japanese Standard Disease-Code Master Using XML Web Service is developed. XML Web Service is a new distributed processing system by standard internet technologies. With seamless remote method invocation of XML Web Service, users are able to get the latest disease code master information from their rich desktop applications or internet web sites, which refer to this service. PMID:14728364
ERIC Educational Resources Information Center
Hofstader, Robert; Chapman, Kenneth
This document discusses the Voluntary Industry Standards for Chemical Process Industries Technical Workers Project and issues of relevance to the education and employment of chemical laboratory technicians (CLTs) and process technicians (PTs). Section 1 consists of the following background information: overview of the chemical process industries,…
ERIC Educational Resources Information Center
Hack, David
This report on telephone networks and computer networks in a global context focuses on the processes and organizations through which the standards that make this possible are set. The first of five major sections presents descriptions of the standardization process, including discussions of the various kinds of standards, advantages and…
NASA Astrophysics Data System (ADS)
Nikolaev, V. N.; Titov, D. V.; Syryamkin, V. I.
2018-05-01
The comparative assessment of the level of channel capacity of different variants of the structural organization of the automated information processing systems is made. The information processing time assessment model depending on the type of standard elements and their structural organization is developed.
Emergency healthcare process automation using mobile computing and cloud services.
Poulymenopoulou, M; Malamateniou, F; Vassilacopoulos, G
2012-10-01
Emergency care is basically concerned with the provision of pre-hospital and in-hospital medical and/or paramedical services and it typically involves a wide variety of interdependent and distributed activities that can be interconnected to form emergency care processes within and between Emergency Medical Service (EMS) agencies and hospitals. Hence, in developing an information system for emergency care processes, it is essential to support individual process activities and to satisfy collaboration and coordination needs by providing readily access to patient and operational information regardless of location and time. Filling this information gap by enabling the provision of the right information, to the right people, at the right time fosters new challenges, including the specification of a common information format, the interoperability among heterogeneous institutional information systems or the development of new, ubiquitous trans-institutional systems. This paper is concerned with the development of an integrated computer support to emergency care processes by evolving and cross-linking institutional healthcare systems. To this end, an integrated EMS cloud-based architecture has been developed that allows authorized users to access emergency case information in standardized document form, as proposed by the Integrating the Healthcare Enterprise (IHE) profile, uses the Organization for the Advancement of Structured Information Standards (OASIS) standard Emergency Data Exchange Language (EDXL) Hospital Availability Exchange (HAVE) for exchanging operational data with hospitals and incorporates an intelligent module that supports triaging and selecting the most appropriate ambulances and hospitals for each case.
Solar industrial process heat systems: An assessment of standards for materials and components
NASA Astrophysics Data System (ADS)
Rossiter, W. J.; Shipp, W. E.
1981-09-01
A study was conducted to obtain information on the performance of materials and components in operational solar industrial process heat (PH) systems, and to provide recommendations for the development of standards including evaluative test procedures for materials and components. An assessment of the needs for standards for evaluating the long-term performance of materials and components of IPH systems was made. The assessment was based on the availability of existing standards, and information obtained from a field survey of operational systems, the literature, and discussions with individuals in the industry. Field inspections of 10 operational IPH systems were performed.
Twenty new ISO standards on dosimetry for radiation processing
NASA Astrophysics Data System (ADS)
Farrar, H., IV
2000-03-01
Twenty standards on essentially all aspects of dosimetry for radiation processing were published as new ISO standards in December 1998. The standards are based on 20 standard practices and guides developed over the past 14 years by Subcommittee E10.01 of the American Society for Testing and Materials (ASTM). The transformation to ISO standards using the 'fast track' process under ISO Technical Committee 85 (ISO/TC85) commenced in 1995 and resulted in some overlap of technical information between three of the new standards and the existing ISO Standard 11137 Sterilization of health care products — Requirements for validation and routine control — Radiation sterilization. Although the technical information in these four standards was consistent, compromise wording in the scopes of the three new ISO standards to establish precedence for use were adopted. Two of the new ISO standards are specifically for food irradiation applications, but the majority apply to all forms of gamma, X-ray, and electron beam radiation processing, including dosimetry for sterilization of health care products and the radiation processing of fruit, vegetables, meats, spices, processed foods, plastics, inks, medical wastes, and paper. Most of the standards provide exact procedures for using individual dosimetry systems or for characterizing various types of irradiation facilities, but one covers the selection and calibration of dosimetry systems, and another covers the treatment of uncertainties using the new ISO Type A and Type B evaluations. Unfortunately, nine of the 20 standards just adopted by the ISO are not the most recent versions of these standards and are therefore already out of date. To help solve this problem, efforts are being made to develop procedures to coordinate the ASTM and ISO development and revision processes for these and future ASTM-originating dosimetry standards. In the meantime, an additional four dosimetry standards have recently been published by the ASTM but have not yet been submitted to the ISO, and six more dosimetry standards are under development.
Kropf, Stefan; Chalopin, Claire; Lindner, Dirk; Denecke, Kerstin
2017-06-28
Access to patient data within the hospital or between hospitals is still problematic since a variety of information systems is in use applying different vendor specific terminologies and underlying knowledge models. Beyond, the development of electronic health record systems (EHRSs) is time and resource consuming. Thus, there is a substantial need for a development strategy of standardized EHRSs. We are applying a reuse-oriented process model and demonstrate its feasibility and realization on a practical medical use case, which is an EHRS holding all relevant data arising in the context of treatment of tumors of the sella region. In this paper, we describe the development process and our practical experiences. Requirements towards the development of the EHRS were collected by interviews with a neurosurgeon and patient data analysis. For modelling of patient data, we selected openEHR as standard and exploited the software tools provided by the openEHR foundation. The patient information model forms the core of the development process, which comprises the EHR generation and the implementation of an EHRS architecture. Moreover, a reuse-oriented process model from the business domain was adapted to the development of the EHRS. The reuse-oriented process model is a model for a suitable abstraction of both, modeling and development of an EHR centralized EHRS. The information modeling process resulted in 18 archetypes that were aggregated in a template and built the boilerplate of the model driven development. The EHRs and the EHRS were developed by openEHR and W3C standards, tightly supported by well-established XML techniques. The GUI of the final EHRS integrates and visualizes information from various examinations, medical reports, findings and laboratory test results. We conclude that the development of a standardized overarching EHR and an EHRS is feasible using openEHR and W3C standards, enabling a high degree of semantic interoperability. The standardized representation visualizes data and can in this way support the decision process of clinicians.
DEC Personnel Preparation Standards: Revision 2005-2008
ERIC Educational Resources Information Center
Lifter, Karin; Chandler, Lynette K.; Cochran, Deborah C.; Dinnebeil, Laurie A.; Gallagher, Peggy A.; Christensen, Kimberly A.; Stayton, Vicki D.
2011-01-01
The revision and process of validation of standards for early childhood special education (ECSE) and early intervention (EI) personnel at the initial and advanced levels of preparation, which occurred during 2005-2008, are described to provide a record of the process and to inform future cycles of standards revision. Central components focus on…
Integrated flexible manufacturing program for manufacturing automation and rapid prototyping
NASA Technical Reports Server (NTRS)
Brooks, S. L.; Brown, C. W.; King, M. S.; Simons, W. R.; Zimmerman, J. J.
1993-01-01
The Kansas City Division of Allied Signal Inc., as part of the Integrated Flexible Manufacturing Program (IFMP), is developing an integrated manufacturing environment. Several systems are being developed to produce standards and automation tools for specific activities within the manufacturing environment. The Advanced Manufacturing Development System (AMDS) is concentrating on information standards (STEP) and product data transfer; the Expert Cut Planner system (XCUT) is concentrating on machining operation process planning standards and automation capabilities; the Advanced Numerical Control system (ANC) is concentrating on NC data preparation standards and NC data generation tools; the Inspection Planning and Programming Expert system (IPPEX) is concentrating on inspection process planning, coordinate measuring machine (CMM) inspection standards and CMM part program generation tools; and the Intelligent Scheduling and Planning System (ISAPS) is concentrating on planning and scheduling tools for a flexible manufacturing system environment. All of these projects are working together to address information exchange, standardization, and information sharing to support rapid prototyping in a Flexible Manufacturing System (FMS) environment.
2015-12-04
This final rule will extend enhanced funding for Medicaid eligibility systems as part of a state's mechanized claims processing system, and will update conditions and standards for such systems, including adding to and updating current Medicaid Management Information Systems (MMIS) conditions and standards. These changes will allow states to improve customer service and support the dynamic nature of Medicaid eligibility, enrollment, and delivery systems.
Software Formal Inspections Guidebook
NASA Technical Reports Server (NTRS)
1993-01-01
The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.
2012-07-01
Information Modeling ( BIM ) is the process of generating and managing building data during a facility’s entire life cycle. New BIM standards for...cycle Building Information Modeling ( BIM ) as a new standard for building information data repositories can serve as the foun- dation for automation and... Building Information Modeling ( BIM ) is defined as “a digital representa- tion of physical and functional
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-05
... comments and additional information on the rulemaking process, see section IV of this document (Public... process of reviewing the changes to ASHRAE Standard 90.1, EPCA directs DOE to publish in the Federal...
Chen, Elizabeth S.; Maloney, Francine L.; Shilmayster, Eugene; Goldberg, Howard S.
2009-01-01
A systematic and standard process for capturing information within free-text clinical documents could facilitate opportunities for improving quality and safety of patient care, enhancing decision support, and advancing data warehousing across an enterprise setting. At Partners HealthCare System, the Medical Language Processing (MLP) services project was initiated to establish a component-based architectural model and processes to facilitate putting MLP functionality into production for enterprise consumption, promote sharing of components, and encourage reuse. Key objectives included exploring the use of an open-source framework called the Unstructured Information Management Architecture (UIMA) and leveraging existing MLP-related efforts, terminology, and document standards. This paper describes early experiences in defining the infrastructure and standards for extracting, encoding, and structuring clinical observations from a variety of clinical documents to serve enterprise-wide needs. PMID:20351830
Chen, Elizabeth S; Maloney, Francine L; Shilmayster, Eugene; Goldberg, Howard S
2009-11-14
A systematic and standard process for capturing information within free-text clinical documents could facilitate opportunities for improving quality and safety of patient care, enhancing decision support, and advancing data warehousing across an enterprise setting. At Partners HealthCare System, the Medical Language Processing (MLP) services project was initiated to establish a component-based architectural model and processes to facilitate putting MLP functionality into production for enterprise consumption, promote sharing of components, and encourage reuse. Key objectives included exploring the use of an open-source framework called the Unstructured Information Management Architecture (UIMA) and leveraging existing MLP-related efforts, terminology, and document standards. This paper describes early experiences in defining the infrastructure and standards for extracting, encoding, and structuring clinical observations from a variety of clinical documents to serve enterprise-wide needs.
Wiltz, Jennifer L; Blanck, Heidi M; Lee, Brian; Kocot, S Lawrence; Seeff, Laura; McGuire, Lisa C; Collins, Janet
2017-10-26
Electronic information technology standards facilitate high-quality, uniform collection of data for improved delivery and measurement of health care services. Electronic information standards also aid information exchange between secure systems that link health care and public health for better coordination of patient care and better-informed population health improvement activities. We developed international data standards for healthy weight that provide common definitions for electronic information technology. The standards capture healthy weight data on the "ABCDs" of a visit to a health care provider that addresses initial obesity prevention and care: assessment, behaviors, continuity, identify resources, and set goals. The process of creating healthy weight standards consisted of identifying needs and priorities, developing and harmonizing standards, testing the exchange of data messages, and demonstrating use-cases. Healthy weight products include 2 message standards, 5 use-cases, 31 LOINC (Logical Observation Identifiers Names and Codes) question codes, 7 healthy weight value sets, 15 public-private engagements with health information technology implementers, and 2 technical guides. A logic model and action steps outline activities toward better data capture, interoperable systems, and information use. Sharing experiences and leveraging this work in the context of broader priorities can inform the development of electronic information standards for similar core conditions and guide strategic activities in electronic systems.
Blanck, Heidi M.; Lee, Brian; Kocot, S. Lawrence; Seeff, Laura; McGuire, Lisa C.; Collins, Janet
2017-01-01
Electronic information technology standards facilitate high-quality, uniform collection of data for improved delivery and measurement of health care services. Electronic information standards also aid information exchange between secure systems that link health care and public health for better coordination of patient care and better-informed population health improvement activities. We developed international data standards for healthy weight that provide common definitions for electronic information technology. The standards capture healthy weight data on the “ABCDs” of a visit to a health care provider that addresses initial obesity prevention and care: assessment, behaviors, continuity, identify resources, and set goals. The process of creating healthy weight standards consisted of identifying needs and priorities, developing and harmonizing standards, testing the exchange of data messages, and demonstrating use-cases. Healthy weight products include 2 message standards, 5 use-cases, 31 LOINC (Logical Observation Identifiers Names and Codes) question codes, 7 healthy weight value sets, 15 public–private engagements with health information technology implementers, and 2 technical guides. A logic model and action steps outline activities toward better data capture, interoperable systems, and information use. Sharing experiences and leveraging this work in the context of broader priorities can inform the development of electronic information standards for similar core conditions and guide strategic activities in electronic systems. PMID:29072985
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stayner, L.T.; Meinhardt, T.; Hardin, B.
Under the Occupational Safety and Health, and Mine Safety and Health Acts, the National Institute for Occupational Safety and Health (NIOSH) is charged with development of recommended occupational safety and health standards, and with conducting research to support the development of these standards. Thus, NIOSH has been actively involved in the analysis of risk associated with occupational exposures, and in the development of research information that is critical for the risk assessment process. NIOSH research programs and other information resources relevant to the risk assessment process are described in this paper. Future needs for information resources are also discussed.
Relatively fast! Efficiency advantages of comparative thinking.
Mussweiler, Thomas; Epstude, Kai
2009-02-01
Comparisons are a ubiquitous process in information processing. Seven studies examine whether, how, and when comparative thinking increases the efficiency of judgment and choice. Studies 1-4 demonstrate that procedurally priming participants to engage in more vs. less comparison influences how they process information about a target. Specifically, they retrieve less information about the target (Studies 1A, 1B), think more about an information-rich standard (Study 2) about which they activate judgment-relevant information (Study 3), and use this information to compensate for missing target information (Study 4). Studies 2-5 demonstrate the ensuing efficiency advantages. Participants who are primed on comparative thinking are faster in making a target judgment (Studies 2A, 2B, 4, 5) and have more residual processing capacities for a secondary task (Study 5). Studies 6 and 7 establish two boundary conditions by demonstrating that comparative thinking holds efficiency advantages only if target and standard are partly characterized by alignable features (Study 6) that are difficult to evaluate in isolation (Study 7). These findings indicate that comparative thinking may often constitute a useful mechanism to simplify information processing. (PsycINFO Database Record (c) 2009 APA, all rights reserved).
Needed: A Standard Information Processing Model of Learning and Learning Processes.
ERIC Educational Resources Information Center
Carifio, James
One strategy to prevent confusion as new paradigms emerge is to have professionals in the area develop and use a standard model of the phenomenon in question. The development and use of standard models in physics, genetics, archaeology, and cosmology have been very productive. The cognitive revolution in psychology and education has produced a…
RMP Guidance for Warehouses - Appendix D: OSHA Guidance on PSM
This text is taken directly from OSHA's appendix C to the Process Safety Management standard (29 CFR 1910.119). Compiled information required by this standard, including material safety data sheets (MSDS), is essential to process hazards analysis (PHA).
75 FR 18751 - FBI Criminal Justice Information Services Division User Fees
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-13
... Standards (SFFAS-4): Managerial Cost Accounting Concepts and Standards for the Federal Government; and other relevant financial management directives, BearingPoint developed a cost accounting methodology and related... management process that provides information about the relationships between inputs (costs) and outputs...
48 CFR 352.239-71 - Standard for encryption language.
Code of Federal Regulations, 2011 CFR
2011-10-01
... language. 352.239-71 Section 352.239-71 Federal Acquisition Regulations System HEALTH AND HUMAN SERVICES... Information Processing Standard (FIPS) 140-2-compliant encryption (Security Requirements for Cryptographic Module, as amended) to protect all instances of HHS sensitive information during storage and transmission...
A standard-driven approach for electronic submission to pharmaceutical regulatory authorities.
Lin, Ching-Heng; Chou, Hsin-I; Yang, Ueng-Cheng
2018-03-01
Using standards is not only useful for data interchange during the process of a clinical trial, but also useful for analyzing data in a review process. Any step, which speeds up approval of new drugs, may benefit patients. As a result, adopting standards for regulatory submission becomes mandatory in some countries. However, preparing standard-compliant documents, such as annotated case report form (aCRF), needs a great deal of knowledge and experience. The process is complex and labor-intensive. Therefore, there is a need to use information technology to facilitate this process. Instead of standardizing data after the completion of a clinical trial, this study proposed a standard-driven approach. This approach was achieved by implementing a computer-assisted "standard-driven pipeline (SDP)" in an existing clinical data management system. SDP used CDISC standards to drive all processes of a clinical trial, such as the design, data acquisition, tabulation, etc. RESULTS: A completed phase I/II trial was used to prove the concept and to evaluate the effects of this approach. By using the CDISC-compliant question library, aCRFs were generated automatically when the eCRFs were completed. For comparison purpose, the data collection process was simulated and the collected data was transformed by the SDP. This new approach reduced the missing data fields from sixty-two to eight and the controlled term mismatch field reduced from eight to zero during data tabulation. This standard-driven approach accelerated CRF annotation and assured data tabulation integrity. The benefits of this approach include an improvement in the use of standards during the clinical trial and a reduction in missing and unexpected data during tabulation. The standard-driven approach is an advanced design idea that can be used for future clinical information system development. Copyright © 2018 Elsevier Inc. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-19
... correction of wording and typographical errors, and further aligns the FIPS with Key Cryptography Standard... Cryptography Standard (PKCS) 1. NIST published a Federal Register Notice (77 FR 21538) on April 10, 2012 to...
Information Processing in Memory Tasks.
ERIC Educational Resources Information Center
Johnston, William A.
The intensity of information processing engendered in different phases of standard memory tasks was examined in six experiments. Processing intensity was conceptualized as system capacity consumed, and was measured via a divided-attention procedure in which subjects performed a memory task and a simple reaction-time (RT) task concurrently. The…
Code of Federal Regulations, 2010 CFR
2010-07-01
... being used will be based on information available to the Administrator, which may include, but is not... techniques, or the control system and process monitoring equipment during a malfunction in a manner... the process and control system monitoring equipment, and shall include a standardized checklist to...
Code of Federal Regulations, 2011 CFR
2011-07-01
... being used will be based on information available to the Administrator, which may include, but is not... techniques, or the control system and process monitoring equipment during a malfunction in a manner... the process and control system monitoring equipment, and shall include a standardized checklist to...
Qualitative Differences in Real-Time Solution of Standardized Figural Analogies.
ERIC Educational Resources Information Center
Schiano, Diane J.; And Others
Performance on standardized figural analogy tests is considered highly predictive of academic success. While information-processing models of analogy solution attribute performance differences to quantitative differences in processing parameters, the problem-solving literature suggests that qualitative differences in problem representation and…
Designing the Information Literacy Competency Standards for nursing.
Phelps, Sue F
2013-01-01
This column documents the rationale for creating information literacy competency standards for nursing based on the Association of College and Research Libraries (ACRL) "Information Literacy Competency Standards for Higher Education" and the three documents from the American Association of Colleges of Nursing (AACN) on essential skills for nurses in baccalaureate, masters, and doctoral level education and practice. It chronicles the process of the task force which is designing the discipline specific skills and predicts the value of their use, once they are published.
Quigley, Matthew; Dillon, Michael P; Fatone, Stefania
2018-02-01
Shared decision making is a consultative process designed to encourage patient participation in decision making by providing accurate information about the treatment options and supporting deliberation with the clinicians about treatment options. The process can be supported by resources such as decision aids and discussion guides designed to inform and facilitate often difficult conversations. As this process increases in use, there is opportunity to raise awareness of shared decision making and the international standards used to guide the development of quality resources for use in areas of prosthetic/orthotic care. To describe the process used to develop shared decision-making resources, using an illustrative example focused on decisions about the level of dysvascular partial foot amputation or transtibial amputation. Development process: The International Patient Decision Aid Standards were used to guide the development of the decision aid and discussion guide focused on decisions about the level of dysvascular partial foot amputation or transtibial amputation. Examples from these shared decision-making resources help illuminate the stages of development including scoping and design, research synthesis, iterative development of a prototype, and preliminary testing with patients and clinicians not involved in the development process. Lessons learnt through the process, such as using the International Patient Decision Aid Standards checklist and development guidelines, may help inform others wanting to develop similar shared decision-making resources given the applicability of shared decision making to many areas of prosthetic-/orthotic-related practice. Clinical relevance Shared decision making is a process designed to guide conversations that help patients make an informed decision about their healthcare. Raising awareness of shared decision making and the international standards for development of high-quality decision aids and discussion guides is important as the approach is introduced in prosthetic-/orthotic-related practice.
Terminal Information Processing System (TIPS) Consolidated CAB Display (CCD) Comparative Analysis.
1982-04-01
Barometric pressure 3. Center field wind speed, direction and gusts 4. Runway visual range 5. Low-level wind shear 6. Vortex advisory 7. Runway equipment...PASSWORD Command (standard user) u. PAUSE Command (standard user) v. PMSG Command (standard user) w. PPD Command (standard user) x. PURGE Command (standard
Standardization of XML Database Exchanges and the James Webb Space Telescope Experience
NASA Technical Reports Server (NTRS)
Gal-Edd, Jonathan; Detter, Ryan; Jones, Ron; Fatig, Curtis C.
2007-01-01
Personnel from the National Aeronautics and Space Administration (NASA) James Webb Space Telescope (JWST) Project have been working with various standard communities such the Object Management Group (OMG) and the Consultative Committee for Space Data Systems (CCSDS) to assist in the definition of a common extensible Markup Language (XML) for database exchange format. The CCSDS and OMG standards are intended for the exchange of core command and telemetry information, not for all database information needed to exercise a NASA space mission. The mission-specific database, containing all the information needed for a space mission, is translated from/to the standard using a translator. The standard is meant to provide a system that encompasses 90% of the information needed for command and telemetry processing. This paper will discuss standardization of the XML database exchange format, tools used, and the JWST experience, as well as future work with XML standard groups both commercial and government.
7 CFR 868.262 - Grade designation and other certificate information.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Brown Rice for Processing Principles Governing Application of Standards § 868.262 Grade designation and other certificate information. (a) Brown rice for processing. The grade designation for all classes of Brown rice for processing shall be included on the certificate grade-line in the following order: (1...
7 CFR 868.262 - Grade designation and other certificate information.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Brown Rice for Processing Principles Governing Application of Standards § 868.262 Grade designation and other certificate information. (a) Brown rice for processing. The grade designation for all classes of Brown rice for processing shall be included on the certificate grade-line in the following order: (1...
7 CFR 868.262 - Grade designation and other certificate information.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Brown Rice for Processing Principles Governing Application of Standards § 868.262 Grade designation and other certificate information. (a) Brown rice for processing. The grade designation for all classes of Brown rice for processing shall be included on the certificate grade-line in the following order: (1...
7 CFR 868.262 - Grade designation and other certificate information.
Code of Federal Regulations, 2014 CFR
2014-01-01
... Brown Rice for Processing Principles Governing Application of Standards § 868.262 Grade designation and other certificate information. (a) Brown rice for processing. The grade designation for all classes of Brown rice for processing shall be included on the certificate grade-line in the following order: (1...
7 CFR 868.262 - Grade designation and other certificate information.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Brown Rice for Processing Principles Governing Application of Standards § 868.262 Grade designation and other certificate information. (a) Brown rice for processing. The grade designation for all classes of Brown rice for processing shall be included on the certificate grade-line in the following order: (1...
NASA Astrophysics Data System (ADS)
Skersys, Tomas; Butleris, Rimantas; Kapocius, Kestutis
2013-10-01
Approaches for the analysis and specification of business vocabularies and rules are very relevant topics in both Business Process Management and Information Systems Development disciplines. However, in common practice of Information Systems Development, the Business modeling activities still are of mostly empiric nature. In this paper, basic aspects of the approach for business vocabularies' semi-automated extraction from business process models are presented. The approach is based on novel business modeling-level OMG standards "Business Process Model and Notation" (BPMN) and "Semantics for Business Vocabularies and Business Rules" (SBVR), thus contributing to OMG's vision about Model-Driven Architecture (MDA) and to model-driven development in general.
[HL7 standard--features, principles, and methodology].
Koncar, Miroslav
2005-01-01
The mission of HL7 Inc. non-profit organization is to provide standards for the exchange, management and integration of data that support clinical patient care, and the management, delivery and evaluation of healthcare services. As the standards developed by HL7 Inc. represent the world's most influential standardization efforts in the field of medical informatics, the HL7 family of standards has been recognized by the technical and scientific community as the foundation for the next generation healthcare information systems. Versions 1 and 2 of HL7 standard have solved many issues, but also demonstrated the size and complexity of health information sharing problem. As the solution complete new methodology has been adopted that is encompassed in the HL7 Version 3 recommendations. This approach standardizes Reference Information Model (RIM), which is the source of all derived domain models and message structures. Message design is now defined in detail, enabling interoperability between loosely coupled systems that are.designed by different vendors and deployed in various environments. At the start of the Primary Healthcare Information System project in the Republic of Croatia in 2002, the decision was to go directly to Version 3. The target scope of work includes clinical, financial and administrative data management in the domain of healthcare processes. By using HL7v3 standardized methodology we were able to completely map the Croatian primary healthcare domain to HL7v3 artefacts. Further refinement processes that are planned for the future will provide semantic interoperability and detailed description of all elements in HL7 messages. Our HL7 Business Component is in constant process of studying different legacy applications, making solid foundation for their integration to HL7-enabled communication environment.
This page contains a December 2007 fact sheet with information regarding the National Emissions Standards for Hazardous Air Pollutants (NESHAP) for Clay Ceramics Manufacturing, Glass Manufacturing, and Secondary Nonferrous Metals Processing Area Sources
Tait, Alan R; Voepel-Lewis, Terri; Malviya, Shobha; Philipson, Sandra J
2005-04-01
To examine whether a consent document modified to conform with the federal guidelines for readability and processability would result in greater parental understanding compared with a standard form. Randomized clinical study. The preoperative waiting area of a larger tertiary care children's hospital. A total of 305 parents of children scheduled for minor elective surgical procedures. Parents were randomized to receive information about a clinical study in 1 of 4 ways: (1) standard consent form alone, (2) standard consent form with verbal disclosure, (3) modified form alone (standard form modified to meet the federal guidelines for readability and processability), and (4) modified form with verbal disclosure. Parents were interviewed to determine their understanding of 11 elements of consent, including study purpose, protocol, risks, benefits to child (direct), benefit to others (indirect), freedom to withdraw, alternatives, duration of study, voluntariness, confidentiality, and whom to contact. Their responses were scored by 2 independent assessors. Understanding of the protocol, study duration, risks, and direct benefits, together with overall understanding, was greater among parents who received the modified form (P<.001). Additionally, parents reported that the modified form had greater clarity (P = .009) and improved layout compared with the standard form (P<.001). When parents were shown both forms, 81.2% preferred the modified version. Results suggest that a consent form written according to federal guidelines for readability and processability can improve parent understanding and thus will be important in enhancing the informed consent process.
45 CFR 164.308 - Administrative safeguards.
Code of Federal Regulations, 2012 CFR
2012-10-01
...)(i) Standard: Security management process. Implement policies and procedures to prevent, detect... this subpart for the entity. (3)(i) Standard: Workforce security. Implement policies and procedures to...) Standard: Information access management. Implement policies and procedures for authorizing access to...
45 CFR 164.308 - Administrative safeguards.
Code of Federal Regulations, 2011 CFR
2011-10-01
...)(i) Standard: Security management process. Implement policies and procedures to prevent, detect... this subpart for the entity. (3)(i) Standard: Workforce security. Implement policies and procedures to...) Standard: Information access management. Implement policies and procedures for authorizing access to...
An Approach for Implementation of Project Management Information Systems
NASA Astrophysics Data System (ADS)
Běrziša, Solvita; Grabis, Jānis
Project management is governed by project management methodologies, standards, and other regulatory requirements. This chapter proposes an approach for implementing and configuring project management information systems according to requirements defined by these methodologies. The approach uses a project management specification framework to describe project management methodologies in a standardized manner. This specification is used to automatically configure the project management information system by applying appropriate transformation mechanisms. Development of the standardized framework is based on analysis of typical project management concepts and process and existing XML-based representations of project management. A demonstration example of project management information system's configuration is provided.
Information use skills in the engineering programme accreditation criteria of four countries
NASA Astrophysics Data System (ADS)
Bradley, Cara
2014-01-01
The need for twenty-first century information skills in engineering practice, combined with the importance for engineering programmes to meet accreditation requirements, suggests that it may be worthwhile to explore the potential for closer alignment between librarians and their work with information literacy competencies to assist in meeting accreditation standards and graduating students with high-level information skills. This article explores whether and how information use skills are reflected in engineering programme accreditation standards of four countries: Canada, the USA, the UK, and Australia. Results indicate that there is significant overlap between the information use skills required of students by engineering accreditation processes and librarians' efforts to develop information literacy competencies in students, despite differences in terms used to describe these skills. Increased collaboration between engineering faculty and librarians has the potential to raise student information literacy levels and fulfil the information use-related requirements of accreditation processes.
Specifications for a Federal Information Processing Standard Data Dictionary System
NASA Technical Reports Server (NTRS)
Goldfine, A.
1984-01-01
The development of a software specification that Federal agencies may use in evaluating and selecting data dictionary systems (DDS) is discussed. To supply the flexibility needed by widely different applications and environments in the Federal Government, the Federal Information Processing Standard (FIPS) specifies a core DDS together with an optimal set of modules. The focus and status of the development project are described. Functional specifications for the FIPS DDS are examined for the dictionary, the dictionary schema, and the dictionary processing system. The DDS user interfaces and DDS software interfaces are discussed as well as dictionary administration.
Making the connection: the VA-Regenstrief project.
Martin, D K
1992-01-01
The Regenstrief Automated Medical Record System is a well-established clinical information system with powerful facilities for querying and decision support. My colleagues and I introduced this system into the Indianapolis Veterans Affairs (VA) Medical Center by interfacing it to the institution's automated data-processing system, the Decentralized Hospital Computer Program (DHCP), using a recently standardized method for clinical data interchange. This article discusses some of the challenges encountered in that process, including the translation of vocabulary terms and maintenance of the software interface. Efforts such as these demonstrate the importance of standardization in medical informatics and the need for data standards at all levels of information exchange.
46 CFR 67.13 - Incorporation by reference.
Code of Federal Regulations, 2010 CFR
2010-10-01
..., National Technical Information Service, Springfield, VA 22181 Federal Information Processing Standards...). For information on the availability of this material at NARA, call 202-741-6030, or go to: http://www...
75 FR 57328 - Petitions for Exemption; Summary of Petitions Received
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-20
... more information on the rulemaking process, see the SUPPLEMENTARY INFORMATION section of this document..., except Federal holidays. FOR FURTHER INFORMATION CONTACT: Jan Thor, (425-227-2127), Standardization...
Gathering Information from Transport Systems for Processing in Supply Chains
NASA Astrophysics Data System (ADS)
Kodym, Oldřich; Unucka, Jakub
2016-12-01
Paper deals with complex system for processing information from means of transport acting as parts of train (rail or road). It focuses on automated information gathering using AutoID technology, information transmission via Internet of Things networks and information usage in information systems of logistic firms for support of selected processes on MES and ERP levels. Different kinds of gathered information from whole transport chain are discussed. Compliance with existing standards is mentioned. Security of information in full life cycle is integral part of presented system. Design of fully equipped system based on synthesized functional nodes is presented.
ERIC Educational Resources Information Center
Kontos, Pia C.; Miller, Karen-Lee; Mitchell, Gail J.
2010-01-01
Purpose: The Resident Assessment Instrument-Minimum Data Set (RAI/MDS) is an interdisciplinary standardized process that informs care plan development in nursing homes. This standardized process has failed to consistently result in individualized care planning, which may suggest problems with content and planning integrity. We examined the…
Effectiveness of the Department of Defense Information Assurance Accreditation Process
2013-03-01
meeting the requirements of ISO 27001, Information Security Management System. ISO 27002 provides “security techniques” or best practices that can be...efforts to the next level and implement a recognized standard such as the International Organization for Standards ( ISO ) 27000 Series of standards...implemented by an organization as part of their certification effort.15 Most likely, the main motivation a company would have for achieving an ISO
Unified System Of Data On Materials And Processes
NASA Technical Reports Server (NTRS)
Key, Carlo F.
1989-01-01
Wide-ranging sets of data for aerospace industry described. Document describes Materials and Processes Technical Information System (MAPTIS), computerized set of integrated data bases for use by NASA and aerospace industry. Stores information in standard format for fast retrieval in searches and surveys of data. Helps engineers select materials and verify their properties. Promotes standardized nomenclature as well as standarized tests and presentation of data. Format of document of photographic projection slides used in lectures. Presents examples of reports from various data bases.
Library Standards: Evidence of Library Effectiveness and Accreditation.
ERIC Educational Resources Information Center
Ebbinghouse, Carol
1999-01-01
Discusses accreditation standards for libraries based on experiences in an academic law library. Highlights include the accreditation process; the impact of distance education and remote technologies on accreditation; and a list of Internet sources of standards and information. (LRW)
75 FR 37879 - Petitions for Exemption; Summary of Petitions Received
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-30
... more information on the rulemaking process, see the SUPPLEMENTARY INFORMATION section of this document... Federal holidays. FOR FURTHER INFORMATION CONTACT: Jan Thor, (425-227-2127), Standardization Branch, ANM...
75 FR 15771 - Petitions for Exemption; Summary of Petitions Received
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-30
... more information on the rulemaking process, see the SUPPLEMENTARY INFORMATION section of this document... Federal holidays. FOR FURTHER INFORMATION CONTACT: Jan Thor, (425-227-2127), Standardization Branch, ANM...
75 FR 26843 - Petition for Exemption; Summary of Petition Received
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-12
... more information on the rulemaking process, see the SUPPLEMENTARY INFORMATION section of this document... Federal holidays. FOR FURTHER INFORMATION CONTACT: Jan Thor, (425-227-2127), Standardization Branch, ANM...
Saqaeian Nejad Isfahani, Sakineh; Mirzaeian, Razieh; Habibi, Mahbobe
2013-01-01
In supporting a therapeutic approach and medication therapy management, pharmacy information system acts as one of the central pillars of information system. This ensures that medication therapy is being supported and evaluated with an optimal level of safety and quality similar to other treatments and services. This research aims to evaluate the performance of pharmacy information system in three types of teaching, private and social affiliated hospitals. The present study is an applied, descriptive and analytical study which was conducted on the pharmacy information system in use in the selected hospitals. The research population included all the users of pharmacy information systems in the selected hospitals. The research sample is the same as the research population. Researchers collected data using a self-designed checklist developed following the guidelines of the American Society of Health-System Pharmacists, Australia pharmaceutical Society and Therapeutic guidelines of the Drug Commission of the German Medical Association. The checklist validity was assessed by research supervisors and pharmacy information system pharmacists and users. To collect data besides observation, the questionnaires were distributed among pharmacy information system pharmacists and users. Finally, the analysis of the data was performed using the SPSS software. Pharmacy information system was found to be semi-automated in 16 hospitals and automated in 3 ones. Regarding the standards in the guidelines issued by the Society of Pharmacists, the highest rank in observing the input standards belonged to the Social Services associated hospitals with a mean score of 32.75. While teaching hospitals gained the highest score both in processing standards with a mean score of 29.15 and output standards with a mean score of 43.95, and the private hospitals had the lowest mean scores of 23.32, 17.78, 24.25 in input, process and output standards respectively. Based on the findings, the studied hospitals had minimal compliance with the input, output and processing standards related to the pharmacy information system. It is suggested that the establishment of a team composed of operational managers, computer fields experts, health information managers, pharmacists as well as physicians may contribute to the promotion of the capabilities of pharmacy information system to be able to focus on health care practitioners' and users' requirements.
ERIC Educational Resources Information Center
Cravens, Xiu Chen; Goldring, Ellen B.; Porter, Andrew C.; Polikoff, Morgan S.; Murphy, Joseph; Elliott, Stephen N.
2013-01-01
Purpose: Performance evaluation informs professional development and helps school personnel improve student learning. Although psychometric literature indicates that a rational, sound, and coherent standard-setting process adds to the credibility of an assessment, few studies have empirically examined the decision-making process. This article…
NASA-STD-6016 Standard Materials and Processes Requirements for Spacecraft
NASA Technical Reports Server (NTRS)
Hirsch, David B.
2009-01-01
The standards for materials and processes surrounding spacecraft are discussed. Presentation focused on minimum requirements for Materials and Processes (M&P) used in design, fabrication, and testing of flight components for NASA manned, unmanned, robotic, launch vehicle, lander, in-space and surface systems, and spacecraft program/project hardware elements.Included is information on flammability, offgassing, compatibility requirements, and processes; both metallic and non-metallic materials are mentioned.
1996 DOE technical standards program workshop: Proceedings
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-07-01
The workshop theme is `The Strategic Standardization Initiative - A Technology Exchange and Global Competitiveness Challenge for DOE.` The workshop goal is to inform the DOE technical standards community of strategic standardization activities taking place in the Department, other Government agencies, standards developing organizations, and industry. Individuals working on technical standards will be challenged to improve cooperation and communications with the involved organizations in response to the initiative. Workshop sessions include presentations by representatives from various Government agencies that focus on coordination among and participation of Government personnel in the voluntary standards process; reports by standards organizations, industry, and DOEmore » representatives on current technology exchange programs; and how the road ahead appears for `information superhighway` standardization. Another session highlights successful standardization case studies selected from several sites across the DOE complex. The workshop concludes with a panel discussion on the goals and objectives of the DOE Technical Standards Program as envisioned by senior DOE management. The annual workshop on technical standards has proven to be an effective medium for communicating information related to standards throughout the DOE community. Technical standards are used to transfer technology and standardize work processes to produce consistent, acceptable results. They provide a practical solution to the Department`s challenge to protect the environment and the health and safety of the public and workers during all facility operations. Through standards, the technologies of industries and governments worldwide are available to DOE. The DOE Technical Standards Program, a Department-wide effort that crosscuts all organizations and disciplines, links the Department to those technologies.« less
Overview of Computer Security Certification and Accreditation. Final Report.
ERIC Educational Resources Information Center
Ruthberg, Zella G.; Neugent, William
Primarily intended to familiarize ADP (automatic data processing) policy and information resource managers with the approach to computer security certification and accreditation found in "Guideline to Computer Security Certification and Accreditation," Federal Information Processing Standards Publications (FIPS-PUB) 102, this overview…
NASA Astrophysics Data System (ADS)
Downs, R. R.; Chen, R. S.
2011-12-01
Services that preserve and enable future access to scientific data are necessary to ensure that the data that are being collected today will be available for use by future generations of scientists. Many data centers, archives, and other digital repositories are working to improve their ability to serve as long-term stewards of scientific data. Trust in sustainable data management and preservation capabilities of digital repositories can influence decisions to use these services to deposit or obtain scientific data. Building on the Open Archival Information System (OAIS) Reference Model developed by the Consultative Committee for Space Data Systems (CCSDS) and adopted by the International Organization for Standardization as ISO 14721:2003, new standards are being developed to improve long-term data management processes and documentation. The Draft Information Standard ISO/DIS 16363, "Space data and information transfer systems - Audit and certification of trustworthy digital repositories" offers the potential to evaluate digital repositories objectively in terms of their trustworthiness as long-term stewards of digital resources. In conjunction with this, the CCSDS and ISO are developing another draft standard for the auditing and certification process, ISO/DIS 16919, "Space data and information transfer systems - Requirements for bodies providing audit and certification of candidate trustworthy digital repositories". Six test audits were conducted of scientific data centers and archives in Europe and the United States to test the use of these draft standards and identify potential improvements for the standards and for the participating digital repositories. We present a case study of the test audit conducted on the NASA Socioeconomic Data and Applications Center (SEDAC) and describe the preparation, the audit process, recommendations received, and next steps to obtain certification as a trustworthy digital repository, after approval of the ISO/DIS standards.
48 CFR 239.7201 - Solicitation requirements.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Information Processing Standards are incorporated into solicitations. [71 FR 39011, July 11, 2006] ... SYSTEM, DEPARTMENT OF DEFENSE SPECIAL CATEGORIES OF CONTRACTING ACQUISITION OF INFORMATION TECHNOLOGY...
Technical Standards for Command and Control Information Systems (CCISs) and Information Technology
1994-02-01
formatting, transmitting, receiving, and processing imagery and imagery-related information. The N1TFS is in essence the suite of individual standards...also known as Limited Operational Capability-Europe) and the German Joint Analysis System Military Intelligence ( JASMIN ). Among the approaches being... essence , the other systems utilize a one-level address space where addressing consists of identifying the fire support unit. However, AFATDS utilizes a two
Memory Effects in Visual Spatial Information Processing.
ERIC Educational Resources Information Center
Fishbein, Harold D.
1978-01-01
Eight, ten, and twelve year old children were tested on a novel procedure involving the successive presentation of standard and comparision stimuli. Two hypotheses were evaluated: one dealing with memory effects, and the other with children's pretesting of choice responses in spatial information processing. (Editor/RK)
[Development and clinical evaluation of an anesthesia information management system].
Feng, Jing-yi; Chen, Hua; Zhu, Sheng-mei
2010-09-21
To study the design, implementation and clinical evaluation of an anesthesia information management system. To record, process and store peri-operative patient data automatically, all kinds of bedside monitoring equipments are connected into the system based on information integrating technology; after a statistical analysis of those patient data by data mining technology, patient status can be evaluated automatically based on risk prediction standard and decision support system, and then anesthetist could perform reasonable and safe clinical processes; with clinical processes electronically recorded, standard record tables could be generated, and clinical workflow is optimized, as well. With the system, kinds of patient data could be collected, stored, analyzed and archived, kinds of anesthesia documents could be generated, and patient status could be evaluated to support clinic decision. The anesthesia information management system is useful for improving anesthesia quality, decreasing risk of patient and clinician, and aiding to provide clinical proof.
76 FR 23714 - Railroad Safety Appliance Standards
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-28
... would include each of the elements that would be necessary to allow it to make an informed decision on a... for the review and approval of existing industry standards. This process will permit railroad industry.... Section 238.230 borrows the process set out in Sec. 238.21. It allows a recognized representative of the...
45 CFR 303.72 - Requests for collection of past-due support by Federal tax refund offset.
Code of Federal Regulations, 2010 CFR
2010-10-01
... the accuracy of the past-due support amount. If the State IV-D agency has verified this information...) The amount of past-due support owed; (iv) The State codes as contained in the Federal Information Processing Standards (FIPS) publication of the National Bureau of Standards and also promulgated by the...
National Education Standards: The Complex Challenge for Educational Leaders.
ERIC Educational Resources Information Center
Faidley, Ray; Musser, Steven
1991-01-01
National standards for education are important elements in the excellence process, but standards imposed by a central authority simply do not work in the Information Era. It would be wise to increase teachers' decision-making role in establishing and implementing local level excellence standards and train teachers to employ the Japanese "kaizen"…
An Extensible Information Grid for Risk Management
NASA Technical Reports Server (NTRS)
Maluf, David A.; Bell, David G.
2003-01-01
This paper describes recent work on developing an extensible information grid for risk management at NASA - a RISK INFORMATION GRID. This grid is being developed by integrating information grid technology with risk management processes for a variety of risk related applications. To date, RISK GRID applications are being developed for three main NASA processes: risk management - a closed-loop iterative process for explicit risk management, program/project management - a proactive process that includes risk management, and mishap management - a feedback loop for learning from historical risks that escaped other processes. This is enabled through an architecture involving an extensible database, structuring information with XML, schemaless mapping of XML, and secure server-mediated communication using standard protocols.
Exploring the Use of Enterprise Content Management Systems in Unification Types of Organizations
NASA Astrophysics Data System (ADS)
Izza Arshad, Noreen; Mehat, Mazlina; Ariff, Mohamed Imran Mohamed
2014-03-01
The aim of this paper is to better understand how highly standardized and integrated businesses known as unification types of organizations use Enterprise Content Management Systems (ECMS) to support their business processes. Multiple case study approach was used to study the ways two unification organizations use their ECMS in their daily work practices. Arising from these case studies are insights into the differing ways in which ECMS is used to support businesses. Based on the comparisons of the two cases, this study proposed that unification organizations may use ECMS in four ways, for: (1) collaboration, (2) information sharing that supports a standardized process structure, (3) building custom workflows that support integrated and standardized processes, and (4) providing links and access to information systems. These findings may guide organizations that are highly standardized and integrated in fashion, to achieve their intended ECMS-use, to understand reasons for ECMS failures and underutilization and to exploit technologies investments.
Using a logical information model-driven design process in healthcare.
Cheong, Yu Chye; Bird, Linda; Tun, Nwe Ni; Brooks, Colleen
2011-01-01
A hybrid standards-based approach has been adopted in Singapore to develop a Logical Information Model (LIM) for healthcare information exchange. The Singapore LIM uses a combination of international standards, including ISO13606-1 (a reference model for electronic health record communication), ISO21090 (healthcare datatypes), SNOMED CT (healthcare terminology) and HL7 v2 (healthcare messaging). This logic-based design approach also incorporates mechanisms for achieving bi-directional semantic interoperability.
ERIC Educational Resources Information Center
Arnold, Jeffery E.
2010-01-01
The purpose of this study was to determine the effect of four different design layouts of the New York State elementary science learning standards on user processing time and preference. Three newly developed layouts contained the same information as the standards core curriculum. In this study, the layout of the core guide is referred to as Book.…
Trigo, Jesús Daniel; Martínez, Ignacio; Alesanco, Alvaro; Kollmann, Alexander; Escayola, Javier; Hayn, Dieter; Schreier, Günter; García, José
2012-07-01
This paper investigates the application of the enterprise information system (EIS) paradigm to standardized cardiovascular condition monitoring. There are many specifications in cardiology, particularly in the ECG standardization arena. The existence of ECG formats, however, does not guarantee the implementation of homogeneous, standardized solutions for ECG management. In fact, hospital management services need to cope with various ECG formats and, moreover, several different visualization applications. This heterogeneity hampers the normalization of integrated, standardized healthcare information systems, hence the need for finding an appropriate combination of ECG formats and a suitable EIS-based software architecture that enables standardized exchange and homogeneous management of ECG formats. Determining such a combination is one objective of this paper. The second aim is to design and develop the integrated healthcare information system that satisfies the requirements posed by the previous determination. The ECG formats selected include ISO/IEEE11073, Standard Communications Protocol for Computer-Assisted Electrocardiography, and an ECG ontology. The EIS-enabling techniques and technologies selected include web services, simple object access protocol, extensible markup language, or business process execution language. Such a selection ensures the standardized exchange of ECGs within, or across, healthcare information systems while providing modularity and accessibility.
Health level 7 development framework for medication administration.
Kim, Hwa Sun; Cho, Hune
2009-01-01
We propose the creation of a standard data model for medication administration activities through the development of a clinical document architecture using the Health Level 7 Development Framework process based on an object-oriented analysis and the development method of Health Level 7 Version 3. Medication administration is the most common activity performed by clinical professionals in healthcare settings. A standardized information model and structured hospital information system are necessary to achieve evidence-based clinical activities. A virtual scenario is used to demonstrate the proposed method of administering medication. We used the Health Level 7 Development Framework and other tools to create the clinical document architecture, which allowed us to illustrate each step of the Health Level 7 Development Framework in the administration of medication. We generated an information model of the medication administration process as one clinical activity. It should become a fundamental conceptual model for understanding international-standard methodology by healthcare professionals and nursing practitioners with the objective of modeling healthcare information systems.
7 CFR 52.24 - Where to file for an appeal inspection and information required.
Code of Federal Regulations, 2010 CFR
2010-01-01
... STANDARDS UNDER THE AGRICULTURAL MARKETING ACT OF 1946 PROCESSED FRUITS AND VEGETABLES, PROCESSED PRODUCTS THEREOF, AND CERTAIN OTHER PROCESSED FOOD PRODUCTS 1 Regulations Governing Inspection and Certification...
Standard development at the Human Variome Project.
Smith, Timothy D; Vihinen, Mauno
2015-01-01
The Human Variome Project (HVP) is a world organization working towards facilitating the collection, curation, interpretation and free and open sharing of genetic variation information. A key component of HVP activities is the development of standards and guidelines. HVP Standards are systems, procedures and technologies that the HVP Consortium has determined must be used by HVP-affiliated data sharing infrastructure and should be used by the broader community. HVP guidelines are considered to be beneficial for HVP affiliated data sharing infrastructure and the broader community to adopt. The HVP also maintains a process for assessing systems, processes and tools that implement HVP Standards and Guidelines. Recommended System Status is an accreditation process designed to encourage the adoption of HVP Standards and Guidelines. Here, we describe the HVP standards development process and discuss the accepted standards, guidelines and recommended systems as well as those under acceptance. Certain HVP Standards and Guidelines are already widely adopted by the community and there are committed users for the others. © The Author(s) 2015. Published by Oxford University Press.
Standard development at the Human Variome Project
Smith, Timothy D.; Vihinen, Mauno
2015-01-01
The Human Variome Project (HVP) is a world organization working towards facilitating the collection, curation, interpretation and free and open sharing of genetic variation information. A key component of HVP activities is the development of standards and guidelines. HVP Standards are systems, procedures and technologies that the HVP Consortium has determined must be used by HVP-affiliated data sharing infrastructure and should be used by the broader community. HVP guidelines are considered to be beneficial for HVP affiliated data sharing infrastructure and the broader community to adopt. The HVP also maintains a process for assessing systems, processes and tools that implement HVP Standards and Guidelines. Recommended System Status is an accreditation process designed to encourage the adoption of HVP Standards and Guidelines. Here, we describe the HVP standards development process and discuss the accepted standards, guidelines and recommended systems as well as those under acceptance. Certain HVP Standards and Guidelines are already widely adopted by the community and there are committed users for the others. PMID:25818894
Uniform National Discharge Standards (UNDS) for Vessels of the Armed Forces
The Uniform National Discharge Standards homepage links to a description of the EPA's rulemaking process and provides information to the public on outreach efforts and answers some frequently asked questions.
CMMI (Trademark) for Development, Version 1.2
2006-08-01
IEC TR 12207 Information Technology—Software Life Cycle Processes, 1995. http://www.jtc1-sc7.org. ISO 1998 International Organization for...We also consult other standards as needed, including the following: • ISO 9000 [ ISO 1987] • ISO /IEC 12207 [ ISO 1995] • ISO /IEC 15504 [ ISO 2006... ISO /IEC) body of standards. CMMs focus on improving processes in an organization. They contain the essential elements of effective processes for one
Government Open Systems Interconnection Profile (GOSIP) transition strategy
NASA Astrophysics Data System (ADS)
Laxen, Mark R.
1993-09-01
This thesis analyzes the Government Open Systems Interconnection Profile (GOSIP) and the requirements of the Federal Information Processing Standard (FIPS) Publication 146-1. It begins by examining the International Organization for Standardization (ISO) Open Systems Interconnection (OSI) architecture and protocol suites and the distinctions between GOSIP version one and two. Additionally, it explores some of the GOSIP protocol details and discusses the process by which standards organizations have developed their recommendations. Implementation considerations from both government and vendor perspectives illustrate the barriers and requirements faced by information systems managers, as well as basic transition strategies. The result of this thesis is to show a transition strategy through an extended and coordinated period of coexistence due to extensive legacy systems and GOSIP product unavailability. Recommendations for GOSIP protocol standards to include capabilities outside the OSI model are also presented.
ERIC Educational Resources Information Center
Bloom, Robert; And Others
A study of the processes for establishing the principles and policies of measurement and disclosure in preparing financial reports examines differences in these processes in the United States, Canada, and England. Information was drawn from international accounting literature on standard setting. The differences and similarities in the…
Code of Federal Regulations, 2010 CFR
2010-01-01
... the USPPI in the form of a power of attorney or written authorization, thus making them authorized... agencies participating in the AES postdeparture filing review process. Failure to meet the standards of the...) calendar days of receipt of the postdeparture filing application by the Census Bureau, or if a decision...
ERIC Educational Resources Information Center
Ryan, David L.
2010-01-01
While research in academic and professional information technology (IT) journals address the need for strategic alignment and defined IT processes, there is little research about what factors should be considered when implementing specific IT hardware standards in an organization. The purpose of this study was to develop a set of factors for…
NASA Operational Environment Team (NOET) - NASA's key to environmental technology
NASA Technical Reports Server (NTRS)
Cook, Beth
1993-01-01
NOET is a NASA-wide team which supports the research and development community by sharing information both in person and via a computerized network, assisting in specification and standard revisions, developing cleaner propulsion systems, and exploring environmentally compliant alternatives to current processes. NOET's structure, dissemination of materials, electronic information, EPA compliance, specifications and standards, and environmental research and development are discussed.
Application of the Near Miss Strategy and Edit Distance to Handle Dirty Data
NASA Astrophysics Data System (ADS)
Varol, Cihan; Bayrak, Coskun; Wagner, Rick; Goff, Dana
In today’s information age, processing customer information in a standardized and accurate manner is known to be a difficult task. Data collection methods vary from source to source by format, volume, and media type. Therefore, it is advantageous to deploy customized data hygiene techniques to standardize the data for meaningfulness and usefulness based on the organization.
Application of the Near Miss Strategy and Edit Distance to Handle Dirty Data
NASA Astrophysics Data System (ADS)
Varol, Cihan; Bayrak, Coskun; Wagner, Rick; Goff, Dana
In today's information age, processing customer information in a standardized and accurate manner is known to be a difficult task. Data collection methods vary from source to source by format, volume, and media type. Therefore, it is advantageous to deploy customized data hygiene techniques to standardize the data for meaningfulness and usefulness based on the organization.
41 CFR 102-118.35 - What definitions apply to this part?
Code of Federal Regulations, 2010 CFR
2010-07-01
... published formats and codes as authorized by the applicable Federal Information Processing Standards... techniques for carrying out transportation transactions using electronic transmissions of the information...
Increasing patient safety and efficiency in transfusion therapy using formal process definitions.
Henneman, Elizabeth A; Avrunin, George S; Clarke, Lori A; Osterweil, Leon J; Andrzejewski, Chester; Merrigan, Karen; Cobleigh, Rachel; Frederick, Kimberly; Katz-Bassett, Ethan; Henneman, Philip L
2007-01-01
The administration of blood products is a common, resource-intensive, and potentially problem-prone area that may place patients at elevated risk in the clinical setting. Much of the emphasis in transfusion safety has been targeted toward quality control measures in laboratory settings where blood products are prepared for administration as well as in automation of certain laboratory processes. In contrast, the process of transfusing blood in the clinical setting (ie, at the point of care) has essentially remained unchanged over the past several decades. Many of the currently available methods for improving the quality and safety of blood transfusions in the clinical setting rely on informal process descriptions, such as flow charts and medical algorithms, to describe medical processes. These informal descriptions, although useful in presenting an overview of standard processes, can be ambiguous or incomplete. For example, they often describe only the standard process and leave out how to handle possible failures or exceptions. One alternative to these informal descriptions is to use formal process definitions, which can serve as the basis for a variety of analyses because these formal definitions offer precision in the representation of all possible ways that a process can be carried out in both standard and exceptional situations. Formal process definitions have not previously been used to describe and improve medical processes. The use of such formal definitions to prospectively identify potential error and improve the transfusion process has not previously been reported. The purpose of this article is to introduce the concept of formally defining processes and to describe how formal definitions of blood transfusion processes can be used to detect and correct transfusion process errors in ways not currently possible using existing quality improvement methods.
This December 6, 2016 letter from EPA provides information about the petition submission process for small refineries seeking exemptions from their 2016 RFS obligations, including financial and other information necessary for evaluation.
Taylor, H E; Bramley, D E P
2012-11-01
The provision of written information is a component of the informed consent process for research participants. We conducted a readability analysis to test the hypothesis that the language used in patient information and consent forms in anaesthesia research in Australia and New Zealand does not meet the readability standards or expectations of the Good Clinical Practice Guidelines, the National Health and Medical Research Council in Australia and the Health Research Council of New Zealand. We calculated readability scores for 40 patient information and consent forms using the Simple Measure of Gobbledygook and Flesch-Kincaid formulas. The mean grade level of patient information and consent forms when using the Simple Measure of Gobbledygook and Flesch-Kincaid readability formulas was 12.9 (standard deviation of 0.8, 95% confidence interval 12.6 to 13.1) and 11.9 (standard deviation 1.1, 95% confidence interval 11.6 to 12.3), respectively. This exceeds the average literacy and comprehension of the general population in Australia and New Zealand. Complex language decreases readability and negatively impacts on the informed consent process. Care should be exercised when providing written information to research participants to ensure language and readability is appropriate for the audience.
Specifications of Standards in Systems and Synthetic Biology.
Schreiber, Falk; Bader, Gary D; Golebiewski, Martin; Hucka, Michael; Kormeier, Benjamin; Le Novère, Nicolas; Myers, Chris; Nickerson, David; Sommer, Björn; Waltemath, Dagmar; Weise, Stephan
2015-09-04
Standards shape our everyday life. From nuts and bolts to electronic devices and technological processes, standardised products and processes are all around us. Standards have technological and economic benefits, such as making information exchange, production, and services more efficient. However, novel, innovative areas often either lack proper standards, or documents about standards in these areas are not available from a centralised platform or formal body (such as the International Standardisation Organisation). Systems and synthetic biology is a relatively novel area, and it is only in the last decade that the standardisation of data, information, and models related to systems and synthetic biology has become a community-wide effort. Several open standards have been established and are under continuous development as a community initiative. COMBINE, the ‘COmputational Modeling in BIology’ NEtwork has been established as an umbrella initiative to coordinate and promote the development of the various community standards and formats for computational models. There are yearly two meeting, HARMONY (Hackathons on Resources for Modeling in Biology), Hackathon-type meetings with a focus on development of the support for standards, and COMBINE forums, workshop-style events with oral presentations, discussion, poster, and breakout sessions for further developing the standards. For more information see http://co.mbine.org/. So far the different standards were published and made accessible through the standards’ web- pages or preprint services. The aim of this special issue is to provide a single, easily accessible and citable platform for the publication of standards in systems and synthetic biology. This special issue is intended to serve as a central access point to standards and related initiatives in systems and synthetic biology, it will be published annually to provide an opportunity for standard development groups to communicate updated specifications.
Do, Hyoungho
2018-01-01
Objectives Increasing use of medical devices outside of healthcare facilities inevitably requires connectivity and interoperability between medical devices and healthcare information systems. To this end, standards have been developed and used to provide interoperability between personal health devices (PHDs) and external systems. ISO/IEEE 11073 standards and IHE PCD-01 standard messages have been used the most in the exchange of observation data of health devices. Recently, transmitting observation data using the HL7 FHIR standard has been devised in the name of DoF (Devices on FHIR) and adopted very fast. We compare and analyze these standards and suggest that which standard will work best at the different environments of device usage. Methods We generated each message/resource of the three standards for observed vital signs from blood pressure monitor and thermometer. Then, the size, the contents, and the exchange processes of these messages are compared and analyzed. Results ISO/IEEE 11073 standard message has the smallest data size, but it has no ability to contain the key information, patient information. On the other hand, PCD-01 messages and FHIR standards have the fields for patient information. HL7 DoF standards provide reusing of information unit known as resource, and it is relatively easy to parse DoF messages since it uses widely known XML and JSON. Conclusions ISO/IEEE 11073 standards are suitable for devices having very small computing power. IHE PCD-01 and HL7 DoF messages can be used for the devices that need to be connected to hospital information systems that require patient information. When information reuse is frequent, DoF is advantageous over PCD-01. PMID:29503752
Lee, Sungkee; Do, Hyoungho
2018-01-01
Increasing use of medical devices outside of healthcare facilities inevitably requires connectivity and interoperability between medical devices and healthcare information systems. To this end, standards have been developed and used to provide interoperability between personal health devices (PHDs) and external systems. ISO/IEEE 11073 standards and IHE PCD-01 standard messages have been used the most in the exchange of observation data of health devices. Recently, transmitting observation data using the HL7 FHIR standard has been devised in the name of DoF (Devices on FHIR) and adopted very fast. We compare and analyze these standards and suggest that which standard will work best at the different environments of device usage. We generated each message/resource of the three standards for observed vital signs from blood pressure monitor and thermometer. Then, the size, the contents, and the exchange processes of these messages are compared and analyzed. ISO/IEEE 11073 standard message has the smallest data size, but it has no ability to contain the key information, patient information. On the other hand, PCD-01 messages and FHIR standards have the fields for patient information. HL7 DoF standards provide reusing of information unit known as resource, and it is relatively easy to parse DoF messages since it uses widely known XML and JSON. ISO/IEEE 11073 standards are suitable for devices having very small computing power. IHE PCD-01 and HL7 DoF messages can be used for the devices that need to be connected to hospital information systems that require patient information. When information reuse is frequent, DoF is advantageous over PCD-01.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-08
... FIPS 201-2 AGENCY: National Institute of Standards and Technology (NIST), Commerce. ACTION: Notice and request for comments. SUMMARY: The National Institute of Standards and Technology (NIST) publishes this... 201-1, and specific changes requested by Federal agencies and implementers. NIST has received numerous...
The Government Information Locator Service (GILS)
Christian, E.
1996-01-01
In coordination with the Information Infrastructure Task Force (IITF), the Office of Management and Budget (OMB) is promoting the establishment of an agency-based Government Information Locator Service (GILS) to help the public locate and access information throughout the Federal Government. This report presents a vision of how GILS will be implemented. Working primarily with OMB and the Locator Subgroup of the Interagency Working Group on Public Access, Eliot Christian of the US Geological Survey prepared this report under the auspices of the IITF Committee on Information Policy. This vision of GILS has also received extensive review by various Federal agencies and other interested parties, including some non-Federal organizations and by the general public through notices in both the Federal Register and the Commerce Business Daily and at a public meeting held in December, 1993. As part of the Federal role in the National Information Infrastructure, GILS will identify and describe information resources throughout the Federal government, and provide assistance in obtaining the information. It will be decentralized and will supplement other agency and commercial information dissemination mechanisms. The public will use GILS directly or through intermediaries, such as the Government Printing Office, the National Technical Information Service, the Federal depository libraries, other public libraries, and private sector information services. Direct users will have access to a GILS Core accessible on the Internet without charge. Intermediate access may include kiosks, "800 numbers", electronic mail, bulletin boards, fax, and off-line media such as floppy disks, CD-ROM, and printed works. GILS will use standard network technology and the American National Standards Institute Z39.50 standard for information search and retrieval so that information can be retrieved in a variety of ways. Direct users will eventually have access to many other Federal and non-Federal information resources, linkages to data systems, and electronic delivery of information products. Development of this report proceeded in tandem with a GILS Profile development project that produced an Implementors Agreement in the voluntary standards process. The National Institute of Standards and Technology is now establishing a Federal Information Processing Standard referencing the GILS Profile Implementors Agreement and making mandatory its application for Federal agencies establishing locators for government information. Existing law and policy, as articulated in OMB Circular A-130, the Records Disposal Act, and the Freedom of Information Act, require agencies to create and maintain an inventory of their information systems and information dissemination products. Although compliance with these requirements varies greatly, the incremental cost of making those inventories accessible through GILS is expected to be minimal. Accordingly, participation in establishing and maintaining GILS may be accomplished as a collective effort executed within existing funds and authorities. OMB will publish in 1994 a Bulletin following on Circular A-130 that will specify agency responsibilities in GILS and set implementation schedules. A process for ongoing evaluation will also be established to evaluate the degree to which GILS meets the information needs of the public.
Station Search Coverage Maps Outages View Outages Report Outages Information General Information Receiver Information Reception Problems NWR Alarms Automated Voices FIPS Codes NWR - Special Needs SAME USING SAME SAME FIPS (Federal Information Processing Standards) code changes and / or SAME location code changes
Proposal for a Security Management in Cloud Computing for Health Care
Dzombeta, Srdan; Brandis, Knud
2014-01-01
Cloud computing is actually one of the most popular themes of information systems research. Considering the nature of the processed information especially health care organizations need to assess and treat specific risks according to cloud computing in their information security management system. Therefore, in this paper we propose a framework that includes the most important security processes regarding cloud computing in the health care sector. Starting with a framework of general information security management processes derived from standards of the ISO 27000 family the most important information security processes for health care organizations using cloud computing will be identified considering the main risks regarding cloud computing and the type of information processed. The identified processes will help a health care organization using cloud computing to focus on the most important ISMS processes and establish and operate them at an appropriate level of maturity considering limited resources. PMID:24701137
Proposal for a security management in cloud computing for health care.
Haufe, Knut; Dzombeta, Srdan; Brandis, Knud
2014-01-01
Cloud computing is actually one of the most popular themes of information systems research. Considering the nature of the processed information especially health care organizations need to assess and treat specific risks according to cloud computing in their information security management system. Therefore, in this paper we propose a framework that includes the most important security processes regarding cloud computing in the health care sector. Starting with a framework of general information security management processes derived from standards of the ISO 27000 family the most important information security processes for health care organizations using cloud computing will be identified considering the main risks regarding cloud computing and the type of information processed. The identified processes will help a health care organization using cloud computing to focus on the most important ISMS processes and establish and operate them at an appropriate level of maturity considering limited resources.
Using a Multimedia Presentation to Enhance Informed Consent in a Pediatric Emergency Department.
Spencer, Sandra P; Stoner, Michael J; Kelleher, Kelly; Cohen, Daniel M
2015-08-01
Informed consent is an ethical process for ensuring patient autonomy. Multimedia presentations (MMPs) often aid the informed consent process for research studies. Thus, it follows that MMPs would improve informed consent in clinical settings. The aim of this study was to determine if an MMP for the informed consent process for ketamine sedation improves parental satisfaction and comprehension as compared with standard practice. This 2-phase study compared 2 methods of informed consent for ketamine sedation of pediatric patients. Phase 1 was a randomized, prospective study that compared the standard verbal consent to an MMP. Phase 2 implemented the MMP into daily work flow to validate the previous year's results. Parents completed a survey evaluating their satisfaction of the informed consent process and assessing their knowledge of ketamine sedation. Primary outcome measures were parental overall satisfaction with the informed consent process and knowledge of ketamine sedation. One hundred eighty-four families from a free-standing, urban, tertiary pediatric emergency department with over 85,000 annual visits were enrolled. Different demographics were not associated with a preference for the MMP or improved scores on the content quiz. Intervention families were more likely "to feel involved in the decision to use ketamine" and to understand that "they had the right to refuse the ketamine" as compared with control families. The intervention group scored significantly higher overall on the content section than the control group. Implementation and intervention families responded similarly to all survey sections. Multimedia presentation improves parental understanding of ketamine sedation, whereas parental satisfaction with the informed consent process remains unchanged. Use of MMP in the emergency department for informed consent shows potential for both patients and providers.
NIST: Information Management in the AMRF
NASA Technical Reports Server (NTRS)
Callaghan, George (Editor)
1991-01-01
The information management strategies developed for the NIST Automated Manufacturing Research Facility (AMRF) - a prototype small batch manufacturing facility used for integration and measurement related standards research are outlined in this video. The five major manufacturing functions - design, process planning, off-line programming, shop floor control, and materials processing are explained and their applications demonstrated.
Brown, Jesslyn; Howard, Daniel M.; Wylie, Bruce K.; Friesz, Aaron M.; Ji, Lei; Gacke, Carolyn
2015-01-01
Monitoring systems benefit from high temporal frequency image data collected from the Moderate Resolution Imaging Spectroradiometer (MODIS) system. Because of near-daily global coverage, MODIS data are beneficial to applications that require timely information about vegetation condition related to drought, flooding, or fire danger. Rapid satellite data streams in operational applications have clear benefits for monitoring vegetation, especially when information can be delivered as fast as changing surface conditions. An “expedited” processing system called “eMODIS” operated by the U.S. Geological Survey provides rapid MODIS surface reflectance data to operational applications in less than 24 h offering tailored, consistently-processed information products that complement standard MODIS products. We assessed eMODIS quality and consistency by comparing to standard MODIS data. Only land data with known high quality were analyzed in a central U.S. study area. When compared to standard MODIS (MOD/MYD09Q1), the eMODIS Normalized Difference Vegetation Index (NDVI) maintained a strong, significant relationship to standard MODIS NDVI, whether from morning (Terra) or afternoon (Aqua) orbits. The Aqua eMODIS data were more prone to noise than the Terra data, likely due to differences in the internal cloud mask used in MOD/MYD09Q1 or compositing rules. Post-processing temporal smoothing decreased noise in eMODIS data.
ERIC Educational Resources Information Center
Pappas, Marjorie L.
2003-01-01
Explains how to develop lesson plans to help students become effective researchers using electronic searching tools. Uses a unit developed for Kansas landmarks to discuss information skills, competency standards, inquiry, technology use, information literacy and process skills, finding information, and an example of a research log. (LRW)
MO/DSD online information server and global information repository access
NASA Technical Reports Server (NTRS)
Nguyen, Diem; Ghaffarian, Kam; Hogie, Keith; Mackey, William
1994-01-01
Often in the past, standards and new technology information have been available only in hardcopy form, with reproduction and mailing costs proving rather significant. In light of NASA's current budget constraints and in the interest of efficient communications, the Mission Operations and Data Systems Directorate (MO&DSD) New Technology and Data Standards Office recognizes the need for an online information server (OLIS). This server would allow: (1) dissemination of standards and new technology information throughout the Directorate more quickly and economically; (2) online browsing and retrieval of documents that have been published for and by MO&DSD; and (3) searching for current and past study activities on related topics within NASA before issuing a task. This paper explores a variety of available information servers and searching tools, their current capabilities and limitations, and the application of these tools to MO&DSD. Most importantly, the discussion focuses on the way this concept could be easily applied toward improving dissemination of standards and new technologies and improving documentation processes.
Open Standards in Practice: An OGC China Forum Initiative
NASA Astrophysics Data System (ADS)
Yue, Peng; Zhang, Mingda; Taylor, Trevor; Xie, Jibo; Zhang, Hongping; Tong, Xiaochong; Yu, Jinsongdi; Huang, Juntao
2016-11-01
Open standards like OGC standards can be used to improve interoperability and support machine-to-machine interaction over the Web. In the Big Data era, standard-based data and processing services from various vendors could be combined to automate the extraction of information and knowledge from heterogeneous and large volumes of geospatial data. This paper introduces an ongoing OGC China forum initiative, which will demonstrate how OGC standards can benefit the interaction among multiple organizations in China. The ability to share data and processing functions across organizations using standard services could change traditional manual interactions in their business processes, and provide on-demand decision support results by on-line service integration. In the initiative, six organizations are involved in two “MashUp” scenarios on disaster management. One “MashUp” is to derive flood maps in the Poyang Lake, Jiangxi. And the other one is to generate turbidity maps on demand in the East Lake, Wuhan, China. The two scenarios engage different organizations from the Chinese community by integrating sensor observations, data, and processing services from them, and improve the automation of data analysis process using open standards.
A model-driven approach to information security compliance
NASA Astrophysics Data System (ADS)
Correia, Anacleto; Gonçalves, António; Teodoro, M. Filomena
2017-06-01
The availability, integrity and confidentiality of information are fundamental to the long-term survival of any organization. Information security is a complex issue that must be holistically approached, combining assets that support corporate systems, in an extended network of business partners, vendors, customers and other stakeholders. This paper addresses the conception and implementation of information security systems, conform the ISO/IEC 27000 set of standards, using the model-driven approach. The process begins with the conception of a domain level model (computation independent model) based on information security vocabulary present in the ISO/IEC 27001 standard. Based on this model, after embedding in the model mandatory rules for attaining ISO/IEC 27001 conformance, a platform independent model is derived. Finally, a platform specific model serves the base for testing the compliance of information security systems with the ISO/IEC 27000 set of standards.
ERIC Educational Resources Information Center
Klein, Gary M.
1994-01-01
Online public access catalogs from 67 libraries using NOTIS software were searched using Internet connections to determine the positional operators selected as the default keyword operator on each catalog. Results indicate the lack of a processing standard for keyword searches. Five tables provide information. (Author/AEF)
Implementing PAT with Standards
NASA Astrophysics Data System (ADS)
Chandramohan, Laakshmana Sabari; Doolla, Suryanarayana; Khaparde, S. A.
2016-02-01
Perform Achieve Trade (PAT) is a market-based incentive mechanism to promote energy efficiency. The purpose of this work is to address the challenges inherent to inconsistent representation of business processes, and interoperability issues in PAT like cap-and-trade mechanisms especially when scaled. Studies by various agencies have highlighted that as the mechanism evolves including more industrial sectors and industries in its ambit, implementation will become more challenging. This paper analyses the major needs of PAT (namely tracking, monitoring, auditing & verifying energy-saving reports, and providing technical support & guidance to stakeholders); and how the aforesaid reasons affect them. Though current technologies can handle these challenges to an extent, standardization activities for implementation have been scanty for PAT and this work attempts to evolve them. The inconsistent modification of business processes, rules, and procedures across stakeholders, and interoperability among heterogeneous systems are addressed. This paper proposes the adoption of specifically two standards into PAT, namely Business Process Model and Notation for maintaining consistency in business process modelling, and Common Information Model (IEC 61970, 61968, 62325 combined) for information exchange. Detailed architecture and organization of these adoptions are reported. The work can be used by PAT implementing agencies, stakeholders, and standardization bodies.
Riley, William; Begun, James W; Meredith, Les; Miller, Kristi K; Connolly, Kathy; Price, Rebecca; Muri, Janet H; McCullough, Mac; Davis, Stanley
2016-12-01
To improve safety practices and reduce adverse events in perinatal units of acute care hospitals. Primary data collected from perinatal units of 14 hospitals participating in the intervention between 2008 and 2012. Baseline secondary data collected from the same hospitals between 2006 and 2007. A prospective study involving 342,754 deliveries was conducted using a quality improvement collaborative that supported three primary interventions. Primary measures include adoption of three standardized care processes and four measures of outcomes. Chart audits were conducted to measure the implementation of standardized care processes. Outcome measures were collected and validated by the National Perinatal Information Center. The hospital perinatal units increased use of all three care processes, raising consolidated overall use from 38 to 81 percent between 2008 and 2012. The harms measured by the Adverse Outcome Index decreased 14 percent, and a run chart analysis revealed two special causes associated with the interventions. This study demonstrates the ability of hospital perinatal staff to implement efforts to reduce perinatal harm using a quality improvement collaborative. Findings help inform the relationship between the use of standardized care processes, teamwork training, and improved perinatal outcomes, and suggest that a multiplicity of integrated strategies, rather than a single intervention, may be essential to achieve high reliability. © Health Research and Educational Trust.
Kreimeyer, Kory; Foster, Matthew; Pandey, Abhishek; Arya, Nina; Halford, Gwendolyn; Jones, Sandra F; Forshee, Richard; Walderhaug, Mark; Botsis, Taxiarchis
2017-09-01
We followed a systematic approach based on the Preferred Reporting Items for Systematic Reviews and Meta-Analyses to identify existing clinical natural language processing (NLP) systems that generate structured information from unstructured free text. Seven literature databases were searched with a query combining the concepts of natural language processing and structured data capture. Two reviewers screened all records for relevance during two screening phases, and information about clinical NLP systems was collected from the final set of papers. A total of 7149 records (after removing duplicates) were retrieved and screened, and 86 were determined to fit the review criteria. These papers contained information about 71 different clinical NLP systems, which were then analyzed. The NLP systems address a wide variety of important clinical and research tasks. Certain tasks are well addressed by the existing systems, while others remain as open challenges that only a small number of systems attempt, such as extraction of temporal information or normalization of concepts to standard terminologies. This review has identified many NLP systems capable of processing clinical free text and generating structured output, and the information collected and evaluated here will be important for prioritizing development of new approaches for clinical NLP. Copyright © 2017 Elsevier Inc. All rights reserved.
45 CFR 170.504 - Reconsideration process for requests for ONC-AA status.
Code of Federal Regulations, 2011 CFR
2011-10-01
... status. 170.504 Section 170.504 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES HEALTH INFORMATION TECHNOLOGY HEALTH INFORMATION TECHNOLOGY STANDARDS, IMPLEMENTATION SPECIFICATIONS, AND CERTIFICATION CRITERIA AND CERTIFICATION PROGRAMS FOR HEALTH INFORMATION TECHNOLOGY Permanent Certification Program for HIT...
45 CFR 170.504 - Reconsideration process for requests for ONC-AA status.
Code of Federal Regulations, 2012 CFR
2012-10-01
... status. 170.504 Section 170.504 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES HEALTH INFORMATION TECHNOLOGY HEALTH INFORMATION TECHNOLOGY STANDARDS, IMPLEMENTATION SPECIFICATIONS, AND CERTIFICATION CRITERIA AND CERTIFICATION PROGRAMS FOR HEALTH INFORMATION TECHNOLOGY Permanent Certification Program for HIT...
75 FR 52701 - Approval and Promulgation of Implementation Plans; State of Missouri
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-27
... information claimed to be Confidential Business Information (CBI) or other information whose disclosure is.... Ventilation Limits 5. Ongoing Ventilation Testing and Reporting Requirements 6. Winter Construction Work..., including building enclosure and ventilation projects, implementation of work practice standards, process...
Saghaeiannejad-Isfahani, Sakineh; Mirzaeian, Razieh; Jannesari, Hasan; Ehteshami, Asghar; Feizi, Awat; Raeisi, Ahmadreza
2014-01-01
Supporting a therapeutic approach and medication therapy management, the pharmacy information system (PIS) acts as one of the pillars of hospital information system. This ensures that medication therapy is being supported with an optimal level of safety and quality similar to other treatments and services. The present study is an applied, cross-sectional study conducted on the PIS in use in selected hospitals. The research population included all users of PIS. The research sample is the same as the research population. The data collection instrument was the self-designed checklist developed from the guidelines of the American Society of Health System Pharmacists, Australia pharmaceutical Society and Therapeutic guidelines of the Drug Commission of the German Medical Association. The checklist validity was assessed by research supervisors and PIS users and pharmacists. The findings of this study were revealed that regarding the degree of meeting the standards given in the guidelines issued by the Society of Pharmacists, the highest rank in observing input standards belonged to Social Services hospitals with a mean score of 32.75. Although teaching hospitals gained the highest score both in process standards with a mean score of 29.15 and output standards with a mean score of 43.95, the private hospitals had the lowest mean score of 23.32, 17.78, 24.25 in input, process and output standards, respectively. Based on the findings, it can be claimed that the studied hospitals had a minimal compliance with the input, output and processing standards related to the PIS.
Milner, Rafał; Rusiniak, Mateusz; Lewandowska, Monika; Wolak, Tomasz; Ganc, Małgorzata; Piątkowska-Janko, Ewa; Bogorodzki, Piotr; Skarżyński, Henryk
2014-01-01
Background The neural underpinnings of auditory information processing have often been investigated using the odd-ball paradigm, in which infrequent sounds (deviants) are presented within a regular train of frequent stimuli (standards). Traditionally, this paradigm has been applied using either high temporal resolution (EEG) or high spatial resolution (fMRI, PET). However, used separately, these techniques cannot provide information on both the location and time course of particular neural processes. The goal of this study was to investigate the neural correlates of auditory processes with a fine spatio-temporal resolution. A simultaneous auditory evoked potentials (AEP) and functional magnetic resonance imaging (fMRI) technique (AEP-fMRI), together with an odd-ball paradigm, were used. Material/Methods Six healthy volunteers, aged 20–35 years, participated in an odd-ball simultaneous AEP-fMRI experiment. AEP in response to acoustic stimuli were used to model bioelectric intracerebral generators, and electrophysiological results were integrated with fMRI data. Results fMRI activation evoked by standard stimuli was found to occur mainly in the primary auditory cortex. Activity in these regions overlapped with intracerebral bioelectric sources (dipoles) of the N1 component. Dipoles of the N1/P2 complex in response to standard stimuli were also found in the auditory pathway between the thalamus and the auditory cortex. Deviant stimuli induced fMRI activity in the anterior cingulate gyrus, insula, and parietal lobes. Conclusions The present study showed that neural processes evoked by standard stimuli occur predominantly in subcortical and cortical structures of the auditory pathway. Deviants activate areas non-specific for auditory information processing. PMID:24413019
Construct Maps as a Foundation for Standard Setting
ERIC Educational Resources Information Center
Wyse, Adam E.
2013-01-01
Construct maps are tools that display how the underlying achievement construct upon which one is trying to set cut-scores is related to other information used in the process of standard setting. This article reviews what construct maps are, uses construct maps to provide a conceptual framework to view commonly used standard-setting procedures (the…
LANDSAT: Non-US standard catalog no. N-33
NASA Technical Reports Server (NTRS)
1975-01-01
A catalog used for dissemination of information regarding the availability of LANDSAT imagery is presented. The Image Processing Facility of the Goddard Space Flight Center, publishes a U.S. and a Non-U.S. Standard Catalog on a monthly schedule, and the catalogs identify imagery which has been processed and input to the data files during the referenced month. The U.S. Standard Catalog includes imagery covering the continental United States, Alaska and Hawaii; the Non-U.S. Catalog identifies all the remaining coverage. Imagery adjacent to the continental U.S. and Alaska borders is included in the U.S. Standard Catalog.
Another HISA--the new standard: health informatics--service architecture.
Klein, Gunnar O; Sottile, Pier Angelo; Endsleff, Frederik
2007-01-01
In addition to the meaning as Health Informatics Society of Australia, HISA is the acronym used for the new European Standard: Health Informatics - Service Architecture. This EN 12967 standard has been developed by CEN - the federation of 29 national standards bodies in Europe. This standard defines the essential elements of a Service Oriented Architecture and a methodology for localization particularly useful for large healthcare organizations. It is based on the Open Distributed Processing (ODP) framework from ISO 10746 and contains the following parts: Part 1: Enterprise viewpoint. Part 2: Information viewpoint. Part 3: Computational viewpoint. This standard is now also the starting point for the consideration for an International standard in ISO/TC 215. The basic principles with a set of health specific middleware services as a common platform for various applications for regional health information systems, or large integrated hospital information systems, are well established following a previous prestandard. Examples of large scale deployments in Sweden, Denmark and Italy are described.
45 CFR 164.308 - Administrative safeguards.
Code of Federal Regulations, 2013 CFR
2013-10-01
... REQUIREMENTS SECURITY AND PRIVACY Security Standards for the Protection of Electronic Protected Health... accordance with § 164.306: (1)(i) Standard: Security management process. Implement policies and procedures to... to the confidentiality, integrity, and availability of electronic protected health information held...
45 CFR 164.308 - Administrative safeguards.
Code of Federal Regulations, 2014 CFR
2014-10-01
... REQUIREMENTS SECURITY AND PRIVACY Security Standards for the Protection of Electronic Protected Health... accordance with § 164.306: (1)(i) Standard: Security management process. Implement policies and procedures to... to the confidentiality, integrity, and availability of electronic protected health information held...
Problems of Automation and Management Principles Information Flow in Manufacturing
NASA Astrophysics Data System (ADS)
Grigoryuk, E. N.; Bulkin, V. V.
2017-07-01
Automated control systems of technological processes are complex systems that are characterized by the presence of elements of the overall focus, the systemic nature of the implemented algorithms for the exchange and processing of information, as well as a large number of functional subsystems. The article gives examples of automatic control systems and automated control systems of technological processes held parallel between them by identifying strengths and weaknesses. Other proposed non-standard control system of technological process.
A data types profile suitable for use with ISO EN 13606.
Sun, Shanghua; Austin, Tony; Kalra, Dipak
2012-12-01
ISO EN 13606 is a five part International Standard specifying how Electronic Healthcare Record (EHR) information should be communicated between different EHR systems and repositories. Part 1 of the standard defines an information model for representing the EHR information itself, including the representation of types of data value. A later International Standard, ISO 21090:2010, defines a comprehensive set of models for data types needed by all health IT systems. This latter standard is vast, and duplicates some of the functions already handled by ISO EN 13606 part 1. A profile (sub-set) of ISO 21090 would therefore be expected to provide EHR system vendors with a more specially tailored set of data types to implement and avoid the risk of providing more than one modelling option for representing the information properties. This paper describes the process and design decisions made for developing a data types profile for EHR interoperability.
Applying Use Cases to Describe the Role of Standards in e-Health Information Systems
NASA Astrophysics Data System (ADS)
Chávez, Emma; Finnie, Gavin; Krishnan, Padmanabhan
Individual health records (IHRs) contain a person's lifetime records of their key health history and care within a health system (National E-Health Transition Authority, Retrieved Jan 12, 2009 from http://www.nehta.gov.au/coordinated-care/whats-in-iehr, 2004). This information can be processed and stored in different ways. The record should be available electronically to authorized health care providers and the individual anywhere, anytime, to support high-quality care. Many organizations provide a diversity of solutions for e-health and its services. Standards play an important role to enable these organizations to support information interchange and improve efficiency of health care delivery. However, there are numerous standards to choose from and not all of them are accessible to the software developer. This chapter proposes a framework to describe the e-health standards that can be used by software engineers to implement e-health information systems.
Tombaugh, Tom N; Rees, Laura; Stormer, Peter; Harrison, Allyson G; Smith, Andra
2007-01-01
In spite of the fact that reaction time (RT) measures are sensitive to the effects of traumatic brain injury (TBI), few RT procedures have been developed for use in standard clinical evaluations. The computerized test of information processing (CTIP) [Tombaugh, T. N., & Rees, L. (2000). Manual for the computerized tests of information processing (CTIP). Ottawa, Ont.: Carleton University] was designed to measure the degree to which TBI decreases the speed at which information is processed. The CTIP consists of three computerized programs that progressively increase the amount of information that is processed. Results of the current study demonstrated that RT increased as the difficulty of the CTIP tests increased (known as the complexity effect), and as severity of injury increased (from mild to severe TBI). The current study also demonstrated the importance of selecting a non-biased measure of variability. Overall, findings suggest that the CTIP is an easy to administer and sensitive measure of information processing speed.
45 CFR 170.504 - Reconsideration process for requests for ONC-AA status.
Code of Federal Regulations, 2014 CFR
2014-10-01
... status. 170.504 Section 170.504 Public Welfare Department of Health and Human Services HEALTH INFORMATION TECHNOLOGY HEALTH INFORMATION TECHNOLOGY STANDARDS, IMPLEMENTATION SPECIFICATIONS, AND CERTIFICATION CRITERIA AND CERTIFICATION PROGRAMS FOR HEALTH INFORMATION TECHNOLOGY ONC HIT Certification Program § 170.504...
45 CFR 170.504 - Reconsideration process for requests for ONC-AA status.
Code of Federal Regulations, 2013 CFR
2013-10-01
... status. 170.504 Section 170.504 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES HEALTH INFORMATION TECHNOLOGY HEALTH INFORMATION TECHNOLOGY STANDARDS, IMPLEMENTATION SPECIFICATIONS, AND CERTIFICATION CRITERIA AND CERTIFICATION PROGRAMS FOR HEALTH INFORMATION TECHNOLOGY ONC HIT Certification Program § 170.504...
The Swedish strategy and method for development of a national healthcare information architecture.
Rosenälv, Jessica; Lundell, Karl-Henrik
2012-01-01
"We need a precise framework of regulations in order to maintain appropriate and structured health care documentation that ensures that the information maintains a sufficient level of quality to be used in treatment, in research and by the actual patient. The users shall be aided by clearly and uniformly defined terms and concepts, and there should be an information structure that clarifies what to document and how to make the information more useful. Most of all, we need to standardize the information, not just the technical systems." (eHälsa - nytta och näring, Riksdag report 2011/12:RFR5, p. 37). In 2010, the Swedish Government adopted the National e-Health - the national strategy for accessible and secure information in healthcare. The strategy is a revision and extension of the previous strategy from 2006, which was used as input for the most recent efforts to develop a national information structure utilizing business-oriented generic models. A national decision on healthcare informatics standards was made by the Swedish County Councils, which decided to follow and use EN/ISO 13606 as a standard for the development of a universally applicable information structure, including archetypes and templates. The overall aim of the Swedish strategy for development of National Healthcare Information Architecture is to achieve high level semantic interoperability for clinical content and clinical contexts. High level semantic interoperability requires consistently structured clinical data and other types of data with coherent traceability to be mapped to reference clinical models. Archetypes that are formal definitions of the clinical and demographic concepts and some administrative data were developed. Each archetype describes the information structure and content of overarching core clinical concepts. Information that is defined in archetypes should be used for different purposes. Generic clinical process model was made concrete and analyzed. For each decision-making step in the process where information is processed, the amount and type of information and its structure were defined in terms of reference templates. Reference templates manage clinical, administrative and demographic types of information in a specific clinical context. Based on a survey of clinical processes at the reference level, the identification of specific clinical processes such as diabetes and congestive heart failure in adults were made. Process-specific templates were defined by using reference templates and populated with information that was relevant to each health problem in a specific clinical context. Throughout this process, medical data for knowledge management were collected for each health problem. Parallel with the efforts to define archetypes and templates, terminology binding work is on-going. Different strategies are used depending on the terminology binding level.
40 CFR 63.7570 - Who implements and enforces this subpart?
Code of Federal Regulations, 2010 CFR
2010-07-01
... PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Industrial, Commercial, and Institutional Boilers and Process Heaters Other Requirements and Information § 63.7570 Who implements and...
5 CFR 293.103 - Recordkeeping standards.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 293.103 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PERSONNEL RECORDS Basic Policies on Maintenance of Personnel Records § 293.103 Recordkeeping standards. (a) The head..., processing, use, or maintenance of personnel records are informed of pertinent recordkeeping regulations and...
Enhanced semantic interoperability by profiling health informatics standards.
López, Diego M; Blobel, Bernd
2009-01-01
Several standards applied to the healthcare domain support semantic interoperability. These standards are far from being completely adopted in health information system development, however. The objective of this paper is to provide a method and suggest the necessary tooling for reusing standard health information models, by that way supporting the development of semantically interoperable systems and components. The approach is based on the definition of UML Profiles. UML profiling is a formal modeling mechanism to specialize reference meta-models in such a way that it is possible to adapt those meta-models to specific platforms or domains. A health information model can be considered as such a meta-model. The first step of the introduced method identifies the standard health information models and tasks in the software development process in which healthcare information models can be reused. Then, the selected information model is formalized as a UML Profile. That Profile is finally applied to system models, annotating them with the semantics of the information model. The approach is supported on Eclipse-based UML modeling tools. The method is integrated into a comprehensive framework for health information systems development, and the feasibility of the approach is demonstrated in the analysis, design, and implementation of a public health surveillance system, reusing HL7 RIM and DIMs specifications. The paper describes a method and the necessary tooling for reusing standard healthcare information models. UML offers several advantages such as tooling support, graphical notation, exchangeability, extensibility, semi-automatic code generation, etc. The approach presented is also applicable for harmonizing different standard specifications.
RMP Guidance for Chemical Distributors - Appendix D: OSHA Guidance on PSM
Guidance on the Process Safety Management standard says information (including MSDS) about chemicals, including process intermediates, must enable accurate assessment of fire/explosion characteristics, reactivity hazards, and corrosing/erosion effects.
Speech Recognition as a Transcription Aid: A Randomized Comparison With Standard Transcription
Mohr, David N.; Turner, David W.; Pond, Gregory R.; Kamath, Joseph S.; De Vos, Cathy B.; Carpenter, Paul C.
2003-01-01
Objective. Speech recognition promises to reduce information entry costs for clinical information systems. It is most likely to be accepted across an organization if physicians can dictate without concerning themselves with real-time recognition and editing; assistants can then edit and process the computer-generated document. Our objective was to evaluate the use of speech-recognition technology in a randomized controlled trial using our institutional infrastructure. Design. Clinical note dictations from physicians in two specialty divisions were randomized to either a standard transcription process or a speech-recognition process. Secretaries and transcriptionists also were assigned randomly to each of these processes. Measurements. The duration of each dictation was measured. The amount of time spent processing a dictation to yield a finished document also was measured. Secretarial and transcriptionist productivity, defined as hours of secretary work per minute of dictation processed, was determined for speech recognition and standard transcription. Results. Secretaries in the endocrinology division were 87.3% (confidence interval, 83.3%, 92.3%) as productive with the speech-recognition technology as implemented in this study as they were using standard transcription. Psychiatry transcriptionists and secretaries were similarly less productive. Author, secretary, and type of clinical note were significant (p < 0.05) predictors of productivity. Conclusion. When implemented in an organization with an existing document-processing infrastructure (which included training and interfaces of the speech-recognition editor with the existing document entry application), speech recognition did not improve the productivity of secretaries or transcriptionists. PMID:12509359
A survey of nursing documentation, terminologies and standards in European countries
Thoroddsen, Asta; Ehrenberg, Anna; Sermeus, Walter; Saranto, Kaija
2012-01-01
A survey was carried out to describe the current state of art in the use of nursing documentation, terminologies, standards and education. Key informants in European countries were targeted by the Association for Common European Nursing Diagnoses, Interventions and Outcomes (ACENDIO). Replies were received from key informants in 20 European countries. Results show that the nursing process was most often used to structure nursing documentation. Many standardized nursing terminologies were used in Europe with NANDA, NIC, NOC and ICF most frequently used. In 70% of the countries minimum requirements were available for electronic health records (EHR), but nursing not addressed specifically. Standards in use for nursing terminologies and information systems were lacking. The results should be a major concern to the nursing community in Europe. As a European platform, ACENDIO can play a role in enhancing standardization activities, and should develop its role accordingly. PMID:24199130
Personal Privacy in an Information Society. Final Report.
ERIC Educational Resources Information Center
Privacy Protection Study Commission, Washington, DC.
This report of the Privacy Protection Study Commission was prepared in response to a Congressional mandate to study data banks, automatic data processing programs, and information systems of governmental, regional and private organizations to determine standards and procedures in force for the protection of personal information. Recommendations…
Jagannathan, V; Mullett, Charles J; Arbogast, James G; Halbritter, Kevin A; Yellapragada, Deepthi; Regulapati, Sushmitha; Bandaru, Pavani
2009-04-01
We assessed the current state of commercial natural language processing (NLP) engines for their ability to extract medication information from textual clinical documents. Two thousand de-identified discharge summaries and family practice notes were submitted to four commercial NLP engines with the request to extract all medication information. The four sets of returned results were combined to create a comparison standard which was validated against a manual, physician-derived gold standard created from a subset of 100 reports. Once validated, the individual vendor results for medication names, strengths, route, and frequency were compared against this automated standard with precision, recall, and F measures calculated. Compared with the manual, physician-derived gold standard, the automated standard was successful at accurately capturing medication names (F measure=93.2%), but performed less well with strength (85.3%) and route (80.3%), and relatively poorly with dosing frequency (48.3%). Moderate variability was seen in the strengths of the four vendors. The vendors performed better with the structured discharge summaries than with the clinic notes in an analysis comparing the two document types. Although automated extraction may serve as the foundation for a manual review process, it is not ready to automate medication lists without human intervention.
Mapping Perinatal Nursing Process Measurement Concepts to Standard Terminologies.
Ivory, Catherine H
2016-07-01
The use of standard terminologies is an essential component for using data to inform practice and conduct research; perinatal nursing data standardization is needed. This study explored whether 76 distinct process elements important for perinatal nursing were present in four American Nurses Association-recognized standard terminologies. The 76 process elements were taken from a valid paper-based perinatal nursing process measurement tool. Using terminology-supported browsers, the elements were manually mapped to the selected terminologies by the researcher. A five-member expert panel validated 100% of the mapping findings. The majority of the process elements (n = 63, 83%) were present in SNOMED-CT, 28% (n = 21) in LOINC, 34% (n = 26) in ICNP, and 15% (n = 11) in CCC. SNOMED-CT and LOINC are terminologies currently recommended for use to facilitate interoperability in the capture of assessment and problem data in certified electronic medical records. Study results suggest that SNOMED-CT and LOINC contain perinatal nursing process elements and are useful standard terminologies to support perinatal nursing practice in electronic health records. Terminology mapping is the first step toward incorporating traditional paper-based tools into electronic systems.
Building the United States National Vegetation Classification
Franklin, S.B.; Faber-Langendoen, D.; Jennings, M.; Keeler-Wolf, T.; Loucks, O.; Peet, R.; Roberts, D.; McKerrow, A.
2012-01-01
The Federal Geographic Data Committee (FGDC) Vegetation Subcommittee, the Ecological Society of America Panel on Vegetation Classification, and NatureServe have worked together to develop the United States National Vegetation Classification (USNVC). The current standard was accepted in 2008 and fosters consistency across Federal agencies and non-federal partners for the description of each vegetation concept and its hierarchical classification. The USNVC is structured as a dynamic standard, where changes to types at any level may be proposed at any time as new information comes in. But, because much information already exists from previous work, the NVC partners first established methods for screening existing types to determine their acceptability with respect to the 2008 standard. Current efforts include a screening process to assign confidence to Association and Group level descriptions, and a review of the upper three levels of the classification. For the upper levels especially, the expectation is that the review process includes international scientists. Immediate future efforts include the review of remaining levels and the development of a proposal review process.
Boiler MACT Technical Assistance (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2012-03-01
Fact sheet describing the changes to Environmental Protection Act process standards. The DOE will offer technical assistance to ensure that major sources burning coal and oil have information on cost-effective, clean energy strategies for compliance, and to promote cleaner, more efficient boiler burning to cut harmful pollution and reduce operational costs. The U.S. Environmental Protection Agency (EPA) is expected to finalize the reconsideration process for its Clean Air Act pollution standards National Emissions Standards for Hazardous Air Pollutants for Major Sources: Industrial, Commercial, and Institutional Boilers and Process Heaters (known as Boiler Maximum Achievable Control Technology (MACT)), in Spring 2012.more » This rule applies to large and small boilers in a wide range of industrial facilities and institutions. The U.S. Department of Energy (DOE) will offer technical assistance to ensure that major sources burning coal or oil have information on cost-effective clean energy strategies for compliance, including combined heat and power, and to promote cleaner, more efficient boilers to cut harmful pollution and reduce operational costs.« less
Science Data Preservation: Implementation and Why It Is Important
NASA Technical Reports Server (NTRS)
Kempler, Steven J.; Moses, John F.; Gerasimov, Irina V.; Johnson, James E.; Vollmer, Bruce E.; Theobald, Michael L.; Ostrenga, Dana M.; Ahmad, Suraiya; Ramapriyan, Hampapuram K.; Khayat, Mohammad G.
2013-01-01
Remote Sensing data generation by NASA to study Earth s geophysical processes was initiated in 1960 with the launch of the first Television Infrared Observation Satellite Program (TIROS), to develop a meteorological satellite information system. What would be deemed as a primitive data set by today s standards, early Earth science missions were the foundation upon which today s remote sensing instruments have built their scientific success, and tomorrow s instruments will yield science not yet imagined. NASA Scientific Data Stewardship requirements have been documented to ensure the long term preservation and usability of remote sensing science data. In recent years, the Federation of Earth Science Information Partners and NASA s Earth Science Data System Working Groups have organized committees that specifically examine standards, processes, and ontologies that can best be employed for the preservation of remote sensing data, supporting documentation, and data provenance information. This presentation describes the activities, issues, and implementations, guided by the NASA Earth Science Data Preservation Content Specification (423-SPEC-001), for preserving instrument characteristics, and data processing and science information generated for 20 Earth science instruments, spanning 40 years of geophysical measurements, at the NASA s Goddard Earth Sciences Data and Information Services Center (GES DISC). In addition, unanticipated preservation/implementation questions and issues in the implementation process are presented.
LANDSAT: Non-US standard catalog no. N-36. [LANDSAT imagery for August, 1975
NASA Technical Reports Server (NTRS)
1975-01-01
Information regarding the availability of LANDSAT imagery processed and input to the data files by the NASA Data Processing Facility is published on a monthly basis. The U.S. Standard Catalog includes imagery covering the continental United States, Alaska, and Hawaii. The Non-U.S. Standard Catalog identifies all the remaining coverage. Sections 1 and 2 describe the contents and format for the catalogs and the associated microfilm. Section 3 provides a cross reference defining the beginning and ending dates for LANDSAT cycles.
LANDSAT 2 world standard catalog, 1 May - 31 July 1978. [LANDSAT imagery for May through July 1978
NASA Technical Reports Server (NTRS)
1978-01-01
Information regarding the availability of LANDSAT imagery processed and input to the data files by the NASA Data Processing Facility is published on a monthly basis. The U.S. Standard Catalog includes imagery covering the continental United States, Alaska and Hawaii. The Non-U.S. Standard Catalog identifies all the remaining coverage. Sections 1 and 2 describe the contents and format for the catalogs and the associated microfilm. Section 3 provides a cross-reference defining the beginning and ending dates for LANDSAT cycles.
LANDSAT: Non-US standard catalog no. N-30. [LANDSAT imagery for February, 1975
NASA Technical Reports Server (NTRS)
1975-01-01
Information regarding the availability of LANDSAT imagery processed and input to the data files by the NASA Data Processing Facility is published on a monthly basis. The U.S. Standard Catalog includes imagery covering the continental United States, Alaska, and Hawaii. The Non-U.S. Standard Catalog identifies all the remaining coverage. Sections 1 and 2 describe the contents and format for the catalogs and the associated microfilm. Section 3 provides a cross-reference defining the beginning and ending dates for LANDSAT cycles.
Encoding Standards for Linguistic Corpora.
ERIC Educational Resources Information Center
Ide, Nancy
The demand for extensive reusability of large language text collections for natural languages processing research requires development of standardized encoding formats. Such formats must be capable of representing different kinds of information across the spectrum of text types and languages, capable of representing different levels of…
There's gold in them thar' databases.
Gillespie, G
2000-11-01
Some health care organizations are using sophisticated data mining applications to unearth hidden truths buried in their online clinical and financial information. But the lack of a standard clinical vocabulary and standard work processes is an obstacle CIOs must blast through to reach their treasure.
Blazona, Bojan; Koncar, Miroslav
2007-12-01
Integration based on open standards, in order to achieve communication and information interoperability, is one of the key aspects of modern health care information systems. However, this requirement represents one of the major challenges for the Information and Communication Technology (ICT) solutions, as systems today use diverse technologies, proprietary protocols and communication standards which are often not interoperable. One of the main producers of clinical information in healthcare settings represent Radiology Information Systems (RIS) that communicate using widely adopted DICOM (Digital Imaging and COmmunications in Medicine) standard, but in very few cases can efficiently integrate information of interest with other systems. In this context we identified HL7 standard as the world's leading medical ICT standard that is envisioned to provide the umbrella for medical data semantic interoperability, which amongst other things represents the cornerstone for the Croatia's National Integrated Healthcare Information System (IHCIS). The aim was to explore the ability to integrate and exchange RIS originated data with Hospital Information Systems based on HL7's CDA (Clinical Document Architecture) standard. We explored the ability of HL7 CDA specifications and methodology to address the need of RIS integration HL7 based healthcare information systems. We introduced the use of WADO service interconnection to IHCIS and finally CDA rendering in widely used Internet explorers. The outcome of our pilot work proves our original assumption of HL7 standard being able to adopt radiology data into the integrated healthcare systems. Uniform DICOM to CDA translation scripts and business processes within IHCIS is desired and cost effective regarding to use of supporting IHCIS services aligned to SOA.
Total quality management: It works for aerospace information services
NASA Technical Reports Server (NTRS)
Erwin, James; Eberline, Carl; Colquitt, Wanda
1993-01-01
Today we are in the midst of information and 'total quality' revolutions. At the NASA STI Program's Center for AeroSpace Information (CASI), we are focused on using continuous improvements techniques to enrich today's services and products and to ensure that tomorrow's technology supports the TQM-based improvement of future STI program products and services. The Continuous Improvements Program at CASI is the foundation for Total Quality Management in products and services. The focus is customer-driven; its goal, to identify processes and procedures that can be improved and new technologies that can be integrated with the processes to gain efficiencies, provide effectiveness, and promote customer satisfaction. This Program seeks to establish quality through an iterative defect prevention approach that is based on the incorporation of standards and measurements into the processing cycle. Four projects are described that utilize cross-functional, problem-solving teams for identifying requirements and defining tasks and task standards, management participation, attention to critical processes, and measurable long-term goals. The implementation of these projects provides the customer with measurably improved access to information that is provided through several channels: the NASA STI Database, document requests for microfiche and hardcopy, and the Centralized Help Desk.
Helping the public find information the U.S. Government Information Locator Service (GILS)
Christian, E.J.
1994-01-01
As part of the National Information Infrastructure, the U.S. federal government is establishing a Government Information Locator Service (GILS). GILS will identify and describe public information resources throughout the federal government and provide assistance in obtaining the information. It will be decentralized and will supplement other agency and commercial information dissemination mechanisms. The public will use GILS directly or through intermediaries, including the Government Printing Office and the National Technical Information Service, as well as federal depository libraries, other public libraries, and private sector information services. Direct users will have access to a GILS Core accessible on the Internet without charge. Intermediate access may include kiosks, 800 numbers, electronic mail, bulletin boards, FAX, and offline media such as floppy disks, CD-ROM, and printed works. GILS will use network technology and the American National Standards Institute Z39.50 standard for information search and retrieval so that information can be retrieved in a variety of ways. Direct users may have access to many other major federal and nonfederal information resources, linkages to data systems, and electronic delivery of information products. An Office of Management and Budget Bulletin in 1994 will provide implementing guidance to agencies. The National Institute of Standards and Technology will also establish a Federal Information Processing Standard specifying a GILS Profile and its application for agencies establishing information locators. ?? 1994.
A Novel College Network Resource Management Method using Cloud Computing
NASA Astrophysics Data System (ADS)
Lin, Chen
At present information construction of college mainly has construction of college networks and management information system; there are many problems during the process of information. Cloud computing is development of distributed processing, parallel processing and grid computing, which make data stored on the cloud, make software and services placed in the cloud and build on top of various standards and protocols, you can get it through all kinds of equipments. This article introduces cloud computing and function of cloud computing, then analyzes the exiting problems of college network resource management, the cloud computing technology and methods are applied in the construction of college information sharing platform.
Enama, Mary E.; Hu, Zonghui; Gordon, Ingelise; Costner, Pamela; Ledgerwood, Julie E.; Grady, Christine
2012-01-01
Background Consent to participate in research is an important component of the conduct of ethical clinical trials. Current consent practices are largely policy-driven. This study was conducted to assess comprehension of study information and satisfaction with the consent form between subjects randomized to concise or to standard informed consent forms as one approach to developing evidence-based consent practices. Methods Participants (N=111) who enrolled into two Phase I investigational influenza vaccine protocols (VRC 306 and VRC 307) at the NIH Clinical Center were randomized to one of two IRB-approved consents; either a standard or concise form. Concise consents had an average of 63% fewer words. All other aspects of the consent process were the same. Questionnaires about the study and the consent process were completed at enrollment and at the last visit in both studies. Results Subjects using concise consent forms scored as well as those using standard length consents in measures of comprehension (7 versus 7, p=0.79 and 20 versus 21, p=0.13), however, the trend was for the concise consent group to report feeling better informed. Both groups thought the length and detail of the consent form was appropriate. Conclusions Randomization of study subjects to different length IRB-approved consents forms as one method for developing evidence-based consent practices, resulted in no differences in study comprehension or satisfaction with the consent form. A concise consent form may be used ethically in the context of a consent process conducted by well-trained staff with opportunities for discussion and education throughout the study. PMID:22542645
29 CFR 4041.27 - Notice of annuity information.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 29 Labor 9 2010-07-01 2010-07-01 false Notice of annuity information. 4041.27 Section 4041.27... TERMINATION OF SINGLE-EMPLOYER PLANS Standard Termination Process § 4041.27 Notice of annuity information. (a... irrevocable commitments (annuity contracts); (2) Change in identity of insurers. A statement that if the plan...
76 FR 43693 - Standard Operating Procedure for “Notice to Industry” Letters
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-21
... the Center for Devices and Radiological Health's (CDRH) process to clarify and more quickly inform stakeholders when CDRH has changed its expectations relating to, or otherwise has new scientific information... scientific information changes CDRH's regulatory thinking, it has been challenging for the Center to...
Expectation, information processing, and subjective duration.
Simchy-Gross, Rhimmon; Margulis, Elizabeth Hellmuth
2018-01-01
In research on psychological time, it is important to examine the subjective duration of entire stimulus sequences, such as those produced by music (Teki, Frontiers in Neuroscience, 10, 2016). Yet research on the temporal oddball illusion (according to which oddball stimuli seem longer than standard stimuli of the same duration) has examined only the subjective duration of single events contained within sequences, not the subjective duration of sequences themselves. Does the finding that oddballs seem longer than standards translate to entire sequences, such that entire sequences that contain oddballs seem longer than those that do not? Is this potential translation influenced by the mode of information processing-whether people are engaged in direct or indirect temporal processing? Two experiments aimed to answer both questions using different manipulations of information processing. In both experiments, musical sequences either did or did not contain oddballs (auditory sliding tones). To manipulate information processing, we varied the task (Experiment 1), the sequence event structure (Experiments 1 and 2), and the sequence familiarity (Experiment 2) independently within subjects. Overall, in both experiments, the sequences that contained oddballs seemed shorter than those that did not when people were engaged in direct temporal processing, but longer when people were engaged in indirect temporal processing. These findings support the dual-process contingency model of time estimation (Zakay, Attention, Perception & Psychophysics, 54, 656-664, 1993). Theoretical implications for attention-based and memory-based models of time estimation, the pacemaker accumulator and coding efficiency hypotheses of time perception, and dynamic attending theory are discussed.
Apply creative thinking of decision support in electrical nursing record.
Hao, Angelica Te-Hui; Hsu, Chien-Yeh; Li-Fang, Huang; Jian, Wen-Shan; Wu, Li-Bin; Kao, Ching-Chiu; Lu, Mei-Show; Chang, Her-Kung
2006-01-01
The nursing process consists of five interrelated steps: assessment, diagnosis, planning, intervention, and evaluation. In the nursing process, the nurse collects a great deal of data and information. The amount of data and information may exceed the amount the nurse can process efficiently and correctly. Thus, the nurse needs assistance to become proficient in the planning of nursing care, due to the difficulty of simultaneously processing a large set of information. Computer systems are viewed as tools to expand the capabilities of the nurse's mind. Using computer technology to support clinicians' decision making may provide high-quality, patient-centered, and efficient healthcare. Although some existing nursing information systems aid in the nursing process, they only provide the most fundamental decision support--i.e., standard care plans associated with common nursing diagnoses. Such a computerized decision support system helps the nurse develop a care plan step-by-step. But it does not assist the nurse in the decision-making process. The decision process about how to generate nursing diagnoses from data and how to individualize the care plans still reminds of the nurse. The purpose of this study is to develop a pilot structure in electronic nursing record system integrated with international nursing standard for improving the proficiency and accuracy of plan of care in clinical pathway process. The proposed pilot systems not only assist both student nurses and nurses who are novice in nursing practice, but also experts who need to work in a practice area which they are not familiar with.
Intelligent Information Systems.
ERIC Educational Resources Information Center
Zabezhailo, M. I.; Finn, V. K.
1996-01-01
An Intelligent Information System (IIS) uses data warehouse technology to facilitate the cycle of data and knowledge processing, including input, standardization, storage, representation, retrieval, calculation, and delivery. This article provides an overview of IIS products and artificial intelligence systems, illustrates examples of IIS…
ACOG Committee Opinion No. 306. Informed refusal.
2004-12-01
Informed refusal is a fundamental component of the informed consent process. Informed consent laws have evolved to the "materiality or patient viewpoint" standard. A physician must disclose to the patient the risks, benefits, and alternatives that a reasonable person in the patient's position would want to know to make an informed decision. Throughout this process, the patient's autonomy, level of health literacy, and cultural background should be respected. The subsequent election by the patient to forgo an intervention that has been recommended by the physician constitutes informed refusal. Documentation of the informed refusal process is essential. It should include a notation that the need for the intervention, as well as risks, benefits, and alternatives to the intervention, and possible consequences of refusal, have been explained. The patient's reason for refusal also should be documented.
40 CFR 63.7565 - What parts of the General Provisions apply to me?
Code of Federal Regulations, 2010 CFR
2010-07-01
... (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Industrial, Commercial, and Institutional Boilers and Process Heaters Other Requirements and Information § 63.7565 What parts...
48 CFR 9.203 - QPL's, QML's, and QBL's.
Code of Federal Regulations, 2010 CFR
2010-10-01
...) Qualification and listing in a QPL, QML, or QBL is the process by which products are obtained from manufacturers... and Standardization Information System (ASSIST) at (http://assist.daps.dla.mil). (c) Instructions concerning qualification procedures are included in the following publications: (1) Federal Standardization...
NASA Astrophysics Data System (ADS)
Zhang, Baocheng; Cai, Qing-yu; You, Li; Zhan, Ming-sheng
2009-05-01
Using standard statistical method, we discover the existence of correlations among Hawking radiations (of tunneled particles) from a black hole. The information carried by such correlations is quantified by mutual information between sequential emissions. Through a careful counting of the entropy taken out by the emitted particles, we show that the black hole radiation as tunneling is an entropy conservation process. While information is leaked out through the radiation, the total entropy is conserved. Thus, we conclude the black hole evaporation process is unitary.
Structuring Legacy Pathology Reports by openEHR Archetypes to Enable Semantic Querying.
Kropf, Stefan; Krücken, Peter; Mueller, Wolf; Denecke, Kerstin
2017-05-18
Clinical information is often stored as free text, e.g. in discharge summaries or pathology reports. These documents are semi-structured using section headers, numbered lists, items and classification strings. However, it is still challenging to retrieve relevant documents since keyword searches applied on complete unstructured documents result in many false positive retrieval results. We are concentrating on the processing of pathology reports as an example for unstructured clinical documents. The objective is to transform reports semi-automatically into an information structure that enables an improved access and retrieval of relevant data. The data is expected to be stored in a standardized, structured way to make it accessible for queries that are applied to specific sections of a document (section-sensitive queries) and for information reuse. Our processing pipeline comprises information modelling, section boundary detection and section-sensitive queries. For enabling a focused search in unstructured data, documents are automatically structured and transformed into a patient information model specified through openEHR archetypes. The resulting XML-based pathology electronic health records (PEHRs) are queried by XQuery and visualized by XSLT in HTML. Pathology reports (PRs) can be reliably structured into sections by a keyword-based approach. The information modelling using openEHR allows saving time in the modelling process since many archetypes can be reused. The resulting standardized, structured PEHRs allow accessing relevant data by retrieving data matching user queries. Mapping unstructured reports into a standardized information model is a practical solution for a better access to data. Archetype-based XML enables section-sensitive retrieval and visualisation by well-established XML techniques. Focussing the retrieval to particular sections has the potential of saving retrieval time and improving the accuracy of the retrieval.
This page contains a February 2003 fact sheet with information regarding the National Emissions Standards for Hazardous Air Pollutants (NESHAP) for Asphalt Processing and Asphalt Roofing Manufacturing.
NASA Technical Reports Server (NTRS)
Dehghani, Navid; Tankenson, Michael
2006-01-01
This paper details an architectural description of the Mission Data Processing and Control System (MPCS), an event-driven, multi-mission ground data processing components providing uplink, downlink, and data management capabilities which will support the Mars Science Laboratory (MSL) project as its first target mission. MPCS is developed based on a set of small reusable components, implemented in Java, each designed with a specific function and well-defined interfaces. An industry standard messaging bus is used to transfer information among system components. Components generate standard messages which are used to capture system information, as well as triggers to support the event-driven architecture of the system. Event-driven systems are highly desirable for processing high-rate telemetry (science and engineering) data, and for supporting automation for many mission operations processes.
FPGA implementation of sparse matrix algorithm for information retrieval
NASA Astrophysics Data System (ADS)
Bojanic, Slobodan; Jevtic, Ruzica; Nieto-Taladriz, Octavio
2005-06-01
Information text data retrieval requires a tremendous amount of processing time because of the size of the data and the complexity of information retrieval algorithms. In this paper the solution to this problem is proposed via hardware supported information retrieval algorithms. Reconfigurable computing may adopt frequent hardware modifications through its tailorable hardware and exploits parallelism for a given application through reconfigurable and flexible hardware units. The degree of the parallelism can be tuned for data. In this work we implemented standard BLAS (basic linear algebra subprogram) sparse matrix algorithm named Compressed Sparse Row (CSR) that is showed to be more efficient in terms of storage space requirement and query-processing timing over the other sparse matrix algorithms for information retrieval application. Although inverted index algorithm is treated as the de facto standard for information retrieval for years, an alternative approach to store the index of text collection in a sparse matrix structure gains more attention. This approach performs query processing using sparse matrix-vector multiplication and due to parallelization achieves a substantial efficiency over the sequential inverted index. The parallel implementations of information retrieval kernel are presented in this work targeting the Virtex II Field Programmable Gate Arrays (FPGAs) board from Xilinx. A recent development in scientific applications is the use of FPGA to achieve high performance results. Computational results are compared to implementations on other platforms. The design achieves a high level of parallelism for the overall function while retaining highly optimised hardware within processing unit.
NASA Astrophysics Data System (ADS)
Barber, Jeffrey; Greca, Joseph; Yam, Kevin; Weatherall, James C.; Smith, Peter R.; Smith, Barry T.
2017-05-01
In 2016, the millimeter wave (MMW) imaging community initiated the formation of a standard for millimeter wave image quality metrics. This new standard, American National Standards Institute (ANSI) N42.59, will apply to active MMW systems for security screening of humans. The Electromagnetic Signatures of Explosives Laboratory at the Transportation Security Laboratory is supporting the ANSI standards process via the creation of initial prototypes for round-robin testing with MMW imaging system manufacturers and experts. Results obtained for these prototypes will be used to inform the community and lead to consensus objective standards amongst stakeholders. Images collected with laboratory systems are presented along with results of preliminary image analysis. Future directions for object design, data collection and image processing are discussed.
NASA Astrophysics Data System (ADS)
Gholibeigian, Hassan
In my vision, there are four animated sub-particles (mater, plant, animal and human sub-particles) as the origin of the life and creator of momentum in each fundamental particle (string). They communicate with dimension of information which is nested with space-time for getting a package of information in each Planck time. They are link-point between dimension of information and space-time. Sub-particle which identifies its fundamental particle, processes the package of information for finding its next step. Processed information carry always by fundamental particles as the history of the universe and enhance its entropy. My proposed formula for calculating number of packages is I =tP- 1 . τ , Planck time tP, and τ is fundamental particle's lifetime. For example a photon needs processes 1 . 8 ×1043 packages of information for finding its path in a second. Duration of each process is faster than light speed. In our bodies, human's sub-particles (substrings) communicate with dimension of information and get packages of information including standard ethics for process and finding their next step. The processed information transforms to knowledge in our mind. This knowledge is always carried by us. Knowledge, as the Result of the Processed Information by Human's Sub-particles (sub-strings)/Mind in our Brain.
Black, Stephanie Winkeljohn; Pössel, Patrick
2013-08-01
Adolescents who develop depression have worse interpersonal and affective experiences and are more likely to develop substance problems and/or suicidal ideation compared to adolescents who do not develop depression. This study examined the combined effects of negative self-referent information processing and rumination (i.e., brooding and reflection) on adolescent depressive symptoms. It was hypothesized that the interaction of negative self-referent information processing and brooding would significantly predict depressive symptoms, while the interaction of negative self-referent information processing and reflection would not predict depressive symptoms. Adolescents (n = 92; 13-15 years; 34.7% female) participated in a 6-month longitudinal study. Self-report instruments measured depressive symptoms and rumination; a cognitive task measured information processing. Path modelling in Amos 19.0 analyzed the data. The interaction of negative information processing and brooding significantly predicted an increase in depressive symptoms 6 months later. The interaction of negative information processing and reflection did not significantly predict depression, however, the model not meet a priori standards to accept the null hypothesis. Results suggest clinicians working with adolescents at-risk for depression should consider focusing on the reduction of brooding and negative information processing to reduce long-term depressive symptoms.
Common Approach to Geoprocessing of Uav Data across Application Domains
NASA Astrophysics Data System (ADS)
Percivall, G. S.; Reichardt, M.; Taylor, T.
2015-08-01
UAVs are a disruptive technology bringing new geographic data and information to many application domains. UASs are similar to other geographic imagery systems so existing frameworks are applicable. But the diversity of UAVs as platforms along with the diversity of available sensors are presenting challenges in the processing and creation of geospatial products. Efficient processing and dissemination of the data is achieved using software and systems that implement open standards. The challenges identified point to the need for use of existing standards and extending standards. Results from the use of the OGC Sensor Web Enablement set of standards are presented. Next steps in the progress of UAVs and UASs may follow the path of open data, open source and open standards.
Method and Apparatus for Processing UDP Data Packets
NASA Technical Reports Server (NTRS)
Murphy, Brandon M. (Inventor)
2017-01-01
A method and apparatus for processing a plurality of data packets. A data packet is received. A determination is made as to whether a portion of the data packet follows a selected digital recorder standard protocol based on a header of the data packet. Raw data in the data packet is converted into human-readable information in response to a determination that the portion of the data packet follows the selected digital recorder standard protocol.
Technical Data Interoperability (TDI) Pathfinder Via Emerging Standards
NASA Technical Reports Server (NTRS)
Conroy, Mike; Gill, Paul; Hill, Bradley; Ibach, Brandon; Jones, Corey; Ungar, David; Barch, Jeffrey; Ingalls, John; Jacoby, Joseph; Manning, Josh;
2014-01-01
The TDI project (TDI) investigates trending technical data standards for applicability to NASA vehicles, space stations, payloads, facilities, and equipment. TDI tested COTS software compatible with a certain suite of related industry standards for capabilities of individual benefits and interoperability. These standards not only esnable Information Technology (IT) efficiencies, but also address efficient structures and standard content for business processes. We used source data from generic industry samples as well as NASA and European Space Agency (ESA) data from space systems.
75 FR 16186 - Petitions for Modification
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-31
... CFR Part 44 govern the application, processing, and disposition of petitions for modification. This... listed above. FOR FURTHER INFORMATION CONTACT: Barbara Barron, Office of Standards, Regulations and... not toll-free numbers]. SUPPLEMENTARY INFORMATION: I. Background Section 101(c) of the Federal Mine...
Jiang, Tao; Yu, Ping; Hailey, David; Ma, Jun; Yang, Jie
2016-09-01
To obtain indications of the influence of electronic health records (EHR) in managing risks and meeting information system accreditation standard in Australian residential aged care (RAC) homes. The hypothesis to be tested is that the RAC homes using EHR have better performance in meeting information system standards in aged care accreditation than their counterparts only using paper records for information management. Content analysis of aged care accreditation reports from the Aged Care Standards and Accreditation Agency produced between April 2011 and December 2013. Items identified included types of information systems, compliance with accreditation standards, and indicators of failure to meet an expected outcome for information systems. The Chi-square test was used to identify difference between the RAC homes that used EHR systems and those that used paper records in not meeting aged care accreditation standards. 1,031 (37.4%) of 2,754 RAC homes had adopted EHR systems. Although the proportion of homes that met all accreditation standards was significantly higher for those with EHR than for homes with paper records, only 13 RAC homes did not meet one or more expected outcomes. 12 used paper records and nine of these failed the expected outcome for information systems. The overall contribution of EHR to meeting aged care accreditation standard in Australia was very small. Risk indicators for not meeting information system standard were no access to accurate and appropriate information, failure in monitoring mechanisms, not reporting clinical incidents, insufficient recording of residents' clinical changes, not providing accurate care plans, and communication processes failure. The study has provided indications that use of EHR provides small, yet significant advantages for RAC homes in Australia in managing risks for information management and in meeting accreditation requirements. The implication of the study for introducing technology innovation in RAC in Australia is discussed.
Computer Science and Technology Publications. NBS Publications List 84.
ERIC Educational Resources Information Center
National Bureau of Standards (DOC), Washington, DC. Inst. for Computer Sciences and Technology.
This bibliography lists publications of the Institute for Computer Sciences and Technology of the National Bureau of Standards. Publications are listed by subject in the areas of computer security, computer networking, and automation technology. Sections list publications of: (1) current Federal Information Processing Standards; (2) computer…
Saidel-Goley, Isaac N; Albiero, Erin E; Flannery, Kathleen A
2012-02-01
Dissociation is a mental process resulting in the disruption of memory, perception, and sometimes identity. At a nonclinical level, only mild dissociative experiences occur. The nature of nonclinical dissociation is disputed in the literature, with some asserting that it is a beneficial information processing style and others positing that it is a psychopathological phenomenon. The purpose of this study was to further the understanding of nonclinical dissociation with respect to memory and attention, by including a more ecologically valid virtual reality (VR) memory task along with standard neuropsychological tasks. Forty-five undergraduate students from a small liberal arts college in the northeast participated for course credit. The participants completed a battery of tasks including two standard memory tasks, a standard attention task, and an experimental VR memory task; the VR task included immersion in a virtual apartment, followed by incidental object-location recall for objects in the virtual apartment. Support for the theoretical model portraying nonclinical dissociation as a beneficial information processing style was found in this study. Dissociation scores were positively correlated with working memory scores and attentional processing scores on the standard neuropsychological tasks. In terms of the VR task, dissociation scores were positively correlated with more false positive memories that could be the result of a tendency of nonclinical highly dissociative individuals to create more elaborative schemas. This study also demonstrates that VR paradigms add to the prediction of cognitive functioning in testing protocols using standard neuropsychological tests, while simultaneously increasing ecological validity.
Facilitating Stewardship of scientific data through standards based workflows
NASA Astrophysics Data System (ADS)
Bastrakova, I.; Kemp, C.; Potter, A. K.
2013-12-01
There are main suites of standards that can be used to define the fundamental scientific methodology of data, methods and results. These are firstly Metadata standards to enable discovery of the data (ISO 19115), secondly the Sensor Web Enablement (SWE) suite of standards that include the O&M and SensorML standards and thirdly Ontology that provide vocabularies to define the scientific concepts and relationships between these concepts. All three types of standards have to be utilised by the practicing scientist to ensure that those who ultimately have to steward the data stewards to ensure that the data can be preserved curated and reused and repurposed. Additional benefits of this approach include transparency of scientific processes from the data acquisition to creation of scientific concepts and models, and provision of context to inform data use. Collecting and recording metadata is the first step in scientific data flow. The primary role of metadata is to provide details of geographic extent, availability and high-level description of data suitable for its initial discovery through common search engines. The SWE suite provides standardised patterns to describe observations and measurements taken for these data, capture detailed information about observation or analytical methods, used instruments and define quality determinations. This information standardises browsing capability over discrete data types. The standardised patterns of the SWE standards simplify aggregation of observation and measurement data enabling scientists to transfer disintegrated data to scientific concepts. The first two steps provide a necessary basis for the reasoning about concepts of ';pure' science, building relationship between concepts of different domains (linked-data), and identifying domain classification and vocabularies. Geoscience Australia is re-examining its marine data flows, including metadata requirements and business processes, to achieve a clearer link between scientific data acquisition and analysis requirements and effective interoperable data management and delivery. This includes participating in national and international dialogue on development of standards, embedding data management activities in business processes, and developing scientific staff as effective data stewards. Similar approach is applied to the geophysical data. By ensuring the geophysical datasets at GA strictly follow metadata and industry standards we are able to implement a provenance based workflow where the data is easily discoverable, geophysical processing can be applied to it and results can be stored. The provenance based workflow enables metadata records for the results to be produced automatically from the input dataset metadata.
Streamlining environmental product declarations: a stage model
NASA Astrophysics Data System (ADS)
Lefebvre, Elisabeth; Lefebvre, Louis A.; Talbot, Stephane; Le Hen, Gael
2001-02-01
General public environmental awareness and education is increasing, therefore stimulating the demand for reliable, objective and comparable information about products' environmental performances. The recently published standard series ISO 14040 and ISO 14025 are normalizing the preparation of Environmental Product Declarations (EPDs) containing comprehensive information relevant to a product's environmental impact during its life cycle. So far, only a few environmentally leading manufacturing organizations have experimented the preparation of EPDs (mostly from Europe), demonstrating its great potential as a marketing weapon. However the preparation of EPDs is a complex process, requiring collection and analysis of massive amounts of information coming from disparate sources (suppliers, sub-contractors, etc.). In a foreseeable future, the streamlining of the EPD preparation process will require product manufacturers to adapt their information systems (ERP, MES, SCADA) in order to make them capable of gathering, and transmitting the appropriate environmental information. It also requires strong functional integration all along the product supply chain in order to ensure that all the information is made available in a standardized and timely manner. The goal of the present paper is two fold: first to propose a transitional model towards green supply chain management and EPD preparation; second to identify key technologies and methodologies allowing to streamline the EPD process and subsequently the transition toward sustainable product development
Design of experiments enhanced statistical process control for wind tunnel check standard testing
NASA Astrophysics Data System (ADS)
Phillips, Ben D.
The current wind tunnel check standard testing program at NASA Langley Research Center is focused on increasing data quality, uncertainty quantification and overall control and improvement of wind tunnel measurement processes. The statistical process control (SPC) methodology employed in the check standard testing program allows for the tracking of variations in measurements over time as well as an overall assessment of facility health. While the SPC approach can and does provide researchers with valuable information, it has certain limitations in the areas of process improvement and uncertainty quantification. It is thought by utilizing design of experiments methodology in conjunction with the current SPC practices that one can efficiently and more robustly characterize uncertainties and develop enhanced process improvement procedures. In this research, methodologies were developed to generate regression models for wind tunnel calibration coefficients, balance force coefficients and wind tunnel flow angularities. The coefficients of these regression models were then tracked in statistical process control charts, giving a higher level of understanding of the processes. The methodology outlined is sufficiently generic such that this research can be applicable to any wind tunnel check standard testing program.
29 CFR 1910.119 - Process safety management of highly hazardous chemicals.
Code of Federal Regulations, 2011 CFR
2011-07-01
... complexity of the process will influence the decision as to the appropriate PHA methodology to use. All PHA... process hazard analysis in sufficient detail to support the analysis. (3) Information pertaining to the...) Relief system design and design basis; (E) Ventilation system design; (F) Design codes and standards...
29 CFR 1910.119 - Process safety management of highly hazardous chemicals.
Code of Federal Regulations, 2010 CFR
2010-07-01
... complexity of the process will influence the decision as to the appropriate PHA methodology to use. All PHA... process hazard analysis in sufficient detail to support the analysis. (3) Information pertaining to the...) Relief system design and design basis; (E) Ventilation system design; (F) Design codes and standards...
ERIC Educational Resources Information Center
National Bureau of Standards (DOC), Washington, DC.
These guidelines provide a handbook for use by federal organizations in structuring physical security and risk management programs for their automatic data processing facilities. This publication discusses security analysis, natural disasters, supporting utilities, system reliability, procedural measures and controls, off-site facilities,…
Wu, Frances M; Shortell, Stephen M; Rundall, Thomas G; Bloom, Joan R
To be successful, accountable care organizations (ACOs) must effectively manage patient care. Health information technology (HIT) can support care delivery by providing various degrees of coordination. Few studies have examined the role of HIT functionalities or the role of different levels of coordination enabled by HIT on care management processes. We examine HIT functionalities in ACOs, categorized by the level of coordination they enable in terms of information and work flow, to determine which specific HIT functionalities and levels of coordination are most strongly associated with care management processes. Retrospective cross-sectional analysis was done using 2012 data from the National Survey of Accountable Care Organizations. HIT functionalities are categorized into coordination levels: information capture, the lowest level, which coordinates through standardization; information provision, which supports unidirectional activities; and information exchange, which reflects the highest level of coordination allowing for bidirectional exchange. The Care Management Process index (CMP index) includes 13 questions about the extent to which care is planned, monitored, and supported by providers and patients. Multiple regressions adjusting for organizational and ACO contractual factors are used to assess relationships between HIT functionalities and the CMP index. HIT functionality coordinating the most complex interdependences (information exchange) was associated with a 0.41 standard deviation change in the CMP index (β = .41, p < .001), but the associations for information capture (β = -.01, p = .97) and information provision (β = .15, p = .48) functionalities were not significant. The current study has shed some light on the relationship between HIT and care management processes by specifying the coordination roles that HIT may play and, in particular, the importance of information exchange functionalities. Although these represent early findings, further research can help policy makers and clinical leaders understand how to prioritize HIT development given resource constraints.
2013-01-01
In 2003, the International Patient Decision Aid Standards (IPDAS) Collaboration was established to enhance the quality and effectiveness of patient decision aids by establishing an evidence-informed framework for improving their content, development, implementation, and evaluation. Over this 10 year period, the Collaboration has established: a) the background document on 12 core dimensions to inform the original modified Delphi process to establish the IPDAS checklist (74 items); b) the valid and reliable IPDAS instrument (47 items); and c) the IPDAS qualifying (6 items), certifying (6 items + 4 items for screening), and quality criteria (28 items). The objective of this paper is to describe the evolution of the IPDAS Collaboration and discuss the standardized process used to update the background documents on the theoretical rationales, evidence and emerging issues underlying the 12 core dimensions for assessing the quality of patient decision aids. PMID:24624947
NASA Astrophysics Data System (ADS)
Hwang, Darryl H.; Ma, Kevin; Yepes, Fernando; Nadamuni, Mridula; Nayyar, Megha; Liu, Brent; Duddalwar, Vinay; Lepore, Natasha
2015-12-01
A conventional radiology report primarily consists of a large amount of unstructured text, and lacks clear, concise, consistent and content-rich information. Hence, an area of unmet clinical need consists of developing better ways to communicate radiology findings and information specific to each patient. Here, we design a new workflow and reporting system that combines and integrates advances in engineering technology with those from the medical sciences, the Multidimensional Interactive Radiology Report and Analysis (MIRRA). Until recently, clinical standards have primarily relied on 2D images for the purpose of measurement, but with the advent of 3D processing, many of the manually measured metrics can be automated, leading to better reproducibility and less subjective measurement placement. Hence, we make use this newly available 3D processing in our workflow. Our pipeline is used here to standardize the labeling, tracking, and quantifying of metrics for renal masses.
Government information resource catalog and its service system realization
NASA Astrophysics Data System (ADS)
Gui, Sheng; Li, Lin; Wang, Hong; Peng, Zifeng
2007-06-01
During the process of informatization, there produces a great deal of information resources. In order to manage these information resources and use them to serve the management of business, government decision and public life, it is necessary to establish a transparent and dynamic information resource catalog and its service system. This paper takes the land-house management information resource for example. Aim at the characteristics of this kind of information, this paper does classification, identification and description of land-house information in an uniform specification and method, establishes land-house information resource catalog classification system&, metadata standard, identification standard and land-house thematic thesaurus, and in the internet environment, user can search and get their interested information conveniently. Moreover, under the network environment, to achieve speedy positioning, inquiring, exploring and acquiring various types of land-house management information; and satisfy the needs of sharing, exchanging, application and maintenance of land-house management information resources.
Implementation of a formulary management process.
Karel, Lauren I; Delisle, Dennis R; Anagnostis, Ellena A; Wordell, Cindy J
2017-08-15
The application of lean methodology in an initiative to redesign the formulary maintenance process at an academic medical center is described. Maintaining a hospital formulary requires clear communication and coordination among multiple members of the pharmacy department. Using principles of lean methodology, pharmacy department personnel within a multihospital health system launched a multifaceted initiative to optimize formulary management systemwide. The ongoing initiative began with creation of a formulary maintenance redesign committee consisting of pharmacy department personnel with expertise in informatics, automation, purchasing, drug information, and clinical pharmacy services. The committee met regularly and used lean methodology to design a standardized process for management of formulary additions and deletions and changes to medications' formulary status. Through value stream analysis, opportunities for process and performance improvement were identified; staff suggestions on process streamlining were gathered during a series of departmental kaizen events. A standardized template for development and dissemination of monographs associated with formulary additions and status changes was created. In addition, a shared Web-based checklist was developed to facilitate information sharing and timely initiation and completion of tasks involved in formulary status changes, and a permanent formulary maintenance committee was established to monitor and refine the formulary management process. A clearly defined, standardized process within the pharmacy department was developed for tracking necessary steps in enacting formulary changes to encourage safe and efficient workflow. Copyright © 2017 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
LANDSAT non-US standard catalog, 1 May 1977 - 31 May 1977
NASA Technical Reports Server (NTRS)
1977-01-01
Information regarding the availability of LANDSAT imagery processed and input to the data files by the NASA Data Processing Facility is published on a monthly basis. The U.S. Standard Catalog includes imagery covering the continental United States, Alaska and Hawaii. the Non-U.S. Standard Catalog identifies all the remaining coverage. Sections 1 and 2 describe the contents and format for the catalogs and associated microfilm. Section 3 provides a cross-reference defining the beginning and ending dates for LANDSAT cycles. Sections 4 and 5 cover LANDSAT-1 and LANDSAT-2 coverage, respectively.
Detwiller, Maureen; Petillion, Wendy
2014-06-01
Moving a large healthcare organization from an old, nonstandardized clinical information system to a new user-friendly, standards-based system was much more than an upgrade to technology. This project to standardize terminology, optimize key processes, and implement a new clinical information system was a large change initiative over 4 years that affected clinicians across the organization. Effective change management and engagement of clinical stakeholders were critical to the success of the initiative. The focus of this article was to outline the strategies and methodologies used and the lessons learned.
Isfahani, Sakineh Saghaeiannejad; Khajouei, Reza; Jahanbakhsh, Maryan; Mirmohamadi, Mahboubeh
2014-01-01
Introduction: Nowadays, modern laboratories are faced with a huge volume of information. One of the goals of the Laboratory Information Management System (LIMS) is to assist in the management of the information generated in the laboratory. This study intends to evaluate the LIMS based on the standards of the American National Standard Institute (ANSI). Materials and Methods: This research is a descriptive–analytical study, which had been conducted in 2011, on the LIMSs in use, in the teaching and private hospitals in Isfahan. The data collecting instrument was a checklist, which was made by evaluating three groups of information components namely: ‘System capabilities’, ‘work list functions,’ and ‘reporting’ based on LIS8-A. Data were analyzed using the SPSS 20. Data were analyzed using (relative) frequency, percentage. To compare the data the following statistical tests were used: Leven test, t-test, and Analysis of Variance (ANOVA). Results: The results of the study indicated that the LIMS had a low conformity (30%) with LIS8-A (P = 0.001), with no difference between teaching and private hospitals (P = 0.806). The ANOVA revealed that in terms of conformity with the LIS8-A standard, there was a significant difference between the systems produced by different vendors (P = 0.023). According to the results, a Kowsar system with more than %57 conformity in the three groups of information components had a better conformity to the standard, compared to the other systems. Conclusions: This study indicated that none of the LIMSs had a good conformity to the standard. It seems that system providers did not pay sufficient attention to many of the information components required by the standards when designing and developing their systems. It was suggested that standards from certified organizations and institutions be followed in the design and development process of health information systems. PMID:25077154
Isfahani, Sakineh Saghaeiannejad; Khajouei, Reza; Jahanbakhsh, Maryan; Mirmohamadi, Mahboubeh
2014-01-01
Nowadays, modern laboratories are faced with a huge volume of information. One of the goals of the Laboratory Information Management System (LIMS) is to assist in the management of the information generated in the laboratory. This study intends to evaluate the LIMS based on the standards of the American National Standard Institute (ANSI). This research is a descriptive-analytical study, which had been conducted in 2011, on the LIMSs in use, in the teaching and private hospitals in Isfahan. The data collecting instrument was a checklist, which was made by evaluating three groups of information components namely: 'System capabilities', 'work list functions,' and 'reporting' based on LIS8-A. Data were analyzed using the SPSS 20. Data were analyzed using (relative) frequency, percentage. To compare the data the following statistical tests were used: Leven test, t-test, and Analysis of Variance (ANOVA). The results of the study indicated that the LIMS had a low conformity (30%) with LIS8-A (P = 0.001), with no difference between teaching and private hospitals (P = 0.806). The ANOVA revealed that in terms of conformity with the LIS8-A standard, there was a significant difference between the systems produced by different vendors (P = 0.023). According to the results, a Kowsar system with more than %57 conformity in the three groups of information components had a better conformity to the standard, compared to the other systems. This study indicated that none of the LIMSs had a good conformity to the standard. It seems that system providers did not pay sufficient attention to many of the information components required by the standards when designing and developing their systems. It was suggested that standards from certified organizations and institutions be followed in the design and development process of health information systems.
On standardization of basic datasets of electronic medical records in traditional Chinese medicine.
Zhang, Hong; Ni, Wandong; Li, Jing; Jiang, Youlin; Liu, Kunjing; Ma, Zhaohui
2017-12-24
Standardization of electronic medical record, so as to enable resource-sharing and information exchange among medical institutions has become inevitable in view of the ever increasing medical information. The current research is an effort towards the standardization of basic dataset of electronic medical records in traditional Chinese medicine. In this work, an outpatient clinical information model and an inpatient clinical information model are created to adequately depict the diagnosis processes and treatment procedures of traditional Chinese medicine. To be backward compatible with the existing dataset standard created for western medicine, the new standard shall be a superset of the existing standard. Thus, the two models are checked against the existing standard in conjunction with 170,000 medical record cases. If a case cannot be covered by the existing standard due to the particularity of Chinese medicine, then either an existing data element is expanded with some Chinese medicine contents or a new data element is created. Some dataset subsets are also created to group and record Chinese medicine special diagnoses and treatments such as acupuncture. The outcome of this research is a proposal of standardized traditional Chinese medicine medical records datasets. The proposal has been verified successfully in three medical institutions with hundreds of thousands of medical records. A new dataset standard for traditional Chinese medicine is proposed in this paper. The proposed standard, covering traditional Chinese medicine as well as western medicine, is expected to be soon approved by the authority. A widespread adoption of this proposal will enable traditional Chinese medicine hospitals and institutions to easily exchange information and share resources. Copyright © 2017. Published by Elsevier B.V.
Information Quality in Regulatory Decision Making: Peer Review versus Good Laboratory Practice.
McCarty, Lynn S; Borgert, Christopher J; Mihaich, Ellen M
2012-07-01
There is an ongoing discussion on the provenance of toxicity testing data regarding how best to ensure its validity and credibility. A central argument is whether journal peer-review procedures are superior to Good Laboratory Practice (GLP) standards employed for compliance with regulatory mandates. We sought to evaluate the rationale for regulatory decision making based on peer-review procedures versus GLP standards. We examined pertinent published literature regarding how scientific data quality and validity are evaluated for peer review, GLP compliance, and development of regulations. Some contend that peer review is a coherent, consistent evaluative procedure providing quality control for experimental data generation, analysis, and reporting sufficient to reliably establish relative merit, whereas GLP is seen as merely a tracking process designed to thwart investigator corruption. This view is not supported by published analyses pointing to subjectivity and variability in peer-review processes. Although GLP is not designed to establish relative merit, it is an internationally accepted quality assurance, quality control method for documenting experimental conduct and data. Neither process is completely sufficient for establishing relative scientific soundness. However, changes occurring both in peer-review processes and in regulatory guidance resulting in clearer, more transparent communication of scientific information point to an emerging convergence in ensuring information quality. The solution to determining relative merit lies in developing a well-documented, generally accepted weight-of-evidence scheme to evaluate both peer-reviewed and GLP information used in regulatory decision making where both merit and specific relevance inform the process.
75 FR 57304 - Periodic Reporting Proposals
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-20
.... Proposal Seven would introduce a mailflow-based model of mail processing costs for Standard Mail Parcels... and invites public comment. DATES: Comments are due October 8, 2010. FOR FURTHER INFORMATION CONTACT: Stephen L. Sharfman, General Counsel, [email protected] or 202-789-6820. SUPPLEMENTARY INFORMATION...
Clear as glass: transparent financial reporting.
Valletta, Robert M
2005-08-01
To be transparent, financial information needs to be easily accessible, timely, content-rich, and narrative. Not-for-profit hospitals and health systems should report detailed financial information quarterly. They need internal controls to reduce the level of complexity throughout the organization by creating standardized processes.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-19
... assist the office in processing your requests. See the SUPPLEMENTARY INFORMATION section for electronic... considerations for standardization of image acquisition, image interpretation methods, and other procedures to help ensure imaging data quality. The draft guidance describes two categories of image acquisition and...
LANDSAT 2 world standard catalog, 1-31 December 1978
NASA Technical Reports Server (NTRS)
1978-01-01
The World Standard Catalog lists imagery acquired by LANDSAT 2 which was processed and input to the data files during the referenced period. Information on cloud cover and image quality is given for each scene. The microfilm roll and frame on which the scene may be found is presented.
LANDSAT 3 world standard catalog, 1-31 December 1978
NASA Technical Reports Server (NTRS)
1978-01-01
The World Standard Catalog lists imagery acquired by LANDSAT 3 which was processed and input to the data files during the referenced period. Information on cloud cover and image quality is given for each scene. The microfilm roll and frame on which the scene may be found is given.
Audit and Certification Process for Science Data Digital Repositories
NASA Astrophysics Data System (ADS)
Hughes, J. S.; Giaretta, D.; Ambacher, B.; Ashley, K.; Conrad, M.; Downs, R. R.; Garrett, J.; Guercio, M.; Lambert, S.; Longstreth, T.; Sawyer, D. M.; Sierman, B.; Tibbo, H.; Waltz, M.
2011-12-01
Science data digital repositories are entrusted to ensure that a science community's data are available and useful to users both today and in the future. Part of the challenge in meeting this responsibility is identifying the standards, policies and procedures required to accomplish effective data preservation. Subsequently a repository should be evaluated on whether or not they are effective in their data preservation efforts. This poster will outline the process by which digital repositories are being formally evaluated in terms of their ability to preserve the digitally encoded information with which they have been entrusted. The ISO standards on which this is based will be identified and the relationship of these standards to the Open Archive Information System (OAIS) reference model will be shown. Six test audits have been conducted with three repositories in Europe and three in the USA. Some of the major lessons learned from these test audits will be briefly described. An assessment of the possible impact of this type of audit and certification on the practice of preserving digital information will also be provided.
Whittaker, P J; Gollins, H J; Roaf, E J
2014-03-01
Infant male circumcision is practised by many groups for religious and cultural reasons. Prompted by a desire to minimize the complication rate and to help parents identify good quality providers, a quality assurance (QA) process for infant male circumcision providers has been developed in Greater Manchester. Local stakeholders agreed a set of minimum standards, and providers were invited to submit evidence of their practice in relation to these standards. In participation with parents, community groups, faith groups, healthcare staff and safeguarding partners, an information leaflet for parents was produced. Engagement work with local community groups, faith groups, providers and healthcare staff was vital to ensure that the resources are accessible to parents and that providers continue to engage in the process. Providers that met the QA standards have been listed on a local website. Details of the website are included in the information leaflet distributed by maternity services, health visitors, primary care and community and faith groups. The leaflet is available in seven languages. Local QA processes can be used to encourage and identify good practice and to support parents who need to access services outside the remit of the National Health Service.
The Effect of Standardized Interviews on Organ Donation.
Corman Dincer, Pelin; Birtan, Deniz; Arslantas, Mustafa Kemal; Tore Altun, Gulbin; Ayanoglu, Hilmi Omer
2018-03-01
Organ donation is the most important stage for organ transplant. Studies reveal that attitudes of families of brain-dead patients toward donation play a significant role in their decision. We hypothesized that supporting family awareness about the meaning of organ donation, including saving lives while losing a loved one, combined with being informed about brain death and the donation process must be maintained by intensive care unit physicians through standardized interviews and questionnaires to increase the donation rate. We retrospectively evaluated the final decisions of families of 52 brain-dead donors treated at our institution between 2014 and 2017. Data underwent descriptive analyses. The standard interview content was generated after literature search results were reviewed by the authors. Previously, we examined the impact of standardized interviews done by intensive care unit physicians with relatives of potential brain-dead donors regarding decisions to donate or reasons for refusing organ donation. After termination of that study, interviews were done according to the intensivist's orientation, resulting in significantly decreased donation rates. Standardized interviews were then started again, resulting in increased donation rates. Of 17 families who participated in standardized interviews, 5 families (29.4%) agreed to donate organs of their brain-dead relatives. In the other group of families, intensivists governed informing the families of donation without standardized interviews. In this group of 35 families, 5 families (14.3%) approved organ donation. The decision regarding whether to agree to organ donation was statistically different between the 2 family groups (P < .05). Conducting a standard interview between relatives of brain-dead donors and the intensivists, facilitating visits between relatives and the brain-dead patients, and informing relatives about the donation process resulted in an increased rate of organ donation compared with routine protocols.
Automated process planning system
NASA Technical Reports Server (NTRS)
Mann, W.
1978-01-01
Program helps process engineers set up manufacturing plans for machined parts. System allows one to develop and store library of similar parts characteristics, as related to particular facility. Information is then used in interactive system to help develop manufacturing plans that meet required standards.
Ensuring respect for persons in COMPASS: a cluster randomised pragmatic clinical trial.
Andrews, Joseph E; Moore, J Brian; Weinberg, Richard B; Sissine, Mysha; Gesell, Sabina; Halladay, Jacquie; Rosamond, Wayne; Bushnell, Cheryl; Jones, Sara; Means, Paula; King, Nancy M P; Omoyeni, Diana; Duncan, Pamela W
2018-05-02
Cluster randomised clinical trials present unique challenges in meeting ethical obligations to those who are treated at a randomised site. Obtaining informed consent for research within the context of clinical care is one such challenge. In order to solve this problem it is important that an informed consent process be effective and efficient, and that it does not impede the research or the healthcare. The innovative approach to informed consent employed in the COMPASS study demonstrates the feasibility of upholding ethical standards without imposing undue burden on clinical workflows, staff members or patients who may participate in the research by virtue of their presence in a cluster randomised facility. The COMPASS study included 40 randomised sites and compared the effectiveness of a postacute stroke intervention with standard care. Each site provided either the comprehensive postacute stroke intervention or standard care according to the randomisation assignment. Working together, the study team, institutional review board and members of the community designed an ethically appropriate and operationally reasonable consent process which was carried out successfully at all randomised sites. This achievement is noteworthy because it demonstrates how to effectively conduct appropriate informed consent in cluster randomised trials, and because it provides a model that can easily be adapted for other pragmatic studies. With this innovative approach to informed consent, patients have access to the information they need about research occurring where they are seeking care, and medical researchers can conduct their studies without ethical concerns or unreasonable logistical impediments. NCT02588664, recruiting. This article covers the development of consent process that is currentlty being employed in the study. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Sharko, Marianne; Wilcox, Lauren; Hong, Matthew K; Ancker, Jessica S
2018-05-17
Medical privacy policies, which are clear-cut for adults and young children, become ambiguous during adolescence. Yet medical organizations must establish unambiguous rules about patient and parental access to electronic patient portals. We conducted a national interview study to characterize the diversity in adolescent portal policies across a range of institutions and determine the factors influencing decisions about these policies. Within a sampling framework that ensured diversity of geography and medical organization type, we used purposive and snowball sampling to identify key informants. Semi-structured interviews were conducted and analyzed with inductive thematic analysis, followed by a member check. We interviewed informants from 25 medical organizations. Policies established different degrees of adolescent access (from none to partial to complete), access ages (from 10 to 18 years), degrees of parental access, and types of information considered sensitive. Federal and state law did not dominate policy decisions. Other factors in the decision process were: technology capabilities; differing patient population needs; resources; community expectations; balance between information access and privacy; balance between promoting autonomy and promoting family shared decision-making; and tension between teen privacy and parental preferences. Some informants believed that clearer standards would simplify policy-making; others worried that standards could restrict high-quality polices. In the absence of universally accepted standards, medical organizations typically undergo an arduous decision-making process to develop teen portal policies, weighing legal, economic, social, clinical, and technological factors. As a result, portal access policies are highly inconsistent across the United States and within individual states.
Electronic Data Interchange in Defense Transportation
1987-10-01
entry into a nearly paperless transportation environment. • Prescribe DoD’s use of the EDI standards developed by the transportation industry and lead...information into a format for internal use so that it can be processed. * Key Entry Costs. Data will no longer need to be entered manually into a terminal or...that commercial standards cannot meet, DoD must create standards. A vehicle for creating those DoD-unique standards now exists. That vehicle , the
Nelson, Victoria; Nelson, Victoria Ruth; Li, Fiona; Green, Susan; Tamura, Tomoyoshi; Liu, Jun-Min; Class, Margaret
2008-11-06
The Walter Reed National Surgical Quality Improvement Program Data Transfer web module integrates with medical and surgical information systems, and leverages outside standards, such as the National Library of Medicine's RxNorm, to process surgical and risk assessment data. Key components of the project included a needs assessment with nurse reviewers and a data analysis for federated (standards were locally controlled) data sources. The resulting interface streamlines nurse reviewer workflow by integrating related tasks and data.
Makadia, Rupa; Matcho, Amy; Ma, Qianli; Knoll, Chris; Schuemie, Martijn; DeFalco, Frank J; Londhe, Ajit; Zhu, Vivienne; Ryan, Patrick B
2015-01-01
Objectives To evaluate the utility of applying the Observational Medical Outcomes Partnership (OMOP) Common Data Model (CDM) across multiple observational databases within an organization and to apply standardized analytics tools for conducting observational research. Materials and methods Six deidentified patient-level datasets were transformed to the OMOP CDM. We evaluated the extent of information loss that occurred through the standardization process. We developed a standardized analytic tool to replicate the cohort construction process from a published epidemiology protocol and applied the analysis to all 6 databases to assess time-to-execution and comparability of results. Results Transformation to the CDM resulted in minimal information loss across all 6 databases. Patients and observations excluded were due to identified data quality issues in the source system, 96% to 99% of condition records and 90% to 99% of drug records were successfully mapped into the CDM using the standard vocabulary. The full cohort replication and descriptive baseline summary was executed for 2 cohorts in 6 databases in less than 1 hour. Discussion The standardization process improved data quality, increased efficiency, and facilitated cross-database comparisons to support a more systematic approach to observational research. Comparisons across data sources showed consistency in the impact of inclusion criteria, using the protocol and identified differences in patient characteristics and coding practices across databases. Conclusion Standardizing data structure (through a CDM), content (through a standard vocabulary with source code mappings), and analytics can enable an institution to apply a network-based approach to observational research across multiple, disparate observational health databases. PMID:25670757
NASA Technical Reports Server (NTRS)
Lowman, Douglas S.; Withers, B. Edward; Shagnea, Anita M.; Dent, Leslie A.; Hayhurst, Kelly J.
1990-01-01
A variety of instructions to be used in the development of implementations of software for the Guidance and Control Software (GCS) project is described. This document fulfills the Radio Technical Commission for Aeronautics RTCA/DO-178A guidelines, 'Software Considerations in Airborne Systems and Equipment Certification' requirements for document No. 4, which specifies the information necessary for understanding and programming the host computer, and document No. 12, which specifies the software design and implementation standards that are applicable to the software development and testing process. Information on the following subjects is contained: activity recording, communication protocol, coding standards, change management, error handling, design standards, problem reporting, module testing logs, documentation formats, accuracy requirements, and programmer responsibilities.
Yeh, Ching-Hua; Hartmann, Monika; Hirsch, Stefan
2018-06-01
The presentation of credence attributes such as the product's origin or the production method has a significant influence on consumers' food purchase decisions. The dataset includes survey responses from a discrete choice experiment with 1309 food shoppers in Taiwan using the example of sweet pepper. The survey was carried out in 2014 in the three largest Taiwanese cities. It evaluates the impact of providing information on the equality of organic standards on consumers' preferences at the example of sweet pepper. Equality of organic standards implies that regardless of products' country-of-origin (COO) organic certifications are based on the same production regulation and managerial processes. Respondents were randomly allocated to the information treatment and the control group. The dataset contains the product choices of participants in both groups, as well as their sociodemographic information.
77 FR 50112 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-20
... the UPD helps to protect the integrity of ACF's award selection process. All ACF discretionary grant... instructions; the Standard Form 424 series, which requests basic information, budget information, and... Planning, Research and Evaluation, 370 L'Enfant Promenade SW., Washington, DC 20447, Attn: ACF Reports...
The Role of Metadata Standards in EOSDIS Search and Retrieval Applications
NASA Technical Reports Server (NTRS)
Pfister, Robin
1999-01-01
Metadata standards play a critical role in data search and retrieval systems. Metadata tie software to data so the data can be processed, stored, searched, retrieved and distributed. Without metadata these actions are not possible. The process of populating metadata to describe science data is an important service to the end user community so that a user who is unfamiliar with the data, can easily find and learn about a particular dataset before an order decision is made. Once a good set of standards are in place, the accuracy with which data search can be performed depends on the degree to which metadata standards are adhered during product definition. NASA's Earth Observing System Data and Information System (EOSDIS) provides examples of how metadata standards are used in data search and retrieval.
Convergence Toward Common Standards in Machine-Readable Cataloging *
Gull, C. D.
1969-01-01
The adoption of the MARC II format for the communication of bibliographic information by the three National Libraries of the U.S.A. makes it possible for those libraries to converge on the remaining necessary common standards for machine-readable cataloging. Three levels of standards are identified: fundamental, the character set; intermediate, MARC II; and detailed, the codes for identifying data elements. The convergence on these standards implies that the National Libraries can create and operate a Joint Bibliographic Data Bank requiring standard book numbers and universal serial numbers for identifying monographs and serials and that the system will thoroughly process contributed catalog entries before adding them to the Data Bank. There is reason to hope that the use of the MARC II format will facilitate catalogers' decision processes. PMID:5782261
Business Model for the Security of a Large-Scale PACS, Compliance with ISO/27002:2013 Standard.
Gutiérrez-Martínez, Josefina; Núñez-Gaona, Marco Antonio; Aguirre-Meneses, Heriberto
2015-08-01
Data security is a critical issue in an organization; a proper information security management (ISM) is an ongoing process that seeks to build and maintain programs, policies, and controls for protecting information. A hospital is one of the most complex organizations, where patient information has not only legal and economic implications but, more importantly, an impact on the patient's health. Imaging studies include medical images, patient identification data, and proprietary information of the study; these data are contained in the storage device of a PACS. This system must preserve the confidentiality, integrity, and availability of patient information. There are techniques such as firewalls, encryption, and data encapsulation that contribute to the protection of information. In addition, the Digital Imaging and Communications in Medicine (DICOM) standard and the requirements of the Health Insurance Portability and Accountability Act (HIPAA) regulations are also used to protect the patient clinical data. However, these techniques are not systematically applied to the picture and archiving and communication system (PACS) in most cases and are not sufficient to ensure the integrity of the images and associated data during transmission. The ISO/IEC 27001:2013 standard has been developed to improve the ISM. Currently, health institutions lack effective ISM processes that enable reliable interorganizational activities. In this paper, we present a business model that accomplishes the controls of ISO/IEC 27002:2013 standard and criteria of security and privacy from DICOM and HIPAA to improve the ISM of a large-scale PACS. The methodology associated with the model can monitor the flow of data in a PACS, facilitating the detection of unauthorized access to images and other abnormal activities.
Recommendations for Selecting Drug-Drug Interactions for Clinical Decision Support
Tilson, Hugh; Hines, Lisa E.; McEvoy, Gerald; Weinstein, David M.; Hansten, Philip D.; Matuszewski, Karl; le Comte, Marianne; Higby-Baker, Stefanie; Hanlon, Joseph T.; Pezzullo, Lynn; Vieson, Kathleen; Helwig, Amy L.; Huang, Shiew-Mei; Perre, Anthony; Bates, David W.; Poikonen, John; Wittie, Michael A.; Grizzle, Amy J.; Brown, Mary; Malone, Daniel C.
2016-01-01
Purpose To recommend principles for including drug-drug interactions (DDIs) in clinical decision support. Methods A conference series was conducted to improve clinical decision support (CDS) for DDIs. The Content Workgroup met monthly by webinar from January 2013 to February 2014, with two in-person meetings to reach consensus. The workgroup consisted of 20 experts in pharmacology, drug information, and CDS from academia, government agencies, health information (IT) vendors, and healthcare organizations. Workgroup members addressed four key questions: (1) What process should be used to develop and maintain a standard set of DDIs?; (2) What information should be included in a knowledgebase of standard DDIs?; (3) Can/should a list of contraindicated drug pairs be established?; and (4) How can DDI alerts be more intelligently filtered? Results To develop and maintain a standard set of DDIs for CDS in the United States, we recommend a transparent, systematic, and evidence-driven process with graded recommendations by a consensus panel of experts and oversight by a national organization. We outline key DDI information needed to help guide clinician decision-making. We recommend judicious classification of DDIs as contraindicated, as only a small set of drug combinations are truly contraindicated. Finally, we recommend more research to identify methods to safely reduce repetitive and less relevant alerts. Conclusion A systematic ongoing process is necessary to select DDIs for alerting clinicians. We anticipate that our recommendations can lead to consistent and clinically relevant content for interruptive DDIs, and thus reduce alert fatigue and improve patient safety. PMID:27045070
Opposite Effects of Context on Immediate Structural and Lexical Processing.
ERIC Educational Resources Information Center
Harris, John W.
The testing of a number of hypotheses about the effect of hearing a prior context sentence on immediate processing of a subsequent target sentence is described. According to the standard deep structure model, higher level processing (e.g. semantic interpretation, integration of context-tarqet information) does not occur immediately as speech is…
NASA Astrophysics Data System (ADS)
Maksimov, N. V.; Tikhomirov, G. V.; Golitsyna, O. L.
2017-01-01
The main problems and circumstances that influence the processes of creating effective knowledge management systems were described. These problems particularly include high species diversity of instruments for knowledge representation, lack of adequate lingware, including formal representation of semantic relationships. For semantic data descriptions development a conceptual model of the subject area and a conceptual-lexical system should be designed on proposals of ISO-15926 standard. It is proposed to conduct an information integration of educational and production processes on the basis of information systems technologies. Integrated knowledge management system information environment combines both traditional information resources and specific information resources of subject domain including task context and implicit/tacit knowledge.
Electronic Photography at the NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Holm, Jack; Judge, Nancianne
1995-01-01
An electronic photography facility has been established in the Imaging & Photographic Technology Section, Visual Imaging Branch, at the NASA Langley Research Center (LaRC). The purpose of this facility is to provide the LaRC community with access to digital imaging technology. In particular, capabilities have been established for image scanning, direct image capture, optimized image processing for storage, image enhancement, and optimized device dependent image processing for output. Unique approaches include: evaluation and extraction of the entire film information content through scanning; standardization of image file tone reproduction characteristics for optimal bit utilization and viewing; education of digital imaging personnel on the effects of sampling and quantization to minimize image processing related information loss; investigation of the use of small kernel optimal filters for image restoration; characterization of a large array of output devices and development of image processing protocols for standardized output. Currently, the laboratory has a large collection of digital image files which contain essentially all the information present on the original films. These files are stored at 8-bits per color, but the initial image processing was done at higher bit depths and/or resolutions so that the full 8-bits are used in the stored files. The tone reproduction of these files has also been optimized so the available levels are distributed according to visual perceptibility. Look up tables are available which modify these files for standardized output on various devices, although color reproduction has been allowed to float to some extent to allow for full utilization of output device gamut.
Informatics in clinical research in oncology: current state, challenges, and a future perspective.
Chahal, Amar P S
2011-01-01
The informatics landscape of clinical trials in oncology has changed significantly in the last 10 years. The current state of the infrastructure for clinical trial management, execution, and data management is reviewed. The systems, their functionality, the users, and the standards available to researchers are discussed from the perspective of the oncologist-researcher. Challenges in complexity and in the processing of information are outlined. These challenges include the lack of communication and information-interchange between systems, the lack of simplified standards, and the lack of implementation and adherence to the standards that are available. The clinical toxicology criteria from the National Cancer Institute (CTCAE) are cited as a successful standard in oncology, and HTTP on the Internet is referenced for its simplicity. Differences in the management of information standards between industries are discussed. Possible future advances in oncology clinical research informatics are addressed. These advances include strategic policy review of standards and the implementation of actions to make standards free, ubiquitous, simple, and easily interpretable; the need to change from a local data-capture- or transaction-driven model to a large-scale data-interpretation model that provides higher value to the oncologist and the patient; and the need for information technology investment in a readily available digital educational model for clinical research in oncology that is customizable for individual studies. These new approaches, with changes in information delivery to mobile platforms, will set the stage for the next decade in clinical research informatics.
XML — an opportunity for
NASA Astrophysics Data System (ADS)
Houlding, Simon W.
2001-08-01
Extensible markup language (XML) is a recently introduced meta-language standard on the Web. It provides the rules for development of metadata (markup) standards for information transfer in specific fields. XML allows development of markup languages that describe what information is rather than how it should be presented. This allows computer applications to process the information in intelligent ways. In contrast hypertext markup language (HTML), which fuelled the initial growth of the Web, is a metadata standard concerned exclusively with presentation of information. Besides its potential for revolutionizing Web activities, XML provides an opportunity for development of meaningful data standards in specific application fields. The rapid endorsement of XML by science, industry and e-commerce has already spawned new metadata standards in such fields as mathematics, chemistry, astronomy, multi-media and Web micro-payments. Development of XML-based data standards in the geosciences would significantly reduce the effort currently wasted on manipulating and reformatting data between different computer platforms and applications and would ensure compatibility with the new generation of Web browsers. This paper explores the evolution, benefits and status of XML and related standards in the more general context of Web activities and uses this as a platform for discussion of its potential for development of data standards in the geosciences. Some of the advantages of XML are illustrated by a simple, browser-compatible demonstration of XML functionality applied to a borehole log dataset. The XML dataset and the associated stylesheet and schema declarations are available for FTP download.
NASA Astrophysics Data System (ADS)
Pesquer, Lluís; Jirka, Simon; van de Giesen, Nick; Masó, Joan; Stasch, Christoph; Van Nooyen, Ronald; Prat, Ester; Pons, Xavier
2015-04-01
This work describes the strategy of the European Horizon 2020 project WaterInnEU. Its vision is to enhance the exploitation of EU funded ICT models, tools, protocols and policy briefs related to the water sector and to establish suitable conditions for new market opportunities based on these offerings. The main goals are: • Connect the research results and developments of previous EU funded activities with the already existing data available on European level and also with to the companies that are able to offer products and services based on these tools and data. • Offer an independent marketplace platform complemented by technical and commercial expertise as a service for users to allow the access to products and services best fitting their priorities, capabilities and procurement processes. One of the pillars of WaterInnEU is to stimulate and prioritize the application of international standards into ICT tools and policy briefs. The standardization of formats, services and processes will allow for a harmonized water management between different sectors, fragmented areas and scales (local, regional or international) approaches. Several levels of interoperability will be addressed: • Syntactic: Connecting system and tools together: Syntactic interoperability allows for client and service tools to automatically discover, access, and process data and information (query and exchange parts of a database) and to connect each other in process chains. The discovery of water related data is achieved using metadata cataloguing standards and, in particular, the one adopted by the INSPIRE directive: OGC Catalogue Service for the Web (CSW). • Semantic: Sharing a pan-European conceptual framework This is the ability of computer systems to exchange data with unambiguous, shared meaning. The project therefore addresses not only the packaging of data (syntax), but also the simultaneous transmission of the meaning with the data (semantics). This is accomplished by linking each data element to a controlled, shared vocabulary. In Europe, INSPIRE defines a shared vocabulary and its associated links to an ontology. For hydrographical information this can be used as a baseline. • Organizational: Harmonizing policy aspects This level of interoperability deals with operational methodologies and procedures that organizations use to administrate their own data and processing capabilities and to share those capabilities with others. This layer is addressed by the adoption of common policy briefs that facilitate both robust protocols and flexibility to interact with others. • Data visualization: Making data easy to see The WMS and WMTS standards are the most commonly used geographic information visualization standards for sharing information in web portals. Our solution will incorporate a quality extension of these standards for visualizing data quality as nested layers linked to the different data sets. In the presented approach, the use of standards can be seen twofold: the tools and products should leverage standards wherever possible to ensure interoperability between solution providers, and the platform itself must utilize standards as much as possible, to allow for example the integration with other systems through open APIs or the description of available items.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spickett, Jeffery, E-mail: J.Spickett@curtin.edu.au; Faculty of Health Sciences, School of Public Health, Curtin University, Perth, Western Australia; Katscherian, Dianne
The approaches used for setting or reviewing air quality standards vary from country to country. The purpose of this research was to consider the potential to improve decision-making through integration of HIA into the processes to review and set air quality standards used in Australia. To assess the value of HIA in this policy process, its strengths and weaknesses were evaluated aligned with review of international processes for setting air quality standards. Air quality standard setting programmes elsewhere have either used HIA or have amalgamated and incorporated factors normally found within HIA frameworks. They clearly demonstrate the value of amore » formalised HIA process for setting air quality standards in Australia. The following elements should be taken into consideration when using HIA in standard setting. (a) The adequacy of a mainly technical approach in current standard setting procedures to consider social determinants of health. (b) The importance of risk assessment criteria and information within the HIA process. The assessment of risk should consider equity, the distribution of variations in air quality in different locations and the potential impacts on health. (c) The uncertainties in extrapolating evidence from one population to another or to subpopulations, especially the more vulnerable, due to differing environmental factors and population variables. (d) The significance of communication with all potential stakeholders on issues associated with the management of air quality. In Australia there is also an opportunity for HIA to be used in conjunction with the NEPM to develop local air quality standard measures. The outcomes of this research indicated that the use of HIA for air quality standard setting at the national and local levels would prove advantageous. -- Highlights: • Health Impact Assessment framework has been applied to a policy development process. • HIA process was evaluated for application in air quality standard setting. • Advantages of HIA in the air quality standard setting process are demonstrated.« less
Protected interoperability of telecommunications and digital products
NASA Astrophysics Data System (ADS)
Hampel, Viktor E.; Cartier, Gene N.; Craft, James P.
1994-11-01
New federal standards for the protection of sensitive data now make it possible to ensure the authenticity, integrity and confidentiality of digital products, and non-repudiation of digital telecommunications. Under review and comment since 1991, the new Federal standards were confirmed this year and provide standard means for the protection of voice and data communications from accidental and wilful abuse. The standards are initially tailored to protect only `sensitive-but-unclassified' (SBU) data in compliance with the Computer Security Act of 1987. These data represent the majority of transactions in electronic commerce, including sensitive procurement information, trade secrets, financial data, product definitions, and company-proprietary information classified as `intellectual property.' Harmonization of the new standards with international requirements is in progress. In the United States, the confirmation of the basic standards marks the beginning of a long-range program to assure discretionary and mandatory access controls to digital resources. Upwards compatibility into the classified domain with multi-level security is a core requirement of the National Information Infrastructure. In this report we review the powerful capabilities of standard Public-Key-Cryptology, the availability of commercial and Federal products for data protection, and make recommendations for their cost-effective use to assure reliable telecommunications and process controls.
Building gold standard corpora for medical natural language processing tasks.
Deleger, Louise; Li, Qi; Lingren, Todd; Kaiser, Megan; Molnar, Katalin; Stoutenborough, Laura; Kouril, Michal; Marsolo, Keith; Solti, Imre
2012-01-01
We present the construction of three annotated corpora to serve as gold standards for medical natural language processing (NLP) tasks. Clinical notes from the medical record, clinical trial announcements, and FDA drug labels are annotated. We report high inter-annotator agreements (overall F-measures between 0.8467 and 0.9176) for the annotation of Personal Health Information (PHI) elements for a de-identification task and of medications, diseases/disorders, and signs/symptoms for information extraction (IE) task. The annotated corpora of clinical trials and FDA labels will be publicly released and to facilitate translational NLP tasks that require cross-corpora interoperability (e.g. clinical trial eligibility screening) their annotation schemas are aligned with a large scale, NIH-funded clinical text annotation project.
LANDSAT 2 world standard catalog, 1-31 January 1979
NASA Technical Reports Server (NTRS)
1979-01-01
The World Standard Catalog lists imagery acquired by LANDSAT 2 which are processed and input to the data files during the referenced period. Information such as cloud cover and image quality is given for each scene. The microfilm roll and frame on which the scene may be found is also given.
LANDSAT 2 world standard catalog, 1-30 November 1978
NASA Technical Reports Server (NTRS)
1978-01-01
The World Standard Catalog lists imagery acquired by LANDSAT 2 which was processed and input to the data files during the referenced period. Information such as cloud cover and image quality is given for each scene. The microfilm roll and frame on which the scene may be found is also given.
LANDSAT 2 world standard catalog, 1-31 October 1978
NASA Technical Reports Server (NTRS)
1978-01-01
The World Standard Catalog lists imagery acquired by LANDSAT 2 which was processed and input to the data files during the referenced period. Information such as cloud cover and image quality is given for each scene. The microfilms roll and frame on which the scene may be found is also given.
Designing a Bridging Discourse: Re-Mediation of a Mathematical Learning Disability
ERIC Educational Resources Information Center
Lewis, Katherine E.
2017-01-01
Students with disabilities present a unique instructional design challenge. These students often have qualitatively different ways of processing information, meaning that standard instructional approaches may not be effective. In this study I present a case study of a student with a mathematical learning disability for whom standard instruction on…
E-Business Reporting: Towards a Global Standard for Financial Reporting Systems Using XBRL
ERIC Educational Resources Information Center
Long, Margaret J.
2013-01-01
Reporting systems can provide transparency into financial markets necessary for a sustainable, prosperous global economy. The most widely used global platform for exchanging electronic information about companies to regulatory bodies is XBRL. Standards for this platform are in the process of becoming legally harmonized, but not all countries are…
77 FR 21778 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-11
... the integrity of ACF's award selection process. All ACF discretionary grant programs are required to... Standard Form 424 series, which requests basic information, budget information, and assurances; the Project... Administration for Children and Families, Office of Planning, Research and Evaluation, 370 L'Enfant Promenade SW...
National Photonics Skills Standard for Technicians.
ERIC Educational Resources Information Center
Center for Occupational Research and Development, Inc., Waco, TX.
This document defines "photonics" as the generation, manipulation, transport, detection, and use of light information and energy whose quantum unit is the photon. The range of applications of photonics extends from energy generation to detection to communication and information processing. Photonics is at the heart of today's…
CrossTalk. The Journal of Defense Software Engineering. Volume 25, Number 3
2012-06-01
OMG) standard Business Process Modeling and Nota- tion ( BPMN ) [6] graphical notation. I will address each of these: identify and document steps...to a value stream map using BPMN and textual process narratives. The resulting process narratives or process metadata includes key information...objectives. Once the processes are identified we can graphically document them capturing the process using BPMN (see Figure 1). The BPMN models
Software Quality Assurance and Controls Standard
2010-04-27
Software Quality Assurance d C t l St d dan on ro s an ar Sue Carroll Principal Software Quality Analyst, SAS John Wal z VP Technology and...for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should be aware that...Cycle (SLC) process? • What is in a SQA Process? • Where are SQA Controls? • What is the SQA standards history? Wh t i h i i SQA?• a s c ang ng n
The Units Ontology: a tool for integrating units of measurement in science
Gkoutos, Georgios V.; Schofield, Paul N.; Hoehndorf, Robert
2012-01-01
Units are basic scientific tools that render meaning to numerical data. Their standardization and formalization caters for the report, exchange, process, reproducibility and integration of quantitative measurements. Ontologies are means that facilitate the integration of data and knowledge allowing interoperability and semantic information processing between diverse biomedical resources and domains. Here, we present the Units Ontology (UO), an ontology currently being used in many scientific resources for the standardized description of units of measurements. PMID:23060432
Recommendations for selecting drug-drug interactions for clinical decision support.
Tilson, Hugh; Hines, Lisa E; McEvoy, Gerald; Weinstein, David M; Hansten, Philip D; Matuszewski, Karl; le Comte, Marianne; Higby-Baker, Stefanie; Hanlon, Joseph T; Pezzullo, Lynn; Vieson, Kathleen; Helwig, Amy L; Huang, Shiew-Mei; Perre, Anthony; Bates, David W; Poikonen, John; Wittie, Michael A; Grizzle, Amy J; Brown, Mary; Malone, Daniel C
2016-04-15
Recommendations for including drug-drug interactions (DDIs) in clinical decision support (CDS) are presented. A conference series was conducted to improve CDS for DDIs. A work group consisting of 20 experts in pharmacology, drug information, and CDS from academia, government agencies, health information vendors, and healthcare organizations was convened to address (1) the process to use for developing and maintaining a standard set of DDIs, (2) the information that should be included in a knowledge base of standard DDIs, (3) whether a list of contraindicated drug pairs can or should be established, and (4) how to more intelligently filter DDI alerts. We recommend a transparent, systematic, and evidence-driven process with graded recommendations by a consensus panel of experts and oversight by a national organization. We outline key DDI information needed to help guide clinician decision-making. We recommend judicious classification of DDIs as contraindicated and more research to identify methods to safely reduce repetitive and less-relevant alerts. An expert panel with a centralized organizer or convener should be established to develop and maintain a standard set of DDIs for CDS in the United States. The process should be evidence driven, transparent, and systematic, with feedback from multiple stakeholders for continuous improvement. The scope of the expert panel's work should be carefully managed to ensure that the process is sustainable. Support for research to improve DDI alerting in the future is also needed. Adoption of these steps may lead to consistent and clinically relevant content for interruptive DDIs, thus reducing alert fatigue and improving patient safety. Copyright © 2016 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
Chou, Ann F; Yano, Elizabeth M; McCoy, Kimberly D; Willis, Deanna R; Doebbeling, Bradley N
2008-01-01
To address increases in the incidence of infection with antimicrobial-resistant pathogens, the National Foundation for Infectious Diseases and Centers for Disease Control and Prevention proposed two sets of strategies to (a) optimize antibiotic use and (b) prevent the spread of antimicrobial resistance and control transmission. However, little is known about the implementation of these strategies. Our objective is to explore organizational structural and process factors that facilitate the implementation of National Foundation for Infectious Diseases/Centers for Disease Control and Prevention strategies in U.S. hospitals. We surveyed 448 infection control professionals from a national sample of hospitals. Clinically anchored in the Donabedian model that defines quality in terms of structural and process factors, with the structural domain further informed by a contingency approach, we modeled the degree to which National Foundation for Infectious Diseases and Centers for Disease Control and Prevention strategies were implemented as a function of formalization and standardization of protocols, centralization of decision-making hierarchy, information technology capabilities, culture, communication mechanisms, and interdepartmental coordination, controlling for hospital characteristics. Formalization, standardization, centralization, institutional culture, provider-management communication, and information technology use were associated with optimal antibiotic use and enhanced implementation of strategies that prevent and control antimicrobial resistance spread (all p < .001). However, interdepartmental coordination for patient care was inversely related with antibiotic use in contrast to antimicrobial resistance spread prevention and control (p < .0001). Formalization and standardization may eliminate staff role conflict, whereas centralized authority may minimize ambiguity. Culture and communication likely promote internal trust, whereas information technology use helps integrate and support these organizational processes. These findings suggest concrete strategies for evaluating current capabilities to implement effective practices and foster and sustain a culture of patient safety.
Particle Pollution Designations
This area provides information on the process EPA, the states, and the tribes follow to designate areas as attainment (meeting) or nonattainment (not meeting) the particle pollution air quality standards.
This area provides information on the process EPA, the states, and the tribes follow to designate areas as attainment (meeting) or nonattainment (not meeting) the sulfur dioxide air quality standards.
NOx SOx Secondary NAAQS: Integrated Review Plan - CASAC Advisory
The NOx SOx Secondary NAAQS Integrated Review Plan is the first document generated as part of the National Ambient Air Quality Standards (NAAQS) review process. The Plan presents background information, the schedule for the review, the process to be used in conducting the review,...
Standards to support information systems integration in anatomic pathology.
Daniel, Christel; García Rojo, Marcial; Bourquard, Karima; Henin, Dominique; Schrader, Thomas; Della Mea, Vincenzo; Gilbertson, John; Beckwith, Bruce A
2009-11-01
Integrating anatomic pathology information- text and images-into electronic health care records is a key challenge for enhancing clinical information exchange between anatomic pathologists and clinicians. The aim of the Integrating the Healthcare Enterprise (IHE) international initiative is precisely to ensure interoperability of clinical information systems by using existing widespread industry standards such as Digital Imaging and Communication in Medicine (DICOM) and Health Level Seven (HL7). To define standard-based informatics transactions to integrate anatomic pathology information to the Healthcare Enterprise. We used the methodology of the IHE initiative. Working groups from IHE, HL7, and DICOM, with special interest in anatomic pathology, defined consensual technical solutions to provide end-users with improved access to consistent information across multiple information systems. The IHE anatomic pathology technical framework describes a first integration profile, "Anatomic Pathology Workflow," dedicated to the diagnostic process including basic image acquisition and reporting solutions. This integration profile relies on 10 transactions based on HL7 or DICOM standards. A common specimen model was defined to consistently identify and describe specimens in both HL7 and DICOM transactions. The IHE anatomic pathology working group has defined standard-based informatics transactions to support the basic diagnostic workflow in anatomic pathology laboratories. In further stages, the technical framework will be completed to manage whole-slide images and semantically rich structured reports in the diagnostic workflow and to integrate systems used for patient care and those used for research activities (such as tissue bank databases or tissue microarrayers).
Sirvent, Mariola; Victoria Calvo, María; Sagalés, María; Rodríguez-Penin, Isaura; Cervera, Mercedes; Piñeiro, Guadalupe; García-Rodicio, Sonsoles; Gomis, Pilar; Caba, Isabel; Vazquez, Amparo; Gomez, María E; Pedraza, Luis
2013-01-01
To identify and develop monitoring indicators of the process of specialized nutritional support that will allow measuring the level of adherence to the established practice standards. Those practice standards considered to be key elements of the process were selected to develop performance indicators. The construction of these indicators combined the scientific evidence with expert opinion. Key goals were identified within each standard provided that its consecution would allow increasing the achievement of the standard. Particular improvement initiatives associated to each key goal were generated. Lastly, monitoring indicators were defined allowing undertaking a follow-up of the implementation of the improvement initiatives or either to assess the level of achievement of the key goals identified. Nineteen practice standards were selected representative of the critical points of the process. The strategic map for each standard has been defined, with the identification of 43 key goals. In order to achieve these key goals, a portfolio of improvements has been generated comprising 56 actions. Finally, 44 monitoring indicators have been defined grouped into three categories: 1. Numeric: they assess the level of goal achievement; 2. Dichotomic (yes/no): they inform on the execution of the improvement actions; 3. Results of the practice audits. We have made available monitoring indicators that allow assessing the level of adherence to the practice standards of the process of specialized nutritional support and the impact of the implementation of improvement actions within this process. Copyright © 2013 SEFH. Published by AULA MEDICA. All rights reserved.
Automatic processing of pragmatic information in the human brain: a mismatch negativity study.
Zhao, Ming; Liu, Tao; Chen, Feiyan
2018-05-23
Language comprehension involves pragmatic information processing, which allows world knowledge to influence the interpretation of a sentence. This study explored whether pragmatic information can be automatically processed during spoken sentence comprehension. The experiment adopted the mismatch negativity (MMN) paradigm to capture the neurophysiological indicators of automatic processing of spoken sentences. Pragmatically incorrect ('Foxes have wings') and correct ('Butterflies have wings') sentences were used as the experimental stimuli. In condition 1, the pragmatically correct sentence was the deviant and the pragmatically incorrect sentence was the standard stimulus, whereas the opposite case was presented in condition 2. The experimental results showed that, compared with the condition that the pragmatically correct sentence is the deviant stimulus, when the condition that the pragmatically incorrect sentence is the deviant stimulus MMN effects were induced within 60-120 and 220-260 ms. The results indicated that the human brain can monitor for incorrect pragmatic information in the inattentive state and can automatically process pragmatic information at the beginning of spoken sentence comprehension.
Semantic Technologies and Bio-Ontologies.
Gutierrez, Fernando
2017-01-01
As information available through data repositories constantly grows, the need for automated mechanisms for linking, querying, and sharing data has become a relevant factor both in research and industry. This situation is more evident in research fields such as the life sciences, where new experiments by different research groups are constantly generating new information regarding a wide variety of related study objects. However, current methods for representing information and knowledge are not suited for machine processing. The Semantic Technologies are a set of standards and protocols that intend to provide methods for representing and handling data that encourages reusability of information and is machine-readable. In this chapter, we will provide a brief introduction to Semantic Technologies, and how these protocols and standards have been incorporated into the life sciences to facilitate dissemination and access to information.
Treatment BMP technology report.
DOT National Transportation Integrated Search
2006-04-01
The Treatment BMP Technology Report consolidates and standardizes information on storm : water quality technologies that are part of the California Department of Transportations : (Departments) BMP identification, and evaluation process describ...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-22
... information that is used to assess inherent risks and internal control processes. Such activities include... management and information systems; and internal controls. The financial condition rating is supported by... appropriate standards of capitalization, liquidity, and risk management consistent with the principles of...
75 FR 82406 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-30
... interstate forms. 45 CFR 303.7 also requires CSE IV-D agencies to transmit child support case information on standard interstate forms when referring cases to other States and Territories for processing. During the... instructions have been clarified by highlighting policy information that was included with the instructions so...
Systems Management of Air Force Standard Communications-Computer systems: There is a Better Way
1988-04-01
upgrade or replacement of systems. AFR 700-6, Information Systems Operation Management , AFR 700-7, Information Processing Center Opera- tions Management...and AFR 700-8, Telephone Systems Operation Management provide USAF guidance, policy and procedures governing this phase. 4 2. 800-Series Regulations
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-28
... information as part of the research needed to write a NIST Special Publication (SP) to help Computer Security.... The NIST SP will identify technical standards, methodologies, procedures, and processes that facilitate prompt and effective response. This RFI requests information regarding technical best practices...
28 CFR 51.28 - Supplemental contents.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Information Processing Standards (FIPS-55) code. (v) Census tracts shall be left justified, and census blocks shall be left justified and blank filled if less than four characters. (vi) Unused plan fields shall be... affected, containing the following information: (1) The prior and new boundaries of the voting unit or...
A randomized trial comparing concise and standard consent forms in the START trial
Touloumi, Giota; Walker, A. Sarah; Smolskis, Mary; Sharma, Shweta; Babiker, Abdel G.; Pantazis, Nikos; Tavel, Jorge; Florence, Eric; Sanchez, Adriana; Hudson, Fleur; Papadopoulos, Antonios; Emanuel, Ezekiel; Clewett, Megan; Munroe, David; Denning, Eileen
2017-01-01
Background Improving the effectiveness and efficiency of research informed consent is a high priority. Some express concern about longer, more complex, written consent forms creating barriers to participant understanding. A recent meta-analysis concluded that randomized comparisons were needed. Methods We conducted a cluster-randomized non-inferiority comparison of a standard versus concise consent form within a multinational trial studying the timing of starting antiretroviral therapy in HIV+ adults (START). Interested sites were randomized to standard or concise consent forms for all individuals signing START consent. Participants completed a survey measuring comprehension of study information and satisfaction with the consent process. Site personnel reported usual site consent practices. The primary outcome was comprehension of the purpose of randomization (pre-specified 7.5% non-inferiority margin). Results 77 sites (2429 participants) were randomly allocated to use standard consent and 77 sites (2000 participants) concise consent, for an evaluable cohort of 4229. Site and participant characteristics were similar for the two groups. The concise consent was non-inferior to the standard consent on comprehension of randomization (80.2% versus 82%, site adjusted difference: 0.75% (95% CI -3.8%, +5.2%)); and the two groups did not differ significantly on total comprehension score, satisfaction, or voluntariness (p>0.1). Certain independent factors, such as education, influenced comprehension and satisfaction but not differences between consent groups. Conclusions An easier to read, more concise consent form neither hindered nor improved comprehension of study information nor satisfaction with the consent process among a large number of participants. This supports continued efforts to make consent forms more efficient. Trial registration Informed consent substudy was registered as part of START study in clinicaltrials.gov #NCT00867048, and EudraCT # 2008-006439-12 PMID:28445471
Do surgeons and patients discuss what they document on consent forms?
Hall, Daniel E; Hanusa, Barbara H; Fine, Michael J; Arnold, Robert M
2015-07-01
Previous studies of surgeon behavior report that surgeons rarely meet basic standards of informed consent, raising concerns that current practice requires urgent remediation. We wondered if the Veterans Affairs Healthcare System's recent implementation of standardized, procedure-specific consent forms might produce a better practice of informed consent than has been reported previously. Our goal was to determine how the discussions shared between surgeons and patients correspond to the VA's standardized consent forms. We enrolled a prospective cohort of patients presenting for possible cholecystectomy or inguinal herniorrhaphy and the surgical providers for those patients. Audio recordings captured the clinical encounter(s) culminating in a decision to have surgery. Each patient's informed consent was documented using a standardized, computer-generated form. We abstracted and compared the information documented with the information discussed. Of 75 consecutively enrolled patients, 37 eventually decided to have surgery and signed the standardized consent form. Patients and providers discussed 37% (95% confidence interval, 0.07-0.67) and 33% (95% confidence interval, 0.21-0.43) of the information found on the cholecystectomy and herniorrhaphy consent forms, respectively. However, the patient-provider discussions frequently included relevant details nowhere documented on the standardized forms, culminating in discussions that included a median 27.5 information items for cholecystectomy and 20 items for herniorrhaphy. Fully, 80% of cholecystectomy discussions and 76% of herniorrhaphy discussions mentioned at least one risk, benefit or alternative, indication for, and description of the procedure. The patients and providers observed here collaborated in a detailed process of informed consent that challenges the initial reports suggesting the need to remediate surgeon's practice of informed consent. However, because the discrepancy between the information documented and discussed exposes legal and ethical liability, there is an opportunity to improve the iMed system so that it better reflects what surgeons discuss and more frequently includes all the information patients need. Published by Elsevier Inc.
From IHE Audit Trails to XES Event Logs Facilitating Process Mining.
Paster, Ferdinand; Helm, Emmanuel
2015-01-01
Recently Business Intelligence approaches like process mining are applied to the healthcare domain. The goal of process mining is to gain process knowledge, compliance and room for improvement by investigating recorded event data. Previous approaches focused on process discovery by event data from various specific systems. IHE, as a globally recognized basis for healthcare information systems, defines in its ATNA profile how real-world events must be recorded in centralized event logs. The following approach presents how audit trails collected by the means of ATNA can be transformed to enable process mining. Using the standardized audit trails provides the ability to apply these methods to all IHE based information systems.
Proposed Computer System for Library Catalog Maintenance. Part II: System Design.
ERIC Educational Resources Information Center
Stein (Theodore) Co., New York, NY.
The logic of the system presented in this report is divided into six parts for computer processing and manipulation. They are: (1) processing of Library of Congress copy, (2) editing of input into standard format, (3) processing of information into and out from the authority files, (4) creation of the catalog records, (5) production of the…
Voss, Erica A; Makadia, Rupa; Matcho, Amy; Ma, Qianli; Knoll, Chris; Schuemie, Martijn; DeFalco, Frank J; Londhe, Ajit; Zhu, Vivienne; Ryan, Patrick B
2015-05-01
To evaluate the utility of applying the Observational Medical Outcomes Partnership (OMOP) Common Data Model (CDM) across multiple observational databases within an organization and to apply standardized analytics tools for conducting observational research. Six deidentified patient-level datasets were transformed to the OMOP CDM. We evaluated the extent of information loss that occurred through the standardization process. We developed a standardized analytic tool to replicate the cohort construction process from a published epidemiology protocol and applied the analysis to all 6 databases to assess time-to-execution and comparability of results. Transformation to the CDM resulted in minimal information loss across all 6 databases. Patients and observations excluded were due to identified data quality issues in the source system, 96% to 99% of condition records and 90% to 99% of drug records were successfully mapped into the CDM using the standard vocabulary. The full cohort replication and descriptive baseline summary was executed for 2 cohorts in 6 databases in less than 1 hour. The standardization process improved data quality, increased efficiency, and facilitated cross-database comparisons to support a more systematic approach to observational research. Comparisons across data sources showed consistency in the impact of inclusion criteria, using the protocol and identified differences in patient characteristics and coding practices across databases. Standardizing data structure (through a CDM), content (through a standard vocabulary with source code mappings), and analytics can enable an institution to apply a network-based approach to observational research across multiple, disparate observational health databases. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.
Barriers to success: physical separation optimizes event-file retrieval in shared workspaces.
Klempova, Bibiana; Liepelt, Roman
2017-07-08
Sharing tasks with other persons can simplify our work and life, but seeing and hearing other people's actions may also be very distracting. The joint Simon effect (JSE) is a standard measure of referential response coding when two persons share a Simon task. Sequential modulations of the joint Simon effect (smJSE) are interpreted as a measure of event-file processing containing stimulus information, response information and information about the just relevant control-state active in a given social situation. This study tested effects of physical (Experiment 1) and virtual (Experiment 2) separation of shared workspaces on referential coding and event-file processing using a joint Simon task. In Experiment 1, participants performed this task in individual (go-nogo), joint and standard Simon task conditions with and without a transparent curtain (physical separation) placed along the imagined vertical midline of the monitor. In Experiment 2, participants performed the same tasks with and without receiving background music (virtual separation). For response times, physical separation enhanced event-file retrieval indicated by an enlarged smJSE in the joint Simon task with curtain than without curtain (Experiment1), but did not change referential response coding. In line with this, we also found evidence for enhanced event-file processing through physical separation in the joint Simon task for error rates. Virtual separation did neither impact event-file processing, nor referential coding, but generally slowed down response times in the joint Simon task. For errors, virtual separation hampered event-file processing in the joint Simon task. For the cognitively more demanding standard two-choice Simon task, we found music to have a degrading effect on event-file retrieval for response times. Our findings suggest that adding a physical separation optimizes event-file processing in shared workspaces, while music seems to lead to a more relaxed task processing mode under shared task conditions. In addition, music had an interfering impact on joint error processing and more generally when dealing with a more complex task in isolation.
40 CFR 63.1306 - Reporting requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Standards for Hazardous Air Pollutants for Flexible Polyurethane Foam Production § 63.1306 Reporting... the information listed in paragraph (d)(4) of this section for molded foam processes and in paragraph (d)(5) for rebond foam processes. (1) A list of diisocyanate storage vessels, along with a record of...
Satisfaction Formation Processes in Library Users: Understanding Multisource Effects
ERIC Educational Resources Information Center
Shi, Xi; Holahan, Patricia J.; Jurkat, M. Peter
2004-01-01
This study explores whether disconfirmation theory can explain satisfaction formation processes in library users. Both library users' needs and expectations are investigated as disconfirmation standards. Overall library user satisfaction is predicted to be a function of two independent sources--satisfaction with the information product received…
Mediation, Alignment, and Information Services for Semantic interoperability (MAISSI): A Trade Study
2007-06-01
Modeling Notation ( BPMN ) • Business Process Definition Metamodel (BPDM) A Business Process (BP) is a defined sequence of steps to be executed in...enterprise applications, to evaluate the capabilities of suppliers, and to compare against the competition. BPMN standardizes flowchart diagrams that
The SGML Standardization Framework and the Introduction of XML
Grütter, Rolf
2000-01-01
Extensible Markup Language (XML) is on its way to becoming a global standard for the representation, exchange, and presentation of information on the World Wide Web (WWW). More than that, XML is creating a standardization framework, in terms of an open network of meta-standards and mediators that allows for the definition of further conventions and agreements in specific business domains. Such an approach is particularly needed in the healthcare domain; XML promises to especially suit the particularities of patient records and their lifelong storage, retrieval, and exchange. At a time when change rather than steadiness is becoming the faithful feature of our society, standardization frameworks which support a diversified growth of specifications that are appropriate to the actual needs of the users are becoming more and more important; and efforts should be made to encourage this new attempt at standardization to grow in a fruitful direction. Thus, the introduction of XML reflects a standardization process which is neither exclusively based on an acknowledged standardization authority, nor a pure market standard. Instead, a consortium of companies, academic institutions, and public bodies has agreed on a common recommendation based on an existing standardization framework. The consortium's process of agreeing to a standardization framework will doubtlessly be successful in the case of XML, and it is suggested that it should be considered as a generic model for standardization processes in the future. PMID:11720931
The SGML standardization framework and the introduction of XML.
Fierz, W; Grütter, R
2000-01-01
Extensible Markup Language (XML) is on its way to becoming a global standard for the representation, exchange, and presentation of information on the World Wide Web (WWW). More than that, XML is creating a standardization framework, in terms of an open network of meta-standards and mediators that allows for the definition of further conventions and agreements in specific business domains. Such an approach is particularly needed in the healthcare domain; XML promises to especially suit the particularities of patient records and their lifelong storage, retrieval, and exchange. At a time when change rather than steadiness is becoming the faithful feature of our society, standardization frameworks which support a diversified growth of specifications that are appropriate to the actual needs of the users are becoming more and more important; and efforts should be made to encourage this new attempt at standardization to grow in a fruitful direction. Thus, the introduction of XML reflects a standardization process which is neither exclusively based on an acknowledged standardization authority, nor a pure market standard. Instead, a consortium of companies, academic institutions, and public bodies has agreed on a common recommendation based on an existing standardization framework. The consortium's process of agreeing to a standardization framework will doubtlessly be successful in the case of XML, and it is suggested that it should be considered as a generic model for standardization processes in the future.
[Overview of the US policies for health information technology and lessons learned for Israel].
Topaz, Maxim; Ash, Nachman
2013-05-01
The heaLthcare system in the United States (U.S.) faces a number of significant changes aimed at improving the quality and availability of medical services and reducing costs. Implementation of health information technologies, especiaLly ELectronic Health Records (EHR), is central to achieving these goals. Several recent Legislative efforts in the U.S. aim at defining standards and promoting wide scale "Meaningful Use" of the novel technologies. In Israel, the majority of heaLthcare providers adopted EHR throughout the Last decade. Unlike the U.S., the process of EHR adoption occurred spontaneously, without governmental control or the definition of standards. In this article, we review the U.S. health information technology policies and standards and suggest potential lessons Learned for Israel. First, we present the three-staged Meaningful Use regulations that require eligible healthcare practitioners to use EHR in their practice. We also describe the standards for EHR certification and national efforts to create interoperable health information technology networks. Finally, we provide a brief overview of the IsraeLi regulation in the field of EHR. Although the adoption of health information technology is wider in Israel, the Lack of technology standards and governmental control has Led to Large technology gaps between providers. The example of the U.S. Legislation urges the adoption of several critical steps to further enhance the quality and efficiency of the Israeli healthcare system, in particular: strengthening health information technology regulation; developing Licensure criteria for health information technology; bridging the digital gap between healthcare organizations; defining quality measures; and improving the accessibility of health information for patients.
NASA Technical Reports Server (NTRS)
Jester, Peggy L.; Lee, Jeffrey; Zukor, Dorothy J. (Technical Monitor)
2001-01-01
This document addresses the software requirements of the Geoscience Laser Altimeter System (GLAS) Standard Data Software (SDS) supporting the GLAS instrument on the EOS ICESat Spacecraft. This Software Requirements Document represents the initial collection of the technical engineering information for the GLAS SDS. This information is detailed within the second of four main volumes of the Standard documentation, the Product Specification volume. This document is a "roll-out" from the governing volume outline containing the Concept and Requirements sections.
Safe patient care - safety culture and risk management in otorhinolaryngology.
St Pierre, Michael
2013-12-13
Safety culture is positioned at the heart of an organization's vulnerability to error because of its role in framing organizational awareness to risk and in providing and sustaining effective strategies of risk management. Safety related attitudes of leadership and management play a crucial role in the development of a mature safety culture ("top-down process"). A type marker for organizational culture and thus a predictor for an organization's maturity in respect to safety is information flow and in particular an organization's general way of coping with information that suggests anomaly. As all values and beliefs, relationships, learning, and other aspects of organizational safety culture are about sharing and processing information, safety culture has been termed "informed culture". An informed culture is free of blame and open for information provided by incidents. "Incident reporting systems" are the backbone of a reporting culture, where good information flow is likely to support and encourage other kinds of cooperative behavior, such as problem solving, innovation, and inter-departmental bridging. Another facet of an informed culture is the free flow of information during perioperative patient care. The World Health Organization's safe surgery checklist" is the most prevalent example of a standardized information exchange aimed at preventing patient harm due to information deficit. In routine tasks mandatory standard operating procedures have gained widespread acceptance in guaranteeing the highest possible process quality. Technical and non-technical skills of healthcare professionals are the decisive human resource for an efficient and safe delivery of patient care and the avoidance of errors. The systematic enhancement of staff qualification by providing training opportunities can be a major investment in patient safety. In recent years several otorhinolaryngology departments have started to incorporate stimulation based team trainings into their curriculum.
[Safe patient care: safety culture and risk management in otorhinolaryngology].
St Pierre, M
2013-04-01
Safety culture is positioned at the heart of an organisation's vulnerability to error because of its role in framing organizational awareness to risk and in providing and sustaining effective strategies of risk management. Safety related attitudes of leadership and management play a crucial role in the development of a mature safety culture ("top-down process"). A type marker for organizational culture and thus a predictor for an organizations maturity in respect to safety is information flow and in particular an organization's general way of coping with information that suggests anomaly. As all values and beliefs, relationships, learning, and other aspects of organizational safety culture are about sharing and processing information, safety culture has been termed "informed culture". An informed culture is free of blame and open for information provided by incidents. "Incident reporting systems" are the backbone of a reporting culture, where good information flow is likely to support and encourage other kinds of cooperative behavior, such as problem solving, innovation, and inter-departmental bridging. Another facet of an informed culture is the free flow of information during perioperative patient care. The World Health Organisation's "safe surgery checklist" is the most prevalent example of a standardized information exchange aimed at preventing patient harm due to information deficit. In routine tasks mandatory standard operating procedures have gained widespread acceptance in guaranteeing the highest possible process quality.Technical and non-technical skills of healthcare professionals are the decisive human resource for an efficient and safe delivery of patient care and the avoidance of errors. The systematic enhancement of staff qualification by providing training opportunities can be a major investment in patient safety. In recent years several otorhinolaryngology departments have started to incorporate simulation based team trainings into their curriculum. © Georg Thieme Verlag KG Stuttgart · New York.
The standard of care: a case report and ethical analysis.
La Puma, J; Schiedermayer, D L; Toulmin, S; Miles, S H; McAtee, J A
1988-01-01
Physicians increasingly allow their perceived legal responsibilities to displace their clinical judgment. Misunderstandings that surround the term "standard of care" have encouraged fears of liability and have led to the practice of defensive medicine. Physicians may consider the standard of care to be a technical or legal obligation, but an optimal standard would be one based on detailed knowledge of a patient's medical history and personal condition. It would include the physician's clinical judgment, which integrates specific technical and legal information with clinical experience in caring for patients. Occasionally, such judgment may conflict with the rulings of a court, which considers technical and legal information without the benefit of clinical judgment. Physicians must be prepared to be advocates for their patients, especially when legal proceedings are flawed or injurious. Systematic processes of examination and analysis, such as those used by ethics consultants, can help resolve questions about the standard of care.
Wirshing, Donna A; Sergi, Mark J; Mintz, Jim
2005-01-01
This study evaluated a brief educational video designed to enhance the informed consent process for people with serious mental and medical illnesses who are considering participating in treatment research. Individuals with schizophrenia who were being recruited for ongoing clinical trials, medical patients without self-reported psychiatric comorbidity, and university undergraduates were randomly assigned to view either a highly structured instructional videotape about the consent process in treatment research or a control videotape that presented only general information about bioethical issues in human research. Knowledge about informed consent was measured before and after viewing. Viewing the experimental videotape resulted in larger gains in knowledge about informed consent. Standardized effect sizes were large in all groups. The videotape was thus an effective teaching tool across diverse populations, ranging from individuals with severe chronic mental illness to university undergraduates.
Laser ablation ICP-MS applications using the timescales of geologic and biologic processes
NASA Astrophysics Data System (ADS)
Ridley, W. I.
2003-04-01
Geochemists commonly examine geologic processes on timescales of 10^4--10^9 years, and accept that often age relations, e.g., chemical zoning in minerals, can only be measured in a relative sense. The progression of a geologic process that involves geochemical changes may be assessed using trace element microbeam techniques, because the textural, and therefore spatial context, of the analytical scheme can be preserved. However, quantification requires appropriate calibration standards. Laser ablation ICP-MS (LA-ICP-MS) is proving particularly useful now that appropriate standards are becoming available. For instance, trace element zoning patterns in primary sulfides (e.g., pyrite, sphalerite, chalcopyrite, galena) and secondary phases can be inverted to examine relative changes in fluid composition during cycles of hydrothermal mineralization. In turn such information provides insights into fluid sources, migration pathways and depositional processes. These studies have only become possible with the development of appropriate sulfide calibration standards. Another example, made possible with the development of appropriate silicate calibration standards, is the quantitative spatial mapping of REE variations in amphibolite-grade garnets. The recognition that the trace and major elements are decoupled provides a better understanding of the various sources of elements during metamorphic re-equilibration. There is also a growing realization that LA-ICP-MS has potential in biochemical studies, and geochemists have begun to turn their attention in this direction, working closely with biologists. Unlike many geologic processes, the timescales of biologic processes are measured in years to centuries and are frequently amenable to absolute dating. Examples that can be cited where LA-ICP-MS has been applied include annual trace metal variations in tree rings, corals, teeth, bones, bird feathers and various animal vibrissae (sea lion, walrus, wolf). The aim of such studies is to correlate trace element variations with changes in environmental variables. Such studies are proving informative in climate change and habitat management. Again, such variations have been quantified with the availability of appropriate organic, carbonate and phosphate calibration standards.
Spatial Information Processing: Standards-Based Open Source Visualization Technology
NASA Astrophysics Data System (ADS)
Hogan, P.
2009-12-01
. Spatial information intelligence is a global issue that will increasingly affect our ability to survive as a species. Collectively we must better appreciate the complex relationships that make life on Earth possible. Providing spatial information in its native context can accelerate our ability to process that information. To maximize this ability to process information, three basic elements are required: data delivery (server technology), data access (client technology), and data processing (information intelligence). NASA World Wind provides open source client and server technologies based on open standards. The possibilities for data processing and data sharing are enhanced by this inclusive infrastructure for geographic information. It is interesting that this open source and open standards approach, unfettered by proprietary constraints, simultaneously provides for entirely proprietary use of this same technology. 1. WHY WORLD WIND? NASA World Wind began as a single program with specific functionality, to deliver NASA content. But as the possibilities for virtual globe technology became more apparent, we found that while enabling a new class of information technology, we were also getting in the way. Researchers, developers and even users expressed their desire for World Wind functionality in ways that would service their specific needs. They want it in their web pages. They want to add their own features. They want to manage their own data. They told us that only with this kind of flexibility, could their objectives and the potential for this technology be truly realized. World Wind client technology is a set of development tools, a software development kit (SDK) that allows a software engineer to create applications requiring geographic visualization technology. 2. MODULAR COMPONENTRY Accelerated evolution of a technology requires that the essential elements of that technology be modular components such that each can advance independent of the other elements. World Wind therefore changed its mission from providing a single information browser to enabling a whole class of 3D geographic applications. Instead of creating a single program, World Wind is a suite of components that can be selectively used in any number of programs. World Wind technology can be a part of any application, or it can be a window in a web page. Or it can be extended with additional functionalities by application and web developers. World Wind makes it possible to include virtual globe visualization and server technology in support of any objective. The world community can continually benefit from advances made in the technology by NASA in concert with the world community. 3. OPEN SOURCE AND OPEN STANDARDS NASA World Wind is NASA Open Source software. This means that the source code is fully accessible for anyone to freely use, even in association with proprietary technology. Imagery and other data provided by the World Wind servers reside in the public domain, including the data server technology itself. This allows others to deliver their own geospatial data and to provide custom solutions based on users specific needs.
Toward a Qualitative Analysis of Standardized Tests Using an Information Processing Model.
ERIC Educational Resources Information Center
Armour-Thomas, Eleanor
The use of standardized tests and test data to detect and address differences in cognitive styles is advocated here. To this end, the paper describes the componential theory of intelligence addressed by Sternberg et. al. This theory defines the components of intelligence by function and level of generality, including: (1) metacomponents: higher…
Coordination and standardization of federal sedimentation activities
Glysson, G. Douglas; Gray, John R.
1997-01-01
- precipitation information critical to water resources management. Memorandum M-92-01 covers primarily freshwater bodies and includes activities, such as "development and distribution of consensus standards, field-data collection and laboratory analytical methods, data processing and interpretation, data-base management, quality control and quality assurance, and water- resources appraisals, assessments, and investigations." Research activities are not included.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-05
... its virtual contact interface be made mandatory as soon as possible for the many beneficial features... messaging and the virtual contact interface in the Standard, some Federal departments and agencies have... Laboratory Programs. [FR Doc. 2013-21491 Filed 9-4-13; 8:45 am] BILLING CODE 3510-13-P ...
LANDSAT 3 world standard catalog, 6 March - 31 July 1978
NASA Technical Reports Server (NTRS)
1978-01-01
The World Standard Catalog lists imagery acquired by LANDSAT 3 which was processed and input to the data files during the referenced period. Information such as date of entry, cloud cover, and image quality is given for each scene. The microfilm roll and frame on which the scene may be found is also indicated.
Survey of existing performance requirements in codes and standards for light-frame construction
G. E. Sherwood
1980-01-01
Present building codes and standards are a combination of specifications and performance criteria. Where specifications prevail, the introduction f new materials or methods can be a long, cumbersome process. To facilitate the introduction of new technology, performance requirements are becoming more prevalent. In some areas, there is a lack of information on which to...
Code of Federal Regulations, 2010 CFR
2010-07-01
... Society for Testing and Materials (ASTM) or International Standards Organization (ISO), shall be approved... description of the technology and/or instrumentation that makes the method functional. (2) Information... part 51. Anyone may purchase copies of this standard from the American Society for Testing and...
Code of Federal Regulations, 2011 CFR
2011-07-01
... Society for Testing and Materials (ASTM) or International Standards Organization (ISO), shall be approved... description of the technology and/or instrumentation that makes the method functional. (2) Information... part 51. Anyone may purchase copies of this standard from the American Society for Testing and...
Defense ADP Acquisition Study.
1981-11-30
Logistics ALS - Advanced Logistics System AMP - ADPS Master Plan ANSI - American National Standards Institute APR - Agency Procurement Request ASD(C...Computers IRM - Information Resources Management ISO - International Standards Organization L LCC - Life Cycle Costs LCM - Life Cycle Management LE...man- agement in the process * Lack of a mission orientation . Lack of systems management and life cycle perspectives * Lack of effective leadership
ERIC Educational Resources Information Center
Bentley, Michael L.; Ebert, Edward S., II; Ebert, Christine
2007-01-01
Good teachers know that science is more than just a collection of facts in a textbook and that teaching science goes beyond the mere transmission of information. Actively engaging students in the learning process is critical to building their knowledge base, assessing progress, and meeting science standards. This book shows teachers how to…
ERIC Educational Resources Information Center
Abdel-Messih, Ibrahim Adib; El-Setouhy, Maged; Crouch, Michael M.; Earhart, Kenneth C.
2008-01-01
Research is conducted in a variety of cultural settings. Ethical standards developed in Europe and the Americas are increasingly applied in these settings, many of which are culturally different from the countries in which these standards originated. To overcome these cultural differences, investigators may be tempted to deviate from ethical…
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCord, Jason
WLS gathers all known relevant contextual data along with standard event log information, processes it into an easily consumable format for analysis by 3rd party tools, and forwards the logs to any compatible log server.
Pathak, Jyotishman; Bailey, Kent R; Beebe, Calvin E; Bethard, Steven; Carrell, David S; Chen, Pei J; Dligach, Dmitriy; Endle, Cory M; Hart, Lacey A; Haug, Peter J; Huff, Stanley M; Kaggal, Vinod C; Li, Dingcheng; Liu, Hongfang; Marchant, Kyle; Masanz, James; Miller, Timothy; Oniki, Thomas A; Palmer, Martha; Peterson, Kevin J; Rea, Susan; Savova, Guergana K; Stancl, Craig R; Sohn, Sunghwan; Solbrig, Harold R; Suesse, Dale B; Tao, Cui; Taylor, David P; Westberg, Les; Wu, Stephen; Zhuo, Ning; Chute, Christopher G
2013-01-01
Research objective To develop scalable informatics infrastructure for normalization of both structured and unstructured electronic health record (EHR) data into a unified, concept-based model for high-throughput phenotype extraction. Materials and methods Software tools and applications were developed to extract information from EHRs. Representative and convenience samples of both structured and unstructured data from two EHR systems—Mayo Clinic and Intermountain Healthcare—were used for development and validation. Extracted information was standardized and normalized to meaningful use (MU) conformant terminology and value set standards using Clinical Element Models (CEMs). These resources were used to demonstrate semi-automatic execution of MU clinical-quality measures modeled using the Quality Data Model (QDM) and an open-source rules engine. Results Using CEMs and open-source natural language processing and terminology services engines—namely, Apache clinical Text Analysis and Knowledge Extraction System (cTAKES) and Common Terminology Services (CTS2)—we developed a data-normalization platform that ensures data security, end-to-end connectivity, and reliable data flow within and across institutions. We demonstrated the applicability of this platform by executing a QDM-based MU quality measure that determines the percentage of patients between 18 and 75 years with diabetes whose most recent low-density lipoprotein cholesterol test result during the measurement year was <100 mg/dL on a randomly selected cohort of 273 Mayo Clinic patients. The platform identified 21 and 18 patients for the denominator and numerator of the quality measure, respectively. Validation results indicate that all identified patients meet the QDM-based criteria. Conclusions End-to-end automated systems for extracting clinical information from diverse EHR systems require extensive use of standardized vocabularies and terminologies, as well as robust information models for storing, discovering, and processing that information. This study demonstrates the application of modular and open-source resources for enabling secondary use of EHR data through normalization into standards-based, comparable, and consistent format for high-throughput phenotyping to identify patient cohorts. PMID:24190931
A process for developing standards to promote quality in general practice.
Khoury, Julie; Krejany, Catherine J; Versteeg, Roald W; Lodewyckx, Michaela A; Pike, Simone R; Civil, Michael S; Jiwa, Moyez
2018-06-02
Since 1991, the Royal Australian College of General Practitioners' (RACGP) Standards for General Practices (the Standards) have provided a framework for quality care, risk management and best practice in the operation of Australian general practices. The Standards are also linked to incentives for general practice remuneration. These Standards were revised in 2017. The objective of this study is to describe the process undertaken to develop the fifth edition Standards published in 2017 to inform future standards development both nationally and internationally. A modified Delphi process was deployed to develop the fifth edition Standards. Development was directed by the RACGP and led by an expert panel of GPs and representatives of stakeholder groups who were assisted and facilitated by a team from RACGP. Each draft was released for stakeholder feedback and tested twice before the final version was submitted for approval by the RACGP board. Four rounds of consultation and two rounds of piloting were carried out over 32 months. The Standards were redrafted after each round. One hundred and fifty-two individuals and 225 stakeholder groups participated in the development of the Standards. Twenty-three new indicators were recommended and grouped into three sections in a new modular structure that was different from the previous edition. The Standards represent the consensus view of national stakeholders on the indicators of quality and safety in Australian general practice and primary care.
Sulfur Dioxide (SO2) Primary NAAQS Review: Integrated Review Plan - Advisory with CASAC
The SO2 Integrated Review Plan is the first document generated as part of the National Ambient Air Quality Standards (NAAQS) review process. The Plan presents background information, the schedule for the review, the process to be used in conducting the review, and the key policy-...
Personalized Learning Path Based on Metadata Standards
ERIC Educational Resources Information Center
Colace, Francesco; De Santo, Massimo; Vento, Mi
2005-01-01
Thanks to the technological improvements of recent years, distance education represents a real alternative or support to the traditional formative processes. The Internet allows the design of contents, which are able to raise the quality of the traditional formative process. However, the amount of information students can obtain from the Internet…
Proceduralism and Bureaucracy: Due Process in the School Setting
ERIC Educational Resources Information Center
Kirp, David L.
1976-01-01
The likely consequences of applying traditional due process standards, expecially formal adversary hearings, to the public school are examined. The ruling in Goss v. Lopez suggests that fair treatment can still be expected if the hearings are treated as opportunities for candid and informal exchange rather than prepunishment ceremonies. (LBH)
Process monitoring in modern safeguards applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ehinger, M.H.
1989-11-01
From the safeguards standpoint, regulatory requirements are finally moving into the modern world of communication and information processing. Gone are the days when the accountant with the green eye shade and arm bands made judgments on the material balance a month after the balance was closed. The most recent Nuclear Regulatory Commission (NRC) regulations and U.S. Department of Energy (DOE) orders have very strict standards for timeliness and sensitivity to loss or removal of material. The latest regulations recognize that plant operators have a lot of information on and control over the location and movement of material within their facilities.more » This information goes beyond that traditionally reported under accountability requirements. These new regulations allow facility operators to take credit for many of the more informal process controls.« less
Antoniou, A; Marmai, K; Qasem, F; Cherry, R; Jones, P M; Singh, S
2018-05-01
Informed consent is required before placing an epidural. At our hospital, teaching of residents about this is done informally at the bedside. This study aimed to assess the ability of anesthesia residents to acquire and retain knowledge required when seeking informed consent for epidural labor analgesia. It assessed how well this knowledge was translated to clinical ability, by assessing the verbal consent process during an interaction with a standardized patient. Twenty anesthesia residents were randomized to a 'didactic group' or a 'simulation group'. Each resident was presented with a written scenario and asked to document the informed consent process, as they normally would do (pre-test). The didactic group then had a presentation about informed consent, while the simulation group members interviewed a simulated patient, the scenarios focusing on different aspects of consent. All residents then read a scenario and documented their informed consent process (post-test). Six weeks later all residents interviewed a standardized patient in labor and documented the consent from this interaction (six-week test). There was no significant difference in the baseline performance of the two groups. Both groups showed significant improvement in their written consent documentation at the immediate time point, the improvement in the didactic group being greater. The didactic group performed better at both the immediate time point and the six-week time point. In this small study, a didactic teaching method proved better than simulation-based teaching in helping residents to gain knowledge needed to obtain informed consent for epidural labor analgesia. Copyright © 2017 Elsevier Ltd. All rights reserved.
Extracting and standardizing medication information in clinical text - the MedEx-UIMA system.
Jiang, Min; Wu, Yonghui; Shah, Anushi; Priyanka, Priyanka; Denny, Joshua C; Xu, Hua
2014-01-01
Extraction of medication information embedded in clinical text is important for research using electronic health records (EHRs). However, most of current medication information extraction systems identify drug and signature entities without mapping them to standard representation. In this study, we introduced the open source Java implementation of MedEx, an existing high-performance medication information extraction system, based on the Unstructured Information Management Architecture (UIMA) framework. In addition, we developed new encoding modules in the MedEx-UIMA system, which mapped an extracted drug name/dose/form to both generalized and specific RxNorm concepts and translated drug frequency information to ISO standard. We processed 826 documents by both systems and verified that MedEx-UIMA and MedEx (the Python version) performed similarly by comparing both results. Using two manually annotated test sets that contained 300 drug entries from medication list and 300 drug entries from narrative reports, the MedEx-UIMA system achieved F-measures of 98.5% and 97.5% respectively for encoding drug names to corresponding RxNorm generic drug ingredients, and F-measures of 85.4% and 88.1% respectively for mapping drug names/dose/form to the most specific RxNorm concepts. It also achieved an F-measure of 90.4% for normalizing frequency information to ISO standard. The open source MedEx-UIMA system is freely available online at http://code.google.com/p/medex-uima/.
The Future of Geospatial Standards
NASA Astrophysics Data System (ADS)
Bermudez, L. E.; Simonis, I.
2016-12-01
The OGC is an international not-for-profit standards development organization (SDO) committed to making quality standards for the geospatial community. A community of more than 500 member organizations with more than 6,000 people registered at the OGC communication platform drives the development of standards that are freely available for anyone to use and to improve sharing of the world's geospatial data. OGC standards are applied in a variety of application domains including Environment, Defense and Intelligence, Smart Cities, Aviation, Disaster Management, Agriculture, Business Development and Decision Support, and Meteorology. Profiles help to apply information models to different communities, thus adapting to particular needs of that community while ensuring interoperability by using common base models and appropriate support services. Other standards address orthogonal aspects such as handling of Big Data, Crowd-sourced information, Geosemantics, or container for offline data usage. Like most SDOs, the OGC develops and maintains standards through a formal consensus process under the OGC Standards Program (OGC-SP) wherein requirements and use cases are discussed in forums generally open to the public (Domain Working Groups, or DWGs), and Standards Working Groups (SWGs) are established to create standards. However, OGC is unique among SDOs in that it also operates the OGC Interoperability Program (OGC-IP) to provide real-world testing of existing and proposed standards. The OGC-IP is considered the experimental playground, where new technologies are researched and developed in a user-driven process. Its goal is to prototype, test, demonstrate, and promote OGC Standards in a structured environment. Results from the OGC-IP often become requirements for new OGC standards or identify deficiencies in existing OGC standards that can be addressed. This presentation will provide an analysis of the work advanced in the OGC consortium including standards and testbeds, where we can extract a trend for the future of geospatial standards. We see a number of key elements in focus, but simultaneously a broadening of standards to address particular communities' needs.
Measuring health care process quality with software quality measures.
Yildiz, Ozkan; Demirörs, Onur
2012-01-01
Existing quality models focus on some specific diseases, clinics or clinical areas. Although they contain structure, process, or output type measures, there is no model which measures quality of health care processes comprehensively. In addition, due to the not measured overall process quality, hospitals cannot compare quality of processes internally and externally. To bring a solution to above problems, a new model is developed from software quality measures. We have adopted the ISO/IEC 9126 software quality standard for health care processes. Then, JCIAS (Joint Commission International Accreditation Standards for Hospitals) measurable elements were added to model scope for unifying functional requirements. Assessment (diagnosing) process measurement results are provided in this paper. After the application, it was concluded that the model determines weak and strong aspects of the processes, gives a more detailed picture for the process quality, and provides quantifiable information to hospitals to compare their processes with multiple organizations.
Promotion of cooperation in evolutionary game dynamics with local information.
Liu, Xuesong; Pan, Qiuhui; He, Mingfeng
2018-01-21
In this paper, we propose a strategy-updating rule driven by local information, which is called Local process. Unlike the standard Moran process, the Local process does not require global information about the strategic environment. By analyzing the dynamical behavior of the system, we explore how the local information influences the fixation of cooperation in two-player evolutionary games. Under weak selection, the decreasing local information leads to an increase of the fixation probability when natural selection does not favor cooperation replacing defection. In the limit of sufficiently large selection, the analytical results indicate that the fixation probability increases with the decrease of the local information, irrespective of the evolutionary games. Furthermore, for the dominance of defection games under weak selection and for coexistence games, the decreasing of local information will lead to a speedup of a single cooperator taking over the population. Overall, to some extent, the local information is conducive to promoting the cooperation. Copyright © 2017 Elsevier Ltd. All rights reserved.
ITIL{sup ®} and information security
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jašek, Roman; Králík, Lukáš; Popelka, Miroslav
2015-03-10
This paper discusses the context of ITIL framework and management of information security. It is therefore a summary study, where the first part is focused on the safety objectives in connection with the ITIL framework. First of all, there is a focus on ITIL process ISM (Information Security Management), its principle and system management. The conclusion is about link between standards, which are related to security, and ITIL framework.
PepsNMR for 1H NMR metabolomic data pre-processing.
Martin, Manon; Legat, Benoît; Leenders, Justine; Vanwinsberghe, Julien; Rousseau, Réjane; Boulanger, Bruno; Eilers, Paul H C; De Tullio, Pascal; Govaerts, Bernadette
2018-08-17
In the analysis of biological samples, control over experimental design and data acquisition procedures alone cannot ensure well-conditioned 1 H NMR spectra with maximal information recovery for data analysis. A third major element affects the accuracy and robustness of results: the data pre-processing/pre-treatment for which not enough attention is usually devoted, in particular in metabolomic studies. The usual approach is to use proprietary software provided by the analytical instruments' manufacturers to conduct the entire pre-processing strategy. This widespread practice has a number of advantages such as a user-friendly interface with graphical facilities, but it involves non-negligible drawbacks: a lack of methodological information and automation, a dependency of subjective human choices, only standard processing possibilities and an absence of objective quality criteria to evaluate pre-processing quality. This paper introduces PepsNMR to meet these needs, an R package dedicated to the whole processing chain prior to multivariate data analysis, including, among other tools, solvent signal suppression, internal calibration, phase, baseline and misalignment corrections, bucketing and normalisation. Methodological aspects are discussed and the package is compared to the gold standard procedure with two metabolomic case studies. The use of PepsNMR on these data shows better information recovery and predictive power based on objective and quantitative quality criteria. Other key assets of the package are workflow processing speed, reproducibility, reporting and flexibility, graphical outputs and documented routines. Copyright © 2018 Elsevier B.V. All rights reserved.
Workflow management systems in radiology
NASA Astrophysics Data System (ADS)
Wendler, Thomas; Meetz, Kirsten; Schmidt, Joachim
1998-07-01
In a situation of shrinking health care budgets, increasing cost pressure and growing demands to increase the efficiency and the quality of medical services, health care enterprises are forced to optimize or complete re-design their processes. Although information technology is agreed to potentially contribute to cost reduction and efficiency improvement, the real success factors are the re-definition and automation of processes: Business Process Re-engineering and Workflow Management. In this paper we discuss architectures for the use of workflow management systems in radiology. We propose to move forward from information systems in radiology (RIS, PACS) to Radiology Management Systems, in which workflow functionality (process definitions and process automation) is implemented through autonomous workflow management systems (WfMS). In a workflow oriented architecture, an autonomous workflow enactment service communicates with workflow client applications via standardized interfaces. In this paper, we discuss the need for and the benefits of such an approach. The separation of workflow management system and application systems is emphasized, and the consequences that arise for the architecture of workflow oriented information systems. This includes an appropriate workflow terminology, and the definition of standard interfaces for workflow aware application systems. Workflow studies in various institutions have shown that most of the processes in radiology are well structured and suited for a workflow management approach. Numerous commercially available Workflow Management Systems (WfMS) were investigated, and some of them, which are process- oriented and application independent, appear suitable for use in radiology.
Sonication standard laboratory module
Beugelsdijk, Tony; Hollen, Robert M.; Erkkila, Tracy H.; Bronisz, Lawrence E.; Roybal, Jeffrey E.; Clark, Michael Leon
1999-01-01
A standard laboratory module for automatically producing a solution of cominants from a soil sample. A sonication tip agitates a solution containing the soil sample in a beaker while a stepper motor rotates the sample. An aspirator tube, connected to a vacuum, draws the upper layer of solution from the beaker through a filter and into another beaker. This beaker can thereafter be removed for analysis of the solution. The standard laboratory module encloses an embedded controller providing process control, status feedback information and maintenance procedures for the equipment and operations within the standard laboratory module.
Biomedical journal title changes: reasons, trends, and impact.
Afes, V B; Wrynn, P E
1993-01-01
A study was conducted to document the impact of biomedical journal title changes on medical libraries and to increase awareness of the reasons titles are changed. The study consisted of two parts: a survey of academic health sciences libraries in the United States and Canada and an analysis of title changes from two different years. The survey response rate was 83%. The majority of respondents commented on difficulties in identifying and processing title changes, often resulting in the delay or loss of information. The analysis revealed that a third of title changes were not justified by the journal. The study results substantiate the need to standardize title change reporting by publishers. A standard developed by the National Information Standards Organization requires publishers to conform to standardized practices for notification. This standard precisely reflects the concerns reflected in both the survey and the study of title changes, and librarians are urged to ensure that the standard is implemented by publishers. PMID:8428189
NASA Astrophysics Data System (ADS)
Leibovici, D. G.; Pourabdollah, A.; Jackson, M.
2011-12-01
Experts and decision-makers use or develop models to monitor global and local changes of the environment. Their activities require the combination of data and processing services in a flow of operations and spatial data computations: a geospatial scientific workflow. The seamless ability to generate, re-use and modify a geospatial scientific workflow is an important requirement but the quality of outcomes is equally much important [1]. Metadata information attached to the data and processes, and particularly their quality, is essential to assess the reliability of the scientific model that represents a workflow [2]. Managing tools, dealing with qualitative and quantitative metadata measures of the quality associated with a workflow, are, therefore, required for the modellers. To ensure interoperability, ISO and OGC standards [3] are to be adopted, allowing for example one to define metadata profiles and to retrieve them via web service interfaces. However these standards need a few extensions when looking at workflows, particularly in the context of geoprocesses metadata. We propose to fill this gap (i) at first through the provision of a metadata profile for the quality of processes, and (ii) through providing a framework, based on XPDL [4], to manage the quality information. Web Processing Services are used to implement a range of metadata analyses on the workflow in order to evaluate and present quality information at different levels of the workflow. This generates the metadata quality, stored in the XPDL file. The focus is (a) on the visual representations of the quality, summarizing the retrieved quality information either from the standardized metadata profiles of the components or from non-standard quality information e.g., Web 2.0 information, and (b) on the estimated qualities of the outputs derived from meta-propagation of uncertainties (a principle that we have introduced [5]). An a priori validation of the future decision-making supported by the outputs of the workflow once run, is then provided using the meta-propagated qualities, obtained without running the workflow [6], together with the visualization pointing out the need to improve the workflow with better data or better processes on the workflow graph itself. [1] Leibovici, DG, Hobona, G Stock, K Jackson, M (2009) Qualifying geospatial workfow models for adaptive controlled validity and accuracy. In: IEEE 17th GeoInformatics, 1-5 [2] Leibovici, DG, Pourabdollah, A (2010a) Workflow Uncertainty using a Metamodel Framework and Metadata for Data and Processes. OGC TC/PC Meetings, September 2010, Toulouse, France [3] OGC (2011) www.opengeospatial.org [4] XPDL (2008) Workflow Process Definition Interface - XML Process Definition Language.Workflow Management Coalition, Document WfMC-TC-1025, 2008 [5] Leibovici, DG Pourabdollah, A Jackson, M (2011) Meta-propagation of Uncertainties for Scientific Workflow Management in Interoperable Spatial Data Infrastructures. In: Proceedings of the European Geosciences Union (EGU2011), April 2011, Austria [6] Pourabdollah, A Leibovici, DG Jackson, M (2011) MetaPunT: an Open Source tool for Meta-Propagation of uncerTainties in Geospatial Processing. In: Proceedings of OSGIS2011, June 2011, Nottingham, UK
The laterality effect: myth or truth?
Cohen Kadosh, Roi
2008-03-01
Tzelgov and colleagues [Tzelgov, J., Meyer, J., and Henik, A. (1992). Automatic and intentional processing of numerical information. Journal of Experimental Psychology: Learning, Memory and Cognition, 18, 166-179.], offered the existence of the laterality effect as a post-hoc explanation for their results. According to this effect, numbers are classified automatically as small/large versus a standard point under autonomous processing of numerical information. However, the genuinity of the laterality effect was never examined, or was confounded with the numerical distance effect. In the current study, I controlled the numerical distance effect and observed that the laterality effect does exist, and affects the processing of automatic numerical information. The current results suggest that the laterality effect should be taken into account when using paradigms that require automatic numerical processing such as Stroop-like or priming tasks.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-09
... may be sent to: Chief, Computer Security Division, Information Technology Laboratory, ATTN: Comments... introduces the concept of a virtual contact interface, over which all functionality of the PIV Card is... Laboratory Programs. [FR Doc. 2012-16725 Filed 7-6-12; 8:45 am] BILLING CODE 3510-13-P ...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-05
... Devices and Radiological Health's (CDRH's or the Center's) draft process to clarify and more quickly inform stakeholders when CDRH has changed its expectations relating to, or otherwise has new scientific... scientific information changes CDRH's regulatory thinking, it has been challenging for the Center to...
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
NASA Astrophysics Data System (ADS)
Green, Tim; Faulkner, Andrew; Rosen, Stuart; Macherey, Olivier
2005-07-01
Standard continuous interleaved sampling processing, and a modified processing strategy designed to enhance temporal cues to voice pitch, were compared on tests of intonation perception, and vowel perception, both in implant users and in acoustic simulations. In standard processing, 400 Hz low-pass envelopes modulated either pulse trains (implant users) or noise carriers (simulations). In the modified strategy, slow-rate envelope modulations, which convey dynamic spectral variation crucial for speech understanding, were extracted by low-pass filtering (32 Hz). In addition, during voiced speech, higher-rate temporal modulation in each channel was provided by 100% amplitude-modulation by a sawtooth-like wave form whose periodicity followed the fundamental frequency (F0) of the input. Channel levels were determined by the product of the lower- and higher-rate modulation components. Both in acoustic simulations and in implant users, the ability to use intonation information to identify sentences as question or statement was significantly better with modified processing. However, while there was no difference in vowel recognition in the acoustic simulation, implant users performed worse with modified processing both in vowel recognition and in formant frequency discrimination. It appears that, while enhancing pitch perception, modified processing harmed the transmission of spectral information.
NASA Astrophysics Data System (ADS)
Elag, M.; Goodall, J. L.
2013-12-01
Hydrologic modeling often requires the re-use and integration of models from different disciplines to simulate complex environmental systems. Component-based modeling introduces a flexible approach for integrating physical-based processes across disciplinary boundaries. Several hydrologic-related modeling communities have adopted the component-based approach for simulating complex physical systems by integrating model components across disciplinary boundaries in a workflow. However, it is not always straightforward to create these interdisciplinary models due to the lack of sufficient knowledge about a hydrologic process. This shortcoming is a result of using informal methods for organizing and sharing information about a hydrologic process. A knowledge-based ontology provides such standards and is considered the ideal approach for overcoming this challenge. The aims of this research are to present the methodology used in analyzing the basic hydrologic domain in order to identify hydrologic processes, the ontology itself, and how the proposed ontology is integrated with the Water Resources Component (WRC) ontology. The proposed ontology standardizes the definitions of a hydrologic process, the relationships between hydrologic processes, and their associated scientific equations. The objective of the proposed Hydrologic Process (HP) Ontology is to advance the idea of creating a unified knowledge framework for components' metadata by introducing a domain-level ontology for hydrologic processes. The HP ontology is a step toward an explicit and robust domain knowledge framework that can be evolved through the contribution of domain users. Analysis of the hydrologic domain is accomplished using the Formal Concept Approach (FCA), in which the infiltration process, an important hydrologic process, is examined. Two infiltration methods, the Green-Ampt and Philip's methods, were used to demonstrate the implementation of information in the HP ontology. Furthermore, a SPARQL service is provided for semantic-based querying of the ontology.
A concept to standardize raw biosignal transmission for brain-computer interfaces.
Breitwieser, Christian; Neuper, Christa; Müller-Putz, Gernot R
2011-01-01
With this concept we introduced the attempt of a standardized interface called TiA to transmit raw biosignals. TiA is able to deal with multirate and block-oriented data transmission. Data is distinguished by different signal types (e.g., EEG, EOG, NIRS, …), whereby those signals can be acquired at the same time from different acquisition devices. TiA is built as a client-server model. Multiple clients can connect to one server. Information is exchanged via a control- and a separated data connection. Control commands and meta information are transmitted over the control connection. Raw biosignal data is delivered using the data connection in a unidirectional way. For this purpose a standardized handshaking protocol and raw data packet have been developed. Thus, an abstraction layer between hardware devices and data processing was evolved facilitating standardization.
Fornwall, M.; Gisiner, R.; Simmons, S. E.; Moustahfid, Hassan; Canonico, G.; Halpin, P.; Goldstein, P.; Fitch, R.; Weise, M.; Cyr, N.; Palka, D.; Price, J.; Collins, D.
2012-01-01
The US Integrated Ocean Observing System (IOOS) has recently adopted standards for biological core variables in collaboration with the US Geological Survey/Ocean Biogeographic Information System (USGS/OBIS-USA) and other federal and non-federal partners. In this Community White Paper (CWP) we provide a process to bring into IOOS a rich new source of biological observing data, visual line transect surveys, and to establish quality data standards for visual line transect observations, an important source of at-sea bird, turtle and marine mammal observation data. The processes developed through this exercise will be useful for other similar biogeographic observing efforts, such as passive acoustic point and line transect observations, tagged animal data, and mark-recapture (photo-identification) methods. Furthermore, we suggest that the processes developed through this exercise will serve as a catalyst for broadening involvement by the larger marine biological data community within the goals and processes of IOOS.
Topaz, Maxim; Lai, Kenneth; Dowding, Dawn; Lei, Victor J; Zisberg, Anna; Bowles, Kathryn H; Zhou, Li
2016-12-01
Electronic health records are being increasingly used by nurses with up to 80% of the health data recorded as free text. However, only a few studies have developed nursing-relevant tools that help busy clinicians to identify information they need at the point of care. This study developed and validated one of the first automated natural language processing applications to extract wound information (wound type, pressure ulcer stage, wound size, anatomic location, and wound treatment) from free text clinical notes. First, two human annotators manually reviewed a purposeful training sample (n=360) and random test sample (n=1100) of clinical notes (including 50% discharge summaries and 50% outpatient notes), identified wound cases, and created a gold standard dataset. We then trained and tested our natural language processing system (known as MTERMS) to process the wound information. Finally, we assessed our automated approach by comparing system-generated findings against the gold standard. We also compared the prevalence of wound cases identified from free-text data with coded diagnoses in the structured data. The testing dataset included 101 notes (9.2%) with wound information. The overall system performance was good (F-measure is a compiled measure of system's accuracy=92.7%), with best results for wound treatment (F-measure=95.7%) and poorest results for wound size (F-measure=81.9%). Only 46.5% of wound notes had a structured code for a wound diagnosis. The natural language processing system achieved good performance on a subset of randomly selected discharge summaries and outpatient notes. In more than half of the wound notes, there were no coded wound diagnoses, which highlight the significance of using natural language processing to enrich clinical decision making. Our future steps will include expansion of the application's information coverage to other relevant wound factors and validation of the model with external data. Copyright © 2016 Elsevier Ltd. All rights reserved.
[A medical consumable material management information system].
Tang, Guoping; Hu, Liang
2014-05-01
Medical consumables material is essential supplies to carry out medical work, which has a wide range of varieties and a large amount of usage. How to manage it feasibly and efficiently that has been a topic of concern to everyone. This article discussed about how to design a medical consumable material management information system that has a set of standardized processes, bring together medical supplies administrator, suppliers and clinical departments. Advanced management mode, enterprise resource planning (ERP) applied to the whole system design process.
Truck driver informational overload, fiscal year 1992. Final report, 1 July 1991-30 September 1992
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacAdam, C.C.
1992-09-01
The document represents the final project report for a study entitled 'Truck Driver Informational Overload' sponsored by the Motor Vehicle Manufacturers Association through its Motor Truck Research Committee and associated Operations/Performance Panels. As stated in an initial project statement, the objective of the work was to provide guidance for developing methods for measuring driving characteristics during information processing tasks. The contents of the report contain results from two basic project activities: (1) a literature review on multiple task performance driver information overload, and (2) a description of driving simulator side-task experiments and a discussion of findings from tests conducted withmore » eight subjects. Two of the key findings from a set of disturbance-input tests conducted with the simulator and the eight test subjects were that: (1) standard deviations of vehicle lateral position and heading (yaw) angle measurements showed the greatest sensitivity to the presence of side-task activities during basic information processing tasks, and (2) corresponding standard deviations of driver steering activity, vehicle yaw rate, and lateral acceleration measurements were seen to be largely insensitive indicators of side-task activity.« less
Earth Resources Technology Satellite: US standard catalog No. U-12
NASA Technical Reports Server (NTRS)
1973-01-01
To provide dissemination of information regarding the availability of Earth Resources Technology Satellite (ERTS) imagery, a U.S. Standard Catalog is published on a monthly schedule. The catalogs identify imagery which has been processed and input to the data files during the preceding month. The U.S. Standard Catalog includes imagery covering the Continental United States, Alaska, and Hawaii. As a supplement to these catalogs, an inventory of ERTS imagery on 16 millimeter microfilm is available. The catalogs consist of four parts: (1) annotated maps which graphically depict the geographic areas covered by the imagery listed in the current catalog, (2) a computer-generated listing organized by observation identification number (D) with pertinent information on each image, (3) a computer listing of observations organized by longitude and latitude, and (4) observations which have had changes made in their catalog information since the original entry in the data base.
PACS-Based Computer-Aided Detection and Diagnosis
NASA Astrophysics Data System (ADS)
Huang, H. K. (Bernie); Liu, Brent J.; Le, Anh HongTu; Documet, Jorge
The ultimate goal of Picture Archiving and Communication System (PACS)-based Computer-Aided Detection and Diagnosis (CAD) is to integrate CAD results into daily clinical practice so that it becomes a second reader to aid the radiologist's diagnosis. Integration of CAD and Hospital Information System (HIS), Radiology Information System (RIS) or PACS requires certain basic ingredients from Health Level 7 (HL7) standard for textual data, Digital Imaging and Communications in Medicine (DICOM) standard for images, and Integrating the Healthcare Enterprise (IHE) workflow profiles in order to comply with the Health Insurance Portability and Accountability Act (HIPAA) requirements to be a healthcare information system. Among the DICOM standards and IHE workflow profiles, DICOM Structured Reporting (DICOM-SR); and IHE Key Image Note (KIN), Simple Image and Numeric Report (SINR) and Post-processing Work Flow (PWF) are utilized in CAD-HIS/RIS/PACS integration. These topics with examples are presented in this chapter.
Software Design Document MCC CSCI (1). Volume 1 Sections 1.0-2.18
1991-06-01
AssociationUserProtocol /simnet/common!include/prot ____________________ ____________________ ocol/p assoc.h Primitive long Standard C type...Information. 2.2.1.4.2 ProcessMessage ProcessMessage processes a message from another process. type describes the message as either one-way, a synchronous or...Macintosh Consoles. This is sometimes necessary due to normal clock skew so that operations among the MCC components will remain synchronized . This
ERIC Educational Resources Information Center
Australian Government Department of Education and Training, 2016
2016-01-01
The Government welcomes the Higher Education Standards Panel's report on how the admissions policies and processes of higher education providers can be made clearer, easier to access and more useful, to inform the choices and decisions of prospective students and their families. The Panel's recommendations provide a comprehensive and well thought…
Haroon, Shamil; Wooldridge, Darren; Hoogewerf, Jan; Nirantharakumar, Krishnarajah; Williams, John; Martino, Lina; Bhala, Neeraj
2018-06-07
Alcohol misuse is an important cause of premature disability and death. While clinicians are recommended to ask patients about alcohol use and provide brief interventions and specialist referral, this is poorly implemented in routine practice. We undertook a national consultation to ascertain the appropriateness of proposed standards for recording information about alcohol use in electronic health records (EHRs) in the UK and to identify potential barriers and facilitators to their implementation in practice. A wide range of stakeholders in the UK were consulted about the appropriateness of proposed information standards for recording alcohol use in EHRs via a multi-disciplinary stakeholder workshop and online survey. Responses to the survey were thematically analysed using the Consolidated Framework for Implementation Research. Thirty-one stakeholders participated in the workshop and 100 in the online survey. This included patients and carers, healthcare professionals, researchers, public health specialists, informaticians, and clinical information system suppliers. There was broad consensus that the Alcohol Use Disorders Identification Test (AUDIT) and AUDIT-Consumption (AUDIT-C) questionnaires were appropriate standards for recording alcohol use in EHRs but that the standards should also address interventions for alcohol misuse. Stakeholders reported a number of factors that might influence implementation of the standards, including having clear care pathways and an implementation guide, sharing information about alcohol use between health service providers, adequately resourcing the implementation process, integrating alcohol screening with existing clinical pathways, having good clinical information systems and IT infrastructure, providing financial incentives, having sufficient training for healthcare workers, and clinical leadership and engagement. Implementation of the standards would need to ensure patients are not stigmatised and that patient confidentiality is robustly maintained. A wide range of stakeholders agreed that use of AUDIT-C and AUDIT are appropriate standards for recording alcohol use in EHRs in addition to recording interventions for alcohol misuse. The findings of this consultation will be used to develop an appropriate information model and implementation guide. Further research is needed to pilot the standards in primary and secondary care.
Goss, Foster R.; Plasek, Joseph M.; Lau, Jason J.; Seger, Diane L.; Chang, Frank Y.; Zhou, Li
2014-01-01
Emergency department (ED) visits due to allergic reactions are common. Allergy information is often recorded in free-text provider notes; however, this domain has not yet been widely studied by the natural language processing (NLP) community. We developed an allergy module built on the MTERMS NLP system to identify and encode food, drug, and environmental allergies and allergic reactions. The module included updates to our lexicon using standard terminologies, and novel disambiguation algorithms. We developed an annotation schema and annotated 400 ED notes that served as a gold standard for comparison to MTERMS output. MTERMS achieved an F-measure of 87.6% for the detection of allergen names and no known allergies, 90% for identifying true reactions in each allergy statement where true allergens were also identified, and 69% for linking reactions to their allergen. These preliminary results demonstrate the feasibility using NLP to extract and encode allergy information from clinical notes. PMID:25954363
Medina-Aunon, J. Alberto; Martínez-Bartolomé, Salvador; López-García, Miguel A.; Salazar, Emilio; Navajas, Rosana; Jones, Andrew R.; Paradela, Alberto; Albar, Juan P.
2011-01-01
The development of the HUPO-PSI's (Proteomics Standards Initiative) standard data formats and MIAPE (Minimum Information About a Proteomics Experiment) guidelines should improve proteomics data sharing within the scientific community. Proteomics journals have encouraged the use of these standards and guidelines to improve the quality of experimental reporting and ease the evaluation and publication of manuscripts. However, there is an evident lack of bioinformatics tools specifically designed to create and edit standard file formats and reports, or embed them within proteomics workflows. In this article, we describe a new web-based software suite (The ProteoRed MIAPE web toolkit) that performs several complementary roles related to proteomic data standards. First, it can verify that the reports fulfill the minimum information requirements of the corresponding MIAPE modules, highlighting inconsistencies or missing information. Second, the toolkit can convert several XML-based data standards directly into human readable MIAPE reports stored within the ProteoRed MIAPE repository. Finally, it can also perform the reverse operation, allowing users to export from MIAPE reports into XML files for computational processing, data sharing, or public database submission. The toolkit is thus the first application capable of automatically linking the PSI's MIAPE modules with the corresponding XML data exchange standards, enabling bidirectional conversions. This toolkit is freely available at http://www.proteored.org/MIAPE/. PMID:21983993
Managing Interoperability for GEOSS - A Report from the SIF
NASA Astrophysics Data System (ADS)
Khalsa, S. J.; Actur, D.; Nativi, S.; Browdy, S.; Eglitis, P.
2009-04-01
The Global Earth Observation System of Systems (GEOSS) is a coordinating and integrating framework for Earth observing and information systems, which are contributed on a voluntary basis by Members and Participating Organizations of the intergovernmental Group on Earth Observations (GEO). GEOSS exists to support informed decision making for the benefit of society, including the implementation of international environmental treaty obligations. GEO Members and Participating organizations use the GEOSS Common Infrastructure (GCI) to register their Earth observation resources, thereby making them discoverable and consumable by both humans and client applications. Essential to meeting GEO user needs is a process for supporting interoperability of observing, processing, modeling and dissemination capabilities. The GEO Standards and Interoperability Forum (SIF) was created to develop, implement and oversee this process. The SIF supports GEO organizations contributing resources to the GEOSS by helping them understand and work with the GEOSS interoperability guidelines and encouraging them to register their "interoperability arrangements" (standards or other ad hoc arrangements for interoperability) in the GEOSS standards registry, which is part of the GCI. These registered interoperability arrangements support the actual services used to achieve interoperability of systems. By making information about these interoperability arrangements available to users of the GEOSS the SIF enhances the understanding and utility of contributed resources. We describe the procedures that the SIF has enacted to carry out its work. To operate effectively the SIF uses a workflow system and is establishing a set of regional teams and domain experts. In the near term our work has focused on population and review of the GEOSS Standards Registry, but we are also developing approaches to achieving progressive convergence on, and uptake of, an optimal set of interoperability arrangements for all of GEOSS.
Using normalization 3D model for automatic clinical brain quantative analysis and evaluation
NASA Astrophysics Data System (ADS)
Lin, Hong-Dun; Yao, Wei-Jen; Hwang, Wen-Ju; Chung, Being-Tau; Lin, Kang-Ping
2003-05-01
Functional medical imaging, such as PET or SPECT, is capable of revealing physiological functions of the brain, and has been broadly used in diagnosing brain disorders by clinically quantitative analysis for many years. In routine procedures, physicians manually select desired ROIs from structural MR images and then obtain physiological information from correspondent functional PET or SPECT images. The accuracy of quantitative analysis thus relies on that of the subjectively selected ROIs. Therefore, standardizing the analysis procedure is fundamental and important in improving the analysis outcome. In this paper, we propose and evaluate a normalization procedure with a standard 3D-brain model to achieve precise quantitative analysis. In the normalization process, the mutual information registration technique was applied for realigning functional medical images to standard structural medical images. Then, the standard 3D-brain model that shows well-defined brain regions was used, replacing the manual ROIs in the objective clinical analysis. To validate the performance, twenty cases of I-123 IBZM SPECT images were used in practical clinical evaluation. The results show that the quantitative analysis outcomes obtained from this automated method are in agreement with the clinical diagnosis evaluation score with less than 3% error in average. To sum up, the method takes advantage of obtaining precise VOIs, information automatically by well-defined standard 3-D brain model, sparing manually drawn ROIs slice by slice from structural medical images in traditional procedure. That is, the method not only can provide precise analysis results, but also improve the process rate for mass medical images in clinical.
NASA Astrophysics Data System (ADS)
Holland, M.; Hoggarth, A.; Nicholson, J.
2016-04-01
The quantity of information generated by survey sensors for ocean and coastal zone mapping has reached the “Big Data” age. This is influenced by the number of survey sensors available to conduct a survey, high data resolution, commercial availability, as well as an increased use of autonomous platforms. The number of users of sophisticated survey information is also growing with the increase in data volume. This is leading to a greater demand and broader use of the processed results, which includes marine archeology, disaster response, and many other applications. Data processing and exchange techniques are evolving to ensure this increased accuracy in acquired data meets the user demand, and leads to an improved understanding of the ocean environment. This includes the use of automated processing, models that maintain the best possible representation of varying resolution data to reduce duplication, as well as data plug-ins and interoperability standards. Through the adoption of interoperable standards, data can be exchanged between stakeholders and used many times in any GIS to support an even wider range of activities. The growing importance of Marine Spatial Data Infrastructure (MSDI) is also contributing to the increased access of marine information to support sustainable use of ocean and coastal environments. This paper offers an industry perspective on trends in hydrographic surveying and processing, and the increased use of marine spatial data.
40 CFR 63.11565 - What general provisions sections apply to this subpart?
Code of Federal Regulations, 2010 CFR
2010-07-01
... CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Area Sources: Asphalt Processing and Asphalt Roofing Manufacturing Other Requirements and Information § 63.11565 What general...
40 CFR 63.11565 - What general provisions sections apply to this subpart?
Code of Federal Regulations, 2013 CFR
2013-07-01
... CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Area Sources: Asphalt Processing and Asphalt Roofing Manufacturing Other Requirements and Information § 63.11565 What general...
40 CFR 63.11565 - What general provisions sections apply to this subpart?
Code of Federal Regulations, 2014 CFR
2014-07-01
... CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Area Sources: Asphalt Processing and Asphalt Roofing Manufacturing Other Requirements and Information § 63.11565 What general...
40 CFR 63.11565 - What general provisions sections apply to this subpart?
Code of Federal Regulations, 2012 CFR
2012-07-01
... CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Area Sources: Asphalt Processing and Asphalt Roofing Manufacturing Other Requirements and Information § 63.11565 What general...
40 CFR 63.11565 - What general provisions sections apply to this subpart?
Code of Federal Regulations, 2011 CFR
2011-07-01
... CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Area Sources: Asphalt Processing and Asphalt Roofing Manufacturing Other Requirements and Information § 63.11565 What general...
76 FR 33400 - Petition for Exemption; Summary of Petition Received
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-08
...., Monday through Friday, except Federal holidays. For more information on the rulemaking process, see the...: Jan Thor, (425-227-2127), Standardization Branch, ANM-113, Federal Aviation Administration, 1601 Lind...
Concepts for a global resources information system
NASA Technical Reports Server (NTRS)
Billingsley, F. C.; Urena, J. L.
1984-01-01
The objective of the Global Resources Information System (GRIS) is to establish an effective and efficient information management system to meet the data access requirements of NASA and NASA-related scientists conducting large-scale, multi-disciplinary, multi-mission scientific investigations. Using standard interfaces and operating guidelines, diverse data systems can be integrated to provide the capabilities to access and process multiple geographically dispersed data sets and to develop the necessary procedures and algorithms to derive global resource information.
2017-04-05
Information Technology at Nationwide v Abstract vi 1 Business Imperatives 1 1.1 Deliver the Right Work 1 1.2 Deliver the Right Way 1 1.3 Deliver with...an Engaged Workforce 1 2 Challenges and Opportunities 2 2.1 Responding to Demand 2 2.2 Standards and Capabilities 2 2.3 Information Technology ...release and unlimited distribution. Information Technology at Nationwide Nationwide Information Technology (IT) is comprised of seven offices
A method for tailoring the information content of a software process model
NASA Technical Reports Server (NTRS)
Perkins, Sharon; Arend, Mark B.
1990-01-01
The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to set of accepted processes and products for achieving each criterion; (5) Select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.
A method for tailoring the information content of a software process model
NASA Technical Reports Server (NTRS)
Perkins, Sharon; Arend, Mark B.
1990-01-01
The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to a set of accepted processes and products for achieving each criterion; (5) select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.
Pathfinder. Volume 8, Number 6, November/December 2010
2010-12-01
transferring information between multiple systems . Nevertheless, without an end-to-end TCPED process and the associated standards, policies and equipment in...products with partners whose information technology systems vary and are not compatible with those of the NSG, NGA and the U.S. Depart- ment of...Pacific. ARF DReaMS is based on Web service technol- ogy, where traditional maps, data and any relevant geospatial information are made available
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-17
... standards, eligibility determination process, and applicant information verification process. a. Definitions... section 4(d) of the Indian Self-Determination and Education Assistance Act (ISDEAA) (Pub. L. 93-638, 88... attestation may be made by the applicant (self-attestation), an application filer, or in cases in which an...
Striking a Balance: Students' Tendencies to Oversimplify or Overcomplicate in Mathematical Modeling
ERIC Educational Resources Information Center
Gould, Heather; Wasserman, Nicholas H.
2014-01-01
With the adoption of the "Common Core State Standards for Mathematics" (CCSSM), the process of mathematical modeling has been given increased attention in mathematics education. This article reports on a study intended to inform the implementation of modeling in classroom contexts by examining students' interactions with the process of…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-13
... Inspection, Certification, Standards, and Audit Services for Fresh Fruits, Vegetables, and Other Products--7... Certification of Processed Fruits and Vegetables and Related Products-- 7 CFR part 52. This notice also combines... Regulations Governing Inspections and Certification of Processed Fruits and Vegetables and Related Products--7...
Do Chinese Readers Follow the National Standard Rules for Word Segmentation during Reading?
Liu, Ping-Ping; Li, Wei-Jun; Lin, Nan; Li, Xing-Shan
2013-01-01
We conducted a preliminary study to examine whether Chinese readers’ spontaneous word segmentation processing is consistent with the national standard rules of word segmentation based on the Contemporary Chinese language word segmentation specification for information processing (CCLWSSIP). Participants were asked to segment Chinese sentences into individual words according to their prior knowledge of words. The results showed that Chinese readers did not follow the segmentation rules of the CCLWSSIP, and their word segmentation processing was influenced by the syntactic categories of consecutive words. In many cases, the participants did not consider the auxiliary words, adverbs, adjectives, nouns, verbs, numerals and quantifiers as single word units. Generally, Chinese readers tended to combine function words with content words to form single word units, indicating they were inclined to chunk single words into large information units during word segmentation. Additionally, the “overextension of monosyllable words” hypothesis was tested and it might need to be corrected to some degree, implying that word length have an implicit influence on Chinese readers’ segmentation processing. Implications of these results for models of word recognition and eye movement control are discussed. PMID:23408981
Content standards for medical image metadata
NASA Astrophysics Data System (ADS)
d'Ornellas, Marcos C.; da Rocha, Rafael P.
2003-12-01
Medical images are at the heart of the healthcare diagnostic procedures. They have provided not only a noninvasive mean to view anatomical cross-sections of internal organs but also a mean for physicians to evaluate the patient"s diagnosis and monitor the effects of the treatment. For a Medical Center, the emphasis may shift from the generation of image to post processing and data management since the medical staff may generate even more processed images and other data from the original image after various analyses and post processing. A medical image data repository for health care information system is becoming a critical need. This data repository would contain comprehensive patient records, including information such as clinical data and related diagnostic images, and post-processed images. Due to the large volume and complexity of the data as well as the diversified user access requirements, the implementation of the medical image archive system will be a complex and challenging task. This paper discusses content standards for medical image metadata. In addition it also focuses on the image metadata content evaluation and metadata quality management.
Closing the data gap: Creating an open data environment
NASA Astrophysics Data System (ADS)
Hester, J. R.
2014-02-01
Poor data management brought on by increasing volumes of complex data undermines both the integrity of the scientific process and the usefulness of datasets. Researchers should endeavour both to make their data citeable and to cite data whenever possible. The reusability of datasets is improved by community adoption of comprehensive metadata standards and public availability of reversibly reduced data. Where standards are not yet defined, as much information as possible about the experiment and samples should be preserved in datafiles written in a standard format.
Data standardization. The key to effective management
Wagner, C. Russell
1991-01-01
Effective management of the nation's water resources is dependent upon accurate and consistent hydrologic information. Before the emergence of environmental concerns in the 1960's, most hydrologic information was collected by the U.S. Geological Survey and other Federal agencies that used fairly consistent methods and equipment. In the past quarter century, however, increased environmental awareness has resulted in an expansion of hydrologic data collection not only by Federal agencies, but also by state and municipal governments, university investigators, and private consulting firms. The acceptance and use of standard methods of collecting and processing hydrologic data would contribute to cost savings and to greater credibility of flow information vital to responsible assessment and management of the nation's water resources. This paper traces the evolution of the requirements and uses of open-channel flow information in the U.S., and the sequence of efforts to standardize the methods used to obtain this information in the future. The variable nature of naturally flowing rivers results in continually changing hydraulic properties of their channels. Those persons responsible for measurement of water flowing in open channels (streamflow) must use a large amount of judgement in the selection of appropriate equipment and technique to obtain accurate flow information. Standardization of the methods used in the measurement of streamflow is essential to assure consistency of data, but must also allow considerable latitude for individual judgement to meet constantly changing field conditions.
Constellation's Command, Control, Communications and Information (C3I) Architecture
NASA Technical Reports Server (NTRS)
Breidenthal, Julian C.
2007-01-01
Operations concepts are highly effective for: 1) Developing consensus; 2) Discovering stakeholder needs, goals, objectives; 3) Defining behavior of system components (especially emergent behaviors). An interoperability standard can provide an excellent lever to define the capabilities needed for system evolution. Two categories of architectures are needed in a program of this size are: 1) Generic - Needed for planning, design and construction standards; 2) Specific - Needed for detailed requirement allocations, interface specs. A wide variety of architectural views are needed to address stakeholder concerns, including: 1) Physical; 2) Information (structure, flow, evolution); 3) Processes (design, manufacturing, operations); 4) Performance; 5) Risk.
40 CFR 63.8696 - What parts of the General Provisions apply to me?
Code of Federal Regulations, 2014 CFR
2014-07-01
... CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants: Asphalt Processing and Asphalt Roofing Manufacturing Other Requirements and Information § 63.8696 What parts of the General...
40 CFR 63.8696 - What parts of the General Provisions apply to me?
Code of Federal Regulations, 2013 CFR
2013-07-01
... CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants: Asphalt Processing and Asphalt Roofing Manufacturing Other Requirements and Information § 63.8696 What parts of the General...
40 CFR 63.8696 - What parts of the General Provisions apply to me?
Code of Federal Regulations, 2010 CFR
2010-07-01
... CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants: Asphalt Processing and Asphalt Roofing Manufacturing Other Requirements and Information § 63.8696 What parts of the General...
40 CFR 63.8696 - What parts of the General Provisions apply to me?
Code of Federal Regulations, 2011 CFR
2011-07-01
... CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants: Asphalt Processing and Asphalt Roofing Manufacturing Other Requirements and Information § 63.8696 What parts of the General...
40 CFR 63.8696 - What parts of the General Provisions apply to me?
Code of Federal Regulations, 2012 CFR
2012-07-01
... CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants: Asphalt Processing and Asphalt Roofing Manufacturing Other Requirements and Information § 63.8696 What parts of the General...
Air Quality Implementation Plans
States must develop plans to attain and maintain air quality standards. These plans, known as SIPs, are submitted to EPA for approval. This web site contains information about this process and the current status of the submittals.
Dogac, Asuman; Kabak, Yildiray; Namli, Tuncay; Okcan, Alper
2008-11-01
Integrating healthcare enterprise (IHE) specifies integration profiles describing selected real world use cases to facilitate the interoperability of healthcare information resources. While realizing a complex real-world scenario, IHE profiles are combined by grouping the related IHE actors. Grouping IHE actors implies that the associated business processes (IHE profiles) that the actors are involved must be combined, that is, the choreography of the resulting collaborative business process must be determined by deciding on the execution sequence of transactions coming from different profiles. There are many IHE profiles and each user or vendor may support a different set of IHE profiles that fits to its business need. However, determining the precedence of all the involved transactions manually for each possible combination of the profiles is a very tedious task. In this paper, we describe how to obtain the overall business process automatically when IHE actors are grouped. For this purpose, we represent the IHE profiles through a standard, machine-processable language, namely, Organization for the Advancement of Structured Information Standards (OASIS) ebusiness eXtensible Markup Language (ebXML) Business Process Specification (ebBP) Language. We define the precedence rules among the transactions of the IHE profiles, again, in a machine-processable way. Then, through a graphical tool, we allow users to select the actors to be grouped and automatically produce the overall business process in a machine-processable format.
Full-wave and half-wave rectification in second-order motion perception
NASA Technical Reports Server (NTRS)
Solomon, J. A.; Sperling, G.
1994-01-01
Microbalanced stimuli are dynamic displays which do not stimulate motion mechanisms that apply standard (Fourier-energy or autocorrelational) motion analysis directly to the visual signal. In order to extract motion information from microbalanced stimuli, Chubb and Sperling [(1988) Journal of the Optical Society of America, 5, 1986-2006] proposed that the human visual system performs a rectifying transformation on the visual signal prior to standard motion analysis. The current research employs two novel types of microbalanced stimuli: half-wave stimuli preserve motion information following half-wave rectification (with a threshold) but lose motion information following full-wave rectification; full-wave stimuli preserve motion information following full-wave rectification but lose motion information following half-wave rectification. Additionally, Fourier stimuli, ordinary square-wave gratings, were used to stimulate standard motion mechanisms. Psychometric functions (direction discrimination vs stimulus contrast) were obtained for each type of stimulus when presented alone, and when masked by each of the other stimuli (presented as moving masks and also as nonmoving, counterphase-flickering masks). RESULTS: given sufficient contrast, all three types of stimulus convey motion. However, only one-third of the population can perceive the motion of the half-wave stimulus. Observers are able to process the motion information contained in the Fourier stimulus slightly more efficiently than the information in the full-wave stimulus but are much less efficient in processing half-wave motion information. Moving masks are more effective than counterphase masks at hampering direction discrimination, indicating that some of the masking effect is interference between motion mechanisms, and some occurs at earlier stages. When either full-wave and Fourier or half-wave and Fourier gratings are presented simultaneously, there is a wide range of relative contrasts within which the motion directions of both gratings are easily determinable. Conversely, when half-wave and full-wave gratings are combined, the direction of only one of these gratings can be determined with high accuracy. CONCLUSIONS: the results indicate that three motion computations are carried out, any two in parallel: one standard ("first order") and two non-Fourier ("second-order") computations that employ full-wave and half-wave rectification.
NASA Technical Reports Server (NTRS)
Borisenko, V. I.; Chesalin, L. S.
1980-01-01
The algorithm, block diagram, complete text, and instructions are given for the use of a computer program to separate formations whose spectral characteristics are constant on the average. The initial material for operating the computer program presented is video information in a standard color-superposition format.
Implementing the DoD Joint Operation Planning Process for Private Industry Enterprise Security
2011-09-01
Standards Organization’s ( ISO ) ISO 27001 ( ISO 27002 defines the controls), and the IT Service Management Forum’s Information Technology Infrastructure...27001 certification. 24 Alberto Bastos and Rosangela Caubit, ISO 27001 and 27002 : Information...includes: 90,000 records lost from Booz Allen Hamilton; 90,000,000 26 ISO /IEC 27002 , 19 December
Audio-visual presentation of information for informed consent for participation in clinical trials.
Synnot, Anneliese; Ryan, Rebecca; Prictor, Megan; Fetherstonhaugh, Deirdre; Parker, Barbara
2014-05-09
Informed consent is a critical component of clinical research. Different methods of presenting information to potential participants of clinical trials may improve the informed consent process. Audio-visual interventions (presented, for example, on the Internet or on DVD) are one such method. We updated a 2008 review of the effects of these interventions for informed consent for trial participation. To assess the effects of audio-visual information interventions regarding informed consent compared with standard information or placebo audio-visual interventions regarding informed consent for potential clinical trial participants, in terms of their understanding, satisfaction, willingness to participate, and anxiety or other psychological distress. We searched: the Cochrane Central Register of Controlled Trials (CENTRAL), The Cochrane Library, issue 6, 2012; MEDLINE (OvidSP) (1946 to 13 June 2012); EMBASE (OvidSP) (1947 to 12 June 2012); PsycINFO (OvidSP) (1806 to June week 1 2012); CINAHL (EbscoHOST) (1981 to 27 June 2012); Current Contents (OvidSP) (1993 Week 27 to 2012 Week 26); and ERIC (Proquest) (searched 27 June 2012). We also searched reference lists of included studies and relevant review articles, and contacted study authors and experts. There were no language restrictions. We included randomised and quasi-randomised controlled trials comparing audio-visual information alone, or in conjunction with standard forms of information provision (such as written or verbal information), with standard forms of information provision or placebo audio-visual information, in the informed consent process for clinical trials. Trials involved individuals or their guardians asked to consider participating in a real or hypothetical clinical study. (In the earlier version of this review we only included studies evaluating informed consent interventions for real studies). Two authors independently assessed studies for inclusion and extracted data. We synthesised the findings using meta-analysis, where possible, and narrative synthesis of results. We assessed the risk of bias of individual studies and considered the impact of the quality of the overall evidence on the strength of the results. We included 16 studies involving data from 1884 participants. Nine studies included participants considering real clinical trials, and eight included participants considering hypothetical clinical trials, with one including both. All studies were conducted in high-income countries.There is still much uncertainty about the effect of audio-visual informed consent interventions on a range of patient outcomes. However, when considered across comparisons, we found low to very low quality evidence that such interventions may slightly improve knowledge or understanding of the parent trial, but may make little or no difference to rate of participation or willingness to participate. Audio-visual presentation of informed consent may improve participant satisfaction with the consent information provided. However its effect on satisfaction with other aspects of the process is not clear. There is insufficient evidence to draw conclusions about anxiety arising from audio-visual informed consent. We found conflicting, very low quality evidence about whether audio-visual interventions took more or less time to administer. No study measured researcher satisfaction with the informed consent process, nor ease of use.The evidence from real clinical trials was rated as low quality for most outcomes, and for hypothetical studies, very low. We note, however, that this was in large part due to poor study reporting, the hypothetical nature of some studies and low participant numbers, rather than inconsistent results between studies or confirmed poor trial quality. We do not believe that any studies were funded by organisations with a vested interest in the results. The value of audio-visual interventions as a tool for helping to enhance the informed consent process for people considering participating in clinical trials remains largely unclear, although trends are emerging with regard to improvements in knowledge and satisfaction. Many relevant outcomes have not been evaluated in randomised trials. Triallists should continue to explore innovative methods of providing information to potential trial participants during the informed consent process, mindful of the range of outcomes that the intervention should be designed to achieve, and balancing the resource implications of intervention development and delivery against the purported benefits of any intervention.More trials, adhering to CONSORT standards, and conducted in settings and populations underserved in this review, i.e. low- and middle-income countries and people with low literacy, would strengthen the results of this review and broaden its applicability. Assessing process measures, such as time taken to administer the intervention and researcher satisfaction, would inform the implementation of audio-visual consent materials.
Eating out: consumer perceptions of food safety.
Worsfold, Denise
2006-06-01
The purpose of this investigation was to improve the understanding of the public's perception of hygiene standards in eating places and their knowledge of the inspection system. A telephone survey found that despite many claiming experience of food poisoning, and a widely held belief that using eating places may result in illness, people continue to eat out or purchase takeaways regularly. Nearly all respondents claimed that the standard of food hygiene was important to them when deciding where to eat out. Assessments of hygiene standards were mainly based on aesthetics. A minority had concerns/complaints about the hygiene standards of eating places they had used. People do not appear to be well informed about the role of the local authorities in protecting food safety and how the food safety laws are enforced. They believe that they have the right to know the result of a hygiene inspection. Half of them thought that it was difficult to find information on the hygiene standards of eating places. If access to information was easier, some consumers would eat out more often. The public will need to be educated on the inspection and enforcement process if 'scores on doors' is adopted as the main method of raising the confidence of the public in the standards of the food industry.
Extracting and standardizing medication information in clinical text – the MedEx-UIMA system
Jiang, Min; Wu, Yonghui; Shah, Anushi; Priyanka, Priyanka; Denny, Joshua C.; Xu, Hua
2014-01-01
Extraction of medication information embedded in clinical text is important for research using electronic health records (EHRs). However, most of current medication information extraction systems identify drug and signature entities without mapping them to standard representation. In this study, we introduced the open source Java implementation of MedEx, an existing high-performance medication information extraction system, based on the Unstructured Information Management Architecture (UIMA) framework. In addition, we developed new encoding modules in the MedEx-UIMA system, which mapped an extracted drug name/dose/form to both generalized and specific RxNorm concepts and translated drug frequency information to ISO standard. We processed 826 documents by both systems and verified that MedEx-UIMA and MedEx (the Python version) performed similarly by comparing both results. Using two manually annotated test sets that contained 300 drug entries from medication list and 300 drug entries from narrative reports, the MedEx-UIMA system achieved F-measures of 98.5% and 97.5% respectively for encoding drug names to corresponding RxNorm generic drug ingredients, and F-measures of 85.4% and 88.1% respectively for mapping drug names/dose/form to the most specific RxNorm concepts. It also achieved an F-measure of 90.4% for normalizing frequency information to ISO standard. The open source MedEx-UIMA system is freely available online at http://code.google.com/p/medex-uima/. PMID:25954575
Tipotsch-Maca, Saskia M; Varsits, Ralph M; Ginzel, Christian; Vecsei-Marlovits, Pia V
2016-01-01
To assess whether a multimedia-assisted preoperative informed consent procedure has an effect on patients' knowledge concerning cataract surgery, satisfaction with the informed consent process, and reduction in anxiety levels. Hietzing Hospital, Vienna, Austria. Prospective randomized controlled clinical trial. Patients participated in an informed consent procedure for age-related cataract surgery that included the standard approach only (reading the information brochure and having a standardized face-to-face discussion) or supplemented with a computer-animated video. The main outcome was information retention assessed by a questionnaire. Further outcome measures used were the State-Trait Anxiety Inventory, the Visual Function-14 score, and an assessment of satisfaction. The study included 123 patients (64 in standard-only group; 59 in computer-animated video group). Both groups scored well on the questionnaire; however, patients who watched the video performed better (82% retention versus 72%) (P = .002). Scores tended to decrease with increasing age (r = -0.25, P = .005); however, this decrease was smaller in the group that watched the video. Both groups had elevated anxiety levels (means in video group: anxiety concerning the current situation [S-anxiety] = 63.8 ± 9.6 [SD], general tendency toward anxiety [T-anxiety] = 65.5 ± 7.9; means in control group: S-anxiety = 61.9 ± 10.3, T-anxiety = 66.2 ± 7.8). A high level of information retention was achieved using an informed consent procedure consisting of an information brochure and a standardized face-to-face discussion. A further increase in information retention was achieved, even with increasing patient age, by adding a multimedia presentation. No author has a financial or proprietary interest in any material or method mentioned. Copyright © 2016 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Advanced Map For Real-Time Process Control
NASA Astrophysics Data System (ADS)
Shiobara, Yasuhisa; Matsudaira, Takayuki; Sashida, Yoshio; Chikuma, Makoto
1987-10-01
MAP, a communications protocol for factory automation proposed by General Motors [1], has been accepted by users throughout the world and is rapidly becoming a user standard. In fact, it is now a LAN standard for factory automation. MAP is intended to interconnect different devices, such as computers and programmable devices, made by different manufacturers, enabling them to exchange information. It is based on the OSI intercomputer com-munications protocol standard under development by the ISO. With progress and standardization, MAP is being investigated for application to process control fields other than factory automation [2]. The transmission response time of the network system and centralized management of data exchanged with various devices for distributed control are import-ant in the case of a real-time process control with programmable controllers, computers, and instruments connected to a LAN system. MAP/EPA and MINI MAP aim at reduced overhead in protocol processing and enhanced transmission response. If applied to real-time process control, a protocol based on point-to-point and request-response transactions limits throughput and transmission response. This paper describes an advanced MAP LAN system applied to real-time process control by adding a new data transmission control that performs multicasting communication voluntarily and periodically in the priority order of data to be exchanged.
Cutting to the chase: what physician executives need to know about HIPAA.
Fitzmaurice, J M; Rose, J S
2000-01-01
All health care providers, plans, and clearinghouses will be affected by the federally mandated uniform standards for administrative transactions. This article presents distilled core information about the Health Insurance Portability and Accountability Act (HIPAA) legislation--the standards, penalties for violations, and status of final rules. It also raises several key unsolved issues of which clinicians, executives, and health care providers must be aware so they can prepare and plan for the upcoming changes. HIPAA is intended to improve the efficiency and effectiveness of the health care system, as well as to increase the protection and confidentiality of individually identifiable health information. The costs of making the transition to the legislated standards and processes remain a worrisome factor. Although there are two years before these standards must be implemented, and cost and compliance issues resolved, work has already begun in many health institutions to identify and address them.
DICOM static and dynamic representation through unified modeling language
NASA Astrophysics Data System (ADS)
Martinez-Martinez, Alfonso; Jimenez-Alaniz, Juan R.; Gonzalez-Marquez, A.; Chavez-Avelar, N.
2004-04-01
The DICOM standard, as all standards, specifies in generic way the management in network and storage media environments of digital medical images and their related information. However, understanding the specifications for particular implementation is not a trivial work. Thus, this work is about understanding and modelling parts of the DICOM standard using Object Oriented methodologies, as part of software development processes. This has offered different static and dynamic views, according with the standard specifications, and the resultant models have been represented through the Unified Modelling Language (UML). The modelled parts are related to network conformance claim: Network Communication Support for Message Exchange, Message Exchange, Information Object Definitions, Service Class Specifications, Data Structures and Encoding, and Data Dictionary. The resultant models have given a better understanding about DICOM parts and have opened the possibility of create a software library to develop DICOM conformable PACS applications.
Patterns-Based IS Change Management in SMEs
NASA Astrophysics Data System (ADS)
Makna, Janis; Kirikova, Marite
The majority of information systems change management guidelines and standards are either too abstract or too bureaucratic to be easily applicable in small enterprises. This chapter proposes the approach, the method, and the prototype that are designed especially for information systems change management in small and medium enterprises. The approach is based on proven patterns of changes in the set of information systems elements. The set of elements was obtained by theoretical analysis of information systems and business process definitions and enterprise architectures. The patterns were evolved from a number of information systems theories and tested in 48 information systems change management projects. The prototype presents and helps to handle three basic change patterns, which help to anticipate the overall scope of changes related to particular elementary changes in an enterprise information system. The use of prototype requires just basic knowledge in organizational business process and information management.
ERIC Educational Resources Information Center
Vick, Matthew
2016-01-01
Science teaching continues to move away from teaching science as merely a body of facts and figures to be memorized to a process of exploring and drawing conclusions. The Next Generation Science Standards (NGSS) emphasize eight science and engineering practices that ask students to apply scientific and engineering reasoning and explanation. This…
The curation of genetic variants: difficulties and possible solutions.
Pandey, Kapil Raj; Maden, Narendra; Poudel, Barsha; Pradhananga, Sailendra; Sharma, Amit Kumar
2012-12-01
The curation of genetic variants from biomedical articles is required for various clinical and research purposes. Nowadays, establishment of variant databases that include overall information about variants is becoming quite popular. These databases have immense utility, serving as a user-friendly information storehouse of variants for information seekers. While manual curation is the gold standard method for curation of variants, it can turn out to be time-consuming on a large scale thus necessitating the need for automation. Curation of variants described in biomedical literature may not be straightforward mainly due to various nomenclature and expression issues. Though current trends in paper writing on variants is inclined to the standard nomenclature such that variants can easily be retrieved, we have a massive store of variants in the literature that are present as non-standard names and the online search engines that are predominantly used may not be capable of finding them. For effective curation of variants, knowledge about the overall process of curation, nature and types of difficulties in curation, and ways to tackle the difficulties during the task are crucial. Only by effective curation, can variants be correctly interpreted. This paper presents the process and difficulties of curation of genetic variants with possible solutions and suggestions from our work experience in the field including literature support. The paper also highlights aspects of interpretation of genetic variants and the importance of writing papers on variants following standard and retrievable methods. Copyright © 2012. Published by Elsevier Ltd.
The Curation of Genetic Variants: Difficulties and Possible Solutions
Pandey, Kapil Raj; Maden, Narendra; Poudel, Barsha; Pradhananga, Sailendra; Sharma, Amit Kumar
2012-01-01
The curation of genetic variants from biomedical articles is required for various clinical and research purposes. Nowadays, establishment of variant databases that include overall information about variants is becoming quite popular. These databases have immense utility, serving as a user-friendly information storehouse of variants for information seekers. While manual curation is the gold standard method for curation of variants, it can turn out to be time-consuming on a large scale thus necessitating the need for automation. Curation of variants described in biomedical literature may not be straightforward mainly due to various nomenclature and expression issues. Though current trends in paper writing on variants is inclined to the standard nomenclature such that variants can easily be retrieved, we have a massive store of variants in the literature that are present as non-standard names and the online search engines that are predominantly used may not be capable of finding them. For effective curation of variants, knowledge about the overall process of curation, nature and types of difficulties in curation, and ways to tackle the difficulties during the task are crucial. Only by effective curation, can variants be correctly interpreted. This paper presents the process and difficulties of curation of genetic variants with possible solutions and suggestions from our work experience in the field including literature support. The paper also highlights aspects of interpretation of genetic variants and the importance of writing papers on variants following standard and retrievable methods. PMID:23317699
Criteria used by nurses to evaluate practice-related information on the World Wide Web.
Cader, Raffik; Campbell, Steve; Watson, Don
2003-01-01
Existing criteria used to evaluate information on the World Wide Web often are not related to nursing, especially in relation to clinical and evidence-based practice. Published criteria have been found orientated to the health-consumer, medicine, or general information. In this study, the process by which nurses evaluate practice-related information and the associated evaluative nursing criteria were investigated using a grounded theory approach. In the first stage of this ongoing investigation, semistructured interviews were used to collect data from UK postregistration nursing students. The findings from this initial study provided indications of the process and the criteria for evaluating information on the World Wide Web. Participating students identified intuition as part of the evaluative process. They identified some criteria similar to existing standards, but critically, with additional criteria that are nursing practice related. Because these new criteria are significant for evaluating nursing information, further refinement of these findings is being undertaken through the next stage of the research program.
Framework for Design of Traceability System on Organic Rice Certification
NASA Astrophysics Data System (ADS)
Purwandoko, P. B.; Seminar, K. B.; Sutrisno; Sugiyanta
2018-05-01
Nowadays, the preferences of organic products such as organic rice have been increased. It because of the people awareness of the healthy and eco-friendly food product consumption has grown. Therefore, it is very important to ensure organic quality of the product that will be produced. Certification is a series of process that holds to ensure the quality of products meets all criteria of organic standards. Currently, there is a problem that traceability information system for organic rice certification has been not available. The current system still conducts manually caused the loss of information during storage process. This paper aimed at developing a traceability framework on organic rice certification process. First, the main discussed issues are organic certification process. Second, unified modeling language (UML) is used to build the model of user requirement in order to develop traceability system for all actors in the certification process. Furthermore, the information captured model along certification process will be explained in this paper. The model shows the information flow that has to be recorded for each actor. Finally, the challenges in the implementation system will be discussed in this paper.
Information processing efficiency in patients with multiple sclerosis.
Archibald, C J; Fisk, J D
2000-10-01
Reduced information processing efficiency, consequent to impaired neural transmission, has been proposed as underlying various cognitive problems in patients with Multiple Sclerosis (MS). This study employed two measures developed from experimental psychology that control for the potential confound of perceptual-motor abnormalities (Salthouse, Babcock, & Shaw, 1991; Sternberg, 1966, 1969) to assess the speed of information processing and working memory capacity in patients with mild to moderate MS. Although patients had significantly more cognitive complaints than neurologically intact matched controls, their performance on standard tests of immediate memory span did not differ from control participants and their word list learning was within normal limits. On the experimental measures, both relapsing-remitting and secondary-progressive patients exhibited significantly slowed information processing speed relative to controls. However, only the secondary-progressive patients had an additional decrement in working memory capacity. Depression, fatigue, or neurologic disability did not account for performance differences on these measures. While speed of information processing may be slowed early in the disease process, deficits in working memory capacity may appear only as there is progression of MS. It is these latter deficits, however, that may underlie the impairment of new learning that patients with MS demonstrate.
Dissemination of metabolomics results: role of MetaboLights and COSMOS.
Salek, Reza M; Haug, Kenneth; Steinbeck, Christoph
2013-05-17
With ever-increasing amounts of metabolomics data produced each year, there is an even greater need to disseminate data and knowledge produced in a standard and reproducible way. To assist with this a general purpose, open source metabolomics repository, MetaboLights, was launched in 2012. To promote a community standard, initially culminated as metabolomics standards initiative (MSI), COordination of Standards in MetabOlomicS (COSMOS) was introduced. COSMOS aims to link life science e-infrastructures within the worldwide metabolomics community as well as develop and maintain open source exchange formats for raw and processed data, ensuring better flow of metabolomics information.
Earth Resources Technology Satellite: Non-US standard catalog No. N-13
NASA Technical Reports Server (NTRS)
1973-01-01
To provide dissemination of information regarding the availability of Earth Resources Technology Satellite (ERTS) imagery, a Non-U.S. Standard Catalog is published on a monthly schedule. The catalogs identify imagery which has been processed and input to the data files during the preceding month. The Non-U.S. Standard Catalog includes imagery covering all areas except that of the United States, Hawaii, and Alaska. Imagery adjacent to the Continental U.S. and Alaska borders will normally appear in the U.S. Standard Catalog. As a supplement to these catalogs, an inventory of ERTS imagery on 16 millimeter microfilm is available. The catalogs consist of four parts: (1) annotated maps which graphically depict the geographic areas covered by the imagery listed in the current catalog, (2) a computer-generated listing organized by observation identification number (ID) with pertinent information for each image, (3) a computer listing of observations organized by longitude and latitude, and (4) observations which have had changes made in their catalog information since the original entry in the data base.
McDonough, Ian M; Bui, Dung C; Friedman, Michael C; Castel, Alan D
2015-10-01
The perceived value of information can influence one's motivation to successfully remember that information. This study investigated how information value can affect memory search and evaluation processes (i.e., retrieval monitoring). In Experiment 1, participants studied unrelated words associated with low, medium, or high values. Subsequent memory tests required participants to selectively monitor retrieval for different values. False memory effects were smaller when searching memory for high-value than low-value words, suggesting that people more effectively monitored more important information. In Experiment 2, participants studied semantically-related words, and the need for retrieval monitoring was reduced at test by using inclusion instructions (i.e., endorsement of any word related to the studied words) compared with standard instructions. Inclusion instructions led to increases in false recognition for low-value, but not for high-value words, suggesting that under standard-instruction conditions retrieval monitoring was less likely to occur for important information. Experiment 3 showed that words retrieved with lower confidence were associated with more effective retrieval monitoring, suggesting that the quality of the retrieved memory influenced the degree and effectiveness of monitoring processes. Ironically, unless encouraged to do so, people were less likely to carefully monitor important information, even though people want to remember important memories most accurately. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Susanto, Arif; Mulyono, Nur Budi
2018-02-01
The changes of environmental management system standards into the latest version, i.e. ISO 14001:2015, may cause a change on a data and information need in decision making and achieving the objectives in the organization coverage. Information management is the organization's responsibility to ensure that effectiveness and efficiency start from its creating, storing, processing and distribution processes to support operations and effective decision making activity in environmental performance management. The objective of this research was to set up an information management program and to adopt the technology as the supporting component of the program which was done by PTFI Concentrating Division so that it could be in line with the desirable organization objective in environmental management based on ISO 14001:2015 environmental management system standards. Materials and methods used covered technical aspects in information management, i.e. with web-based application development by using usage centered design. The result of this research showed that the use of Single Sign On gave ease to its user to interact further on the use of the environmental management system. Developing a web-based through creating entity relationship diagram (ERD) and information extraction by conducting information extraction which focuses on attributes, keys, determination of constraints. While creating ERD is obtained from relational database scheme from a number of database from environmental performances in Concentrating Division.
SAE J2735 standard : applying the systems engineering process.
DOT National Transportation Integrated Search
1998-11-01
As part of the U.S. Department of Transportations Intelligent Vehicle Initiative (IVI) program, the Federal Highway Administration investigated the human factors research needs for integrating in-vehicle safety and driver information technologies ...
GéoSAS: A modular and interoperable Open Source Spatial Data Infrastructure for research
NASA Astrophysics Data System (ADS)
Bera, R.; Squividant, H.; Le Henaff, G.; Pichelin, P.; Ruiz, L.; Launay, J.; Vanhouteghem, J.; Aurousseau, P.; Cudennec, C.
2015-05-01
To-date, the commonest way to deal with geographical information and processes still appears to consume local resources, i.e. locally stored data processed on a local desktop or server. The maturity and subsequent growing use of OGC standards to exchange data on the World Wide Web, enhanced in Europe by the INSPIRE Directive, is bound to change the way people (and among them research scientists, especially in environmental sciences) make use of, and manage, spatial data. A clever use of OGC standards can help scientists to better store, share and use data, in particular for modelling. We propose a framework for online processing by making an intensive use of OGC standards. We illustrate it using the Spatial Data Infrastructure (SDI) GéoSAS which is the SDI set up for researchers' needs in our department. It is based on the existing open source, modular and interoperable Spatial Data Architecture geOrchestra.
Automated realtime data import for the i2b2 clinical data warehouse: introducing the HL7 ETL cell.
Majeed, Raphael W; Röhrig, Rainer
2012-01-01
Clinical data warehouses are used to consolidate all available clinical data from one or multiple organizations. They represent an important source for clinical research, quality management and controlling. Since its introduction, the data warehouse i2b2 gathered a large user base in the research community. Yet, little work has been done on the process of importing clinical data into data warehouses using existing standards. In this article, we present a novel approach of utilizing the clinical integration server as data source, commonly available in most hospitals. As information is transmitted through the integration server, the standardized HL7 message is immediately parsed and inserted into the data warehouse. Evaluation of import speeds suggest feasibility of the provided solution for real-time processing of HL7 messages. By using the presented approach of standardized data import, i2b2 can be used as a plug and play data warehouse, without the hurdle of customized import for every clinical information system or electronic medical record. The provided solution is available for download at http://sourceforge.net/projects/histream/.
Improving Analysis: Dealing with Information Processing Errors
2006-11-01
obviating this issue, psychological test data provides information that is normed and scored in a common standardized metric (e.g., a z score. A z score is a...to take these into account when interpreting psychological test information. Clinicians are not alone in their relative inability to outperform...1980); M. Snyder and B. Campbell, " Testing hypotheses about other people: The role of the hypothesis," Personality and Social Psychology Bulletin, No. 6
Sherman, Kerry A; Shaw, Laura-Kate E; Winch, Caleb J; Harcourt, Diana; Boyages, John; Cameron, Linda D; Brown, Paul; Lam, Thomas; Elder, Elisabeth; French, James; Spillane, Andrew
2016-10-01
Deciding whether or not to have breast reconstruction following breast cancer diagnosis is a complex decision process. This randomized controlled trial assessed the impact of an online decision aid [Breast RECONstruction Decision Aid (BRECONDA)] on breast reconstruction decision-making. Women (n = 222) diagnosed with breast cancer or ductal carcinoma in situ, and eligible for reconstruction following mastectomy, completed an online baseline questionnaire. They were then assigned randomly to receive either standard online information about breast reconstruction (control) or standard information plus access to BRECONDA (intervention). Participants then completed questionnaires at 1 and 6 months after randomization. The primary outcome was participants' decisional conflict 1 month after exposure to the intervention. Secondary outcomes included decisional conflict at 6 months, satisfaction with information at 1 and 6 months, and 6-month decisional regret. Linear mixed-model analyses revealed that 1-month decisional conflict was significantly lower in the intervention group (27.18) compared with the control group (35.5). This difference was also sustained at the 6-month follow-up. Intervention participants reported greater satisfaction with information at 1- and 6-month follow-up, and there was a nonsignificant trend for lower decisional regret in the intervention group at 6-month follow-up. Intervention participants' ratings for BRECONDA demonstrated high user acceptability and overall satisfaction. Women who accessed BRECONDA benefited by experiencing significantly less decisional conflict and being more satisfied with information regarding the reconstruction decisional process than women receiving standard care alone. These findings support the efficacy of BRECONDA in helping women to arrive at their breast reconstruction decision.
Good eggs? Evaluating consent forms for egg donation.
Cattapan, Alana Rose
2016-07-01
Beyond gaps in the provision of information, the informed consent process for egg donation is complicated by conflicts of interest, payment and a lack of longitudinal data about physiological and psychological risks. Recent scholarship has suggested that egg donation programmes could improve the informed consent process by revising consent documents. At a minimum, these documents should include information about eight key criteria: the nature and objectives of treatment; the benefits, risks and inconveniences of egg donation; the privacy of donors and their anonymity (where applicable); disclosure that participation is voluntary (withdrawal); the availability of counselling; financial considerations; the possibility of an unsuccessful cycle and potential uses of the eggs retrieved. This study evaluates the incorporation of these minimum criteria in consent forms for egg donation, obtained through requests to Canadian fertility clinics. Even when clinics were considered to have met criteria simply by mentioning them, among the eight consent forms assessed, none met the minimum standards. Only half of clinics addressed privacy/anonymity concerns, financial issues and the possibility of a future cycle. Improving the quality of consent documentation to meet the minimum standards established by this study may not be an onerous task. For some, this will include re-evaluating how they include one or two elements of disclosure, and for others, this will require a substantial overhaul. Using the criteria provided by this study as the minimum standard for consent could ensure that donors have the basic information they need to make informed decisions. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Concept of Integrated Information Systems of Rail Transport
NASA Astrophysics Data System (ADS)
Siergiejczyk, Mirosław; Gago, Stanisław
This paper will present a need to create integrated information systems of the rail transport and their links with other means of public transportation. IT standards will be discussed that are expected to create the integrated information systems of the rail transport. Also the main tasks will be presented of centralized information systems, the concept of their architecture, business processes and their implementation as well as the proposed measures to secure data. A method shall be proposed to implement a system to inform participants of rail transport in Polish conditions.
Predictors and Effects of Knowledge Management in U.S. Colleges and Schools of Pharmacy
NASA Astrophysics Data System (ADS)
Watcharadamrongkun, Suntaree
Public demands for accountability in higher education have placed increasing pressure on institutions to document their achievement of critical outcomes. These demands also have had wide-reaching implications for the development and enforcement of accreditation standards, including those governing pharmacy education. The knowledge management (KM) framework provides perspective for understanding how organizations evaluate themselves and guidance for how to improve their performance. In this study, we explore knowledge management processes, how these processes are affected by organizational structure and by information technology resources, and how these processes affect organizational performance. This is done in the context of Accreditation Standards and Guidelines for the Professional Program in Pharmacy Leading to the Doctor of Pharmacy Degree (Standards 2007). Data were collected using an online census survey of 121 U.S. Colleges and Schools of Pharmacy and supplemented with archival data. A key informant method was used with CEO Deans and Assessment leaders serving as respondents. The survey yielded a 76.0% (92/121) response rate. Exploratory factor analysis was used to construct scales (and scales) describing core KM processes: Knowledge Acquisition, Knowledge Integration, and Institutionalization; all scale reliabilities were found to be acceptable. Analysis showed that, as expected, greater Knowledge Acquisition predicts greater Knowledge Integration and greater Knowledge Integration predicts greater Institutionalization. Predictive models were constructed using hierarchical multiple regression and path analysis. Overall, information technology resources had stronger effects on KM processes than did characteristics of organizational structure. Greater Institutionalization predicted better outcomes related to direct measures of performance (i.e., NAPLEX pass rates, Accreditation actions) but Institutionalization was unrelated to an indirect measure of performance (i.e., USNWR ratings). Several organizational structure characteristics (i.e., size, age, and being part of an academic health center) were significant predictors of organizational performance; in contrast, IT resources had no direct effects on performance. Findings suggest that knowledge management processes, organizational structures and IT resources are related to better performance for Colleges and Schools of Pharmacy. Further research is needed to understand mechanisms through which specific knowledge management processes translate into better performance and, relatedly, to establish how enhancing KM processes can be used to improve institutional quality.
The long-term ecological research community metada standardisation project: a progress report
Inigo San Gil; Karen Baker; John Campbell; Ellen G. Denny; Kristin Vanderbilt; Brian Riordan; Rebecca Koskela; Jason Downing; Sabine Grabner; Eda Melendez; Jonathan M. Walsh; Masib Kortz; James Conners; Lynn Yarmey; Nicole Kaplan; Emery R. Boose; Linda Powell; Corinna Gries; Robin Schroeder; Todd Ackerman; Ken Ramsey; Barbara Benson; Jonathan Chipman; James Laundre; Hap Garritt; Don Henshaw; Barrie Collins; Christopher Gardner; Sven Bohm; Margaret O' Brien; Jincheng Gao; Wade Sheldon; Stephanie Lyon; Dan Bahauddin; Mark Servilla; Duane Costa; James Brunt
2009-01-01
We describe the process by which the Long-Term Ecological Research (LTER) Network standardized their metadata through the adoption of the Ecological Metadata Language (EML). We describe the strategies developed to improve motivation and to complement the information technology resources available at the LTER sites. EML implementation is presented as a mapping process...
The Power of Teacher Selection to Improve Education. Evidence Speaks Reports, Vol 1, #12
ERIC Educational Resources Information Center
Jacob, Brian A.
2016-01-01
This report describes the findings from a new study of the teacher selection process in Washington, DC public schools. In 2009, the district created a centralized application process to streamline hiring by screening out less desirable candidates. Following the collection of standard information, applicants are asked to complete up to three…
Parameter Variability and Distributional Assumptions in the Diffusion Model
ERIC Educational Resources Information Center
Ratcliff, Roger
2013-01-01
If the diffusion model (Ratcliff & McKoon, 2008) is to account for the relative speeds of correct responses and errors, it is necessary that the components of processing identified by the model vary across the trials of a task. In standard applications, the rate at which information is accumulated by the diffusion process is assumed to be normally…
Information content versus word length in random typing
NASA Astrophysics Data System (ADS)
Ferrer-i-Cancho, Ramon; Moscoso del Prado Martín, Fermín
2011-12-01
Recently, it has been claimed that a linear relationship between a measure of information content and word length is expected from word length optimization and it has been shown that this linearity is supported by a strong correlation between information content and word length in many languages (Piantadosi et al 2011 Proc. Nat. Acad. Sci. 108 3825). Here, we study in detail some connections between this measure and standard information theory. The relationship between the measure and word length is studied for the popular random typing process where a text is constructed by pressing keys at random from a keyboard containing letters and a space behaving as a word delimiter. Although this random process does not optimize word lengths according to information content, it exhibits a linear relationship between information content and word length. The exact slope and intercept are presented for three major variants of the random typing process. A strong correlation between information content and word length can simply arise from the units making a word (e.g., letters) and not necessarily from the interplay between a word and its context as proposed by Piantadosi and co-workers. In itself, the linear relation does not entail the results of any optimization process.
Materials, processes, and environmental engineering network
NASA Technical Reports Server (NTRS)
White, Margo M.
1993-01-01
The Materials, Processes, and Environmental Engineering Network (MPEEN) was developed as a central holding facility for materials testing information generated by the Materials and Processes Laboratory. It contains information from other NASA centers and outside agencies, and also includes the NASA Environmental Information System (NEIS) and Failure Analysis Information System (FAIS) data. Environmental replacement materials information is a newly developed focus of MPEEN. This database is the NASA Environmental Information System, NEIS, which is accessible through MPEEN. Environmental concerns are addressed regarding materials identified by the NASA Operational Environment Team, NOET, to be hazardous to the environment. An environmental replacement technology database is contained within NEIS. Environmental concerns about materials are identified by NOET, and control or replacement strategies are formed. This database also contains the usage and performance characteristics of these hazardous materials. In addition to addressing environmental concerns, MPEEN contains one of the largest materials databases in the world. Over 600 users access this network on a daily basis. There is information available on failure analysis, metals and nonmetals testing, materials properties, standard and commercial parts, foreign alloy cross-reference, Long Duration Exposure Facility (LDEF) data, and Materials and Processes Selection List data.
Bodner, Todd E.
2017-01-01
Wilkinson and Task Force on Statistical Inference (1999) recommended that researchers include information on the practical magnitude of effects (e.g., using standardized effect sizes) to distinguish between the statistical and practical significance of research results. To date, however, researchers have not widely incorporated this recommendation into the interpretation and communication of the conditional effects and differences in conditional effects underlying statistical interactions involving a continuous moderator variable where at least one of the involved variables has an arbitrary metric. This article presents a descriptive approach to investigate two-way statistical interactions involving continuous moderator variables where the conditional effects underlying these interactions are expressed in standardized effect size metrics (i.e., standardized mean differences and semi-partial correlations). This approach permits researchers to evaluate and communicate the practical magnitude of particular conditional effects and differences in conditional effects using conventional and proposed guidelines, respectively, for the standardized effect size and therefore provides the researcher important supplementary information lacking under current approaches. The utility of this approach is demonstrated with two real data examples and important assumptions underlying the standardization process are highlighted. PMID:28484404
Paperless Procurement: The Impact of Advanced Automation
1992-09-01
System. POPS = Paperless Order Processing System; RADMIS = Research and Development Management Information System; SAACONS=Standard Army Automated... order processing system, which then updates the contractor’s production (or delivery) scheduling and contract accounting applications. In return, the...used by the DLA’s POPS. 3-5 into an EDI delivery order and pass it directly to the distributor’s or manufacturer’s order processing system. That
Freedom of information: a case study.
Worsfold, Denise
2006-09-01
The purpose of this case study was to find out how easy it was to access information on the hygiene standards of eating places open to the public. Using the Freedom of Information (FOI) Act 2000, four adjacent local authorities in South Wales were asked to provide the last food hygiene report of an eating place in their area. The disclosed reports were assessed to determine how useful they would be to an individual seeking more information on a food premise. It was relatively easy to obtain information from two authorities and difficult if not impossible with the others. One local authority refused to release information despite the intervention of the FOI Commissioner. The quality of the information released was variable. This ranged from a completed comprehensive inspection protocol to a hand-written, illegible, incomplete report that failed to adequately differentiate between requirements and recommendations. Without some training in food law and food hygiene it would be difficult to interpret the reports. There was no evidence from the information provided of inspection scoring. The case study raises concerns about the effectiveness of the Act for consumers who wish to obtain information about the hygiene standards of food premises. While the specialist information provided by hygiene inspection reports may be useful to businesses it is not helpful for the lay public. Consumers must be prepared to exercise patience and tenacity if they want this information. Concerns must be raised about the consistency of the inspection process and about the willingness of some local authorities to be transparent about the inspection and enforcement process.
National Practice Patterns of Obtaining Informed Consent for Stroke Thrombolysis.
Mendelson, Scott J; Courtney, D Mark; Gordon, Elisa J; Thomas, Leena F; Holl, Jane L; Prabhakaran, Shyam
2018-03-01
No standard approach to obtaining informed consent for stroke thrombolysis with tPA (tissue-type plasminogen activator) currently exists. We aimed to assess current nationwide practice patterns of obtaining informed consent for tPA. An online survey was developed and distributed by e-mail to clinicians involved in acute stroke care. Multivariable logistic regression analyses were performed to determine independent factors contributing to always obtaining informed consent for tPA. Among 268 respondents, 36.7% reported always obtaining informed consent and 51.8% reported the informed consent process caused treatment delays. Being an emergency medicine physician (odds ratio, 5.8; 95% confidence interval, 2.9-11.5) and practicing at a nonacademic medical center (odds ratio, 2.1; 95% confidence interval, 1.0-4.3) were independently associated with always requiring informed consent. The most commonly cited cause of delay was waiting for a patient's family to reach consensus about treatment. Most clinicians always or often require informed consent for stroke thrombolysis. Future research should focus on standardizing content and delivery of tPA information to reduce delays. © 2018 American Heart Association, Inc.
How to (and how not to) think about top-down influences on visual perception.
Teufel, Christoph; Nanay, Bence
2017-01-01
The question of whether cognition can influence perception has a long history in neuroscience and philosophy. Here, we outline a novel approach to this issue, arguing that it should be viewed within the framework of top-down information-processing. This approach leads to a reversal of the standard explanatory order of the cognitive penetration debate: we suggest studying top-down processing at various levels without preconceptions of perception or cognition. Once a clear picture has emerged about which processes have influences on those at lower levels, we can re-address the extent to which they should be considered perceptual or cognitive. Using top-down processing within the visual system as a model for higher-level influences, we argue that the current evidence indicates clear constraints on top-down influences at all stages of information processing; it does, however, not support the notion of a boundary between specific types of information-processing as proposed by the cognitive impenetrability hypothesis. Copyright © 2016 Elsevier Inc. All rights reserved.
Parent educational materials regarding the newborn hearing screening process.
Krishnan, Lata A; Lawler, Breanne; Van Hyfte, Shannon
2017-04-01
Newborn hearing screening (NHS) procedures and implementation vary from state to state in the US. The purpose of this study was to evaluate the content and nature of information provided to parents about their infant's NHS across states to answer two questions: 1) what information is included in each state's parent information brochure? and 2) do the brochures include educational information requested by parents that may help reduce parental anxiety, improve satisfaction, and decrease the potential for misunderstandings? Each state's parent brochures and educational resources provided to parents were accessed via the National Center for Hearing Assessment and Management (NCHAM) website, categorized, and reviewed for content. Results indicate that the information provided to parents varies considerably across states and many brochures do not contain important information that is desired by parents. NHS procedures may be improved by providing standardized information regarding the process to parents in all states. Copyright © 2017 Elsevier B.V. All rights reserved.
Marginally specified priors for non-parametric Bayesian estimation
Kessler, David C.; Hoff, Peter D.; Dunson, David B.
2014-01-01
Summary Prior specification for non-parametric Bayesian inference involves the difficult task of quantifying prior knowledge about a parameter of high, often infinite, dimension. A statistician is unlikely to have informed opinions about all aspects of such a parameter but will have real information about functionals of the parameter, such as the population mean or variance. The paper proposes a new framework for non-parametric Bayes inference in which the prior distribution for a possibly infinite dimensional parameter is decomposed into two parts: an informative prior on a finite set of functionals, and a non-parametric conditional prior for the parameter given the functionals. Such priors can be easily constructed from standard non-parametric prior distributions in common use and inherit the large support of the standard priors on which they are based. Additionally, posterior approximations under these informative priors can generally be made via minor adjustments to existing Markov chain approximation algorithms for standard non-parametric prior distributions. We illustrate the use of such priors in the context of multivariate density estimation using Dirichlet process mixture models, and in the modelling of high dimensional sparse contingency tables. PMID:25663813
Information security concepts and practices: the case of a provincial multi-specialty hospital.
Cavalli, Enrico; Mattasoglio, Andrea; Pinciroli, Francesco; Spaggiari, Piergiorgio
2004-03-31
In recent years, major and widely accepted information security understandings and achievements confirm that the problem is complex. They clarify that technologies are fundamental tools, but management processes have even bigger relevance, as also prestigious international magazines dossier clearly explained recently. Such a magazine attention outlines the wide impact that the subject has on watchful decision makers. ISO17799 is an emerging standard in information security. In principle there are no reasons for considering it not applicable to the health care sector. In practice, because of both the just conceptual level of the standard and the peculiarities of the health care data and institutions, a lot of analysis and design work need to be invested any time a health care institution decides to deal with the subject. CEN/ENV 12924 is another emerging standard certainly more on the spot of the health care. Nevertheless, it also asks for evident further investigation. The practical case of information security design, implementation, management, and auditing inside a multi-specialty provincial Italian hospital will be described.
Total quality management - It works for aerospace information services
NASA Technical Reports Server (NTRS)
Erwin, James; Eberline, Carl; Colquitt, Wanda
1993-01-01
Today we are in the midst of information and 'total quality' revolutions. At the NASA STI Program's Center for AeroSpace Information (CASI), we are focused on using continuous improvements techniques to enrich today's services and products and to ensure that tomorrow's technology supports the TQM-based improvement of future STI program products and services. The Continuous Improvements Program at CASI is the foundation for Total Quality Management in products and services. The focus is customer-driven; its goal, to identify processes and procedures that can be improved and new technologies that can be integrated with the processes to gain efficiencies, provide effectiveness, and promote customer satisfaction. This Program seeks to establish quality through an iterative defect prevention approach that is based on the incorporation of standards and measurements into the processing cycle.
[Development of a medical equipment support information system based on PDF portable document].
Cheng, Jiangbo; Wang, Weidong
2010-07-01
According to the organizational structure and management system of the hospital medical engineering support, integrate medical engineering support workflow to ensure the medical engineering data effectively, accurately and comprehensively collected and kept in electronic archives. Analyse workflow of the medical, equipment support work and record all work processes by the portable electronic document. Using XML middleware technology and SQL Server database, complete process management, data calculation, submission, storage and other functions. The practical application shows that the medical equipment support information system optimizes the existing work process, standardized and digital, automatic and efficient orderly and controllable. The medical equipment support information system based on portable electronic document can effectively optimize and improve hospital medical engineering support work, improve performance, reduce costs, and provide full and accurate digital data
ERIC Educational Resources Information Center
Penton, John
Designed to provide information about reading in New Zealand, this report offers an overview of theory and practice in that area. Among the topics discussed are: the current concern about reading standards; developmental reading; effective methods of reading instruction; research into the nature of the reading process; preparation of teachers of…
Social scientist's viewpoint on conflict management
Ertel, Madge O.
1990-01-01
Social scientists can bring to the conflict-management process objective, reliable information needed to resolve increasingly complex issues. Engineers need basic training in the principles of the social sciences and in strategies for public involvement. All scientists need to be sure that that the information they provide is unbiased by their own value judgments and that fair standards and open procedures govern its use.
Library Information-Processing System
NASA Technical Reports Server (NTRS)
1985-01-01
System works with Library of Congress MARC II format. System composed of subsystems that provide wide range of library informationprocessing capabilities. Format is American National Standards Institute (ANSI) format for machine-readable bibliographic data. Adaptable to any medium-to-large library.
Energy-Saving Opportunities for Manufacturing Enterprises (International English Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This fact sheet provides information about the Industrial Technologies Program Save Energy Now energy audit process, software tools, training, energy management standards, and energy efficient technologies to help U.S. companies identify energy cost savings.
Citrin, Rebecca; Horowitz, Joseph P; Reilly, Anne F; Li, Yimei; Huang, Yuan-Shung; Getz, Kelly D; Seif, Alix E; Fisher, Brian T; Aplenc, Richard
2017-01-01
Mature B-cell non-Hodgkin lymphoma (B-NHL) constitutes a collection of relatively rare pediatric malignancies. In order to utilize administrative data to perform large-scale epidemiologic studies within this population, a two-step process was used to assemble a 12-year cohort of B-NHL patients treated between 2004 and 2015 within the Pediatric Health Information System database. Patients were identified by ICD-9 codes, and their chemotherapy data were then manually reviewed against standard B-NHL treatment regimens. A total of 1,409 patients were eligible for cohort inclusion. This process was validated at a single center, utilizing both an institutional tumor registry and medical record review as the gold standards. The validation demonstrated appropriate sensitivity (91.5%) and positive predictive value (95.1%) to allow for the future use of this cohort for epidemiologic and comparative effectiveness research.
The Role of Healthcare Technology Management in Facilitating Medical Device Cybersecurity.
Busdicker, Mike; Upendra, Priyanka
2017-09-02
This article discusses the role of healthcare technology management (HTM) in medical device cybersecurity and outlines concepts that are applicable to HTM professionals at a healthcare delivery organization or at an integrated delivery network, regardless of size. It provides direction for HTM professionals who are unfamiliar with the security aspects of managing healthcare technologies but are familiar with standards from The Joint Commission (TJC). It provides a useful set of recommendations, including relevant references for incorporating good security practices into HTM practice. Recommendations for policies, procedures, and processes referencing TJC standards are easily applicable to HTM departments with limited resources and to those with no resource concerns. The authors outline processes from their organization as well as best practices learned through information sharing at AAMI, National Health Information Sharing and Analysis Center (NH-ISAC), and Medical Device Innovation, Safety, and Security Consortium (MDISS) conferences and workshops.
Developing SoilML as a global standard for the collation and transfer of soil data and information.
NASA Astrophysics Data System (ADS)
Montanarella, Luca; Wilson, Peter; Cox, Simon; McBratney, Alex; Ahamed, Sonya; McMillan, Bob; Jacquier, David; Fortner, Jim
2010-05-01
There is an increasing need to collect, collate and share soil data and information within countries, across regions and globally. Timely access to consistent and authoritative data and information is critical to issues related to food production, climate change, water management, energy production and biodiversityl. Soil data and information is managed by numerous agencies and organisations using a plethora of processes, scales and standards. A number of national and international activities and projects are currently dealing with the issues associated with collation of disparate data sets. Standards are being developed for data storage, transfer and collation like, for example, in the GobalSoilMap.net project, e-SOTER and the EU Inspire GS-SOIL. Individually these will not provide a single internationally recognised and adopted standard for soil data and information exchange. A recent GlobalSoilMap.net meeting held in Wageningen, The Netherlands, discussed the needs of a harmonized information model for collation of a global 90 metre grid of key soil attributes (organic carbon, soil texture, pH, depth to bedrock/impeding layer, and predictions of bulk density and available water capacity) at six specified depth increments. The meeting considered a number of existing data base implementations (such as ASRIS, NASIS, WISE, SOTER) as well as emerging abstract information models that are being expressed in UML (such as e-SOTER). It examined related information models, such as GeoSciML and the lessons learnt in developing and implementing such community agreed models, features and vocabularies. There is a need to develop a global soil information standard, to be called SoilML, that would allow access and use of data across a broad range of international initiatives (such as GEOSS and INSPIRE) as well as supporting national, regional and local data interoperability and integration. The meeting agreed to adopt the interoperability approaches of formalising the information model in UML with XML encoding for data transfer as well as re-using existing features and patterns where appropriate such as those found in GeoSciML and Observations and Measurements. It has been proposed to establish a formal Working Group on Soil Information Standards under the International Union of Soil Science to give the SoilML information model both scientific credibility and international standing. A number of meetings and workshops are being planned to progress the draft SoilML information model
2008-07-01
cycle Evolution of a system, product, service, project or other human-made entity from conception through retirement [ ISO 12207 ]. Logical line of...012 [ ISO 1995] International Organization for Standardization. ISO /IEC 12207 :1995—Information technology— Software life cycle processes. http...definitions, authors were asked to use or align with already existing standards such as those available through ISO and IEEE when possible. Literature
A Model for Joint Software Reviews
1998-10-01
CEPMAN 1, 1996; Gabb, 1997], and with the growing popularity of outsourcing, they are becoming more important in the commercial sector [ ISO /IEC 12207 ...technical and management reviews [MIL-STD-498, 1996; ISO /IEC 12207 , 1995]. Management reviews occur after technical reviews, and are focused on the cost...characteristics, Standard (No. ISO /IEC 9126-1). [ ISO /IEC 12207 , 1995] Information Technology Software Life Cycle Processes, Standard (No. ISO /IEC 12207
The bottom line on wound care standardization.
McNees, Patrick; Kueven, John A
2011-03-01
North Mississippi Medical Center learned four important lessons in its effort to standardize wound care products and processes of care: Designate clinical champions. Limit the number of vendors that provide products for a specialty. Reduce the number of similar-function wound care products that are available for use, with decisions guided by clinicians. Provide transparent information related to cost, the intentions of the initiative, usage patterns, and the criteria for evaluation.
Sauer, Vernon B.
2002-01-01
Surface-water computation methods and procedures are described in this report to provide standards from which a completely automated electronic processing system can be developed. To the greatest extent possible, the traditional U. S. Geological Survey (USGS) methodology and standards for streamflow data collection and analysis have been incorporated into these standards. Although USGS methodology and standards are the basis for this report, the report is applicable to other organizations doing similar work. The proposed electronic processing system allows field measurement data, including data stored on automatic field recording devices and data recorded by the field hydrographer (a person who collects streamflow and other surface-water data) in electronic field notebooks, to be input easily and automatically. A user of the electronic processing system easily can monitor the incoming data and verify and edit the data, if necessary. Input of the computational procedures, rating curves, shift requirements, and other special methods are interactive processes between the user and the electronic processing system, with much of this processing being automatic. Special computation procedures are provided for complex stations such as velocity-index, slope, control structures, and unsteady-flow models, such as the Branch-Network Dynamic Flow Model (BRANCH). Navigation paths are designed to lead the user through the computational steps for each type of gaging station (stage-only, stagedischarge, velocity-index, slope, rate-of-change in stage, reservoir, tide, structure, and hydraulic model stations). The proposed electronic processing system emphasizes the use of interactive graphics to provide good visual tools for unit values editing, rating curve and shift analysis, hydrograph comparisons, data-estimation procedures, data review, and other needs. Documentation, review, finalization, and publication of records are provided for with the electronic processing system, as well as archiving, quality assurance, and quality control.
Localized analysis of paint-coat drying using dynamic speckle interferometry
NASA Astrophysics Data System (ADS)
Sierra-Sosa, Daniel; Tebaldi, Myrian; Grumel, Eduardo; Rabal, Hector; Elmaghraby, Adel
2018-07-01
The paint-coating is part of several industrial processes, including the automotive industry, architectural coatings, machinery and appliances. These paint-coatings must comply with high quality standards, for this reason evaluation techniques from paint-coatings are in constant development. One important factor from the paint-coating process is the drying, as it has influence on the quality of final results. In this work we present an assessment technique based on the optical dynamic speckle interferometry, this technique allows for the temporal activity evaluation of the paint-coating drying process, providing localized information from drying. This localized information is relevant in order to address the drying homogeneity, optimal drying, and quality control. The technique relies in the definition of a new temporal history of the speckle patterns to obtain the local activity; this information is then clustered to provide a convenient indicative of different drying process stages. The experimental results presented were validated using the gravimetric drying curves
Security Risk Assessment Process for UAS in the NAS CNPC Architecture
NASA Technical Reports Server (NTRS)
Iannicca, Dennis C.; Young, Dennis P.; Thadani, Suresh K.; Winter, Gilbert A.
2013-01-01
This informational paper discusses the risk assessment process conducted to analyze Control and Non-Payload Communications (CNPC) architectures for integrating civil Unmanned Aircraft Systems (UAS) into the National Airspace System (NAS). The assessment employs the National Institute of Standards and Technology (NIST) Risk Management framework to identify threats, vulnerabilities, and risks to these architectures and recommends corresponding mitigating security controls. This process builds upon earlier work performed by RTCA Special Committee (SC) 203 and the Federal Aviation Administration (FAA) to roadmap the risk assessment methodology and to identify categories of information security risks that pose a significant impact to aeronautical communications systems. A description of the deviations from the typical process is described in regards to this aeronautical communications system. Due to the sensitive nature of the information, data resulting from the risk assessment pertaining to threats, vulnerabilities, and risks is beyond the scope of this paper.
Security Risk Assessment Process for UAS in the NAS CNPC Architecture
NASA Technical Reports Server (NTRS)
Iannicca, Dennis Christopher; Young, Daniel Paul; Suresh, Thadhani; Winter, Gilbert A.
2013-01-01
This informational paper discusses the risk assessment process conducted to analyze Control and Non-Payload Communications (CNPC) architectures for integrating civil Unmanned Aircraft Systems (UAS) into the National Airspace System (NAS). The assessment employs the National Institute of Standards and Technology (NIST) Risk Management framework to identify threats, vulnerabilities, and risks to these architectures and recommends corresponding mitigating security controls. This process builds upon earlier work performed by RTCA Special Committee (SC) 203 and the Federal Aviation Administration (FAA) to roadmap the risk assessment methodology and to identify categories of information security risks that pose a significant impact to aeronautical communications systems. A description of the deviations from the typical process is described in regards to this aeronautical communications system. Due to the sensitive nature of the information, data resulting from the risk assessment pertaining to threats, vulnerabilities, and risks is beyond the scope of this paper
Quality and efficiency successes leveraging IT and new processes.
Chaiken, Barry P; Christian, Charles E; Johnson, Liz
2007-01-01
Today, healthcare annually invests billions of dollars in information technology, including clinical systems, electronic medical records and interoperability platforms. While continued investment and parallel development of standards are critical to secure exponential benefits from clinical information technology, intelligent and creative redesign of processes through path innovation is necessary to deliver meaningful value. Reports from two organizations included in this report review the steps taken to reinvent clinical processes that best leverage information technology to deliver safer and more efficient care. Good Samaritan Hospital, Vincennes, Indiana, implemented electronic charting, point-of-care bar coding of medications prior to administration, and integrated clinical documentation for nursing, laboratory, radiology and pharmacy. Tenet Healthcare, during its implementation and deployment of multiple clinical systems across several hospitals, focused on planning that included team-based process redesign. In addition, Tenet constructed valuable and measurable metrics that link outcomes with its strategic goals.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
The purpose of this inventory of power plants is to provide a ready reference for planners whose focus is on the state, standard Federal region, and/or national level. Thus the inventory is compiled alphabetically by state within standard Federal regions. The units are listed alphabetically within electric utility systems which in turn are listed alphabetically within states. The locations are identified to county level according to the Federal Information Processing Standards Publication Counties and County Equivalents of the States of the United States. Data compiled include existing and projected electrical generation units, jointly owned units, and projected construction units.
Interpreting and Reporting Radiological Water-Quality Data
McCurdy, David E.; Garbarino, John R.; Mullin, Ann H.
2008-01-01
This document provides information to U.S. Geological Survey (USGS) Water Science Centers on interpreting and reporting radiological results for samples of environmental matrices, most notably water. The information provided is intended to be broadly useful throughout the United States, but it is recommended that scientists who work at sites containing radioactive hazardous wastes need to consult additional sources for more detailed information. The document is largely based on recognized national standards and guidance documents for radioanalytical sample processing, most notably the Multi-Agency Radiological Laboratory Analytical Protocols Manual (MARLAP), and on documents published by the U.S. Environmental Protection Agency and the American National Standards Institute. It does not include discussion of standard USGS practices including field quality-control sample analysis, interpretive report policies, and related issues, all of which shall always be included in any effort by the Water Science Centers. The use of 'shall' in this report signifies a policy requirement of the USGS Office of Water Quality.
Lin, Yanhua; Staes, Catherine J; Shields, David E; Kandula, Vijay; Welch, Brandon M; Kawamoto, Kensaku
2015-01-01
When coupled with a common information model, a common terminology for clinical decision support (CDS) and electronic clinical quality measurement (eCQM) could greatly facilitate the distributed development and sharing of CDS and eCQM knowledge resources. To enable such scalable knowledge authoring and sharing, we systematically developed an extensible and standards-based terminology for CDS and eCQM in the context of the HL7 Virtual Medical Record (vMR) information model. The development of this terminology entailed three steps: (1) systematic, physician-curated concept identification from sources such as the Health Information Technology Standards Panel (HITSP) and the SNOMED-CT CORE problem list; (2) concept de-duplication leveraging the Unified Medical Language System (UMLS) MetaMap and Metathesaurus; and (3) systematic concept naming using standard terminologies and heuristic algorithms. This process generated 3,046 concepts spanning 68 domains. Evaluation against representative CDS and eCQM resources revealed approximately 50-70% concept coverage, indicating the need for continued expansion of the terminology.
Bidgood, W. Dean; Bray, Bruce; Brown, Nicolas; Mori, Angelo Rossi; Spackman, Kent A.; Golichowski, Alan; Jones, Robert H.; Korman, Louis; Dove, Brent; Hildebrand, Lloyd; Berg, Michael
1999-01-01
Objective: To support clinically relevant indexing of biomedical images and image-related information based on the attributes of image acquisition procedures and the judgments (observations) expressed by observers in the process of image interpretation. Design: The authors introduce the notion of “image acquisition context,” the set of attributes that describe image acquisition procedures, and present a standards-based strategy for utilizing the attributes of image acquisition context as indexing and retrieval keys for digital image libraries. Methods: The authors' indexing strategy is based on an interdependent message/terminology architecture that combines the Digital Imaging and Communication in Medicine (DICOM) standard, the SNOMED (Systematized Nomenclature of Human and Veterinary Medicine) vocabulary, and the SNOMED DICOM microglossary. The SNOMED DICOM microglossary provides context-dependent mapping of terminology to DICOM data elements. Results: The capability of embedding standard coded descriptors in DICOM image headers and image-interpretation reports improves the potential for selective retrieval of image-related information. This favorably affects information management in digital libraries. PMID:9925229
Bidgood, W D; Bray, B; Brown, N; Mori, A R; Spackman, K A; Golichowski, A; Jones, R H; Korman, L; Dove, B; Hildebrand, L; Berg, M
1999-01-01
To support clinically relevant indexing of biomedical images and image-related information based on the attributes of image acquisition procedures and the judgments (observations) expressed by observers in the process of image interpretation. The authors introduce the notion of "image acquisition context," the set of attributes that describe image acquisition procedures, and present a standards-based strategy for utilizing the attributes of image acquisition context as indexing and retrieval keys for digital image libraries. The authors' indexing strategy is based on an interdependent message/terminology architecture that combines the Digital Imaging and Communication in Medicine (DICOM) standard, the SNOMED (Systematized Nomenclature of Human and Veterinary Medicine) vocabulary, and the SNOMED DICOM microglossary. The SNOMED DICOM microglossary provides context-dependent mapping of terminology to DICOM data elements. The capability of embedding standard coded descriptors in DICOM image headers and image-interpretation reports improves the potential for selective retrieval of image-related information. This favorably affects information management in digital libraries.
Lin, Yanhua; Staes, Catherine J; Shields, David E; Kandula, Vijay; Welch, Brandon M; Kawamoto, Kensaku
2015-01-01
When coupled with a common information model, a common terminology for clinical decision support (CDS) and electronic clinical quality measurement (eCQM) could greatly facilitate the distributed development and sharing of CDS and eCQM knowledge resources. To enable such scalable knowledge authoring and sharing, we systematically developed an extensible and standards-based terminology for CDS and eCQM in the context of the HL7 Virtual Medical Record (vMR) information model. The development of this terminology entailed three steps: (1) systematic, physician-curated concept identification from sources such as the Health Information Technology Standards Panel (HITSP) and the SNOMED-CT CORE problem list; (2) concept de-duplication leveraging the Unified Medical Language System (UMLS) MetaMap and Metathesaurus; and (3) systematic concept naming using standard terminologies and heuristic algorithms. This process generated 3,046 concepts spanning 68 domains. Evaluation against representative CDS and eCQM resources revealed approximately 50–70% concept coverage, indicating the need for continued expansion of the terminology. PMID:26958220
2018-04-01
referred to as “defense in depth” and has been the standard model of information security management for at least a decade. Concepts such as mandatory...instrumentation into the system and monitoring this instrumentation with appropriate reports and alerts (e.g., security information event management tools or...Coalition Battle Management Language (C-BML) (NATO 2012) define information (orders, plans, reports, requests, etc.) that can be readily processed by
NASA Handbook for Models and Simulations: An Implementation Guide for NASA-STD-7009
NASA Technical Reports Server (NTRS)
Steele, Martin J.
2013-01-01
The purpose of this Handbook is to provide technical information, clarification, examples, processes, and techniques to help institute good modeling and simulation practices in the National Aeronautics and Space Administration (NASA). As a companion guide to NASA-STD- 7009, Standard for Models and Simulations, this Handbook provides a broader scope of information than may be included in a Standard and promotes good practices in the production, use, and consumption of NASA modeling and simulation products. NASA-STD-7009 specifies what a modeling and simulation activity shall or should do (in the requirements) but does not prescribe how the requirements are to be met, which varies with the specific engineering discipline, or who is responsible for complying with the requirements, which depends on the size and type of project. A guidance document, which is not constrained by the requirements of a Standard, is better suited to address these additional aspects and provide necessary clarification. This Handbook stems from the Space Shuttle Columbia Accident Investigation (2003), which called for Agency-wide improvements in the "development, documentation, and operation of models and simulations"' that subsequently elicited additional guidance from the NASA Office of the Chief Engineer to include "a standard method to assess the credibility of the models and simulations."2 General methods applicable across the broad spectrum of model and simulation (M&S) disciplines were sought to help guide the modeling and simulation processes within NASA and to provide for consistent reporting ofM&S activities and analysis results. From this, the standardized process for the M&S activity was developed. The major contents of this Handbook are the implementation details of the general M&S requirements ofNASA-STD-7009, including explanations, examples, and suggestions for improving the credibility assessment of an M&S-based analysis.
Harmonization in laboratory medicine: Requests, samples, measurements and reports.
Plebani, Mario
2016-01-01
In laboratory medicine, the terms "standardization" and "harmonization" are frequently used interchangeably as the final goal is the same: the equivalence of measurement results among different routine measurement procedures over time and space according to defined analytical and clinical quality specifications. However, the terms define two distinct, albeit closely linked, concepts based on traceability principles. The word "standardization" is used when results for a measurement are equivalent and traceable to the International System of Units (SI) through a high-order primary reference material and/or a reference measurement procedure (RMP). "Harmonization" is generally used when results are equivalent, but neither a high-order primary reference material nor a reference measurement procedure is available. Harmonization is a fundamental aspect of quality in laboratory medicine as its ultimate goal is to improve patient outcomes through the provision of accurate and actionable laboratory information. Patients, clinicians and other healthcare professionals assume that clinical laboratory tests performed by different laboratories at different times on the same sample and specimen can be compared, and that results can be reliably and consistently interpreted. Unfortunately, this is not necessarily the case, because many laboratory test results are still highly variable and poorly standardized and harmonized. Although the initial focus was mainly on harmonizing and standardizing analytical processes and methods, the scope of harmonization now also includes all other aspects of the total testing process (TTP), such as terminology and units, report formats, reference intervals and decision limits as well as tests and test profiles, requests and criteria for interpretation. Several projects and initiatives aiming to improve standardization and harmonization in the testing process are now underway. Laboratory professionals should therefore step up their efforts to provide interchangeable and comparable laboratory information in order to ultimately assure better diagnosis and treatment in patient care.
Toward the First Data Acquisition Standard in Synthetic Biology.
Sainz de Murieta, Iñaki; Bultelle, Matthieu; Kitney, Richard I
2016-08-19
This paper describes the development of a new data acquisition standard for synthetic biology. This comprises the creation of a methodology that is designed to capture all the data, metadata, and protocol information associated with biopart characterization experiments. The new standard, called DICOM-SB, is based on the highly successful Digital Imaging and Communications in Medicine (DICOM) standard in medicine. A data model is described which has been specifically developed for synthetic biology. The model is a modular, extensible data model for the experimental process, which can optimize data storage for large amounts of data. DICOM-SB also includes services orientated toward the automatic exchange of data and information between modalities and repositories. DICOM-SB has been developed in the context of systematic design in synthetic biology, which is based on the engineering principles of modularity, standardization, and characterization. The systematic design approach utilizes the design, build, test, and learn design cycle paradigm. DICOM-SB has been designed to be compatible with and complementary to other standards in synthetic biology, including SBOL. In this regard, the software provides effective interoperability. The new standard has been tested by experiments and data exchange between Nanyang Technological University in Singapore and Imperial College London.
Architectural approaches for HL7-based health information systems implementation.
López, D M; Blobel, B
2010-01-01
Information systems integration is hard, especially when semantic and business process interoperability requirements need to be met. To succeed, a unified methodology, approaching different aspects of systems architecture such as business, information, computational, engineering and technology viewpoints, has to be considered. The paper contributes with an analysis and demonstration on how the HL7 standard set can support health information systems integration. Based on the Health Information Systems Development Framework (HIS-DF), common architectural models for HIS integration are analyzed. The framework is a standard-based, consistent, comprehensive, customizable, scalable methodology that supports the design of semantically interoperable health information systems and components. Three main architectural models for system integration are analyzed: the point to point interface, the messages server and the mediator models. Point to point interface and messages server models are completely supported by traditional HL7 version 2 and version 3 messaging. The HL7 v3 standard specification, combined with service-oriented, model-driven approaches provided by HIS-DF, makes the mediator model possible. The different integration scenarios are illustrated by describing a proof-of-concept implementation of an integrated public health surveillance system based on Enterprise Java Beans technology. Selecting the appropriate integration architecture is a fundamental issue of any software development project. HIS-DF provides a unique methodological approach guiding the development of healthcare integration projects. The mediator model - offered by the HIS-DF and supported in HL7 v3 artifacts - is the more promising one promoting the development of open, reusable, flexible, semantically interoperable, platform-independent, service-oriented and standard-based health information systems.
Fulton, James L.
1992-01-01
Spatial data analysis has become an integral component in many surface and sub-surface hydrologic investigations within the U.S. Geological Survey (USGS). Currently, one of the largest costs in applying spatial data analysis is the cost of developing the needed spatial data. Therefore, guidelines and standards are required for the development of spatial data in order to allow for data sharing and reuse; this eliminates costly redevelopment. In order to attain this goal, the USGS is expanding efforts to identify guidelines and standards for the development of spatial data for hydrologic analysis. Because of the variety of project and database needs, the USGS has concentrated on developing standards for documenting spatial sets to aid in the assessment of data set quality and compatibility of different data sets. An interim data set documentation standard (1990) has been developed that provides a mechanism for associating a wide variety of information with a data set, including data about source material, data automation and editing procedures used, projection parameters, data statistics, descriptions of features and feature attributes, information on organizational contacts lists of operations performed on the data, and free-form comments and notes about the data, made at various times in the evolution of the data set. The interim data set documentation standard has been automated using a commercial geographic information system (GIS) and data set documentation software developed by the USGS. Where possible, USGS developed software is used to enter data into the data set documentation file automatically. The GIS software closely associates a data set with its data set documentation file; the documentation file is retained with the data set whenever it is modified, copied, or transferred to another computer system. The Water Resources Division of the USGS is continuing to develop spatial data and data processing standards, with emphasis on standards needed to support hydrologic analysis, hydrologic data processing, and publication of hydrologic thermatic maps. There is a need for the GIS vendor community to develop data set documentation tools similar to those developed by the USGS, or to incorporate USGS developed tools in their software.
[Breast cancer screening process indicators in Mexico: a case study].
Uscanga-Sánchez, Santos; Torres-Mejía, Gabriela; Ángeles-Llerenas, Angélica; Domínguez-Malpica, Raúl; Lazcano-Ponce, Eduardo
2014-01-01
To identify, measure and compare the performance indicators of productivity, effective access and quality service for the early detection breast cancer program in Mexico. By means of a study case based on the 2011 Women Cancer Information System (SICAM), the indicators were measured and compared with the Mexican official standard NOM-041-SSA2-2011 and international standards. The analysis showed insufficient installed capacity (37%), low coverage in screening (15%), diagnostic evaluation (16%), biopsy (44%) and treatment (57%), and very low effectiveness in confirmed cases by the total number of screening mammograms performed (0.04%). There was no information available, from SICAM, to estimate the rest of the indicators proposed. Efficient health information systems are required in order to monitor indicators and generate performance observatories of screening programs.
Edge-Based Image Compression with Homogeneous Diffusion
NASA Astrophysics Data System (ADS)
Mainberger, Markus; Weickert, Joachim
It is well-known that edges contain semantically important image information. In this paper we present a lossy compression method for cartoon-like images that exploits information at image edges. These edges are extracted with the Marr-Hildreth operator followed by hysteresis thresholding. Their locations are stored in a lossless way using JBIG. Moreover, we encode the grey or colour values at both sides of each edge by applying quantisation, subsampling and PAQ coding. In the decoding step, information outside these encoded data is recovered by solving the Laplace equation, i.e. we inpaint with the steady state of a homogeneous diffusion process. Our experiments show that the suggested method outperforms the widely-used JPEG standard and can even beat the advanced JPEG2000 standard for cartoon-like images.
DOT National Transportation Integrated Search
2001-01-22
Federal Aviation Regulation (FAR) Part 36, Noise : Standards: Aircraft Type and Airworthiness : Certification, requires that measured aircraft noise : certification data be corrected to a nominal reference-day : condition. This correction process...
RMP Guidance for Warehouses - Chapter 6: Prevention Program (Program 2)
If substances you have above threshold are not covered by OSHA's PSM standard, you have a Program 2 process. Your prevention program must include safety information, hazard review, SOPs, training, maintenance, compliance audits, and incident investigation.
CERT Resilience Management Model, Version 1.0
2010-05-01
practice such as ISO 27000 , COBIT, or ITIL. If you are a member of an established process improvement community, particularly one centered on CMMI...Systems Audit and Control Association ISO International Organization for Standardization ISSA Information Systems Security Association IT
[The standardization of medical care and the training of medical personnel].
Korbut, V B; Tyts, V V; Boĭshenko, V A
1997-09-01
The medical specialist training at all levels (medical orderly, doctor's assistant, general practitioner, doctors) should be based on the medical care standards. Preliminary studies in the field of military medicine standards have demonstrated that the medical service of the Armed Forces of Russia needs medical resources' standards, structure and organization standards, technology standards. Military medical service resources' standards should reflect the requisitions for: all medical specialists' qualification, equipment and material for medical set-ups, field medical systems, drugs, etc. Standards for structures and organization should include requisitions for: command and control systems in military formations' and task forces' medical services and their information support; health-care and evacuation functions, sanitary control and anti-epidemic measures and personnel health protection. Technology standards development could improve and regulate the health care procedures in the process of evacuation. Standards' development will help to solve the problem of the data-base for the military medicine education system and medical research.
An Interoperable Architecture for Air Pollution Early Warning System Based on Sensor Web
NASA Astrophysics Data System (ADS)
Samadzadegan, F.; Zahmatkesh, H.; Saber, M.; Ghazi khanlou, H. J.
2013-09-01
Environmental monitoring systems deal with time-sensitive issues which require quick responses in emergency situations. Handling the sensor observations in near real-time and obtaining valuable information is challenging issues in these systems from a technical and scientific point of view. The ever-increasing population growth in urban areas has caused certain problems in developing countries, which has direct or indirect impact on human life. One of applicable solution for controlling and managing air quality by considering real time and update air quality information gathered by spatially distributed sensors in mega cities, using sensor web technology for developing monitoring and early warning systems. Urban air quality monitoring systems using functionalities of geospatial information system as a platform for analysing, processing, and visualization of data in combination with Sensor Web for supporting decision support systems in disaster management and emergency situations. This system uses Sensor Web Enablement (SWE) framework of the Open Geospatial Consortium (OGC), which offers a standard framework that allows the integration of sensors and sensor data into spatial data infrastructures. SWE framework introduces standards for services to access sensor data and discover events from sensor data streams as well as definition set of standards for the description of sensors and the encoding of measurements. The presented system provides capabilities to collect, transfer, share, process air quality sensor data and disseminate air quality status in real-time. It is possible to overcome interoperability challenges by using standard framework. In a routine scenario, air quality data measured by in-situ sensors are communicated to central station where data is analysed and processed. The extracted air quality status is processed for discovering emergency situations, and if necessary air quality reports are sent to the authorities. This research proposed an architecture to represent how integrate air quality sensor data stream into geospatial data infrastructure to present an interoperable air quality monitoring system for supporting disaster management systems by real time information. Developed system tested on Tehran air pollution sensors for calculating Air Quality Index (AQI) for CO pollutant and subsequently notifying registered users in emergency cases by sending warning E-mails. Air quality monitoring portal used to retrieving and visualize sensor observation through interoperable framework. This system provides capabilities to retrieve SOS observation using WPS in a cascaded service chaining pattern for monitoring trend of timely sensor observation.
Towards a service bus for distributed manufacturing
NASA Astrophysics Data System (ADS)
Delgado-Gomes, Vasco; Oliveira-Lima, José A.; Martins, João F.; Jardim-Gonçalves, Ricardo
2013-10-01
The electronic exchange of data between industrial equipment, manufacturing and information systems of companies is becoming increasingly important with the current trend of reducing products' life cycle, wide range of diversified products, and the need to answer the specific needs of each consumer. In this context, quality, time, costs involved in integrating information over the company's internal processes, and in the interaction of these processes with their customers, suppliers and other business partners are in many sectors, far beyond what the current technology and communications solutions enable. This paper presents a communication infrastructure to integrate several companies from different sectors of the supply chain, to exchange their heterogeneous information using a data model which is composed by different standards.
Organizing Space Shuttle parametric data for maintainability
NASA Technical Reports Server (NTRS)
Angier, R. C.
1983-01-01
A model of organization and management of Space Shuttle data is proposed. Shuttle avionics software is parametrically altered by a reconfiguration process for each flight. As the flight rate approaches an operational level, current methods of data management would become increasingly complex. An alternative method is introduced, using modularized standard data, and its implications for data collection, integration, validation, and reconfiguration processes are explored. Information modules are cataloged for later use, and may be combined in several levels for maintenance. For each flight, information modules can then be selected from the catalog at a high level. These concepts take advantage of the reusability of Space Shuttle information to reduce the cost of reconfiguration as flight experience increases.
NASA Astrophysics Data System (ADS)
Venkrbec, Vaclav; Bittnerova, Lucie
2017-12-01
Building information modeling (BIM) can support effectiveness during many activities in the AEC industry. even when processing a construction-technological project. This paper presents an approach how to use building information model in higher education, especially during the work on diploma thesis and it supervision. Diploma thesis is project based work, which aims to compile a construction-technological project for a selected construction. The paper describes the use of input data, working with them and compares this process with standard input data such as printed design documentation. The effectiveness of using the building information model as a input data for construction-technological project is described in the conclusion.
Continuous Codes and Standards Improvement (CCSI)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rivkin, Carl H; Burgess, Robert M; Buttner, William J
2015-10-21
As of 2014, the majority of the codes and standards required to initially deploy hydrogen technologies infrastructure in the United States have been promulgated. These codes and standards will be field tested through their application to actual hydrogen technologies projects. Continuous codes and standards improvement (CCSI) is a process of identifying code issues that arise during project deployment and then developing codes solutions to these issues. These solutions would typically be proposed amendments to codes and standards. The process is continuous because as technology and the state of safety knowledge develops there will be a need to monitor the applicationmore » of codes and standards and improve them based on information gathered during their application. This paper will discuss code issues that have surfaced through hydrogen technologies infrastructure project deployment and potential code changes that would address these issues. The issues that this paper will address include (1) setback distances for bulk hydrogen storage, (2) code mandated hazard analyses, (3) sensor placement and communication, (4) the use of approved equipment, and (5) system monitoring and maintenance requirements.« less
Preliminary design review package for the solar heating and cooling central data processing system
NASA Technical Reports Server (NTRS)
1976-01-01
The Central Data Processing System (CDPS) is designed to transform the raw data collected at remote sites into performance evaluation information for assessing the performance of solar heating and cooling systems. Software requirements for the CDPS are described. The programming standards to be used in development, documentation, and maintenance of the software are discussed along with the CDPS operations approach in support of daily data collection and processing.
Defense Acquisitions: How and Where DOD Spends Its Contracting Dollars
2015-04-30
process . GSA is undertaking a multi-year effort to improve the reliability and usefulness of the information contained in FPDS and other federal... Improve FPDS According to GSA, a number of data systems, including FPDS, are undergoing a significant overhaul. This overhaul is a multi-year process ...data accuracy and completeness, then initiating a process to ensure that these standards are met, would improve data accuracy and completeness.” U.S
Information risk and security modeling
NASA Astrophysics Data System (ADS)
Zivic, Predrag
2005-03-01
This research paper presentation will feature current frameworks to addressing risk and security modeling and metrics. The paper will analyze technical level risk and security metrics of Common Criteria/ISO15408, Centre for Internet Security guidelines, NSA configuration guidelines and metrics used at this level. Information IT operational standards view on security metrics such as GMITS/ISO13335, ITIL/ITMS and architectural guidelines such as ISO7498-2 will be explained. Business process level standards such as ISO17799, COSO and CobiT will be presented with their control approach to security metrics. Top level, the maturity standards such as SSE-CMM/ISO21827, NSA Infosec Assessment and CobiT will be explored and reviewed. For each defined level of security metrics the research presentation will explore the appropriate usage of these standards. The paper will discuss standards approaches to conducting the risk and security metrics. The research findings will demonstrate the need for common baseline for both risk and security metrics. This paper will show the relation between the attribute based common baseline and corporate assets and controls for risk and security metrics. IT will be shown that such approach spans over all mentioned standards. The proposed approach 3D visual presentation and development of the Information Security Model will be analyzed and postulated. Presentation will clearly demonstrate the benefits of proposed attributes based approach and defined risk and security space for modeling and measuring.
A frequency standard via spectrum analysis and direct digital synthesis
NASA Astrophysics Data System (ADS)
Li, Dawei; Shi, Daiting; Hu, Ermeng; Wang, Yigen; Tian, Lu; Zhao, Jianye; Wang, Zhong
2014-11-01
We demonstrated a frequency standard based on a detuned coherent population beating phenomenon. In this phenomenon, the beat frequency of the radio frequency for laser modulation and the hyperfine splitting can be obtained by digital signal processing technology. After analyzing the spectrum of the beat frequency, the fluctuation information is obtained and applied to compensate for the frequency shift to generate the standard frequency by the digital synthesis method. Frequency instability of 2.6 × 1012 at 1000 s is observed in our preliminary experiment. By eliminating the phase-locking loop, the method will enable us to achieve a full-digital frequency standard with remarkable stability.
Weinstock, Deborah; Failey, Tara
2014-11-01
In the United States, unions sometimes joined by worker advocacy groups (e.g., Public Citizen and the American Public Health Association) have played a critical role in strengthening worker safety and health protections. They have sought to improve standards that protect workers by participating in the rulemaking process, through written comments and involvement in hearings; lobbying decision-makers; petitioning the Department of Labor; and defending improved standards in court. Their efforts have culminated in more stringent exposure standards, access to information about the presence of potentially hazardous toxic chemicals, and improved access to personal protective equipment-further improving working conditions in the United States.
Dissemination of metabolomics results: role of MetaboLights and COSMOS
2013-01-01
With ever-increasing amounts of metabolomics data produced each year, there is an even greater need to disseminate data and knowledge produced in a standard and reproducible way. To assist with this a general purpose, open source metabolomics repository, MetaboLights, was launched in 2012. To promote a community standard, initially culminated as metabolomics standards initiative (MSI), COordination of Standards in MetabOlomicS (COSMOS) was introduced. COSMOS aims to link life science e-infrastructures within the worldwide metabolomics community as well as develop and maintain open source exchange formats for raw and processed data, ensuring better flow of metabolomics information. PMID:23683662
ERIC Educational Resources Information Center
Evans, Marsha Ann Johnson
2012-01-01
Open Access (OA) to scholarly communications is a critical component in providing equitable admission to scholarly information and a key vehicle toward the achievement of global access to research in the knowledge building process. A standard and universally accepted process for guaranteeing OA permits complimentary access to knowledge, research…
ERIC Educational Resources Information Center
Gifford, Bernard R.
The purpose of this study was to develop a plan for alleviating problems in the collection, processing, and dissemination of educational data as they affect the New York City Board of Education information requirements. The Data Base Management concept was used to analyze three topics: administration, structure, and standards. The study found that…
Development of Electro-Optical Standard Processes for Application
2011-11-01
AERONAUTICS AND SPACE ADMINISTRATION DISTRIBUTION A: APPROVED FOR PUBLIC RELEASE DISTRIBUTION IS UNLIMITED Report Documentation Page Form ApprovedOMB No...0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing...DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT Defines the process of
Exploiting salient semantic analysis for information retrieval
NASA Astrophysics Data System (ADS)
Luo, Jing; Meng, Bo; Quan, Changqin; Tu, Xinhui
2016-11-01
Recently, many Wikipedia-based methods have been proposed to improve the performance of different natural language processing (NLP) tasks, such as semantic relatedness computation, text classification and information retrieval. Among these methods, salient semantic analysis (SSA) has been proven to be an effective way to generate conceptual representation for words or documents. However, its feasibility and effectiveness in information retrieval is mostly unknown. In this paper, we study how to efficiently use SSA to improve the information retrieval performance, and propose a SSA-based retrieval method under the language model framework. First, SSA model is adopted to build conceptual representations for documents and queries. Then, these conceptual representations and the bag-of-words (BOW) representations can be used in combination to estimate the language models of queries and documents. The proposed method is evaluated on several standard text retrieval conference (TREC) collections. Experiment results on standard TREC collections show the proposed models consistently outperform the existing Wikipedia-based retrieval methods.
WaterML, an Information Standard for the Exchange of in-situ hydrological observations
NASA Astrophysics Data System (ADS)
Valentine, D.; Taylor, P.; Zaslavsky, I.
2012-04-01
The WaterML 2.0 Standards Working Group (SWG), working within the Open Geospatial Consortium (OGC) and in cooperation with the joint OGC-World Meteorological Organization (WMO) Hydrology Domain Working Group (HDWG), has developed an open standard for the exchange of water observation data; WaterML 2.0. The focus of the standard is time-series data, commonly generated from in-situ style monitoring. This is high value data for hydrological applications such as flood forecasting, environmental reporting and supporting hydrological infrastructure (e.g. dams, supply systems), which is commonly exchanged, but a lack of standards inhibits efficient reuse and automation. The process of developing WaterML required doing a harmonization analysis of existing standards to identify overlapping concepts and come to agreement on a harmonized definition. Generally the formats captured similar requirements, all with subtle differences, such as how time-series point metadata was handled. The in-progress standard WaterML 2.0 incorporates the semantics of the hydrologic information: location, procedure, and observations, and is implemented as an application schema of the Geography Markup Language version 3.2.1, making use of the OGC Observations & Measurements standards. WaterML2.0 is designed as an extensible schema to allow encoding of data to be used in a variety of exchange scenarios. Example areas of usage are: exchange of data for operational hydrological monitoring programs; supporting operation of infrastructure (e.g. dams, supply systems); cross-border exchange of observational data; release of data for public dissemination; enhancing disaster management through data exchange; and exchange in support of national reporting The first phase of WaterML2.0 focused on structural definitions allowing for the transfer of time-series, with less work on harmonization of vocabulary items such as quality codes. Vocabularies from various organizations tend to be specific and take time to come to agreement on. This will be continued in future work for the HDWG, along with extending the information model to cover additional types of hydrologic information: rating and gauging information, and water quality. Rating curves, gaugings and river cross sections are commonly exchanged in addition to standard time-series data to allow information relating to conversions such as river level to discharge. Members of the HDWG plan to initiate this work in early 2012. Water quality data is varied in the way it is processed and in the number of phenomena it measures. It will require specific components of extension to the WaterML2.0 model, most likely making use of the specimen types within O&M and extensive use of controlled vocabularies. Other future work involves different target encodings for the WaterML2.0 conceptual model, such as JSON, netCDF, CSV etc. are optimized for particular needs, such as efficiency in size of the encoding and parsing of structure, but may not be capable of representing the full extent of the WaterML2.0 information model. Certain encodings are best matched for particular needs; the community has begun investigation into when and how best to implement these.
NASA Astrophysics Data System (ADS)
Hoffman, Kenneth J.
1995-10-01
Few information systems create a standardized clinical patient record in which there are discrete and concise observations of patient problems and their resolution. Clinical notes usually are narratives which don't support an aggregate and systematic outcome analysis. Many programs collect information on diagnosis and coded procedures but are not focused on patient problems. Integrated definition (IDEF) methodology has been accepted by the Department of Defense as part of the Corporate Information Management Initiative and serves as the foundation that establishes a need for automation. We used IDEF modeling to describe present and idealized patient care activities. A logical IDEF data model was created to support those activities. The modeling process allows for accurate cost estimates based upon performed activities, efficient collection of relevant information, and outputs which allow real- time assessments of process and outcomes. This model forms the foundation for a prototype automated clinical information system (ACIS).
Hankin, Steven C.; Blower, Jon D.; Carval, Thierry; Casey, Kenneth S.; Donlon, Craig; Lauret, Olivier; Loubrieu, Thomas; Srinivasan, Ashwanth; Trinanes, Joaquin; Godøy, Øystein; Mendelssohn, Roy; Signell, Richard P.; de La Beaujardiere, Jeff; Cornillon, Peter; Blanc, Frederique; Rew, Russ; Harlan, Jack; Hall, Julie; Harrison, D.E.; Stammer, Detlef
2010-01-01
It is generally recognized that meeting society's emerging environmental science and management needs will require the marine data community to provide simpler, more effective and more interoperable access to its data. There is broad agreement, as well, that data standards are the bedrock upon which interoperability will be built. The path that would bring the marine data community to agree upon and utilize such standards, however, is often elusive. In this paper we examine the trio of standards 1) netCDF files; 2) the Climate and Forecast (CF) metadata convention; and 3) the OPeNDAP data access protocol. These standards taken together have brought our community a high level of interoperability for "gridded" data such as model outputs, satellite products and climatological analyses, and they are gaining rapid acceptance for ocean observations. We will provide an overview of the scope of the contribution that has been made. We then step back from the information technology considerations to examine the community or "social" process by which the successes were achieved. We contrast the path by which the World Meteorological Organization (WMO) has advanced the Global Telecommunications System (GTS) - netCDF/CF/OPeNDAP exemplifying a "bottom up" standards process whereas GTS is "top down". Both of these standards are tales of success at achieving specific purposes, yet each is hampered by technical limitations. These limitations sometimes lead to controversy over whether alternative technological directions should be pursued. Finally we draw general conclusions regarding the factors that affect the success of a standards development effort - the likelihood that an IT standard will meet its design goals and will achieve community-wide acceptance. We believe that a higher level of thoughtful awareness by the scientists, program managers and technology experts of the vital role of standards and the merits of alternative standards processes can help us as a community to reach our interoperability goals faster.
Final Report of the NASA Office of Safety and Mission Assurance Agile Benchmarking Team
NASA Technical Reports Server (NTRS)
Wetherholt, Martha
2016-01-01
To ensure that the NASA Safety and Mission Assurance (SMA) community remains in a position to perform reliable Software Assurance (SA) on NASAs critical software (SW) systems with the software industry rapidly transitioning from waterfall to Agile processes, Terry Wilcutt, Chief, Safety and Mission Assurance, Office of Safety and Mission Assurance (OSMA) established the Agile Benchmarking Team (ABT). The Team's tasks were: 1. Research background literature on current Agile processes, 2. Perform benchmark activities with other organizations that are involved in software Agile processes to determine best practices, 3. Collect information on Agile-developed systems to enable improvements to the current NASA standards and processes to enhance their ability to perform reliable software assurance on NASA Agile-developed systems, 4. Suggest additional guidance and recommendations for updates to those standards and processes, as needed. The ABT's findings and recommendations for software management, engineering and software assurance are addressed herein.
Neural activity in superior parietal cortex during rule-based visual-motor transformations.
Hawkins, Kara M; Sayegh, Patricia; Yan, Xiaogang; Crawford, J Douglas; Sergio, Lauren E
2013-03-01
Cognition allows for the use of different rule-based sensorimotor strategies, but the neural underpinnings of such strategies are poorly understood. The purpose of this study was to compare neural activity in the superior parietal lobule during a standard (direct interaction) reaching task, with two nonstandard (gaze and reach spatially incongruent) reaching tasks requiring the integration of rule-based information. Specifically, these nonstandard tasks involved dissociating the planes of reach and vision or rotating visual feedback by 180°. Single unit activity, gaze, and reach trajectories were recorded from two female Macaca mulattas. In all three conditions, we observed a temporal discharge pattern at the population level reflecting early reach planning and on-line reach monitoring. In the plane-dissociated task, we found a significant overall attenuation in the discharge rate of cells from deep recording sites, relative to standard reaching. We also found that cells modulated by reach direction tended to be significantly tuned either during the standard or the plane-dissociated task but rarely during both. In the standard versus feedback reversal comparison, we observed some cells that shifted their preferred direction by 180° between conditions, reflecting maintenance of directional tuning with respect to the reach goal. Our findings suggest that the superior parietal lobule plays an important role in processing information about the nonstandard nature of a task, which, through reciprocal connections with precentral motor areas, contributes to the accurate transformation of incongruent sensory inputs into an appropriate motor output. Such processing is crucial for the integration of rule-based information into a motor act.
Digital document imaging systems: An overview and guide
NASA Technical Reports Server (NTRS)
1990-01-01
This is an aid to NASA managers in planning the selection of a Digital Document Imaging System (DDIS) as a possible solution for document information processing and storage. Intended to serve as a manager's guide, this document contains basic information on digital imaging systems, technology, equipment standards, issues of interoperability and interconnectivity, and issues related to selecting appropriate imaging equipment based upon well defined needs.
NASA Astrophysics Data System (ADS)
Sheldon, W.
2013-12-01
Managing data for a large, multidisciplinary research program such as a Long Term Ecological Research (LTER) site is a significant challenge, but also presents unique opportunities for data stewardship. LTER research is conducted within multiple organizational frameworks (i.e. a specific LTER site as well as the broader LTER network), and addresses both specific goals defined in an NSF proposal as well as broader goals of the network; therefore, every LTER data can be linked to rich contextual information to guide interpretation and comparison. The challenge is how to link the data to this wealth of contextual metadata. At the Georgia Coastal Ecosystems LTER we developed an integrated information management system (GCE-IMS) to manage, archive and distribute data, metadata and other research products as well as manage project logistics, administration and governance (figure 1). This system allows us to store all project information in one place, and provide dynamic links through web applications and services to ensure content is always up to date on the web as well as in data set metadata. The database model supports tracking changes over time in personnel roles, projects and governance decisions, allowing these databases to serve as canonical sources of project history. Storing project information in a central database has also allowed us to standardize both the formatting and content of critical project information, including personnel names, roles, keywords, place names, attribute names, units, and instrumentation, providing consistency and improving data and metadata comparability. Lookup services for these standard terms also simplify data entry in web and database interfaces. We have also coupled the GCE-IMS to our MATLAB- and Python-based data processing tools (i.e. through database connections) to automate metadata generation and packaging of tabular and GIS data products for distribution. Data processing history is automatically tracked throughout the data lifecycle, from initial import through quality control, revision and integration by our data processing system (GCE Data Toolbox for MATLAB), and included in metadata for versioned data products. This high level of automation and system integration has proven very effective in managing the chaos and scalability of our information management program.
Todt, Oliver; Luján, José Luis
2016-06-01
To identify the various types of evidence, as well as their relative importance in European health claims regulation, in order to analyze the consequences for consumer protection of the requirements for scientific substantiation in this regulation. Qualitative analysis of various documents relevant to the regulatory process, particularly as to the implications of the standards of proof for the functional food market, as well as consumer behavior. European regulation defines a hierarchy of evidence that turns randomized controlled trials into a necessary and sufficient condition for health claim authorizations. Consumer protection can be interpreted in different manners. High standards of proof protect consumers from false information about the health outcomes of functional foods, while lower standards lead to more, albeit less accurate information about such outcomes being available to consumers.
Psychiatric and addiction consultation for patients in critical care.
Kaiser, Susan
2012-03-01
Practicing within the paradigm of compartmentalized specially treatment without a collaborative practice is ineffective for the chemical dependency and dual diagnosis population. Chemical dependency is not well understood as a disease, evidenced by barriers cited from the 2005 Survey on Drug Use and Health. Recovery from addiction and dual diagnosis logically demands an integrated and science-based treatment approach with unified standards for care and improved educational standards for preparation of care providers. Consultation and collaboration with addiction and psychiatric specialists is needed to establish consistency in standards for treatment and holistic care, essential for comorbidity. Continued learning and research about the complexity of the addiction process and comorbidity will provide continued accurate information about the harmful effects of alcoholism and drug abuse which in turn will empower individuals to make informed choices and result in better treatment and social policies.
On the creation of a clinical gold standard corpus in Spanish: Mining adverse drug reactions.
Oronoz, Maite; Gojenola, Koldo; Pérez, Alicia; de Ilarraza, Arantza Díaz; Casillas, Arantza
2015-08-01
The advances achieved in Natural Language Processing make it possible to automatically mine information from electronically created documents. Many Natural Language Processing methods that extract information from texts make use of annotated corpora, but these are scarce in the clinical domain due to legal and ethical issues. In this paper we present the creation of the IxaMed-GS gold standard composed of real electronic health records written in Spanish and manually annotated by experts in pharmacology and pharmacovigilance. The experts mainly annotated entities related to diseases and drugs, but also relationships between entities indicating adverse drug reaction events. To help the experts in the annotation task, we adapted a general corpus linguistic analyzer to the medical domain. The quality of the annotation process in the IxaMed-GS corpus has been assessed by measuring the inter-annotator agreement, which was 90.53% for entities and 82.86% for events. In addition, the corpus has been used for the automatic extraction of adverse drug reaction events using machine learning. Copyright © 2015 Elsevier Inc. All rights reserved.
Park, Yu Rang; Yoon, Young Jo; Koo, HaYeong; Yoo, Soyoung; Choi, Chang-Min; Beck, Sung-Ho
2018-01-01
Background Clinical trials pose potential risks in both communications and management due to the various stakeholders involved when performing clinical trials. The academic medical center has a responsibility and obligation to conduct and manage clinical trials while maintaining a sufficiently high level of quality, therefore it is necessary to build an information technology system to support standardized clinical trial processes and comply with relevant regulations. Objective The objective of the study was to address the challenges identified while performing clinical trials at an academic medical center, Asan Medical Center (AMC) in Korea, by developing and utilizing a clinical trial management system (CTMS) that complies with standardized processes from multiple departments or units, controlled vocabularies, security, and privacy regulations. Methods This study describes the methods, considerations, and recommendations for the development and utilization of the CTMS as a consolidated research database in an academic medical center. A task force was formed to define and standardize the clinical trial performance process at the site level. On the basis of the agreed standardized process, the CTMS was designed and developed as an all-in-one system complying with privacy and security regulations. Results In this study, the processes and standard mapped vocabularies of a clinical trial were established at the academic medical center. On the basis of these processes and vocabularies, a CTMS was built which interfaces with the existing trial systems such as the electronic institutional review board health information system, enterprise resource planning, and the barcode system. To protect patient data, the CTMS implements data governance and access rules, and excludes 21 personal health identifiers according to the Health Insurance Portability and Accountability Act (HIPAA) privacy rule and Korean privacy laws. Since December 2014, the CTMS has been successfully implemented and used by 881 internal and external users for managing 11,645 studies and 146,943 subjects. Conclusions The CTMS was introduced in the Asan Medical Center to manage the large amounts of data involved with clinical trial operations. Inter- and intraunit control of data and resources can be easily conducted through the CTMS system. To our knowledge, this is the first CTMS developed in-house at an academic medical center side which can enhance the efficiency of clinical trial management in compliance with privacy and security laws. PMID:29691212
Envisioning Transformation in VA Mental Health Services Through Collaborative Site Visits.
Kearney, Lisa K; Schaefer, Jeanne A; Dollar, Katherine M; Iwamasa, Gayle Y; Katz, Ira; Schmitz, Theresa; Schohn, Mary; Resnick, Sandra G
2018-04-16
This column reviews the unique contributions of multiple partners in establishing a standardized site visit process to promote quality improvement in mental health care at the Veterans Health Administration. Working as a team, leaders in policy and operations, staff of research centers, and regional- and facility-level mental health leaders developed a standardized protocol for evaluating mental health services at each site and using the data to help implement policy goals. The authors discuss the challenges experienced and lessons learned in this systemwide process and how this information can be part of a framework for improving mental health services on a national level.
Festinger, David S; Dugosh, Karen L; Croft, Jason R; Arabia, Patricia L; Marlowe, Douglas B
2011-01-01
We examined the efficacy of including a research intermediary (RI) during the consent process in reducing participants' perceptions of coercion to enroll in a research study. Eighty-four drug court clients being recruited into an ongoing study were randomized to receive a standard informed consent process alone (standard condition) or with an RI (intermediary condition). Before obtaining consent, RIs met with clients individually to discuss remaining concerns. Findings provided preliminary evidence that RIs reduced client perceptions that their participation might influence how clinical and judicial staff view them. This suggests that using RIs may improve participant autonomy in clinical studies.
Festinger, David S.; Dugosh, Karen L.; Croft, Jason R.; Arabia, Patricia L.; Marlowe, Douglas B.
2011-01-01
We examined the efficacy of including a research intermediary (RI) during the consent process in reducing participants’ perceptions of coercion to enroll in a research study. Eighty-four drug court clients being recruited into an ongoing study were randomized to receive a standard informed consent process alone (standard condition) or with an RI (intermediary condition). Before obtaining consent, RIs met with clients individually to discuss remaining concerns. Findings provided preliminary evidence that RIs reduced client perceptions that their participation might influence how clinical and judicial staff view them. This suggests that using RIs may improve participant autonomy in clinical studies. PMID:22081751
The Agent of extracting Internet Information with Lead Order
NASA Astrophysics Data System (ADS)
Mo, Zan; Huang, Chuliang; Liu, Aijun
In order to carry out e-commerce better, advanced technologies to access business information are in need urgently. An agent is described to deal with the problems of extracting internet information that caused by the non-standard and skimble-scamble structure of Chinese websites. The agent designed includes three modules which respond to the process of extracting information separately. A method of HTTP tree and a kind of Lead algorithm is proposed to generate a lead order, with which the required web can be retrieved easily. How to transform the extracted information structuralized with natural language is also discussed.
Open source cardiology electronic health record development for DIGICARDIAC implementation
NASA Astrophysics Data System (ADS)
Dugarte, Nelson; Medina, Rubén.; Huiracocha, Lourdes; Rojas, Rubén.
2015-12-01
This article presents the development of a Cardiology Electronic Health Record (CEHR) system. Software consists of a structured algorithm designed under Health Level-7 (HL7) international standards. Novelty of the system is the integration of high resolution ECG (HRECG) signal acquisition and processing tools, patient information management tools and telecardiology tools. Acquisition tools are for management and control of the DIGICARDIAC electrocardiograph functions. Processing tools allow management of HRECG signal analysis searching for indicative patterns of cardiovascular pathologies. Telecardiology tools incorporation allows system communication with other health care centers decreasing access time to the patient information. CEHR system was completely developed using open source software. Preliminary results of process validation showed the system efficiency.
Integration and Interoperability: An Analysis to Identify the Attributes for System of Systems
2008-09-01
divisions of the enterprise. Examples of the current I2 are: • a nightly feed of elearning information is captured through an automated and...standardized process throughout the enterprise and • the LMS has been integrated with SkillSoft, a third party elearning software system, (http...Command (JITC) is responsible to test all programs that utilize standard interfaces to specific global nets or systems. Many times programs that
Federal COBOL Compiler Testing Service Compiler Validation Request Information.
1977-05-09
background of the Federal COBOL Compiler Testing Service which was set up by a memorandum of agreement between the National Bureau of Standards and the...Federal Standard, and the requirement of COBOL compiler validation in the procurement process. It also contains a list of all software products...produced by the software Development Division in support of the FCCTS as well as the Validation Summary Reports produced as a result of discharging the
Comparison as a Universal Learning Action
ERIC Educational Resources Information Center
Merkulova, T. V.
2016-01-01
This article explores "comparison" as a universal metasubject learning action, a key curricular element envisaged by the Russian Federal State Educational Standards. Representing the modern learner's fundamental pragmatic skill embedding such core capacities as information processing, critical thinking, robust decision-making, and…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-30
..., IPAC and MNU blocks support billing and processing enhancements. The printed ORI number is no longer necessary because SF 87 forms are converted to images and transmitted to the FBI electronically. The Public...
48 CFR 4.1300 - Scope of subpart.
Code of Federal Regulations, 2010 CFR
2010-10-01
... ADMINISTRATIVE MATTERS Personal Identity Verification 4.1300 Scope of subpart. This subpart provides policy and procedures associated with Personal Identity Verification as required by— (a) Federal Information Processing Standards Publication (FIPS PUB) Number 201, “Personal Identity Verification of Federal Employees and...
Combining Low-Level Perception and Expectations in Conceptual Learning
2004-01-12
human representation and processing of visual information. San Francisco: W. H. Freeman, 1982. U . Neisser , Cognitive Psychology. New York: Appleton...expectation that characters are from a standard alphabet enables correct identification of characters which would otherwise be ambiguous ( Neisser , 1966
NASA Astrophysics Data System (ADS)
Vandenbroucke, D.; Van Orshoven, J.; Vancauwenberghe, G.
2012-12-01
Over the last decennia, the use of Geographic Information (GI) has gained importance, in public as well as in private sector. But even if many spatial data and related information exist, data sets are scattered over many organizations and departments. In practice it remains difficult to find the spatial data sets needed, and to access, obtain and prepare them for using in applications. Therefore Spatial Data Infrastructures (SDI) haven been developed to enhance the access, the use and sharing of GI. SDIs consist of a set of technological and non-technological components to reach this goal. Since the nineties many SDI initiatives saw light. Ultimately, all these initiatives aim to enhance the flow of spatial data between organizations (users as well as producers) involved in intra- and inter-organizational and even cross-country business processes. However, the flow of information and its re-use in different business processes requires technical and semantic interoperability: the first should guarantee that system components can interoperate and use the data, while the second should guarantee that data content is understood by all users in the same way. GI-standards within the SDI are necessary to make this happen. However, it is not known if this is realized in practice. Therefore the objective of the research is to develop a quantitative framework to assess the impact of GI-standards on the performance of business processes. For that purpose, indicators are defined and tested in several cases throughout Europe. The proposed research will build upon previous work carried out in the SPATIALIST project. It analyzed the impact of different technological and non-technological factors on the SDI-performance of business processes (Dessers et al., 2011). The current research aims to apply quantitative performance measurement techniques - which are frequently used to measure performance of production processes (Anupindi et al., 2005). Key to reach the research objectives is a correct design of the test cases. The major challenge is: to set-up the analytical framework for analyzing the impact of GI-standards on the process performance, to define the appropriate indicators and to choose the right test cases. In order to do so, it is proposed to define the test cases as 8 pairs of organizations (see figure). The paper will present the state of the art of performance measurement in the context of work processes, propose a series of SMART indicators for describing the set-up and measure the performance, define the test case set-up and suggest criteria for the selection of the test cases, i.e. the organizational pairs. References Anupindi, R., Chopra, S., Deshmukh, S.D., Van Mieghem, J.A., & Zemel, E. (2006). Managing Business Process Flows: Principles of Operations Management. New-Jersey, USA: Prentice Hall. Dessers, D., Crompvoets, J., Janssen, K., Vancauwenberghe, G., Vandenbroucke, D. & Vanhaverbeke, L. (2011). SDI at work: The Spatial Zoning Plans Case. Leuven, Belgium: Katholieke Universiteit Leuven.
López, Diego M; Blobel, Bernd; Gonzalez, Carolina
2010-01-01
Requirement analysis, design, implementation, evaluation, use, and maintenance of semantically interoperable Health Information Systems (HIS) have to be based on eHealth standards. HIS-DF is a comprehensive approach for HIS architectural development based on standard information models and vocabulary. The empirical validity of HIS-DF has not been demonstrated so far. Through an empirical experiment, the paper demonstrates that using HIS-DF and HL7 information models, semantic quality of HIS architecture can be improved, compared to architectures developed using traditional RUP process. Semantic quality of the architecture has been measured in terms of model's completeness and validity metrics. The experimental results demonstrated an increased completeness of 14.38% and an increased validity of 16.63% when using the HIS-DF and HL7 information models in a sample HIS development project. Quality assurance of the system architecture in earlier stages of HIS development presumes an increased quality of final HIS systems, which supposes an indirect impact on patient care.
Digital health technology and trauma: development of an app to standardize care.
Hsu, Jeremy M
2015-04-01
Standardized practice results in less variation, therefore reducing errors and improving outcome. Optimal trauma care is achieved through standardization, as is evidenced by the widespread adoption of the Advanced Trauma Life Support approach. The challenge for an individual institution is how does one educate and promulgate these standardized processes widely and efficiently? In today's world, digital health technology must be considered in the process. The aim of this study was to describe the process of developing an app, which includes standardized trauma algorithms. The objective of the app was to allow easy, real-time access to trauma algorithms, and therefore reduce omissions/errors. A set of trauma algorithms, relevant to the local setting, was derived from the best available evidence. After obtaining grant funding, a collaborative endeavour was undertaken with an external specialist app developing company. The process required 6 months to translate the existing trauma algorithms into an app. The app contains 32 separate trauma algorithms, formatted as a single-page flow diagram. It utilizes specific smartphone features such as 'pinch to zoom', jump-words and pop-ups to allow rapid access to the desired information. Improvements in trauma care outcomes result from reducing variation. By incorporating digital health technology, a trauma app has been developed, allowing easy and intuitive access to evidenced-based algorithms. © 2015 Royal Australasian College of Surgeons.
FDA recognition of consensus standards in the premarket notification program.
Marlowe, D E; Phillips, P J
1998-01-01
"The FDA has long advocated the use of standards as a significant contributor to safety and effectiveness of medical devices," Center for Devices and Radiological Health's (CDRH) Donald E. Marlowe and Philip J. Phillips note in the following article, highlighting the latest U.S. Food and Drug Administration (FDA) plans for use of standards. They note that the important role standards can play has been reinforced as part of FDA reengineering efforts undertaken in anticipation of an increased regulatory work-load and declining agency resources. As part of its restructuring effort, the FDA announced last spring that it would recognize some consensus standards for use in the device approval process. Under the new 510(k) paradigm--the FDA's proposal to streamline premarket review, which includes incorporating the use of standards in the review of 510(k) submissions--the FDA will accept proof of compliance with standards as evidence of device safety and effectiveness. Manufacturers may submit declarations of conformity to standards instead of following the traditional review process. The International Electrotechnical Commission (IEC) 60601 series of consensus standards, which deals with many safety issues common to electrical medical devices, was the first to be chosen for regulatory review. Other standards developed by nationally or internationally recognized standards development organizations, such as AAMI, may be eligible for use to ensure review requirements. In the following article, Marlowe and Phillips describe the FDA's plans to use standards in the device review process. The article focuses on the use of standards for medical device review, the development of the standards recognition process for reviewing devices, and the anticipated benefits of using standards to review devices. One important development has been the recent implementation of the FDA Modernization Act of 1997 (FDAMA), which advocates the use of standards in the device review process. In implementing the legislation, the FDA published in the Federal Register a list of standards to which manufacturers may declare conformity. Visit AAMI's Web site at www.aami.org/news/fda.standards for a copy of the list and for information on nominating other standards for official recognition by the agency. The FDA expects that use of standards will benefit the agency and manufacturers alike: "We estimate that in time, reliance on declarations of conformity to recognized standards could save the agency considerable resources while reducing the regulatory obstacles to entry to domestic and international markets," state the authors.
Handoffs in care--can we make them safer?
Streitenberger, Kim; Breen-Reid, Karen; Harris, Cheryl
2006-12-01
In today's complex and rapidly changing health care environments, patient harm may result if important patient information is not communicated from one health care provider to another during handoffs in care. Issues involving communication, continuity of care, and care planning are cited as a root cause in more than 80% of reported sentinel events. In light of the inherent risks associated with handoffs in care, the use of strategies that reduce the impact of human factors on effective communication and standardize the communication process is essential to ensure appropriate communication patient information and that a plan of care is continued through the process.
Kibbe, David C; McLaughlin, Curtis P
2008-01-01
Expert panels and policy analysts have often ignored potential contributions to health information technology (IT) from the Internet and Web-based applications. Perhaps they are among the "unmentionables" of health IT. Ignoring those unmentionables and relying on established industry experts has left us with a standards process that is complex and burdened by diverse goals, easy for entrenched interests to dominate, and reluctant to deal with potentially disruptive technologies. We need a health IT planning process that is more dynamic in its technological forecasting and inclusive of IT experts from outside the industry.
PHASE I MATERIALS PROPERTY DATABASE DEVELOPMENT FOR ASME CODES AND STANDARDS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Weiju; Lin, Lianshan
2013-01-01
To support the ASME Boiler and Pressure Vessel Codes and Standard (BPVC) in modern information era, development of a web-based materials property database is initiated under the supervision of ASME Committee on Materials. To achieve efficiency, the project heavily draws upon experience from development of the Gen IV Materials Handbook and the Nuclear System Materials Handbook. The effort is divided into two phases. Phase I is planned to deliver a materials data file warehouse that offers a depository for various files containing raw data and background information, and Phase II will provide a relational digital database that provides advanced featuresmore » facilitating digital data processing and management. Population of the database will start with materials property data for nuclear applications and expand to data covering the entire ASME Code and Standards including the piping codes as the database structure is continuously optimized. The ultimate goal of the effort is to establish a sound cyber infrastructure that support ASME Codes and Standards development and maintenance.« less
Engineering Lessons Learned and Systems Engineering Applications
NASA Technical Reports Server (NTRS)
Gill, Paul S.; Garcia, Danny; Vaughan, William W.
2005-01-01
Systems Engineering is fundamental to good engineering, which in turn depends on the integration and application of engineering lessons learned. Thus, good Systems Engineering also depends on systems engineering lessons learned from within the aerospace industry being documented and applied. About ten percent of the engineering lessons learned documented in the NASA Lessons Learned Information System are directly related to Systems Engineering. A key issue associated with lessons learned datasets is the communication and incorporation of this information into engineering processes. As part of the NASA Technical Standards Program activities, engineering lessons learned datasets have been identified from a number of sources. These are being searched and screened for those having a relation to Technical Standards. This paper will address some of these Systems Engineering Lessons Learned and how they are being related to Technical Standards within the NASA Technical Standards Program, including linking to the Agency's Interactive Engineering Discipline Training Courses and the life cycle for a flight vehicle development program.
NASA Technical Reports Server (NTRS)
Mayor, Antoinette C.
1999-01-01
The Chemical Management Team is responsible for ensuring compliance with the OSHA Laboratory Standard. The program at Lewis Research Center (LeRC) evolved over many years to include training, developing Standard Operating Procedures (SOPS) for each laboratory process, coordinating with other safety and health organizations and teams at the Center, and issuing an SOP binder. The Chemical Hygiene Policy was first established for the Center. The Chemical Hygiene Plan was established and reviewed by technical, laboratory and management for viability and applicability to the Center. A risk assessment was conducted for each laboratory. The laboratories were prioritized by order of risk, higher risk taking priority. A Chemical Management Team staff member interviewed the lead researcher for each laboratory process to gather the information needed to develop the SOP for the process. A binder containing the Chemical Hygiene Plan, the SOP, a map of the laboratory identifying the personal protective equipment and best egress, and glove guides, as well as other guides for safety and health. Each laboratory process has been captured in the form of an SOP. The chemicals used in the procedure have been identified and the information is used to reduce the number of chemicals in the lab. The Chemical Hygiene Plan binder is used as a training tool for new employees. LeRC is in compliance with the OSHA Standard. The program was designed to comply with the OSHA standard. In the process, we have been able to assess the usage of chemicals in the laboratories, as well as reduce or relocate the chemicals being stored in the laboratory. Our researchers are trained on the hazards of the materials they work with and have a better understanding of the hazards of the process and what is needed to prevent any incident. From the SOP process, we have been able to reduce our chemical inventory, determine and implement better hygiene procedures and equipment in the laboratories, and provide specific training to our employees. As a result of this program, we are adding labeling to the laboratories for emergency responders and initiating a certified chemical user program.
Compliance with minimum information guidelines in public metabolomics repositories
Spicer, Rachel A.; Salek, Reza; Steinbeck, Christoph
2017-01-01
The Metabolomics Standards Initiative (MSI) guidelines were first published in 2007. These guidelines provided reporting standards for all stages of metabolomics analysis: experimental design, biological context, chemical analysis and data processing. Since 2012, a series of public metabolomics databases and repositories, which accept the deposition of metabolomic datasets, have arisen. In this study, the compliance of 399 public data sets, from four major metabolomics data repositories, to the biological context MSI reporting standards was evaluated. None of the reporting standards were complied with in every publicly available study, although adherence rates varied greatly, from 0 to 97%. The plant minimum reporting standards were the most complied with and the microbial and in vitro were the least. Our results indicate the need for reassessment and revision of the existing MSI reporting standards. PMID:28949328