Sample records for observation document analysis

  1. Automated document analysis system

    NASA Astrophysics Data System (ADS)

    Black, Jeffrey D.; Dietzel, Robert; Hartnett, David

    2002-08-01

    A software application has been developed to aid law enforcement and government intelligence gathering organizations in the translation and analysis of foreign language documents with potential intelligence content. The Automated Document Analysis System (ADAS) provides the capability to search (data or text mine) documents in English and the most commonly encountered foreign languages, including Arabic. Hardcopy documents are scanned by a high-speed scanner and are optical character recognized (OCR). Documents obtained in an electronic format bypass the OCR and are copied directly to a working directory. For translation and analysis, the script and the language of the documents are first determined. If the document is not in English, the document is machine translated to English. The documents are searched for keywords and key features in either the native language or translated English. The user can quickly review the document to determine if it has any intelligence content and whether detailed, verbatim human translation is required. The documents and document content are cataloged for potential future analysis. The system allows non-linguists to evaluate foreign language documents and allows for the quick analysis of a large quantity of documents. All document processing can be performed manually or automatically on a single document or a batch of documents.

  2. Discourse analysis of an 'observation levels' nursing policy.

    PubMed

    Horsfall, J; Cleary, M

    2000-11-01

    The practice of special observation (or constant observation) is widely used in inpatient psychiatric facilities for the care of people who are suicidal. In this study, the policy of special observation was examined using a discourse analysis method to discern prevailing ideas and practices highlighted within the policy. After reading, studying and analysing the special observation nursing policy, the authors briefly describe the document and outline the terms and phrases prevalent within the document. These recurrent ideas are then organized into five categories: professional responsibilities, suicidality, the patient's immediate context, the patient's observable behaviour and the nursing checklist. In discussion of the policy document, the invisibility of the authors, target audience and patients is noted. The authors attempt to elicit evidence for the therapeutic nurse-patient relationship in the document. In the analysis of patient, nurse and doctor roles and responsibilities, it is evident that the policy document reinforces the traditional medical hierarchy of power relations. Some assumptions that underpin the document are postulated. Questions regarding the nature of risk assessment and the evidence base for the medical prescription of special observation are raised. As well as ideas and themes evident in the document, the absence of some relevant issues is explored. While the need for succinctness and clarity in policy documents is acknowledged, the fact that patient rights, therapeutic processes and ethical dilemmas are absent is deemed significant.

  3. Document reconstruction by layout analysis of snippets

    NASA Astrophysics Data System (ADS)

    Kleber, Florian; Diem, Markus; Sablatnig, Robert

    2010-02-01

    Document analysis is done to analyze entire forms (e.g. intelligent form analysis, table detection) or to describe the layout/structure of a document. Also skew detection of scanned documents is performed to support OCR algorithms that are sensitive to skew. In this paper document analysis is applied to snippets of torn documents to calculate features for the reconstruction. Documents can either be destroyed by the intention to make the printed content unavailable (e.g. tax fraud investigation, business crime) or due to time induced degeneration of ancient documents (e.g. bad storage conditions). Current reconstruction methods for manually torn documents deal with the shape, inpainting and texture synthesis techniques. In this paper the possibility of document analysis techniques of snippets to support the matching algorithm by considering additional features are shown. This implies a rotational analysis, a color analysis and a line detection. As a future work it is planned to extend the feature set with the paper type (blank, checked, lined), the type of the writing (handwritten vs. machine printed) and the text layout of a snippet (text size, line spacing). Preliminary results show that these pre-processing steps can be performed reliably on a real dataset consisting of 690 snippets.

  4. 10 CFR 830.204 - Documented safety analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Documented safety analysis. 830.204 Section 830.204 Energy DEPARTMENT OF ENERGY NUCLEAR SAFETY MANAGEMENT Safety Basis Requirements § 830.204 Documented safety analysis... approval from DOE for the methodology used to prepare the documented safety analysis for the facility...

  5. Document analysis with neural net circuits

    NASA Technical Reports Server (NTRS)

    Graf, Hans Peter

    1994-01-01

    Document analysis is one of the main applications of machine vision today and offers great opportunities for neural net circuits. Despite more and more data processing with computers, the number of paper documents is still increasing rapidly. A fast translation of data from paper into electronic format is needed almost everywhere, and when done manually, this is a time consuming process. Markets range from small scanners for personal use to high-volume document analysis systems, such as address readers for the postal service or check processing systems for banks. A major concern with present systems is the accuracy of the automatic interpretation. Today's algorithms fail miserably when noise is present, when print quality is poor, or when the layout is complex. A common approach to circumvent these problems is to restrict the variations of the documents handled by a system. In our laboratory, we had the best luck with circuits implementing basic functions, such as convolutions, that can be used in many different algorithms. To illustrate the flexibility of this approach, three applications of the NET32K circuit are described in this short viewgraph presentation: locating address blocks, cleaning document images by removing noise, and locating areas of interest in personal checks to improve image compression. Several of the ideas realized in this circuit that were inspired by neural nets, such as analog computation with a low resolution, resulted in a chip that is well suited for real-world document analysis applications and that compares favorably with alternative, 'conventional' circuits.

  6. Cultural diversity: blind spot in medical curriculum documents, a document analysis.

    PubMed

    Paternotte, Emma; Fokkema, Joanne P I; van Loon, Karsten A; van Dulmen, Sandra; Scheele, Fedde

    2014-08-22

    Cultural diversity among patients presents specific challenges to physicians. Therefore, cultural diversity training is needed in medical education. In cases where strategic curriculum documents form the basis of medical training it is expected that the topic of cultural diversity is included in these documents, especially if these have been recently updated. The aim of this study was to assess the current formal status of cultural diversity training in the Netherlands, which is a multi-ethnic country with recently updated medical curriculum documents. In February and March 2013, a document analysis was performed of strategic curriculum documents for undergraduate and postgraduate medical education in the Netherlands. All text phrases that referred to cultural diversity were extracted from these documents. Subsequently, these phrases were sorted into objectives, training methods or evaluation tools to assess how they contributed to adequate curriculum design. Of a total of 52 documents, 33 documents contained phrases with information about cultural diversity training. Cultural diversity aspects were more prominently described in the curriculum documents for undergraduate education than in those for postgraduate education. The most specific information about cultural diversity was found in the blueprint for undergraduate medical education. In the postgraduate curriculum documents, attention to cultural diversity differed among specialties and was mainly superficial. Cultural diversity is an underrepresented topic in the Dutch documents that form the basis for actual medical training, although the documents have been updated recently. Attention to the topic is thus unwarranted. This situation does not fit the demand of a multi-ethnic society for doctors with cultural diversity competences. Multi-ethnic countries should be critical on the content of the bases for their medical educational curricula.

  7. Cultural diversity: blind spot in medical curriculum documents, a document analysis

    PubMed Central

    2014-01-01

    Background Cultural diversity among patients presents specific challenges to physicians. Therefore, cultural diversity training is needed in medical education. In cases where strategic curriculum documents form the basis of medical training it is expected that the topic of cultural diversity is included in these documents, especially if these have been recently updated. The aim of this study was to assess the current formal status of cultural diversity training in the Netherlands, which is a multi-ethnic country with recently updated medical curriculum documents. Methods In February and March 2013, a document analysis was performed of strategic curriculum documents for undergraduate and postgraduate medical education in the Netherlands. All text phrases that referred to cultural diversity were extracted from these documents. Subsequently, these phrases were sorted into objectives, training methods or evaluation tools to assess how they contributed to adequate curriculum design. Results Of a total of 52 documents, 33 documents contained phrases with information about cultural diversity training. Cultural diversity aspects were more prominently described in the curriculum documents for undergraduate education than in those for postgraduate education. The most specific information about cultural diversity was found in the blueprint for undergraduate medical education. In the postgraduate curriculum documents, attention to cultural diversity differed among specialties and was mainly superficial. Conclusions Cultural diversity is an underrepresented topic in the Dutch documents that form the basis for actual medical training, although the documents have been updated recently. Attention to the topic is thus unwarranted. This situation does not fit the demand of a multi-ethnic society for doctors with cultural diversity competences. Multi-ethnic countries should be critical on the content of the bases for their medical educational curricula. PMID:25150546

  8. Graph-based layout analysis for PDF documents

    NASA Astrophysics Data System (ADS)

    Xu, Canhui; Tang, Zhi; Tao, Xin; Li, Yun; Shi, Cao

    2013-03-01

    To increase the flexibility and enrich the reading experience of e-book on small portable screens, a graph based method is proposed to perform layout analysis on Portable Document Format (PDF) documents. Digital born document has its inherent advantages like representing texts and fractional images in explicit form, which can be straightforwardly exploited. To integrate traditional image-based document analysis and the inherent meta-data provided by PDF parser, the page primitives including text, image and path elements are processed to produce text and non text layer for respective analysis. Graph-based method is developed in superpixel representation level, and page text elements corresponding to vertices are used to construct an undirected graph. Euclidean distance between adjacent vertices is applied in a top-down manner to cut the graph tree formed by Kruskal's algorithm. And edge orientation is then used in a bottom-up manner to extract text lines from each sub tree. On the other hand, non-textual objects are segmented by connected component analysis. For each segmented text and non-text composite, a 13-dimensional feature vector is extracted for labelling purpose. The experimental results on selected pages from PDF books are presented.

  9. 10 CFR 830.206 - Preliminary documented safety analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Preliminary documented safety analysis. 830.206 Section 830.206 Energy DEPARTMENT OF ENERGY NUCLEAR SAFETY MANAGEMENT Safety Basis Requirements § 830.206 Preliminary documented safety analysis. If construction begins after December 11, 2000, the contractor...

  10. USA National Phenology Network observational data documentation

    USGS Publications Warehouse

    Rosemartin, Alyssa H.; Denny, Ellen G.; Gerst, Katharine L.; Marsh, R. Lee; Posthumus, Erin E.; Crimmins, Theresa M.; Weltzin, Jake F.

    2018-04-25

    The goals of the USA National Phenology Network (USA-NPN, www.usanpn.org) are to advance science, inform decisions, and communicate and connect with the public regarding phenology and species’ responses to environmental variation and climate change. The USA-NPN seeks to advance the science of phenology and facilitate ecosystem stewardship by providing phenological information freely and openly. To accomplish these goals, the USA-NPN National Coordinating Office (NCO) delivers observational data on plant and animal phenology in several formats, including minimally processed status and intensity datasets and derived phenometrics for individual plants, sites, and regions. This document describes the suite of observational data products delivered by the USA National Phenology Network, covering the period 2009–present for the United States and accessible via the Phenology Observation Portal (http://dx.doi.org/10.5066/F78S4N1V) and via an Application Programming Interface. The data described here have been used in diverse research and management applications, including over 30 publications in fields such as remote sensing, plant evolution, and resource management.

  11. Canister Storage Building (CSB) Design Basis Accident Analysis Documentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    CROWE, R.D.; PIEPHO, M.G.

    2000-03-23

    This document provided the detailed accident analysis to support HNF-3553, Spent Nuclear Fuel Project Final Safety Analysis Report, Annex A, ''Canister Storage Building Final Safety Analysis Report''. All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

  12. A Conceptual Model for Multidimensional Analysis of Documents

    NASA Astrophysics Data System (ADS)

    Ravat, Franck; Teste, Olivier; Tournier, Ronan; Zurlfluh, Gilles

    Data warehousing and OLAP are mainly used for the analysis of transactional data. Nowadays, with the evolution of Internet, and the development of semi-structured data exchange format (such as XML), it is possible to consider entire fragments of data such as documents as analysis sources. As a consequence, an adapted multidimensional analysis framework needs to be provided. In this paper, we introduce an OLAP multidimensional conceptual model without facts. This model is based on the unique concept of dimensions and is adapted for multidimensional document analysis. We also provide a set of manipulation operations.

  13. Planning, Conducting, and Documenting Data Analysis for Program Improvement

    ERIC Educational Resources Information Center

    Winer, Abby; Taylor, Cornelia; Derrington, Taletha; Lucas, Anne

    2015-01-01

    This 2015 document was developed to help technical assistance (TA) providers and state staff define and limit the scope of data analysis for program improvement efforts, including the State Systemic Improvement Plan (SSIP); develop a plan for data analysis; document alternative hypotheses and additional analyses as they are generated; and…

  14. Advective transport observations with MODPATH-OBS--documentation of the MODPATH observation process

    USGS Publications Warehouse

    Hanson, R.T.; Kauffman, L.K.; Hill, M.C.; Dickinson, J.E.; Mehl, S.W.

    2013-01-01

    . Though the program name MODPATH-OBS specifically refers to observations, the program also can be used to calculate model prediction of observations. MODPATH-OBS is primarily intended for use with separate programs that conduct sensitivity analysis, data needs assessment, parameter estimation, and uncertainty analysis, such as UCODE_2005, and PEST. In many circumstances, refined grids in selected parts of a model are important to simulated hydraulics, detailed inflows and outflows, or other system characteristics. MODFLOW-LGR and MODPATH-LGR support accurate local grid refinement in which both mass (flows) and energy (head) are conserved across the local grid boundary. MODPATH-OBS is designed to take advantage of these capabilities. For example, particles tracked between a pumping well and a nearby stream, which are simulated poorly if a river and well are located in a single large grid cell, can be simulated with improved accuracy using a locally refined grid in MODFLOW-LGR, MODPATH-LGR, and MODPATH-OBS. The locally-refined-grid approach can provide more accurate simulated equivalents to observed transport between the well and the river. The documentation presented here includes a brief discussion of previous work, description of the methods, and detailed descriptions of the required input files and how the output files are typically used.

  15. Page layout analysis and classification for complex scanned documents

    NASA Astrophysics Data System (ADS)

    Erkilinc, M. Sezer; Jaber, Mustafa; Saber, Eli; Bauer, Peter; Depalov, Dejan

    2011-09-01

    A framework for region/zone classification in color and gray-scale scanned documents is proposed in this paper. The algorithm includes modules for extracting text, photo, and strong edge/line regions. Firstly, a text detection module which is based on wavelet analysis and Run Length Encoding (RLE) technique is employed. Local and global energy maps in high frequency bands of the wavelet domain are generated and used as initial text maps. Further analysis using RLE yields a final text map. The second module is developed to detect image/photo and pictorial regions in the input document. A block-based classifier using basis vector projections is employed to identify photo candidate regions. Then, a final photo map is obtained by applying probabilistic model based on Markov random field (MRF) based maximum a posteriori (MAP) optimization with iterated conditional mode (ICM). The final module detects lines and strong edges using Hough transform and edge-linkages analysis, respectively. The text, photo, and strong edge/line maps are combined to generate a page layout classification of the scanned target document. Experimental results and objective evaluation show that the proposed technique has a very effective performance on variety of simple and complex scanned document types obtained from MediaTeam Oulu document database. The proposed page layout classifier can be used in systems for efficient document storage, content based document retrieval, optical character recognition, mobile phone imagery, and augmented reality.

  16. Artificial neural networks for document analysis and recognition.

    PubMed

    Marinai, Simone; Gori, Marco; Soda, Giovanni; Society, Computer

    2005-01-01

    Artificial neural networks have been extensively applied to document analysis and recognition. Most efforts have been devoted to the recognition of isolated handwritten and printed characters with widely recognized successful results. However, many other document processing tasks, like preprocessing, layout analysis, character segmentation, word recognition, and signature verification, have been effectively faced with very promising results. This paper surveys the most significant problems in the area of offline document image processing, where connectionist-based approaches have been applied. Similarities and differences between approaches belonging to different categories are discussed. A particular emphasis is given on the crucial role of prior knowledge for the conception of both appropriate architectures and learning algorithms. Finally, the paper provides a critical analysis on the reviewed approaches and depicts the most promising research guidelines in the field. In particular, a second generation of connectionist-based models are foreseen which are based on appropriate graphical representations of the learning environment.

  17. Method of determining the necessary number of observations for video stream documents recognition

    NASA Astrophysics Data System (ADS)

    Arlazarov, Vladimir V.; Bulatov, Konstantin; Manzhikov, Temudzhin; Slavin, Oleg; Janiszewski, Igor

    2018-04-01

    This paper discusses a task of document recognition on a sequence of video frames. In order to optimize the processing speed an estimation is performed of stability of recognition results obtained from several video frames. Considering identity document (Russian internal passport) recognition on a mobile device it is shown that significant decrease is possible of the number of observations necessary for obtaining precise recognition result.

  18. Analysis of line structure in handwritten documents using the Hough transform

    NASA Astrophysics Data System (ADS)

    Ball, Gregory R.; Kasiviswanathan, Harish; Srihari, Sargur N.; Narayanan, Aswin

    2010-01-01

    In the analysis of handwriting in documents a central task is that of determining line structure of the text, e.g., number of text lines, location of their starting and end-points, line-width, etc. While simple methods can handle ideal images, real world documents have complexities such as overlapping line structure, variable line spacing, line skew, document skew, noisy or degraded images etc. This paper explores the application of the Hough transform method to handwritten documents with the goal of automatically determining global document line structure in a top-down manner which can then be used in conjunction with a bottom-up method such as connected component analysis. The performance is significantly better than other top-down methods, such as the projection profile method. In addition, we evaluate the performance of skew analysis by the Hough transform on handwritten documents.

  19. 43 CFR 46.135 - Incorporation of referenced documents into NEPA analysis.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... the analysis at hand. (b) Citations of specific information or analysis from other source documents... NEPA analysis. 46.135 Section 46.135 Public Lands: Interior Office of the Secretary of the Interior... Quality § 46.135 Incorporation of referenced documents into NEPA analysis. (a) The Responsible Official...

  20. 43 CFR 46.135 - Incorporation of referenced documents into NEPA analysis.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... the analysis at hand. (b) Citations of specific information or analysis from other source documents... NEPA analysis. 46.135 Section 46.135 Public Lands: Interior Office of the Secretary of the Interior... Quality § 46.135 Incorporation of referenced documents into NEPA analysis. (a) The Responsible Official...

  1. 43 CFR 46.135 - Incorporation of referenced documents into NEPA analysis.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... the analysis at hand. (b) Citations of specific information or analysis from other source documents... NEPA analysis. 46.135 Section 46.135 Public Lands: Interior Office of the Secretary of the Interior... Quality § 46.135 Incorporation of referenced documents into NEPA analysis. (a) The Responsible Official...

  2. 43 CFR 46.135 - Incorporation of referenced documents into NEPA analysis.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... the analysis at hand. (b) Citations of specific information or analysis from other source documents... NEPA analysis. 46.135 Section 46.135 Public Lands: Interior Office of the Secretary of the Interior... Quality § 46.135 Incorporation of referenced documents into NEPA analysis. (a) The Responsible Official...

  3. 43 CFR 46.135 - Incorporation of referenced documents into NEPA analysis.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... the analysis at hand. (b) Citations of specific information or analysis from other source documents... NEPA analysis. 46.135 Section 46.135 Public Lands: Interior Office of the Secretary of the Interior... Quality § 46.135 Incorporation of referenced documents into NEPA analysis. (a) The Responsible Official...

  4. A Qualitative Analysis Evaluating The Purposes And Practices Of Clinical Documentation

    PubMed Central

    Ho, Y.-X.; Gadd, C. S.; Kohorst, K.L.; Rosenbloom, S.T.

    2014-01-01

    Summary Objectives An important challenge for biomedical informatics researchers is determining the best approach for healthcare providers to use when generating clinical notes in settings where electronic health record (EHR) systems are used. The goal of this qualitative study was to explore healthcare providers’ and administrators’ perceptions about the purpose of clinical documentation and their own documentation practices. Methods We conducted seven focus groups with a total of 46 subjects composed of healthcare providers and administrators to collect knowledge, perceptions and beliefs about documentation from those who generate and review notes, respectively. Data were analyzed using inductive analysis to probe and classify impressions collected from focus group subjects. Results We observed that both healthcare providers and administrators believe that documentation serves five primary domains: clinical, administrative, legal, research, education. These purposes are tied closely to the nature of the clinical note as a document shared by multiple stakeholders, which can be a source of tension for all parties who must use the note. Most providers reported using a combination of methods to complete their notes in a timely fashion without compromising patient care. While all administrators reported relying on computer-based documentation tools to review notes, they expressed a desire for a more efficient method of extracting relevant data. Conclusions Although clinical documentation has utility, and is valued highly by its users, the development and successful adoption of a clinical documentation tool largely depends on its ability to be smoothly integrated into the provider’s busy workflow, while allowing the provider to generate a note that communicates effectively and efficiently with multiple stakeholders. PMID:24734130

  5. Analysis of a document/reporting system

    NASA Technical Reports Server (NTRS)

    Narrow, B.

    1971-01-01

    An in-depth analysis of the information system within the Data Processing Branch is presented. Quantitative measures are used to evaluate the efficiency and effectiveness of the information system. It is believed that this is the first documented study which utilizes quantitative measures for full scale system analysis. The quantitative measures and techniques for collecting and qualifying the basic data, as described, are applicable to any information system. Therefore this report is considered to be of interest to any persons concerned with the management design, analysis or evaluation of information systems.

  6. Content Recognition and Context Modeling for Document Analysis and Retrieval

    ERIC Educational Resources Information Center

    Zhu, Guangyu

    2009-01-01

    The nature and scope of available documents are changing significantly in many areas of document analysis and retrieval as complex, heterogeneous collections become accessible to virtually everyone via the web. The increasing level of diversity presents a great challenge for document image content categorization, indexing, and retrieval.…

  7. The use of fingerprints available on the web in false identity documents: Analysis from a forensic intelligence perspective.

    PubMed

    Girelli, Carlos Magno Alves

    2016-05-01

    Fingerprints present in false identity documents were found on the web. In some cases, laterally reversed (mirrored) images of a same fingerprint were observed in different documents. In the present work, 100 fingerprints images downloaded from the web, as well as their reversals obtained by image editing, were compared between themselves and against the database of the Brazilian Federal Police AFIS, in order to better understand trends about this kind of forgery in Brazil. Some image editing effects were observed in the analyzed fingerprints: addition of artifacts (such as watermarks), image rotation, image stylization, lateral reversal and tonal reversal. Discussion about lateral reversals' detection is presented in this article, as well as suggestion to reduce errors due to missed HIT decisions between reversed fingerprints. The present work aims to highlight the importance of the fingerprints' analysis when performing document examination, especially when only copies of documents are available, something very common in Brazil. Besides the intrinsic features of the fingermarks considered in three levels of details by ACE-V methodology, some visual features of the fingerprints images can be helpful to identify sources of forgeries and modus operandi, such as: limits and image contours, fails in the friction ridges caused by excess or lack of inking and presence of watermarks and artifacts arising from the background. Based on the agreement of such features in fingerprints present in different identity documents and also on the analysis of the time and location where the documents were seized, it is possible to highlight potential links between apparently unconnected crimes. Therefore, fingerprints have potential to reduce linkage blindness and the present work suggests the analysis of fingerprints when profiling false identity documents, as well as the inclusion of fingerprints features in the profile of the documents. Copyright © 2016 Elsevier Ireland Ltd. All

  8. Spectrum analysis on quality requirements consideration in software design documents.

    PubMed

    Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji

    2013-12-01

    Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called "spectrum analysis for quality requirements" is applied not only to requirements specifications but also to design documents. The technique enables us to derive the spectrum of a document, and quality requirements considerations in the document are numerically represented in the spectrum. We can thus objectively identify whether the considerations of quality requirements in a requirements document are adapted to its design document. To validate the method, we applied it to commercial software systems with the help of a supporting tool, and we confirmed that the method worked well.

  9. Full-scale system impact analysis: Digital document storage project

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The Digital Document Storage Full Scale System can provide cost effective electronic document storage, retrieval, hard copy reproduction, and remote access for users of NASA Technical Reports. The desired functionality of the DDS system is highly dependent on the assumed requirements for remote access used in this Impact Analysis. It is highly recommended that NASA proceed with a phased, communications requirement analysis to ensure that adequate communications service can be supplied at a reasonable cost in order to validate recent working assumptions upon which the success of the DDS Full Scale System is dependent.

  10. Policies on Protecting Vulnerable People During Disasters in Iran: A Document Analysis

    PubMed Central

    Abbasi Dolatabadi, Zahra; Seyedin, Hesam; Aryankhesal, Aidin

    2016-01-01

    Context Developing official protection policies for disasters is a main strategy in protecting vulnerable people. The aim of this study was to analyze official documents concerning policies on protecting vulnerable people during disasters. Evidence Acquisition This study was conducted by the qualitative document analysis method. Documents were gathered by searching websites and referring to the organizations involved in disaster management. The documents were assessed by a researcher-made data collection form. A directed content analysis approach was used to analyze the retrieved documents regarding the protection policies and legislation for vulnerable people. Results A total of 22 documents were included in the final analysis. Most of the documents referred to women, children, elderly people, poor, and villagers as vulnerable people. Moreover, the documents did not provide information regarding official measures for protecting vulnerable people during different phases of disaster management. Conclusions A clear and comprehensive definition of “vulnerable people” and formulation of official policies to protect them is needs to be formulated. Given the high prevalence of disasters in Iran, policy makers need to develop effective context-based policies to protect vulnerable people during disasters. PMID:27921019

  11. General Electromagnetic Model for the Analysis of Complex Systems (GEMACS) Computer Code Documentation (Version 3). Volume 3, Part 4.

    DTIC Science & Technology

    1983-09-01

    6ENFRAL. ELECTROMAGNETIC MODEL FOR THE ANALYSIS OF COMPLEX SYSTEMS **%(GEMA CS) Computer Code Documentation ii( Version 3 ). A the BDM Corporation Dr...ANALYSIS FnlTcnclRpr F COMPLEX SYSTEM (GmCS) February 81 - July 83- I TR CODE DOCUMENTATION (Version 3 ) 6.PROMN N.REPORT NUMBER 5. CONTRACT ORGAT97...the ti and t2 directions on the source patch. 3 . METHOD: The electric field at a segment observation point due to the source patch j is given by 1-- lnA

  12. Policy Document on Earth Observation for Urban Planning and Management: State of the Art and Recommendations for Application of Earth Observation in Urban Planning

    NASA Technical Reports Server (NTRS)

    Nichol, Janet; King, Bruce; Xiaoli, Ding; Dowman, Ian; Quattrochi, Dale; Ehlers, Manfred

    2007-01-01

    A policy document on earth observation for urban planning and management resulting from a workshop held in Hong Kong in November 2006 is presented. The aim of the workshop was to provide a forum for researchers and scientists specializing in earth observation to interact with practitioners working in different aspects of city planning, in a complex and dynamic city, Hong Kong. A summary of the current state of the art, limitations, and recommendations for the use of earth observation in urban areas is presented here as a policy document.

  13. A ward-based time study of paper and electronic documentation for recording vital sign observations.

    PubMed

    Wong, David; Bonnici, Timothy; Knight, Julia; Gerry, Stephen; Turton, James; Watkinson, Peter

    2017-07-01

    To investigate time differences in recording observations and an early warning score using traditional paper charts and a novel e-Obs system in clinical practice. Researchers observed the process of recording observations and early warning scores across 3 wards in 2 university teaching hospitals immediately before and after introduction of the e-Obs system. The process of recording observations included both measurement and documentation of vital signs. Interruptions were timed and subtracted from the measured process duration. Multilevel modeling was used to compensate for potential confounding factors. In all, 577 nurse events were observed (281 paper, 296 e-Obs). The geometric mean time to take a complete set of vital signs was 215 s (95% confidence interval [CI], 177 s-262 s) on paper, and 150 s (95% CI, 130 s-172 s) electronically. The treatment effect ratio was 0.70 (95% CI, 0.57-0.85, P  < .001). The treatment effect ratio in ward 1 was 0.37 (95% CI, 0.26-0.53), in ward 2 was 0.98 (95% CI, 0.70-1.38), and in ward 3 was 0.93 (95% CI, 0.66-1.33). Introduction of an e-Obs system was associated with a statistically significant reduction in overall time to measure and document vital signs electronically compared to paper documentation. The reductions in time varied among wards and were of clinical significance on only 1 of 3 wards studied. Our results suggest that introduction of an e-Obs system could lower nursing workload as well as increase documentation quality. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  14. Innovations in the Analysis of Chandra-ACIS Observations

    NASA Astrophysics Data System (ADS)

    Broos, Patrick S.; Townsley, Leisa K.; Feigelson, Eric D.; Getman, Konstantin V.; Bauer, Franz E.; Garmire, Gordon P.

    2010-05-01

    As members of the instrument team for the Advanced CCD Imaging Spectrometer (ACIS) on NASA's Chandra X-ray Observatory and as Chandra General Observers, we have developed a wide variety of data analysis methods that we believe are useful to the Chandra community, and have constructed a significant body of publicly available software (the ACIS Extract package) addressing important ACIS data and science analysis tasks. This paper seeks to describe these data analysis methods for two purposes: to document the data analysis work performed in our own science projects and to help other ACIS observers judge whether these methods may be useful in their own projects (regardless of what tools and procedures they choose to implement those methods). The ACIS data analysis recommendations we offer here address much of the workflow in a typical ACIS project, including data preparation, point source detection via both wavelet decomposition and image reconstruction, masking point sources, identification of diffuse structures, event extraction for both point and diffuse sources, merging extractions from multiple observations, nonparametric broadband photometry, analysis of low-count spectra, and automation of these tasks. Many of the innovations presented here arise from several, often interwoven, complications that are found in many Chandra projects: large numbers of point sources (hundreds to several thousand), faint point sources, misaligned multiple observations of an astronomical field, point source crowding, and scientifically relevant diffuse emission.

  15. Documentation and Validation of the Goddard Earth Observing System (GEOS) Data Assimilation System, Version 4

    NASA Technical Reports Server (NTRS)

    Suarez, Max J. (Editor); daSilva, Arlindo; Dee, Dick; Bloom, Stephen; Bosilovich, Michael; Pawson, Steven; Schubert, Siegfried; Wu, Man-Li; Sienkiewicz, Meta; Stajner, Ivanka

    2005-01-01

    This document describes the structure and validation of a frozen version of the Goddard Earth Observing System Data Assimilation System (GEOS DAS): GEOS-4.0.3. Significant features of GEOS-4 include: version 3 of the Community Climate Model (CCM3) with the addition of a finite volume dynamical core; version two of the Community Land Model (CLM2); the Physical-space Statistical Analysis System (PSAS); and an interactive retrieval system (iRET) for assimilating TOVS radiance data. Upon completion of the GEOS-4 validation in December 2003, GEOS-4 became operational on 15 January 2004. Products from GEOS-4 have been used in supporting field campaigns and for reprocessing several years of data for CERES.

  16. Third Annual Symposium on Document Analysis and Information Retrieval

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This document presents papers of the Third Annual Symposium on Document Analysis and Information Retrieval at the Information Science Research-l Institute at the University of Nevada, Las Vegas (UNLV/ISRI). Of the 60 papers submitted, 25 were accepted for oral presentation and 9 as poster papers. Both oral presentations and poster papers are included in these Proceedings. The individual papers have been cataloged separately.

  17. Every document and picture tells a story: using internal corporate document reviews, semiotics, and content analysis to assess tobacco advertising.

    PubMed

    Anderson, S J; Dewhirst, T; Ling, P M

    2006-06-01

    In this article we present communication theory as a conceptual framework for conducting documents research on tobacco advertising strategies, and we discuss two methods for analysing advertisements: semiotics and content analysis. We provide concrete examples of how we have used tobacco industry documents archives and tobacco advertisement collections iteratively in our research to yield a synergistic analysis of these two complementary data sources. Tobacco promotion researchers should consider adopting these theoretical and methodological approaches.

  18. Forensic document analysis using scanning microscopy

    NASA Astrophysics Data System (ADS)

    Shaffer, Douglas K.

    2009-05-01

    The authentication and identification of the source of a printed document(s) can be important in forensic investigations involving a wide range of fraudulent materials, including counterfeit currency, travel and identity documents, business and personal checks, money orders, prescription labels, travelers checks, medical records, financial documents and threatening correspondence. The physical and chemical characterization of document materials - including paper, writing inks and printed media - is becoming increasingly relevant for law enforcement agencies, with the availability of a wide variety of sophisticated commercial printers and copiers which are capable of producing fraudulent documents of extremely high print quality, rendering these difficult to distinguish from genuine documents. This paper describes various applications and analytical methodologies using scanning electron miscoscopy/energy dispersive (x-ray) spectroscopy (SEM/EDS) and related technologies for the characterization of fraudulent documents, and illustrates how their morphological and chemical profiles can be compared to (1) authenticate and (2) link forensic documents with a common source(s) in their production history.

  19. Intelligent Document Gateway: A Service System Case Study and Analysis

    NASA Astrophysics Data System (ADS)

    Krishna, Vikas; Lelescu, Ana

    In today's fast paced world, it is necessary to process business ­documents expediently, accurately, and diligently. In other words, processing has to be fast, errors must be prevented (or caught and corrected quickly), and documents cannot be lost or misplaced. The failure to meet these criteria, depending on the type and purpose of the documents, can have serious business, legal, or safety consequences. In this paper, we evaluated a B2B order placement service system that allows clients to place orders for products and services over a network. We describe the order placement service before and after deploying the Intelligent Document Gateway (IDG), a document-centric business process automation technology from IBM Research. Using service science perspective and service systems frameworks, we provide an analysis of how IDG improved the value proposition for both the service providers and service clients.

  20. Every document and picture tells a story: using internal corporate document reviews, semiotics, and content analysis to assess tobacco advertising

    PubMed Central

    Anderson, S J; Dewhirst, T; Ling, P M

    2006-01-01

    In this article we present communication theory as a conceptual framework for conducting documents research on tobacco advertising strategies, and we discuss two methods for analysing advertisements: semiotics and content analysis. We provide concrete examples of how we have used tobacco industry documents archives and tobacco advertisement collections iteratively in our research to yield a synergistic analysis of these two complementary data sources. Tobacco promotion researchers should consider adopting these theoretical and methodological approaches. PMID:16728758

  1. Communications data delivery system analysis : public workshop read-ahead document.

    DOT National Transportation Integrated Search

    2012-04-09

    This document presents an overview of work conducted to date around development and analysis of communications data delivery systems for supporting transactions in the connected vehicle environment. It presents the results of technical analysis of co...

  2. Failures in communication through documents and documentation across the perioperative pathway.

    PubMed

    Braaf, Sandra; Riley, Robin; Manias, Elizabeth

    2015-07-01

    To explore how communication failures occur in documents and documentations across the perioperative pathway in nurses' interactions with other nurses, surgeons and anaesthetists. Documents and documentation are used to communicate vital patient and procedural information among nurses, and in nurses' interactions with surgeons and anaesthetists, across the perioperative pathway. Previous research indicates that communication failure regularly occurs in the perioperative setting. A qualitative study was undertaken. The study was conducted over three hospitals in Melbourne, Australia. One hundred and twenty-five healthcare professionals from the disciplines of surgery, anaesthesia and nursing participated in the study. Data collection commenced in January 2010 and concluded in October 2010. Data were generated through 350 hours of observation, two focus groups and 20 semi-structured interviews. A detailed thematic analysis was undertaken. Communication failure occurred owing to a reliance on documents and documentation to transfer information at patient transition points, poor quality documents and documentation, and problematic access to information. Institutional ruling practices of professional practice, efficiency and productivity, and fiscal constraint dominated the coordination of nurses', surgeons' and anaesthetists' communication through documents and documentation. These governing practices configured communication to be incongruous with reliably meeting safety and quality objectives. Communication failure occurred because important information was sometimes buried in documents, insufficient, inaccurate, out-of-date or not verbally reinforced. Furthermore, busy nurses were not always able to access information they required in a timely manner. Patient safety was affected, which led to delays in treatment and at times inadequate care. Organisational support needs to be provided to nurses, surgeons and anaesthetists so they have sufficient time to complete

  3. Continuous Improvement in Online Education: Documenting Teaching Effectiveness in the Online Environment through Observations

    ERIC Educational Resources Information Center

    Purcell, Jennifer W.; Scott, Heather I.; Mixson-Brookshire, Deborah

    2017-01-01

    Teaching observations are commonly used among educators to document and improve teaching effectiveness. Unfortunately, the necessary protocols and supporting infrastructure are not consistently available for faculty who teach online. This paper presents a brief literature review and reflective narratives of educators representing online education…

  4. A Document Analysis of Teacher Evaluation Systems Specific to Physical Education

    ERIC Educational Resources Information Center

    Norris, Jason M.; van der Mars, Hans; Kulinna, Pamela; Kwon, Jayoun; Amrein-Beardsley, Audrey

    2017-01-01

    Purpose: The purpose of this document analysis study was to examine current teacher evaluation systems, understand current practices, and determine whether the instrumentation is a valid measure of teaching quality as reflected in teacher behavior and effectiveness specific to physical education (PE). Method: An interpretive document analysis…

  5. An observational study of the accuracy and completeness of an anesthesia information management system: recommendations for documentation system changes.

    PubMed

    Wilbanks, Bryan A; Moss, Jacqueline A; Berner, Eta S

    2013-08-01

    Anesthesia information management systems must often be tailored to fit the environment in which they are implemented. Extensive customization necessitates that systems be analyzed for both accuracy and completeness of documentation design to ensure that the final record is a true representation of practice. The purpose of this study was to determine the accuracy of a recently installed system in the capture of key perianesthesia data. This study used an observational design and was conducted using a convenience sample of nurse anesthetists. Observational data of the nurse anesthetists'delivery of anesthesia care were collected using a touch-screen tablet computer utilizing an Access database customized observational data collection tool. A questionnaire was also administered to these nurse anesthetists to assess perceived accuracy, completeness, and satisfaction with the electronic documentation system. The major sources of data not documented in the system were anesthesiologist presence (20%) and placement of intravenous lines (20%). The major sources of inaccuracies in documentation were gas flow rates (45%), medication administration times (30%), and documentation of neuromuscular function testing (20%)-all of the sources of inaccuracies were related to the use of charting templates that were not altered to reflect the actual interventions performed.

  6. Restoration of recto-verso colour documents using correlated component analysis

    NASA Astrophysics Data System (ADS)

    Tonazzini, Anna; Bedini, Luigi

    2013-12-01

    In this article, we consider the problem of removing see-through interferences from pairs of recto-verso documents acquired either in grayscale or RGB modality. The see-through effect is a typical degradation of historical and archival documents or manuscripts, and is caused by transparency or seeping of ink from the reverse side of the page. We formulate the problem as one of separating two individual texts, overlapped in the recto and verso maps of the colour channels through a linear convolutional mixing operator, where the mixing coefficients are unknown, while the blur kernels are assumed known a priori or estimated off-line. We exploit statistical techniques of blind source separation to estimate both the unknown model parameters and the ideal, uncorrupted images of the two document sides. We show that recently proposed correlated component analysis techniques overcome the already satisfactory performance of independent component analysis techniques and colour decorrelation, when the two texts are even sensibly correlated.

  7. Overview: The Design, Adoption, and Analysis of a Visual Document Mining Tool for Investigative Journalists.

    PubMed

    Brehmer, Matthew; Ingram, Stephen; Stray, Jonathan; Munzner, Tamara

    2014-12-01

    For an investigative journalist, a large collection of documents obtained from a Freedom of Information Act request or a leak is both a blessing and a curse: such material may contain multiple newsworthy stories, but it can be difficult and time consuming to find relevant documents. Standard text search is useful, but even if the search target is known it may not be possible to formulate an effective query. In addition, summarization is an important non-search task. We present Overview, an application for the systematic analysis of large document collections based on document clustering, visualization, and tagging. This work contributes to the small set of design studies which evaluate a visualization system "in the wild", and we report on six case studies where Overview was voluntarily used by self-initiated journalists to produce published stories. We find that the frequently-used language of "exploring" a document collection is both too vague and too narrow to capture how journalists actually used our application. Our iterative process, including multiple rounds of deployment and observations of real world usage, led to a much more specific characterization of tasks. We analyze and justify the visual encoding and interaction techniques used in Overview's design with respect to our final task abstractions, and propose generalizable lessons for visualization design methodology.

  8. Quality Evaluation of Nursing Observation Based on a Survey of Nursing Documents Using NursingNAVI.

    PubMed

    Tsuru, Satoko; Omori, Miho; Inoue, Manami; Wako, Fumiko

    2016-01-01

    We have identified three foci of the nursing observation and nursing action respectively. Using these frameworks, we have developed the structured knowledge model for a number of diseases and medical interventions. We developed this structure based NursingNAVI® contents collaborated with some quality centred hospitals. Authors analysed the nursing care documentations of post-gastrectomy patients in light of the standardized nursing care plan in the "NursingNAVI®" developed by ourselves and revealed the "failure to observe" and "failure to document", which leaded to the volatility of the patients' data, conditions and some situation. This phenomenon should have been avoided if nurses had employed a standardized nursing care plan. So, we developed thinking process support system for planning, delivering, recording and evaluating in daily nursing using NursingNAVI® contents. It is important to identify the problem of the volatility of the patients' data, conditions and some situation. We developed a survey tool of nursing documents using NursingNAVI® Content for quality evaluation of nursing observation. We recommended some hospitals to use this survey tool. Fifteen hospitals participated the survey using this tool. It is estimated that the volatilizing situation. A hospital which don't participate this survey, knew the result. So the hospital decided to use NursingNAVI® contents in HIS. It was suggested that the system has availability for nursing OJT and time reduction of planning and recording without volatilizing situation.

  9. Old document image segmentation using the autocorrelation function and multiresolution analysis

    NASA Astrophysics Data System (ADS)

    Mehri, Maroua; Gomez-Krämer, Petra; Héroux, Pierre; Mullot, Rémy

    2013-01-01

    Recent progress in the digitization of heterogeneous collections of ancient documents has rekindled new challenges in information retrieval in digital libraries and document layout analysis. Therefore, in order to control the quality of historical document image digitization and to meet the need of a characterization of their content using intermediate level metadata (between image and document structure), we propose a fast automatic layout segmentation of old document images based on five descriptors. Those descriptors, based on the autocorrelation function, are obtained by multiresolution analysis and used afterwards in a specific clustering method. The method proposed in this article has the advantage that it is performed without any hypothesis on the document structure, either about the document model (physical structure), or the typographical parameters (logical structure). It is also parameter-free since it automatically adapts to the image content. In this paper, firstly, we detail our proposal to characterize the content of old documents by extracting the autocorrelation features in the different areas of a page and at several resolutions. Then, we show that is possible to automatically find the homogeneous regions defined by similar indices of autocorrelation without knowledge about the number of clusters using adapted hierarchical ascendant classification and consensus clustering approaches. To assess our method, we apply our algorithm on 316 old document images, which encompass six centuries (1200-1900) of French history, in order to demonstrate the performance of our proposal in terms of segmentation and characterization of heterogeneous corpus content. Moreover, we define a new evaluation metric, the homogeneity measure, which aims at evaluating the segmentation and characterization accuracy of our methodology. We find a 85% of mean homogeneity accuracy. Those results help to represent a document by a hierarchy of layout structure and content, and to

  10. Analysis of Private Returns to Vocational Education and Training: Support Document

    ERIC Educational Resources Information Center

    Lee, Wang-Sheng; Coelli, Michael

    2010-01-01

    This document is an appendix that is meant to accompany the main report, "Analysis of Private Returns to Vocational Education and Training". Included here are the detailed regression results that correspond to Tables 4 to 59 of the main report. This document was produced by the authors based on their research for the main report, and is…

  11. Explanation and Elaboration Document for the STROBE-Vet Statement: Strengthening the Reporting of Observational Studies in Epidemiology - Veterinary Extension.

    PubMed

    O'Connor, A M; Sargeant, J M; Dohoo, I R; Erb, H N; Cevallos, M; Egger, M; Ersbøll, A K; Martin, S W; Nielsen, L R; Pearl, D L; Pfeiffer, D U; Sanchez, J; Torrence, M E; Vigre, H; Waldner, C; Ward, M P

    2016-12-01

    The STROBE (Strengthening the Reporting of Observational Studies in Epidemiology) statement was first published in 2007 and again in 2014. The purpose of the original STROBE was to provide guidance for authors, reviewers and editors to improve the comprehensiveness of reporting; however, STROBE has a unique focus on observational studies. Although much of the guidance provided by the original STROBE document is directly applicable, it was deemed useful to map those statements to veterinary concepts, provide veterinary examples and highlight unique aspects of reporting in veterinary observational studies. Here, we present the examples and explanations for the checklist items included in the STROBE-Vet Statement. Thus, this is a companion document to the STROBE-Vet Statement Methods and process document, which describes the checklist and how it was developed. © 2016 The Authors. Zoonoses and Public Health published by Blackwell Verlag GmbH.

  12. [Psychoanalysis and Psychiatrie-Enquete: expert interviews and document analysis].

    PubMed

    Söhner, Felicitas Petra; Fangerau, Heiner; Becker, Thomas

    2017-12-01

    Background The purpose of this paper is to analyse the perception of the role of psychoanalysis and psychoanalysts in the coming about of the Psychiatrie-Enquete in the Federal Republic of Germany (West Germany). Methods We performed a qualitative content analysis of expert interviews with persons involved in the Enquete (or witnessing the events as mental health professionals active at the time), a selective literature review and an analysis of documents on the Enquete process. Results Expert interviews, relevant literature and documents point to a role of psychoanalysis in the Enquete process. Psychoanalysts were considered to have been effective in the run-up to the Enquete and the work of the commission. Conclusion Psychoanalysis and a small number of psychoanalysts were perceived as being relevant in the overall process of the Psychiatrie-Enquete in West Germany. Georg Thieme Verlag KG Stuttgart · New York.

  13. Model-based document categorization employing semantic pattern analysis and local structure clustering

    NASA Astrophysics Data System (ADS)

    Fume, Kosei; Ishitani, Yasuto

    2008-01-01

    We propose a document categorization method based on a document model that can be defined externally for each task and that categorizes Web content or business documents into a target category in accordance with the similarity of the model. The main feature of the proposed method consists of two aspects of semantics extraction from an input document. The semantics of terms are extracted by the semantic pattern analysis and implicit meanings of document substructure are specified by a bottom-up text clustering technique focusing on the similarity of text line attributes. We have constructed a system based on the proposed method for trial purposes. The experimental results show that the system achieves more than 80% classification accuracy in categorizing Web content and business documents into 15 or 70 categories.

  14. An interactive environment for the analysis of large Earth observation and model data sets

    NASA Technical Reports Server (NTRS)

    Bowman, Kenneth P.; Walsh, John E.; Wilhelmson, Robert B.

    1993-01-01

    We propose to develop an interactive environment for the analysis of large Earth science observation and model data sets. We will use a standard scientific data storage format and a large capacity (greater than 20 GB) optical disk system for data management; develop libraries for coordinate transformation and regridding of data sets; modify the NCSA X Image and X DataSlice software for typical Earth observation data sets by including map transformations and missing data handling; develop analysis tools for common mathematical and statistical operations; integrate the components described above into a system for the analysis and comparison of observations and model results; and distribute software and documentation to the scientific community.

  15. An interactive environment for the analysis of large Earth observation and model data sets

    NASA Technical Reports Server (NTRS)

    Bowman, Kenneth P.; Walsh, John E.; Wilhelmson, Robert B.

    1992-01-01

    We propose to develop an interactive environment for the analysis of large Earth science observation and model data sets. We will use a standard scientific data storage format and a large capacity (greater than 20 GB) optical disk system for data management; develop libraries for coordinate transformation and regridding of data sets; modify the NCSA X Image and X Data Slice software for typical Earth observation data sets by including map transformations and missing data handling; develop analysis tools for common mathematical and statistical operations; integrate the components described above into a system for the analysis and comparison of observations and model results; and distribute software and documentation to the scientific community.

  16. Explanation and Elaboration Document for the STROBE-Vet Statement: Strengthening the Reporting of Observational Studies in Epidemiology-Veterinary Extension.

    PubMed

    O'Connor, A M; Sargeant, J M; Dohoo, I R; Erb, H N; Cevallos, M; Egger, M; Ersbøll, A K; Martin, S W; Nielsen, L R; Pearl, D L; Pfeiffer, D U; Sanchez, J; Torrence, M E; Vigre, H; Waldner, C; Ward, M P

    2016-11-01

    The STROBE (Strengthening the Reporting of Observational Studies in Epidemiology) statement was first published in 2007 and again in 2014. The purpose of the original STROBE was to provide guidance for authors, reviewers, and editors to improve the comprehensiveness of reporting; however, STROBE has a unique focus on observational studies. Although much of the guidance provided by the original STROBE document is directly applicable, it was deemed useful to map those statements to veterinary concepts, provide veterinary examples, and highlight unique aspects of reporting in veterinary observational studies. Here, we present the examples and explanations for the checklist items included in the STROBE-Vet statement. Thus, this is a companion document to the STROBE-Vet statement methods and process document (JVIM_14575 "Methods and Processes of Developing the Strengthening the Reporting of Observational Studies in Epidemiology-Veterinary (STROBE-Vet) Statement" undergoing proofing), which describes the checklist and how it was developed. Copyright © 2016 The Authors. Journal of Veterinary Internal Medicine published by Wiley Periodicals, Inc. on behalf of the American College of Veterinary Internal Medicine.

  17. Blood pressure documentation in the emergency department

    PubMed Central

    Daniel, Ana Carolina Queiroz Godoy; Machado, Juliana Pereira; Veiga, Eugenia Velludo

    2017-01-01

    ABSTRACT Objective To analyze the frequency of blood pressure documentation performed by nursing professionals in an emergency department. Methods This is a cross-sectional, observational, descriptive, and analytical study, which included medical records of adult patients admitted to the observation ward of an emergency department, between March and May 2014. Data were obtained through a collection instrument divided into three parts: patient identification, triage data, and blood pressure documentation. For statistical analysis, Pearson’s correlation coefficient was used, with a significance level of α<0.05. Results One hundred fifty-seven records and 430 blood pressure measurements were analyzed with an average of three measurements per patient. Of these measures, 46.5% were abnormal. The mean time from admission to documentation of the first blood pressure measurement was 2.5 minutes, with 42 minutes between subsequent measures. There is no correlation between the systolic blood pressure values and the mean time interval between blood pressure documentations: 0.173 (p=0.031). Conclusion The present study found no correlation between frequency of blood pressure documentation and blood pressure values. The frequency of blood pressure documentation increased according to the severity of the patient and decreased during the length of stay in the emergency department. PMID:28444085

  18. Assessment of documentation requirements under DOE 5481. 1, Safety Analysis and Review System (SARS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Browne, E.T.

    1981-03-01

    This report assesses the requirements of DOE Order 5481.1, Safety Analysis and Review System for DOE Operations (SARS) in regard to maintaining SARS documentation. Under SARS, all pertinent details of the entire safety analysis and review process for each DOE operation are to be traceable from the initial identification of a hazard. This report is intended to provide assistance in identifying the points in the SARS cycle at which documentation is required, what type of documentation is most appropriate, and where it ultimately should be maintained.

  19. Analysis of Document Authentication Technique using Soft Magnetic Fibers

    NASA Astrophysics Data System (ADS)

    Aoki, Ayumi; Ikeda, Takashi; Yamada, Tsutomu; Takemura, Yasushi; Matsumoto, Tsutomu

    An artifact-metric system using magnetic fibers can be applied for authentications of stock certificate, bill, passport, plastic cards and other documents. Security of the system is guaranteed by its feature of difficulty in copy. This authentication system is based on randomly dispersed magnetic fibers embedded in documents. In this paper, a theoretical analysis was performed in order to evaluate this system. The position of the magnetic fibers was determined by a conventional function of random number generator. By measuring output waveforms by a magnetoresistance (MR) sensor, a false match rate (FMR) could be calculated. Optimizations of the density of the magnetic fibers and the dimension of the MR sensor were achieved.

  20. Tobacco document research reporting

    PubMed Central

    Carter, S

    2005-01-01

    Design: Interpretive analysis of published research. Sample: 173 papers indexed in Medline between 1995 and 2004 that cited tobacco industry documents. Analysis: Information about year published, journal and author, and a set of codes relating to methods reporting, were managed in N*Vivo. This coding formed the basis of an interpretation of tobacco document research reporting. Results: Two types of papers were identified. The first used tobacco documents as the primary data source (A-papers). The second was dedicated to another purpose but cited a small number of documents (B-papers). In B-papers documents were used either to provide a specific example or to support an expansive contention. A-papers contained information about purpose, sources, searching, analysis, and limitations that differed by author and journal and over time. A-papers had no clear methodological context, but used words from three major traditions—interpretive research, positivist research, and history—to describe analysis. Interpretation: A descriptive mainstream form of tobacco document reporting is proposed, initially typical but decreasing, and a continuum of positioning of the researcher, from conduit to constructor. Reporting practices, particularly from experienced researchers, appeared to evolve towards researcher as constructor, with later papers showing more complex purposes, diverse sources, and detail of searching and analysis. Tobacco document research could learn from existing research traditions: a model for planning and evaluating tobacco document research is presented. PMID:16319359

  1. Content analysis to detect high stress in oral interviews and text documents

    NASA Technical Reports Server (NTRS)

    Thirumalainambi, Rajkumar (Inventor); Jorgensen, Charles C. (Inventor)

    2012-01-01

    A system of interrogation to estimate whether a subject of interrogation is likely experiencing high stress, emotional volatility and/or internal conflict in the subject's responses to an interviewer's questions. The system applies one or more of four procedures, a first statistical analysis, a second statistical analysis, a third analysis and a heat map analysis, to identify one or more documents containing the subject's responses for which further examination is recommended. Words in the documents are characterized in terms of dimensions representing different classes of emotions and states of mind, in which the subject's responses that manifest high stress, emotional volatility and/or internal conflict are identified. A heat map visually displays the dimensions manifested by the subject's responses in different colors, textures, geometric shapes or other visually distinguishable indicia.

  2. An Analysis of Community Health Nurses Documentation: The Best Approach to Computerization

    PubMed Central

    Chalmers, M.

    1988-01-01

    The study explored and analyzed the actual patient-related documentation performed by a sample of community health nurses working in voluntary home health agencies. The outcome of the study was a system flow chart of that documentation and included: common components of the documentation, where in the existing systems they are recorded, when they are recorded by the nurse and why they are used by the nurses and administrative personnel in the agencies. The flow chart is suitable for use as a prototype for the development of a computer software package for the computerization of the patient-related documentation by community health nurses. General System and communication theories were used as a framework for this study. A thorough analysis of the documenation resulted in a complete and exhaustive explication of the documentation by community health nurses, as well as the identification of what parts of that documentation lend themselves most readily to computerization and what areas, if any, may not readily adapt to computerization.

  3. Web document ranking via active learning and kernel principal component analysis

    NASA Astrophysics Data System (ADS)

    Cai, Fei; Chen, Honghui; Shu, Zhen

    2015-09-01

    Web document ranking arises in many information retrieval (IR) applications, such as the search engine, recommendation system and online advertising. A challenging issue is how to select the representative query-document pairs and informative features as well for better learning and exploring new ranking models to produce an acceptable ranking list of candidate documents of each query. In this study, we propose an active sampling (AS) plus kernel principal component analysis (KPCA) based ranking model, viz. AS-KPCA Regression, to study the document ranking for a retrieval system, i.e. how to choose the representative query-document pairs and features for learning. More precisely, we fill those documents gradually into the training set by AS such that each of which will incur the highest expected DCG loss if unselected. Then, the KPCA is performed via projecting the selected query-document pairs onto p-principal components in the feature space to complete the regression. Hence, we can cut down the computational overhead and depress the impact incurred by noise simultaneously. To the best of our knowledge, we are the first to perform the document ranking via dimension reductions in two dimensions, namely, the number of documents and features simultaneously. Our experiments demonstrate that the performance of our approach is better than that of the baseline methods on the public LETOR 4.0 datasets. Our approach brings an improvement against RankBoost as well as other baselines near 20% in terms of MAP metric and less improvements using P@K and NDCG@K, respectively. Moreover, our approach is particularly suitable for document ranking on the noisy dataset in practice.

  4. Underground Test Area Subproject Phase I Data Analysis Task. Volume VIII - Risk Assessment Documentation Package

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Volume VIII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the risk assessment documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  5. Comparison of approaches for mobile document image analysis using server supported smartphones

    NASA Astrophysics Data System (ADS)

    Ozarslan, Suleyman; Eren, P. Erhan

    2014-03-01

    With the recent advances in mobile technologies, new capabilities are emerging, such as mobile document image analysis. However, mobile phones are still less powerful than servers, and they have some resource limitations. One approach to overcome these limitations is performing resource-intensive processes of the application on remote servers. In mobile document image analysis, the most resource consuming process is the Optical Character Recognition (OCR) process, which is used to extract text in mobile phone captured images. In this study, our goal is to compare the in-phone and the remote server processing approaches for mobile document image analysis in order to explore their trade-offs. For the inphone approach, all processes required for mobile document image analysis run on the mobile phone. On the other hand, in the remote-server approach, core OCR process runs on the remote server and other processes run on the mobile phone. Results of the experiments show that the remote server approach is considerably faster than the in-phone approach in terms of OCR time, but adds extra delays such as network delay. Since compression and downscaling of images significantly reduce file sizes and extra delays, the remote server approach overall outperforms the in-phone approach in terms of selected speed and correct recognition metrics, if the gain in OCR time compensates for the extra delays. According to the results of the experiments, using the most preferable settings, the remote server approach performs better than the in-phone approach in terms of speed and acceptable correct recognition metrics.

  6. Public health human resources: a comparative analysis of policy documents in two Canadian provinces

    PubMed Central

    2014-01-01

    Background Amidst concerns regarding the capacity of the public health system to respond rapidly and appropriately to threats such as pandemics and terrorism, along with changing population health needs, governments have focused on strengthening public health systems. A key factor in a robust public health system is its workforce. As part of a nationally funded study of public health renewal in Canada, a policy analysis was conducted to compare public health human resources-relevant documents in two Canadian provinces, British Columbia (BC) and Ontario (ON), as they each implement public health renewal activities. Methods A content analysis of policy and planning documents from government and public health-related organizations was conducted by a research team comprised of academics and government decision-makers. Documents published between 2003 and 2011 were accessed (BC = 27; ON = 20); documents were either publicly available or internal to government and excerpted with permission. Documentary texts were deductively coded using a coding template developed by the researchers based on key health human resources concepts derived from two national policy documents. Results Documents in both provinces highlighted the importance of public health human resources planning and policies; this was particularly evident in early post-SARS documents. Key thematic areas of public health human resources identified were: education, training, and competencies; capacity; supply; intersectoral collaboration; leadership; public health planning context; and priority populations. Policy documents in both provinces discussed the importance of an educated, competent public health workforce with the appropriate skills and competencies for the effective and efficient delivery of public health services. Conclusion This policy analysis identified progressive work on public health human resources policy and planning with early documents providing an inventory of issues to be

  7. Public health human resources: a comparative analysis of policy documents in two Canadian provinces.

    PubMed

    Regan, Sandra; MacDonald, Marjorie; Allan, Diane E; Martin, Cheryl; Peroff-Johnston, Nancy

    2014-02-24

    Amidst concerns regarding the capacity of the public health system to respond rapidly and appropriately to threats such as pandemics and terrorism, along with changing population health needs, governments have focused on strengthening public health systems. A key factor in a robust public health system is its workforce. As part of a nationally funded study of public health renewal in Canada, a policy analysis was conducted to compare public health human resources-relevant documents in two Canadian provinces, British Columbia (BC) and Ontario (ON), as they each implement public health renewal activities. A content analysis of policy and planning documents from government and public health-related organizations was conducted by a research team comprised of academics and government decision-makers. Documents published between 2003 and 2011 were accessed (BC = 27; ON = 20); documents were either publicly available or internal to government and excerpted with permission. Documentary texts were deductively coded using a coding template developed by the researchers based on key health human resources concepts derived from two national policy documents. Documents in both provinces highlighted the importance of public health human resources planning and policies; this was particularly evident in early post-SARS documents. Key thematic areas of public health human resources identified were: education, training, and competencies; capacity; supply; intersectoral collaboration; leadership; public health planning context; and priority populations. Policy documents in both provinces discussed the importance of an educated, competent public health workforce with the appropriate skills and competencies for the effective and efficient delivery of public health services. This policy analysis identified progressive work on public health human resources policy and planning with early documents providing an inventory of issues to be addressed and later documents providing

  8. Global Nursing Issues and Development: Analysis of World Health Organization Documents.

    PubMed

    Wong, Frances Kam Yuet; Liu, Huaping; Wang, Hui; Anderson, Debra; Seib, Charrlotte; Molasiotis, Alex

    2015-11-01

    To analyze World Health Organization (WHO) documents to identify global nursing issues and development. Qualitative content analysis. Documents published by the six WHO regions between 2007 and 2012 and with key words related to nurse/midwife or nursing/midwifery were included. Themes, categories, and subcategories were derived. The final coding reached 80% agreement among three independent coders, and the final coding for the discrepant coding was reached by consensus. Thirty-two documents from the regions of Europe (n = 19), the Americas (n = 6), the Western Pacific (n = 4), Africa (n = 1), the Eastern Mediterranean (n = 1), and Southeast Asia (n = 1) were examined. A total of 385 units of analysis dispersed in 31 subcategories under four themes were derived. The four themes derived (number of unit of analysis, %) were Management & Leadership (206, 53.5), Practice (75, 19.5), Education (70, 18.2), and Research (34, 8.8). The key nursing issues of concern at the global level are workforce, the impacts of nursing in health care, professional status, and education of nurses. International alliances can help advance nursing, but the visibility of nursing in the WHO needs to be strengthened. Organizational leadership is important in order to optimize the use of nursing competence in practice and inform policy makers regarding the value of nursing to promote people's health. © 2015 Sigma Theta Tau International.

  9. Organ donation in the ICU: A document analysis of institutional policies, protocols, and order sets.

    PubMed

    Oczkowski, Simon J W; Centofanti, John E; Durepos, Pamela; Arseneau, Erika; Kelecevic, Julija; Cook, Deborah J; Meade, Maureen O

    2018-04-01

    To better understand how local policies influence organ donation rates. We conducted a document analysis of our ICU organ donation policies, protocols and order sets. We used a systematic search of our institution's policy library to identify documents related to organ donation. We used Mindnode software to create a publication timeline, basic statistics to describe document characteristics, and qualitative content analysis to extract document themes. Documents were retrieved from Hamilton Health Sciences, an academic hospital system with a high volume of organ donation, from database inception to October 2015. We retrieved 12 active organ donation documents, including six protocols, two policies, two order sets, and two unclassified documents, a majority (75%) after the introduction of donation after circulatory death in 2006. Four major themes emerged: organ donation process, quality of care, patient and family-centred care, and the role of the institution. These themes indicate areas where documented institutional standards may be beneficial. Further research is necessary to determine the relationship of local policies, protocols, and order sets to actual organ donation practices, and to identify barriers and facilitators to improving donation rates. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Earth Observing System (EOS) Advanced Microwave Sounding Unit-A (AMSU-A): Instrumentation interface control document

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This Interface Control Document (ICD) defines the specific details of the complete accomodation information between the Earth Observing System (EOS) PM Spacecraft and the Advanced Microwave Sounding Unit (AMSU-A)Instrument. This is the first submittal of the ICN: it will be updated periodically throughout the life of the program. The next update is planned prior to Critical Design Review (CDR).

  11. Prison-Based Educational Programs: A Content Analysis of Government Documents

    ERIC Educational Resources Information Center

    Piotrowski, Chris; Lathrop, Peter J.

    2012-01-01

    The literature provides limited, constructive, consensus-based information to correctional officials and administrators on the efficacy of prison-based programs. This study reports an analysis of 8 review government documents, that surveyed the research literature from 1980-2008, on the topic of educational rehabilitation programs available to…

  12. Analysis of Informed Consent Document Utilization in a Minimal-Risk Genetic Study

    PubMed Central

    Desch, Karl; Li, Jun; Kim, Scott; Laventhal, Naomi; Metzger, Kristen; Siemieniak, David; Ginsburg, David

    2012-01-01

    Background The signed informed consent document certifies that the process of informed consent has taken place and provides research participants with comprehensive information about their role in the study. Despite efforts to optimize the informed consent document, only limited data are available about the actual use of consent documents by participants in biomedical research. Objective To examine the use of online consent documents in a minimal-risk genetic study. Design Prospective sibling cohort enrolled as part of a genetic study of hematologic and common human traits. Setting University of Michigan Campus, Ann Arbor, Michigan. Participants Volunteer sample of healthy persons with 1 or more eligible siblings aged 14 to 35 years. Enrollment was through targeted e-mail to student lists. A total of 1209 persons completed the study. Measurements Time taken by participants to review a 2833-word online consent document before indicating consent and identification of a masked hyperlink near the end of the document. Results The minimum predicted reading time was 566 seconds. The median time to consent was 53 seconds. A total of 23% of participants consented within 10 seconds, and 93% of participants consented in less than the minimum predicted reading time. A total of 2.5% of participants identified the masked hyperlink. Limitation The online consent process was not observed directly by study investigators, and some participants may have viewed the consent document more than once. Conclusion Few research participants thoroughly read the consent document before agreeing to participate in this genetic study. These data suggest that current informed consent documents, particularly for low-risk studies, may no longer serve the intended purpose of protecting human participants, and the role of these documents should be reassessed. Primary Funding Source National Institutes of Health. PMID:21893624

  13. Underground Test Area Subproject Phase I Data Analysis Task. Volume VII - Tritium Transport Model Documentation Package

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Volume VII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the tritium transport model documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  14. Analysis of a risk prevention document using dependability techniques: a first step towards an effectiveness model

    NASA Astrophysics Data System (ADS)

    Ferrer, Laetitia; Curt, Corinne; Tacnet, Jean-Marc

    2018-04-01

    Major hazard prevention is a main challenge given that it is specifically based on information communicated to the public. In France, preventive information is notably provided by way of local regulatory documents. Unfortunately, the law requires only few specifications concerning their content; therefore one can question the impact on the general population relative to the way the document is concretely created. Ergo, the purpose of our work is to propose an analytical methodology to evaluate preventive risk communication document effectiveness. The methodology is based on dependability approaches and is applied in this paper to the Document d'Information Communal sur les Risques Majeurs (DICRIM; in English, Municipal Information Document on Major Risks). DICRIM has to be made by mayors and addressed to the public to provide information on major hazards affecting their municipalities. An analysis of law compliance of the document is carried out thanks to the identification of regulatory detection elements. These are applied to a database of 30 DICRIMs. This analysis leads to a discussion on points such as usefulness of the missing elements. External and internal function analysis permits the identification of the form and content requirements and service and technical functions of the document and its components (here its sections). Their results are used to carry out an FMEA (failure modes and effects analysis), which allows us to define the failure and to identify detection elements. This permits the evaluation of the effectiveness of form and content of each components of the document. The outputs are validated by experts from the different fields investigated. Those results are obtained to build, in future works, a decision support model for the municipality (or specialised consulting firms) in charge of drawing up documents.

  15. Analysis of Documentation Speed Using Web-Based Medical Speech Recognition Technology: Randomized Controlled Trial.

    PubMed

    Vogel, Markus; Kaisers, Wolfgang; Wassmuth, Ralf; Mayatepek, Ertan

    2015-11-03

    Clinical documentation has undergone a change due to the usage of electronic health records. The core element is to capture clinical findings and document therapy electronically. Health care personnel spend a significant portion of their time on the computer. Alternatives to self-typing, such as speech recognition, are currently believed to increase documentation efficiency and quality, as well as satisfaction of health professionals while accomplishing clinical documentation, but few studies in this area have been published to date. This study describes the effects of using a Web-based medical speech recognition system for clinical documentation in a university hospital on (1) documentation speed, (2) document length, and (3) physician satisfaction. Reports of 28 physicians were randomized to be created with (intervention) or without (control) the assistance of a Web-based system of medical automatic speech recognition (ASR) in the German language. The documentation was entered into a browser's text area and the time to complete the documentation including all necessary corrections, correction effort, number of characters, and mood of participant were stored in a database. The underlying time comprised text entering, text correction, and finalization of the documentation event. Participants self-assessed their moods on a scale of 1-3 (1=good, 2=moderate, 3=bad). Statistical analysis was done using permutation tests. The number of clinical reports eligible for further analysis stood at 1455. Out of 1455 reports, 718 (49.35%) were assisted by ASR and 737 (50.65%) were not assisted by ASR. Average documentation speed without ASR was 173 (SD 101) characters per minute, while it was 217 (SD 120) characters per minute using ASR. The overall increase in documentation speed through Web-based ASR assistance was 26% (P=.04). Participants documented an average of 356 (SD 388) characters per report when not assisted by ASR and 649 (SD 561) characters per report when assisted

  16. SUSHI: an exquisite recipe for fully documented, reproducible and reusable NGS data analysis.

    PubMed

    Hatakeyama, Masaomi; Opitz, Lennart; Russo, Giancarlo; Qi, Weihong; Schlapbach, Ralph; Rehrauer, Hubert

    2016-06-02

    Next generation sequencing (NGS) produces massive datasets consisting of billions of reads and up to thousands of samples. Subsequent bioinformatic analysis is typically done with the help of open source tools, where each application performs a single step towards the final result. This situation leaves the bioinformaticians with the tasks to combine the tools, manage the data files and meta-information, document the analysis, and ensure reproducibility. We present SUSHI, an agile data analysis framework that relieves bioinformaticians from the administrative challenges of their data analysis. SUSHI lets users build reproducible data analysis workflows from individual applications and manages the input data, the parameters, meta-information with user-driven semantics, and the job scripts. As distinguishing features, SUSHI provides an expert command line interface as well as a convenient web interface to run bioinformatics tools. SUSHI datasets are self-contained and self-documented on the file system. This makes them fully reproducible and ready to be shared. With the associated meta-information being formatted as plain text tables, the datasets can be readily further analyzed and interpreted outside SUSHI. SUSHI provides an exquisite recipe for analysing NGS data. By following the SUSHI recipe, SUSHI makes data analysis straightforward and takes care of documentation and administration tasks. Thus, the user can fully dedicate his time to the analysis itself. SUSHI is suitable for use by bioinformaticians as well as life science researchers. It is targeted for, but by no means constrained to, NGS data analysis. Our SUSHI instance is in productive use and has served as data analysis interface for more than 1000 data analysis projects. SUSHI source code as well as a demo server are freely available.

  17. Bite mark documentation and analysis: the forensic 3D/CAD supported photogrammetry approach.

    PubMed

    Thali, M J; Braun, M; Markwalder, Th H; Brueschweiler, W; Zollinger, U; Malik, Naseem J; Yen, K; Dirnhofer, R

    2003-08-12

    Bite mark identification is based on the individuality of a dentition, which is used to match a bite mark to a suspected perpetrator. This matching is based on a tooth-by-tooth and arch-to-arch comparison utilising parameters of size, shape and alignment. The most common method used to analyse bite mark are carried out in 2D space. That means that the 3D information is preserved only two dimensionally with distortions. This paper presents a new 3D documentation, analysis and visualisation approach based on forensic 3D/CAD supported photogrammetry (FPHG) and the use of a 3D surface scanner. Our photogrammetric approach and the used visualisation method is, to the best to our knowledge, the first 3D approach for bite mark analysis in an actual case. The documentation has no distortion artifacts as can be found with standard photography. All the data are documented with a metric 3D measurement, orientation and subsequent analysis in 3D space. Beside the metrical analysis between bite mark and cast, it is possible using our method to utilise the topographical 3D feature of each individual tooth. This means that the 3D features of the biting surfaces and edges of each teeth are respected which is--as shown in our case--very important especially in the front teeth which have the first contact to the skin. Based upon the 3D detailed representation of the cast with the 3D topographic characteristics of the teeth, the interaction with the 3D documented skin can be visualised and analysed on the computer screen.

  18. Analysis of Documents Published in Scopus Database on Foreign Language Learning through Mobile Learning: A Content Analysis

    ERIC Educational Resources Information Center

    Uzunboylu, Huseyin; Genc, Zeynep

    2017-01-01

    The purpose of this study is to determine the recent trends in foreign language learning through mobile learning. The study was conducted employing document analysis and related content analysis among the qualitative research methodology. Through the search conducted on Scopus database with the key words "mobile learning and foreign language…

  19. Hurricane Sandy: observations and analysis of coastal change

    USGS Publications Warehouse

    Sopkin, Kristin L.; Stockdon, Hilary F.; Doran, Kara S.; Plant, Nathaniel G.; Morgan, Karen L.M.; Guy, Kristy K.; Smith, Kathryn E.L.

    2014-01-01

    Hurricane Sandy, the largest Atlantic hurricane on record, made landfall on October 29, 2012, and impacted a long swath of the U.S. Atlantic coastline. The barrier islands were breached in a number of places and beach and dune erosion occurred along most of the Mid-Atlantic coast. As a part of the National Assessment of Coastal Change Hazards project, the U.S. Geological Survey collected post-Hurricane Sandy oblique aerial photography and lidar topographic surveys to document the changes that occurred as a result of the storm. Comparisons of post-storm photographs to those collected prior to Sandy’s landfall were used to characterize the nature, magnitude, and spatial variability of hurricane-induced coastal changes. Analysis of pre- and post-storm lidar elevations was used to quantify magnitudes of change in shoreline position, dune elevation, and beach width. Erosion was observed along the coast from North Carolina to New York; however, as would be expected over such a large region, extensive spatial variability in storm response was observed.

  20. Swarm Intelligence in Text Document Clustering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cui, Xiaohui; Potok, Thomas E

    2008-01-01

    Social animals or insects in nature often exhibit a form of emergent collective behavior. The research field that attempts to design algorithms or distributed problem-solving devices inspired by the collective behavior of social insect colonies is called Swarm Intelligence. Compared to the traditional algorithms, the swarm algorithms are usually flexible, robust, decentralized and self-organized. These characters make the swarm algorithms suitable for solving complex problems, such as document collection clustering. The major challenge of today's information society is being overwhelmed with information on any topic they are searching for. Fast and high-quality document clustering algorithms play an important role inmore » helping users to effectively navigate, summarize, and organize the overwhelmed information. In this chapter, we introduce three nature inspired swarm intelligence clustering approaches for document clustering analysis. These clustering algorithms use stochastic and heuristic principles discovered from observing bird flocks, fish schools and ant food forage.« less

  1. Centroid-Based Document Classification Algorithms: Analysis & Experimental Results

    DTIC Science & Technology

    2000-03-06

    stories such as baseball, football , basketball, and Olympics. In the first category, most of the documents contain words Clinton and Lewinsky and hence...document. On the other hand, any of sports related words like baseball, football , and basketball appearing in a document will put the document in the...0.15 diseas 0.14 women 0.13 heart 0.12 drug 4 0.41 newspap 0.22 editor 0.19 advertis 0.14 media 0.13 peruvian 0.13 coverag 0.12 percent 0.12 journalist

  2. 75 FR 24718 - Guidance for Industry on Documenting Statistical Analysis Programs and Data Files; Availability

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-05

    ...] Guidance for Industry on Documenting Statistical Analysis Programs and Data Files; Availability AGENCY... Programs and Data Files.'' This guidance is provided to inform study statisticians of recommendations for documenting statistical analyses and data files submitted to the Center for Veterinary Medicine (CVM) for the...

  3. Revenue-Generating Language Programs at Canadian Post-Secondary Institutions: Emerging Themes from a Documentation Analysis

    ERIC Educational Resources Information Center

    Eaton, Sarah Elaine; Goddard, J. Tim

    2008-01-01

    This presentation identifies emerging themes in a study combining documentation analysis (Atkinson & Coffey, 2004) and interviews that examine policy statements, promotional materials and various institutional documents from selected English as a Second Language (ESL) programs at one Canadian University. It looks at how and why ESL programs…

  4. Document boundary determination using structural and lexical analysis

    NASA Astrophysics Data System (ADS)

    Taghva, Kazem; Cartright, Marc-Allen

    2009-01-01

    The document boundary determination problem is the process of identifying individual documents in a stack of papers. In this paper, we report on a classification system for automation of this process. The system employs features based on document structure and lexical content. We also report on experimental results to support the effectiveness of this system.

  5. Document co-citation analysis to enhance transdisciplinary research

    PubMed Central

    Trujillo, Caleb M.; Long, Tammy M.

    2018-01-01

    Specialized and emerging fields of research infrequently cross disciplinary boundaries and would benefit from frameworks, methods, and materials informed by other fields. Document co-citation analysis, a method developed by bibliometric research, is demonstrated as a way to help identify key literature for cross-disciplinary ideas. To illustrate the method in a useful context, we mapped peer-recognized scholarship related to systems thinking. In addition, three procedures for validation of co-citation networks are proposed and implemented. This method may be useful for strategically selecting information that can build consilience about ideas and constructs that are relevant across a range of disciplines. PMID:29308433

  6. Security analysis for biometric data in ID documents

    NASA Astrophysics Data System (ADS)

    Schimke, Sascha; Kiltz, Stefan; Vielhauer, Claus; Kalker, Ton

    2005-03-01

    In this paper we analyze chances and challenges with respect to the security of using biometrics in ID documents. We identify goals for ID documents, set by national and international authorities, and discuss the degree of security, which is obtainable with the inclusion of biometric into documents like passports. Starting from classical techniques for manual authentication of ID card holders, we expand our view towards automatic methods based on biometrics. We do so by reviewing different human biometric attributes by modality, as well as by discussing possible techniques for storing and handling the particular biometric data on the document. Further, we explore possible vulnerabilities of potential biometric passport systems. Based on the findings of that discussion we will expand upon two exemplary approaches for including digital biometric data in the context of ID documents and present potential risks attack scenarios along with technical aspects such as capacity and robustness.

  7. An historical document analysis of the introduction of the Baby Friendly Hospital Initiative into the Australian setting.

    PubMed

    Atchan, Marjorie; Davis, Deborah; Foureur, Maralyn

    2017-02-01

    Breastfeeding has many known benefits yet its support across Australian health systems was suboptimal throughout the 20th Century. The World Health Organization launched a global health promotion strategy to help create a 'breastfeeding culture'. Research on the programme has revealed multiple barriers since implementation. To analyse the sociopolitical challenges associated with implementing a global programme into a national setting via an examination of the influences on the early period of implementation of the Baby Friendly Hospital Initiative in Australia. A focused historical document analysis was attended as part of an instrumental case study. A purposeful sampling strategy obtained a comprehensive sample of public and private documents related to the introduction of the BFHI in Australia. Analysis was informed by a 'documents as commentary' approach to gain insight into individual and collective social practices not otherwise observable. Four major themes were identified: "a breastfeeding culture"; "resource implications"; "ambivalent support for breastfeeding and the BFHI" and "business versus advocacy". "A breastfeeding culture" included several subthemes. No tangible support for breastfeeding generally, or the Baby Friendly Hospital Initiative specifically, was identified. Australian policy did not follow international recommendations. There were no financial or policy incentives for BFHI implementation. Key stakeholders' decisions negatively impacted on the Baby Friendly Hospital Initiative at a crucial time in its implementation in Australia. The potential impact of the programme was not realised, representing a missed opportunity to establish and provide sustainable standardised breastfeeding support to Australian women and their families. Copyright © 2016 Australian College of Midwives. Published by Elsevier Ltd. All rights reserved.

  8. Rapid Exploitation and Analysis of Documents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buttler, D J; Andrzejewski, D; Stevens, K D

    Analysts are overwhelmed with information. They have large archives of historical data, both structured and unstructured, and continuous streams of relevant messages and documents that they need to match to current tasks, digest, and incorporate into their analysis. The purpose of the READ project is to develop technologies to make it easier to catalog, classify, and locate relevant information. We approached this task from multiple angles. First, we tackle the issue of processing large quantities of information in reasonable time. Second, we provide mechanisms that allow users to customize their queries based on latent topics exposed from corpus statistics. Third,more » we assist users in organizing query results, adding localized expert structure over results. Forth, we use word sense disambiguation techniques to increase the precision of matching user generated keyword lists with terms and concepts in the corpus. Fifth, we enhance co-occurrence statistics with latent topic attribution, to aid entity relationship discovery. Finally we quantitatively analyze the quality of three popular latent modeling techniques to examine under which circumstances each is useful.« less

  9. Textual blocks rectification method based on fast Hough transform analysis in identity documents recognition

    NASA Astrophysics Data System (ADS)

    Bezmaternykh, P. V.; Nikolaev, D. P.; Arlazarov, V. L.

    2018-04-01

    Textual blocks rectification or slant correction is an important stage of document image processing in OCR systems. This paper considers existing methods and introduces an approach for the construction of such algorithms based on Fast Hough Transform analysis. A quality measurement technique is proposed and obtained results are shown for both printed and handwritten textual blocks processing as a part of an industrial system of identity documents recognition on mobile devices.

  10. Documentation: Records and Reports.

    PubMed

    Akers, Michael J

    2017-01-01

    This article deals with documentation to include the beginning of documentation, the requirements of Good Manufacturing Practice reports and records, and the steps that can be taken to minimize Good Manufacturing Practice documentation problems. It is important to remember that documentation for 503a compounding involves the Formulation Record, Compounding Record, Standard Operating Procedures, Safety Data Sheets, etc. For 503b outsourcing facilities, compliance with Current Good Manufacturing Practices is required, so this article is applicable to them. For 503a pharmacies, one can see the development and modification of Good Manufacturing Practice and even observe changes as they are occurring in 503a documentation requirements and anticipate that changes will probably continue to occur. Copyright© by International Journal of Pharmaceutical Compounding, Inc.

  11. Degraded document image enhancement

    NASA Astrophysics Data System (ADS)

    Agam, G.; Bal, G.; Frieder, G.; Frieder, O.

    2007-01-01

    Poor quality documents are obtained in various situations such as historical document collections, legal archives, security investigations, and documents found in clandestine locations. Such documents are often scanned for automated analysis, further processing, and archiving. Due to the nature of such documents, degraded document images are often hard to read, have low contrast, and are corrupted by various artifacts. We describe a novel approach for the enhancement of such documents based on probabilistic models which increases the contrast, and thus, readability of such documents under various degradations. The enhancement produced by the proposed approach can be viewed under different viewing conditions if desired. The proposed approach was evaluated qualitatively and compared to standard enhancement techniques on a subset of historical documents obtained from the Yad Vashem Holocaust museum. In addition, quantitative performance was evaluated based on synthetically generated data corrupted under various degradation models. Preliminary results demonstrate the effectiveness of the proposed approach.

  12. 23 CFR 1340.5 - Documentation requirements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE § 1340.5 Documentation requirements. All sample design, data collection, and estimation procedures used in State surveys conducted in accordance with this part must be well documented. At a minimum, the documentation must: (a) For sample design— (1) Define all...

  13. 23 CFR 1340.5 - Documentation requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE § 1340.5 Documentation requirements. All sample design, data collection, and estimation procedures used in State surveys conducted in accordance with this part must be well documented. At a minimum, the documentation must: (a) For sample design— (1) Define all...

  14. Lattice algebra approach to multispectral analysis of ancient documents.

    PubMed

    Valdiviezo-N, Juan C; Urcid, Gonzalo

    2013-02-01

    This paper introduces a lattice algebra procedure that can be used for the multispectral analysis of historical documents and artworks. Assuming the presence of linearly mixed spectral pixels captured in a multispectral scene, the proposed method computes the scaled min- and max-lattice associative memories to determine the purest pixels that best represent the spectra of single pigments. The estimation of fractional proportions of pure spectra at each image pixel is used to build pigment abundance maps that can be used for subsequent restoration of damaged parts. Application examples include multispectral images acquired from the Archimedes Palimpsest and a Mexican pre-Hispanic codex.

  15. System for the Analysis of Global Energy Markets - Vol. I, Model Documentation

    EIA Publications

    2003-01-01

    Documents the objectives and the conceptual and methodological approach used in the development of projections for the International Energy Outlook. The first volume of this report describes the System for the Analysis of Global Energy Markets (SAGE) methodology and provides an in-depth explanation of the equations of the model.

  16. Shoulder dystocia documentation: an evaluation of a documentation training intervention.

    PubMed

    LeRiche, Tammy; Oppenheimer, Lawrence; Caughey, Sharon; Fell, Deshayne; Walker, Mark

    2015-03-01

    To evaluate the quality and content of nurse and physician shoulder dystocia delivery documentation before and after MORE training in shoulder dystocia management skills and documentation. Approximately 384 charts at the Ottawa Hospital General Campus involving a diagnosis of shoulder dystocia between the years of 2000 and 2006 excluding the training year of 2003 were identified. The charts were evaluated for 14 key components derived from a validated instrument. The delivery notes were then scored based on these components by 2 separate investigators who were blinded to delivery note author, date, and patient identification to further quantify delivery record quality. Approximately 346 charts were reviewed for physician and nurse delivery documentation. The average score for physician notes was 6 (maximum possible score of 14) both before and after the training intervention. The nurses' average score was 5 before and after the training intervention. Negligible improvement was observed in the content and quality of shoulder dystocia documentation before and after nurse and physician training.

  17. [Interrelationship among "NANDA, NOC and NIC". A pilot study and an evaluation of a nursing document].

    PubMed

    Gomez de Segura Navarro, Carlota; Esain Larrambe, Ainhoa; Tina Majuelo, Pilar; Guembe Ibáñez, Irene; Fernández Perea, Laura; Narvaiza Solís, M Jesús

    2006-01-01

    (a) to determine the effectiveness of a nursing document which integrates nursing diagnoses, nursing treatments/actions (NIC), and results (NOC); (b) to verify the application of the aforementioned document in a hospitalization unit. A descriptive, transversal and observational study. Nursing documents (NANDA, NIC and NOC taxonomies). PHASES: 1st: analysis of the content in the nursing documentation for 23 pneumonic patients: Selection of nursing diagnoses and the most frequent interdependent problems. 2nd: Selection of results and nursing treatment/actions. 3rd: Elaboration of the document and a description of the Likert scales to define the state of the indicators for each result. 4th: A pilot study of the document applied to 12 patients. the application of the document permits one to identify the real status of a patient; to establish specific objectives; to improve the recording of data; to observe the effectiveness of treatment; to include educational activities; to give greater continuity and quality to a treatment plan.

  18. In-service documentation tools and statements on palliative sedation in Germany--do they meet the EAPC framework recommendations? A qualitative document analysis.

    PubMed

    Stiel, Stephanie; Heckel, Maria; Christensen, Britta; Ostgathe, Christoph; Klein, Carsten

    2016-01-01

    Numerous (inter-)national guidelines and frameworks have been developed to provide recommendations for the application of palliative sedation (PS). However, they are still not widely known, and large variations in PS clinical practice can be found. This study aims to collect and describe contents from documents used in clinical practice and to compare to what extent they match the European Association for Palliative Care (EAPC) framework recommendations. In a national survey on PS in Germany 2012, participants were asked to upload their in-service templates, assessment tools, specific protocols, and in-service statements for the application and documentation of PS. These documents are analyzed by using systematic structured content analysis. Three hundred seven content units of 52 provided documents were coded. The analyzed templates are very heterogeneous and also contain items not mentioned in the EAPC framework. Among 11 scales for the evaluation of sedation level, the Ramsey Sedation Score (n = 5) and the Richmond-Agitation-Sedation-Scale (n = 2) were found most often. For symptom assessment, three different scales were provided one time respectively. In all six PS statements, the common core elements were possible indications for PS, instructions on dose titration, patient monitoring, and care. Wide congruency exists for physical and psychological indications. Most documents coincide on midazolam as a preferred drug and basic monitoring in regular intervals. Aspects such as pre-emptive discussion of the potential role of sedation, informational needs of relatives, and care for the medical professionals are mentioned rarely. The analyzed templates do neglect some points of the EAPC recommendations. However, they expand the ten-point scheme of the framework in some details. The findings may facilitate the development of standardized consensus documentation and monitoring draft as an operational statement.

  19. Objective analysis of observational data from the FGGE observing systems

    NASA Technical Reports Server (NTRS)

    Baker, W.; Edelmann, D.; Iredell, M.; Han, D.; Jakkempudi, S.

    1981-01-01

    An objective analysis procedure for updating the GLAS second and fourth order general atmospheric circulation models using observational data from the first GARP global experiment is described. The objective analysis procedure is based on a successive corrections method and the model is updated in a data assimilation cycle. Preparation of the observational data for analysis and the objective analysis scheme are described. The organization of the program and description of the required data sets are presented. The program logic and detailed descriptions of each subroutine are given.

  20. Systematic documentation and analysis of human genetic variation in hemoglobinopathies using the microattribution approach.

    PubMed

    Giardine, Belinda; Borg, Joseph; Higgs, Douglas R; Peterson, Kenneth R; Philipsen, Sjaak; Maglott, Donna; Singleton, Belinda K; Anstee, David J; Basak, A Nazli; Clark, Barnaby; Costa, Flavia C; Faustino, Paula; Fedosyuk, Halyna; Felice, Alex E; Francina, Alain; Galanello, Renzo; Gallivan, Monica V E; Georgitsi, Marianthi; Gibbons, Richard J; Giordano, Piero C; Harteveld, Cornelis L; Hoyer, James D; Jarvis, Martin; Joly, Philippe; Kanavakis, Emmanuel; Kollia, Panagoula; Menzel, Stephan; Miller, Webb; Moradkhani, Kamran; Old, John; Papachatzopoulou, Adamantia; Papadakis, Manoussos N; Papadopoulos, Petros; Pavlovic, Sonja; Perseu, Lucia; Radmilovic, Milena; Riemer, Cathy; Satta, Stefania; Schrijver, Iris; Stojiljkovic, Maja; Thein, Swee Lay; Traeger-Synodinos, Jan; Tully, Ray; Wada, Takahito; Waye, John S; Wiemann, Claudia; Zukic, Branka; Chui, David H K; Wajcman, Henri; Hardison, Ross C; Patrinos, George P

    2011-03-20

    We developed a series of interrelated locus-specific databases to store all published and unpublished genetic variation related to hemoglobinopathies and thalassemia and implemented microattribution to encourage submission of unpublished observations of genetic variation to these public repositories. A total of 1,941 unique genetic variants in 37 genes, encoding globins and other erythroid proteins, are currently documented in these databases, with reciprocal attribution of microcitations to data contributors. Our project provides the first example of implementing microattribution to incentivise submission of all known genetic variation in a defined system. It has demonstrably increased the reporting of human variants, leading to a comprehensive online resource for systematically describing human genetic variation in the globin genes and other genes contributing to hemoglobinopathies and thalassemias. The principles established here will serve as a model for other systems and for the analysis of other common and/or complex human genetic diseases.

  1. Creating history: documents and patient participation in nurse-patient interviews.

    PubMed

    Jones, Aled

    2009-09-01

    Strongly worded directives regarding the need for increased patient participation during nursing interaction with patients have recently appeared in a range of 'best-practice' documents. This paper focuses on one area of nurse-patient communication, the hospital admission interview, which has been put forward as an ideal arena for increased patient participation. It uses data from a total of 27 admission interviews, extensive periods of participant observation and analysis of nursing records to examine how hospital admission interviews are performed by nurses and patients. Analysis shows that topics discussed during admission closely follow the layout of the admission document which nurses complete during the interview. Whilst it is tempting to describe the admission document as a 'super technological power' in influencing the interaction and restricting patient participation, this analysis attempts a more rounded reading of the data. Findings demonstrate that, whilst opportunities for patient participation were rare, admission interviews are complex interactional episodes that often belie simplistic or prescriptive guidance regarding interaction between nurses and patients. In particular, issue is taken with the lack of contextual and conceptual clarity with which best-practice guidelines are written.

  2. Computer program for design and performance analysis of navigation-aid power systems. Program documentation. Volume 1: Software requirements document

    NASA Technical Reports Server (NTRS)

    Goltz, G.; Kaiser, L. M.; Weiner, H.

    1977-01-01

    A computer program has been developed for designing and analyzing the performance of solar array/battery power systems for the U.S. Coast Guard Navigational Aids. This program is called the Design Synthesis/Performance Analysis (DSPA) Computer Program. The basic function of the Design Synthesis portion of the DSPA program is to evaluate functional and economic criteria to provide specifications for viable solar array/battery power systems. The basic function of the Performance Analysis portion of the DSPA program is to simulate the operation of solar array/battery power systems under specific loads and environmental conditions. This document establishes the software requirements for the DSPA computer program, discusses the processing that occurs within the program, and defines the necessary interfaces for operation.

  3. Document similarity measures and document browsing

    NASA Astrophysics Data System (ADS)

    Ahmadullin, Ildus; Fan, Jian; Damera-Venkata, Niranjan; Lim, Suk Hwan; Lin, Qian; Liu, Jerry; Liu, Sam; O'Brien-Strain, Eamonn; Allebach, Jan

    2011-03-01

    Managing large document databases is an important task today. Being able to automatically com- pare document layouts and classify and search documents with respect to their visual appearance proves to be desirable in many applications. We measure single page documents' similarity with respect to distance functions between three document components: background, text, and saliency. Each document component is represented as a Gaussian mixture distribution; and distances between dierent documents' components are calculated as probabilistic similarities between corresponding distributions. The similarity measure between documents is represented as a weighted sum of the components' distances. Using this document similarity measure, we propose a browsing mechanism operating on a document dataset. For these purposes, we use a hierarchical browsing environment which we call the document similarity pyramid. It allows the user to browse a large document dataset and to search for documents in the dataset that are similar to the query. The user can browse the dataset on dierent levels of the pyramid, and zoom into the documents that are of interest.

  4. Equity in public health standards: a qualitative document analysis of policies from two Canadian provinces

    PubMed Central

    2012-01-01

    Introduction Promoting health equity is a key goal of many public health systems. However, little is known about how equity is conceptualized in such systems, particularly as standards of public health practice are established. As part of a larger study examining the renewal of public health in two Canadian provinces, Ontario and British Columbia (BC), we undertook an analysis of relevant public health documents related to equity. The aim of this paper is to discuss how equity is considered within documents that outline standards for public health. Methods A research team consisting of policymakers and academics identified key documents related to the public health renewal process in each province. The documents were analyzed using constant comparative analysis to identify key themes related to the conceptualization and integration of health equity as part of public health renewal in Ontario and BC. Documents were coded inductively with higher levels of abstraction achieved through multiple readings. Sets of questions were developed to guide the analysis throughout the process. Results In both sets of provincial documents health inequities were defined in a similar fashion, as the consequence of unfair or unjust structural conditions. Reducing health inequities was an explicit goal of the public health renewal process. In Ontario, addressing “priority populations” was used as a proxy term for health equity and the focus was on existing programs. In BC, the incorporation of an equity lens enhanced the identification of health inequities, with a particular emphasis on the social determinants of health. In both, priority was given to reducing barriers to public health services and to forming partnerships with other sectors to reduce health inequities. Limits to the accountability of public health to reduce health inequities were identified in both provinces. Conclusion This study contributes to understanding how health equity is conceptualized and incorporated

  5. Leveraging electronic health record documentation for Failure Mode and Effects Analysis team identification

    PubMed Central

    Carson, Matthew B; Lee, Young Ji; Benacka, Corrine; Mutharasan, R. Kannan; Ahmad, Faraz S; Kansal, Preeti; Yancy, Clyde W; Anderson, Allen S; Soulakis, Nicholas D

    2017-01-01

    Objective: Using Failure Mode and Effects Analysis (FMEA) as an example quality improvement approach, our objective was to evaluate whether secondary use of orders, forms, and notes recorded by the electronic health record (EHR) during daily practice can enhance the accuracy of process maps used to guide improvement. We examined discrepancies between expected and observed activities and individuals involved in a high-risk process and devised diagnostic measures for understanding discrepancies that may be used to inform quality improvement planning. Methods: Inpatient cardiology unit staff developed a process map of discharge from the unit. We matched activities and providers identified on the process map to EHR data. Using four diagnostic measures, we analyzed discrepancies between expectation and observation. Results: EHR data showed that 35% of activities were completed by unexpected providers, including providers from 12 categories not identified as part of the discharge workflow. The EHR also revealed sub-components of process activities not identified on the process map. Additional information from the EHR was used to revise the process map and show differences between expectation and observation. Conclusion: Findings suggest EHR data may reveal gaps in process maps used for quality improvement and identify characteristics about workflow activities that can identify perspectives for inclusion in an FMEA. Organizations with access to EHR data may be able to leverage clinical documentation to enhance process maps used for quality improvement. While focused on FMEA protocols, findings from this study may be applicable to other quality activities that require process maps. PMID:27589944

  6. 43 CFR 46.140 - Using tiered documents.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... documents. A NEPA document that tiers to another broader NEPA document in accordance with 40 CFR 1508.28 must include a finding that the conditions and environmental effects described in the broader NEPA... identified and analyzed in the broader NEPA document, no further analysis is necessary, and the previously...

  7. 43 CFR 46.140 - Using tiered documents.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... documents. A NEPA document that tiers to another broader NEPA document in accordance with 40 CFR 1508.28 must include a finding that the conditions and environmental effects described in the broader NEPA... identified and analyzed in the broader NEPA document, no further analysis is necessary, and the previously...

  8. 43 CFR 46.140 - Using tiered documents.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... documents. A NEPA document that tiers to another broader NEPA document in accordance with 40 CFR 1508.28 must include a finding that the conditions and environmental effects described in the broader NEPA... identified and analyzed in the broader NEPA document, no further analysis is necessary, and the previously...

  9. 43 CFR 46.140 - Using tiered documents.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... documents. A NEPA document that tiers to another broader NEPA document in accordance with 40 CFR 1508.28 must include a finding that the conditions and environmental effects described in the broader NEPA... identified and analyzed in the broader NEPA document, no further analysis is necessary, and the previously...

  10. 43 CFR 46.140 - Using tiered documents.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... documents. A NEPA document that tiers to another broader NEPA document in accordance with 40 CFR 1508.28 must include a finding that the conditions and environmental effects described in the broader NEPA... identified and analyzed in the broader NEPA document, no further analysis is necessary, and the previously...

  11. Hurricane Isaac: observations and analysis of coastal change

    USGS Publications Warehouse

    Guy, Kristy K.; Stockdon, Hilary F.; Plant, Nathaniel G.; Doran, Kara S.; Morgan, Karen L.M.

    2013-01-01

    , airborne light detection and ranging (lidar) topographic surveys, and ground-based topographic surveys. This report documents data-collection efforts and presents qualitative and quantitative descriptions of hurricane-induced changes to the shoreline, beaches, dunes, and infrastructure in the region that was heavily impacted by Hurricane Isaac. The report is divided into the following sections: Section 1: Introduction Section 2: Storm Overview, presents a synopsis of the storm, including meteorological evolution, wind speed impact area, wind-wave generation, and storm-surge extent and magnitudes. Section 3: Coastal-Change Observations, describes data-collection missions, including acquisition of oblique aerial photography and airborne lidar topographic surveys, in response to Hurricane Isaac. Section 4: Coastal-Change Analysis, describes data-analysis methods and observations of coastal change.

  12. Comparison of historical documents for writership

    NASA Astrophysics Data System (ADS)

    Ball, Gregory R.; Pu, Danjun; Stritmatter, Roger; Srihari, Sargur N.

    2010-01-01

    Over the last century forensic document science has developed progressively more sophisticated pattern recognition methodologies for ascertaining the authorship of disputed documents. These include advances not only in computer assisted stylometrics, but forensic handwriting analysis. We present a writer verification method and an evaluation of an actual historical document written by an unknown writer. The questioned document is compared against two known handwriting samples of Herman Melville, a 19th century American author who has been hypothesized to be the writer of this document. The comparison led to a high confidence result that the questioned document was written by the same writer as the known documents. Such methodology can be applied to many such questioned documents in historical writing, both in literary and legal fields.

  13. NACA documents database project

    NASA Technical Reports Server (NTRS)

    Smith, Ruth S.

    1991-01-01

    The plan to get all the National Advisory Committee on Aeronautics (NACA) collection online, with quality records, led to the NACA Documents Data base Project. The project has a two fold purpose: (1) to develop the definitive bibliography of NACA produced and/or held documents; and (2) to make that bibliography and the associated documents available to the aerospace community. This study supports the first objective by providing an analysis of the NACA collection and its bibliographic records, and supports the second objective by defining the NACA archive and recommending methodologies for meeting the project objectives.

  14. Technical report series on global modeling and data assimilation. Volume 1: Documentation of the Goddard Earth Observing System (GEOS) General Circulation Model, version 1

    NASA Technical Reports Server (NTRS)

    Suarez, Max J. (Editor); Takacs, Lawrence L.; Molod, Andrea; Wang, Tina

    1994-01-01

    This technical report documents Version 1 of the Goddard Earth Observing System (GEOS) General Circulation Model (GCM). The GEOS-1 GCM is being used by NASA's Data Assimilation Office (DAO) to produce multiyear data sets for climate research. This report provides a documentation of the model components used in the GEOS-1 GCM, a complete description of model diagnostics available, and a User's Guide to facilitate GEOS-1 GCM experiments.

  15. Analysis of Critical Earth Observation Priorities for Societal Benefit

    NASA Astrophysics Data System (ADS)

    Zell, E. R.; Huff, A. K.; Carpenter, A. T.; Friedl, L.

    2011-12-01

    To ensure that appropriate near real-time (NRT) and historical Earth observation data are available to benefit society and meet end-user needs, the Group on Earth Observations (GEO) sponsored a multi-disciplinary study to identify a set of critical and common Earth observations associated with 9 Societal Benefit Areas (SBAs): Agriculture, Biodiversity, Climate, Disasters, Ecosystems, Energy, Health, Water, and Weather. GEO is an intergovernmental organization working to improve the availability, access, and use of Earth observations to benefit society through a Global Earth Observation System of Systems (GEOSS). The study, overseen by the GEO User Interface Committee, focused on the "demand" side of Earth observation needs: which users need what types of data, and when? The methodology for the study was a meta-analysis of over 1,700 publicly available documents addressing Earth observation user priorities, under the guidance of expert advisors from around the world. The result was a ranking of 146 Earth observation parameters that are critical and common to multiple SBAs, based on an ensemble of 4 statistically robust methods. Within the results, key details emerged on NRT observations needed to serve a broad community of users. The NRT observation priorities include meteorological parameters, vegetation indices, land cover and soil property observations, water body and snow cover properties, and atmospheric composition. The results of the study and examples of NRT applications will be presented. The applications are as diverse as the list of priority parameters. For example, NRT meteorological and soil moisture information can support monitoring and forecasting for more than 25 infectious diseases, including epidemic diseases, such as malaria, and diseases of major concern in the U.S., such as Lyme disease. Quickly evolving events that impact forests, such as fires and insect outbreaks, can be monitored and forecasted with a combination of vegetation indices, fuel

  16. Vital sign documentation in electronic records: The development of workarounds.

    PubMed

    Stevenson, Jean E; Israelsson, Johan; Nilsson, Gunilla; Petersson, Goran; Bath, Peter A

    2018-06-01

    Workarounds are commonplace in healthcare settings. An increase in the use of electronic health records has led to an escalation of workarounds as healthcare professionals cope with systems which are inadequate for their needs. Closely related to this, the documentation of vital signs in electronic health records has been problematic. The accuracy and completeness of vital sign documentation has a direct impact on the recognition of deterioration in a patient's condition. We examined workflow processes to identify workarounds related to vital signs in a 372-bed hospital in Sweden. In three clinical areas, a qualitative study was performed with data collected during observations and interviews and analysed through thematic content analysis. We identified paper workarounds in the form of handwritten notes and a total of eight pre-printed paper observation charts. Our results suggested that nurses created workarounds to allow a smooth workflow and ensure patients safety.

  17. A general concept for consistent documentation of computational analyses

    PubMed Central

    Müller, Fabian; Nordström, Karl; Lengauer, Thomas; Schulz, Marcel H.

    2015-01-01

    The ever-growing amount of data in the field of life sciences demands standardized ways of high-throughput computational analysis. This standardization requires a thorough documentation of each step in the computational analysis to enable researchers to understand and reproduce the results. However, due to the heterogeneity in software setups and the high rate of change during tool development, reproducibility is hard to achieve. One reason is that there is no common agreement in the research community on how to document computational studies. In many cases, simple flat files or other unstructured text documents are provided by researchers as documentation, which are often missing software dependencies, versions and sufficient documentation to understand the workflow and parameter settings. As a solution we suggest a simple and modest approach for documenting and verifying computational analysis pipelines. We propose a two-part scheme that defines a computational analysis using a Process and an Analysis metadata document, which jointly describe all necessary details to reproduce the results. In this design we separate the metadata specifying the process from the metadata describing an actual analysis run, thereby reducing the effort of manual documentation to an absolute minimum. Our approach is independent of a specific software environment, results in human readable XML documents that can easily be shared with other researchers and allows an automated validation to ensure consistency of the metadata. Because our approach has been designed with little to no assumptions concerning the workflow of an analysis, we expect it to be applicable in a wide range of computational research fields. Database URL: http://deep.mpi-inf.mpg.de/DAC/cmds/pub/pyvalid.zip PMID:26055099

  18. Applying a sociolinguistic model to the analysis of informed consent documents.

    PubMed

    Granero-Molina, José; Fernández-Sola, Cayetano; Aguilera-Manrique, Gabriel

    2009-11-01

    Information on the risks and benefits related to surgical procedures is essential for patients in order to obtain their informed consent. Some disciplines, such as sociolinguistics, offer insights that are helpful for patient-professional communication in both written and oral consent. Communication difficulties become more acute when patients make decisions through an informed consent document because they may sign this with a lack of understanding and information, and consequently feel deprived of their freedom to make their choice about different treatments or surgery. This article discusses findings from documentary analysis using the sociolinguistic SPEAKING model, which was applied to the general and specific informed consent documents required for laparoscopic surgery of the bile duct at Torrecárdenas Hospital, Almería, Spain. The objective of this procedure was to identify flaws when information was provided, together with its readability, its voluntary basis, and patients' consent. The results suggest potential linguistic communication difficulties, different languages being used, cultural clashes, asymmetry of communication between professionals and patients, assignment of rights on the part of patients, and overprotection of professionals and institutions.

  19. Pretest analysis document for Test S-NH-1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owca, W.A.

    This report documents the pretest analysis calculation completed with the RELAP5/MOD2/CY3601 code for Semiscale MOD-2C Test S-NH-1. The test will simulate the shear of a small diameter penetration of a cold leg, equivalent to 0.5% of the cold leg flow area. The high pressure injection system is assumed to be inoperative throughout the transient. The recovery procedure consists of latching open both steam generator ADV's while feeding with auxiliary feedwater, and accumulator operation. Recovery will be initiated upon a peak cladding temperature of 811 K (1000/sup 0/F). The test will be terminated when primary pressure has been reduced to themore » low pressure injection system setpoint of 1.38 MPa (200 psia). The calculated results indicate that the test objectives can be achieved and the proposed test scenario poses no threat to personnel or to plant integrity. 12 figs.« less

  20. Qualitative analysis of national documents on health care services and pharmaceuticals` purchasing challenges: evidence from Iran.

    PubMed

    Bastani, Peivand; Samadbeik, Mahnaz; Dinarvand, Rassoul; Kashefian-Naeeini, Sara; Vatankhah, Soudabeh

    2018-06-05

    Iranian health sector encountered many challenges in resource allocation and health service purchasing during the past decades, the aim of this study was to determine the main challenges of the present process of health service purchasing for national policymakers and other developing countries with the same setting. It was a qualitative study carried out via the complete content analysis of all relevant national documents from 2007 to 2014. In order to retrieve the related documents, we searched the official websites related to the Ministry of Health and Medical Education, four main Iranian insurance organizations, the Health Committee of the Parliament Profile, strategic vice president's site and Supreme Insurance Council. After recognition of documents, their credibility and authenticity were evaluated in terms of their publication or adjustment. For the analysis of documents, the four step-Scott method was used applying MAXQDA version 10. Findings illustrated that health service purchase challenges in the country can be classified in 6 main themes of policy-making, executive, intersectional, natural, legal and informational challenges with 26 subthemes. Furthermore, 5 themes of Basic Benefit Package, Reimbursement,Decision making, Technology and Contract are considered as the main Challenges in pharmaceutical purchasing area containing 13 relevant subthemes. It seems that according to documents, Iran has faced many structural and procedural problems with the purchase of the best health interventions. So it is highly recommended to consider consequences derived from the present challenges and try to use these evidences in their policy making process to decrease the existed problems and move to better procurement of health interventions.

  1. Pilot production system cost/benefit analysis: Digital document storage project

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The Digital Document Storage (DDS)/Pilot Production System (PPS) will provide cost effective electronic document storage, retrieval, hard copy reproduction, and remote access for users of NASA Technical Reports. The DDS/PPS will result in major benefits, such as improved document reproduction quality within a shorter time frame than is currently possible. In addition, the DDS/PPS will provide an important strategic value through the construction of a digital document archive. It is highly recommended that NASA proceed with the DDS Prototype System and a rapid prototyping development methodology in order to validate recent working assumptions upon which the success of the DDS/PPS is dependent.

  2. Standardizing Documentation of FITS Headers

    NASA Astrophysics Data System (ADS)

    Hourcle, Joseph

    2014-06-01

    Although the FITS file format[1] can be self-documenting, human intervention is often needed to read the headers to write the necessary transformations to make a given instrument team's data compatible with our preferred analysis package. External documentation may be needed to determine what the values are of coded values or unfamiliar acronyms.Different communities have interpreted keywords slightly differently. This has resulted in ambiguous fields such as DATE-OBS, which could be either the start or mid-point of an observation.[2]Conventions for placing units and additional information within the comments of a FITS card exist, but they require re-writing the FITS file. This operation can be quite costly for large archives, and should not be taken lightly when dealing with issues of digital preservation.We present what we believe is needed for a machine-actionable external file describing a given collection of FITS files. We seek comments from data producers, archives, and those writing software to help develop a single, useful, implementable standard.References:[1] Pence, et.al. 2010, http://dx.doi.org/10.1051/0004-6361/201015362[2] Rots, et.al, (in preparation), http://hea-www.cfa.harvard.edu arots/TimeWCS/

  3. Documents of the JPL Photovoltaics Program Analysis and Integration Center: An annotated bibliography

    NASA Technical Reports Server (NTRS)

    Pearson, A. M.

    1985-01-01

    A bibliography of internal and external documents produced by the Jet Propulsion Laboratory, based on the work performed by the Photovoltaics Program Analysis and Integration Center, is presented with annotations. As shown in the Table of Contents, the bibliography is divided into three subject areas: (1) Assessments, (2) Methdological Studies, and (3) Supporting Studies. Annotated abstracts are presented for 20 papers.

  4. 32 CFR 989.23 - Contractor prepared documents.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Contractor prepared documents. 989.23 Section... PROTECTION ENVIRONMENTAL IMPACT ANALYSIS PROCESS (EIAP) § 989.23 Contractor prepared documents. All Air Force... should reflect on the cover sheet they are an Air Force document. Contractor preparation information...

  5. The analysis of a complex fire event using multispaceborne observations

    NASA Astrophysics Data System (ADS)

    Andrei, Simona; Carstea, Emil; Marmureanu, Luminita; Ene, Dragos; Binietoglou, Ioannis; Nicolae, Doina; Konsta, Dimitra; Amiridis, Vassilis; Proestakis, Emmanouil

    2018-04-01

    This study documents a complex fire event that occurred on October 2016, in Middle East belligerent area. Two fire outbreaks were detected by different spacecraft monitoring instruments on board of TERRA, CALIPSO and AURA Earth Observation missions. Link with local weather conditions was examined using ERA Interim Reanalysis and CAMS datasets. The detection of the event by multiple sensors enabled a detailed characterization of fires and the comparison with different observational data.

  6. Final safety analysis report for the Galileo Mission: Volume 1, Reference design document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The Galileo mission uses nuclear power sources called Radioisotope Thermoelectric Generators (RTGs) to provide the spacecraft's primary electrical power. Because these generators contain nuclear material, a Safety Analysis Report (SAR) is required. A preliminary SAR and an updated SAR were previously issued that provided an evolving status report on the safety analysis. As a result of the Challenger accident, the launch dates for both Galileo and Ulysses missions were later rescheduled for November 1989 and October 1990, respectively. The decision was made by agreement between the DOE and the NASA to have a revised safety evaluation and report (FSAR) preparedmore » on the basis of these revised vehicle accidents and environments. The results of this latest revised safety evaluation are presented in this document (Galileo FSAR). Volume I, this document, provides the background design information required to understand the analyses presented in Volumes II and III. It contains descriptions of the RTGs, the Galileo spacecraft, the Space Shuttle, the Inertial Upper Stage (IUS), the trajectory and flight characteristics including flight contingency modes, and the launch site. There are two appendices in Volume I which provide detailed material properties for the RTG.« less

  7. Documentation Panels Enhance Teacher Education Programs

    ERIC Educational Resources Information Center

    Warash, Bobbie Gibson

    2005-01-01

    Documentation of children's projects is advantageous to their learning process and is also a good method for student teachers to observe the process of learning. Documentation panels are a unique way to help student teachers understand how children learn. Completing a panel requires a student teacher to think through a process. Teachers must learn…

  8. Policies and Programs for Prevention and Control of Diabetes in Iran: A Document Analysis.

    PubMed

    Faraji, Obeidollah; Etemad, Koorosh; Akbari Sari, Ali; Ravaghi, Hamid

    2015-04-19

    Trend analysis in 2005 to 2011 showed high growth in diabetes prevalence in Iran. Considering the high prevalence of diabetes in the country and likely to increase its prevalence in the future, the analysis of diabetes-related policies and programs is very important and effective in the prevention and control of diabetes. Therefore, the aim of the study was an analysis of policies and programs related to prevention and control of diabetes in Iran in 2014. This study was a policy analysis using deductive thematic content analysis of key documents. The health policy triangle framework was used in the data analysis. PubMed and ScienceDirect databases were searched to find relevant studies and documents. Also, hand searching was conducted among references of the identified studies. MAXQDA 10 software was used to organize and analyze data. The main reasons to take into consideration diabetes in Iran can be World Health Organization (WHO) report in 1989, and high prevalence of diabetes in the country. The major challenges in implementing the diabetes program include difficulty in referral levels of the program, lack of coordination between the private sector and the public sector and the limitations of reporting system in the specialized levels of the program. Besides strengthening referral system, the government should allocate more funds to the program and more importance to the educational programs for the public. Also, Non-Governmental Organizations (NGOs) and the private sector should involve in the formulation and implementation of the prevention and control programs of diabetes in the future.

  9. Methodological tools for the collection and analysis of participant observation data using grounded theory.

    PubMed

    Laitinen, Heleena; Kaunonen, Marja; Astedt-Kurki, Päivi

    2014-11-01

    To give clarity to the analysis of participant observation in nursing when implementing the grounded theory method. Participant observation (PO) is a method of collecting data that reveals the reality of daily life in a specific context. In grounded theory, interviews are the primary method of collecting data but PO gives a distinctive insight, revealing what people are really doing, instead of what they say they are doing. However, more focus is needed on the analysis of PO. An observational study carried out to gain awareness of nursing care and its electronic documentation in four acute care wards in hospitals in Finland. Discussion of using the grounded theory method and PO as a data collection tool. The following methodological tools are discussed: an observational protocol, jotting of notes, microanalysis, the use of questioning, constant comparison, and writing and illustrating. Each tool has specific significance in collecting and analysing data, working in constant interaction. Grounded theory and participant observation supplied rich data and revealed the complexity of the daily reality of acute care. In this study, the methodological tools provided a base for the study at the research sites and outside. The process as a whole was challenging. It was time-consuming and it required rigorous and simultaneous data collection and analysis, including reflective writing. Using these methodological tools helped the researcher stay focused from data collection and analysis to building theory. Using PO as a data collection method in qualitative nursing research provides insights. It is not commonly discussed in nursing research and therefore this study can provide insight, which cannot be seen or revealed by using other data collection methods. Therefore, this paper can produce a useful tool for those who intend to use PO and grounded theory in their nursing research.

  10. Pretest analysis document for Semiscale Test S-FS-1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, T.H.

    This report documents the pretest analysis calculation completed with the RELAP5/MOD2/CY21 code for Semiscale Test S-FS-1. The test will simulate the double-ended offset shear of the main steam line at the exit of the broken loop steam generator (downstream of the flow restrictor) and the subsequent plant recovery. The recovery portion of the test consists of a plant stabilization phase and a plant cooldown phase. The recovery procedures involve normal charging/letdown operation, pressurizer heater operation, secondary steam and feed of the unaffected steam generator, and pressurizer auxiliary spray. The test will be terminated after the unaffected steam generator and pressurizermore » pressures and liquid levels are stable, and the average priamry fluid temperature is stable at about 480 K (405/sup 0/F) for at least 10 minutes.« less

  11. Documenting the diet in ancient human populations through stable isotope analysis of hair.

    PubMed

    Macko, S A; Engel, M H; Andrusevich, V; Lubec, G; O'Connell, T C; Hedges, R E

    1999-01-29

    Fundamental to the understanding of human history is the ability to make interpretations based on artefacts and other remains which are used to gather information about an ancient population. Sequestered in the organic matrices of these remains can be information, for example, concerning incidence of disease, genetic defects and diet. Stable isotopic compositions, especially those made on isolates of collagen from bones, have been used to help suggest principal dietary components. A significant problem in the use of collagen is its long-term stability, and the possibility of isotopic alteration during early diagenesis, or through contaminating condensation reactions. In this study, we suggest that a commonly overlooked material, human hair, may represent an ideal material to be used in addressing human diets of ancient civilizations. Through the analysis of the amino-acid composition of modern hair, as well as samples that were subjected to radiation (thus simulating ageing of the hair) and hair from humans that is up to 5200 years old, we have observed little in the way of chemical change. The principal amino acids observed in all of these samples are essentially identical in relative abundances and content. Dominating the compositions are serine, glutamic acid, threonine, glycine and leucine, respectively accounting for approximately 15%, 17%, 10%, 8% and 8% of the total hydrolysable amino acids. Even minor components (for example, alanine, valine, isoleucine) show similar constancy between the samples of different ages. This constancy clearly indicates minimal alteration of the amino-acid composition of the hair. Further, it would indicate that hair is well preserved and is amenable to isotopic analysis as a tool for distinguishing sources of nutrition. Based on this observation, we have isotopically characterized modern individuals for whom the diet has been documented. Both stable nitrogen and carbon isotope compositions were assessed, and together provide an

  12. Documenting the diet in ancient human populations through stable isotope analysis of hair.

    PubMed Central

    Macko, S A; Engel, M H; Andrusevich, V; Lubec, G; O'Connell, T C; Hedges, R E

    1999-01-01

    Fundamental to the understanding of human history is the ability to make interpretations based on artefacts and other remains which are used to gather information about an ancient population. Sequestered in the organic matrices of these remains can be information, for example, concerning incidence of disease, genetic defects and diet. Stable isotopic compositions, especially those made on isolates of collagen from bones, have been used to help suggest principal dietary components. A significant problem in the use of collagen is its long-term stability, and the possibility of isotopic alteration during early diagenesis, or through contaminating condensation reactions. In this study, we suggest that a commonly overlooked material, human hair, may represent an ideal material to be used in addressing human diets of ancient civilizations. Through the analysis of the amino-acid composition of modern hair, as well as samples that were subjected to radiation (thus simulating ageing of the hair) and hair from humans that is up to 5200 years old, we have observed little in the way of chemical change. The principal amino acids observed in all of these samples are essentially identical in relative abundances and content. Dominating the compositions are serine, glutamic acid, threonine, glycine and leucine, respectively accounting for approximately 15%, 17%, 10%, 8% and 8% of the total hydrolysable amino acids. Even minor components (for example, alanine, valine, isoleucine) show similar constancy between the samples of different ages. This constancy clearly indicates minimal alteration of the amino-acid composition of the hair. Further, it would indicate that hair is well preserved and is amenable to isotopic analysis as a tool for distinguishing sources of nutrition. Based on this observation, we have isotopically characterized modern individuals for whom the diet has been documented. Both stable nitrogen and carbon isotope compositions were assessed, and together provide an

  13. Method and system of filtering and recommending documents

    DOEpatents

    Patton, Robert M.; Potok, Thomas E.

    2016-02-09

    Disclosed is a method and system for discovering documents using a computer and providing a small set of the most relevant documents to the attention of a human observer. Using the method, the computer obtains a seed document from the user and generates a seed document vector using term frequency-inverse corpus frequency weighting. A keyword index for a plurality of source documents can be compared with the weighted terms of the seed document vector. The comparison is then filtered to reduce the number of documents, which define an initial subset of the source documents. Initial subset vectors are generated and compared to the seed document vector to obtain a similarity value for each comparison. Based on the similarity value, the method then recommends one or more of the source documents.

  14. Clinical decision support improves quality of telephone triage documentation--an analysis of triage documentation before and after computerized clinical decision support.

    PubMed

    North, Frederick; Richards, Debra D; Bremseth, Kimberly A; Lee, Mary R; Cox, Debra L; Varkey, Prathibha; Stroebel, Robert J

    2014-03-20

    Clinical decision support (CDS) has been shown to be effective in improving medical safety and quality but there is little information on how telephone triage benefits from CDS. The aim of our study was to compare triage documentation quality associated with the use of a clinical decision support tool, ExpertRN©. We examined 50 triage documents before and after a CDS tool was used in nursing triage. To control for the effects of CDS training we had an additional control group of triage documents created by nurses who were trained in the CDS tool, but who did not use it in selected notes. The CDS intervention cohort of triage notes was compared to both the pre-CDS notes and the CDS trained (but not using CDS) cohort. Cohorts were compared using the documentation standards of the American Academy of Ambulatory Care Nursing (AAACN). We also compared triage note content (documentation of associated positive and negative features relating to the symptoms, self-care instructions, and warning signs to watch for), and documentation defects pertinent to triage safety. Three of five AAACN documentation standards were significantly improved with CDS. There was a mean of 36.7 symptom features documented in triage notes for the CDS group but only 10.7 symptom features in the pre-CDS cohort (p < 0.0001) and 10.2 for the cohort that was CDS-trained but not using CDS (p < 0.0001). The difference between the mean of 10.2 symptom features documented in the pre-CDS and the mean of 10.7 symptom features documented in the CDS-trained but not using was not statistically significant (p = 0.68). CDS significantly improves triage note documentation quality. CDS-aided triage notes had significantly more information about symptoms, warning signs and self-care. The changes in triage documentation appeared to be the result of the CDS alone and not due to any CDS training that came with the CDS intervention. Although this study shows that CDS can improve documentation, further study is needed

  15. Integrated system for automated financial document processing

    NASA Astrophysics Data System (ADS)

    Hassanein, Khaled S.; Wesolkowski, Slawo; Higgins, Ray; Crabtree, Ralph; Peng, Antai

    1997-02-01

    A system was developed that integrates intelligent document analysis with multiple character/numeral recognition engines in order to achieve high accuracy automated financial document processing. In this system, images are accepted in both their grayscale and binary formats. A document analysis module starts by extracting essential features from the document to help identify its type (e.g. personal check, business check, etc.). These features are also utilized to conduct a full analysis of the image to determine the location of interesting zones such as the courtesy amount and the legal amount. These fields are then made available to several recognition knowledge sources such as courtesy amount recognition engines and legal amount recognition engines through a blackboard architecture. This architecture allows all the available knowledge sources to contribute incrementally and opportunistically to the solution of the given recognition query. Performance results on a test set of machine printed business checks using the integrated system are also reported.

  16. Clustering document fragments using background color and texture information

    NASA Astrophysics Data System (ADS)

    Chanda, Sukalpa; Franke, Katrin; Pal, Umapada

    2012-01-01

    Forensic analysis of questioned documents sometimes can be extensively data intensive. A forensic expert might need to analyze a heap of document fragments and in such cases to ensure reliability he/she should focus only on relevant evidences hidden in those document fragments. Relevant document retrieval needs finding of similar document fragments. One notion of obtaining such similar documents could be by using document fragment's physical characteristics like color, texture, etc. In this article we propose an automatic scheme to retrieve similar document fragments based on visual appearance of document paper and texture. Multispectral color characteristics using biologically inspired color differentiation techniques are implemented here. This is done by projecting document color characteristics to Lab color space. Gabor filter-based texture analysis is used to identify document texture. It is desired that document fragments from same source will have similar color and texture. For clustering similar document fragments of our test dataset we use a Self Organizing Map (SOM) of dimension 5×5, where the document color and texture information are used as features. We obtained an encouraging accuracy of 97.17% from 1063 test images.

  17. Statistical analysis of dynamic fibrils observed from NST/BBSO observations

    NASA Astrophysics Data System (ADS)

    Gopalan Priya, Thambaje; Su, Jiang-Tao; Chen, Jie; Deng, Yuan-Yong; Prasad Choudhury, Debi

    2018-02-01

    We present the results obtained from the analysis of dynamic fibrils in NOAA active region (AR) 12132, using high resolution Hα observations from the New Solar Telescope operating at Big Bear Solar Observatory. The dynamic fibrils are seen to be moving up and down, and most of these dynamic fibrils are periodic and have a jet-like appearance. We found from our observations that the fibrils follow almost perfect parabolic paths in many cases. A statistical analysis on the properties of the parabolic paths showing an analysis on deceleration, maximum velocity, duration and kinetic energy of these fibrils is presented here. We found the average maximum velocity to be around 15 kms‑1 and mean deceleration to be around 100 ms‑2. The observed deceleration appears to be a fraction of gravity of the Sun and is not compatible with the path of ballistic motion due to gravity of the Sun. We found a positive correlation between deceleration and maximum velocity. This correlation is consistent with simulations done earlier on magnetoacoustic shock waves propagating upward.

  18. Classroom Observations: Documenting Shifts in Instruction for Districtwide Improvement. Formative Evaluation Cycle Report for the Math in Common Initiative, Volume 2

    ERIC Educational Resources Information Center

    Perry, Rebecca R.; Seago, Nanette M.; Burr, Elizabeth; Broek, Marie; Finkelstein, Neal D.

    2015-01-01

    Math in Common® (MiC) is a five-year initiative that supports a formal network of 10 California school districts as they implement the Common Core State Standards in Mathematics (CCSS-M) across grades K-8. This research brief explores how best to select or develop and use classroom observation systems in order to document instructional shifts and…

  19. Pretest analysis document for Test S-NH-2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Streit, J.E.; Owca, W.A.

    This report documents the pretest analysis calculation completed with the RELAP5/MOD2/CY3601 code for Semiscale MOD-2C Test S-NH-2. The test will simulate the transient that results from the shear in a small diameter penetration of a cold leg, equivalent to 2.1% of the cold leg flow area. The high pressure injection system is assumed to be inoperative throughout the transient. The recovery procedure consists of latching open both steam generator atmospheric dump valves, supplying both steam generators with auxiliary feedwater system is assumed to be partially inoperative so the auxiliary feedwater flow is degraded. Recovery will be initiated upon a peakmore » cladding temperature of 811/sup 0/K (1000/sup 0/F). The test will be terminated when primary pressure has been reduced to the low pressure injection system setpoint of 1.38 MPa (200 psia). The calculated results indicate that the test objectives can be achieved and the proposed test scenario poses no threat to personnel or to plant integrity. 7 refs., 16 figs., 2 tabs.« less

  20. Using MERRA Gridded Innovations for Quantifying Uncertainties in Analysis Fields and Diagnosing Observing System Inhomogeneities

    NASA Technical Reports Server (NTRS)

    da Silva, Arlindo; Redder, Christopher

    2010-01-01

    -likelihood estimates of background and observation errors, as well as global bias estimates. Starting with the joint PDF of innovations and analysis increments at observation locations we propose a technique for diagnosing bias among the observing systems, and document how these contextual biases have evolved during the satellite era covered by MERRA.

  1. Using MERRA Gridded Innovation for Quantifying Uncertainties in Analysis Fields and Diagnosing Observing System Inhomogeneities

    NASA Astrophysics Data System (ADS)

    da Silva, A.; Redder, C. R.

    2010-12-01

    -likelihood estimates of background and observation errors, as well as global bias estimates. Starting with the joint PDF of innovations and analysis increments at observation locations we propose a technique for diagnosing bias among the observing systems, and document how these contextual biases have evolved during the satellite era covered by MERRA.

  2. TA-55 Final Safety Analysis Report Comparison Document and DOE Safety Evaluation Report Requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alan Bond

    2001-04-01

    This document provides an overview of changes to the currently approved TA-55 Final Safety Analysis Report (FSAR) that are included in the upgraded FSAR. The DOE Safety Evaluation Report (SER) requirements that are incorporated into the upgraded FSAR are briefly discussed to provide the starting point in the FSAR with respect to the SER requirements.

  3. Medication details documented on hospital discharge: cross-sectional observational study of factors associated with medication non-reconciliation

    PubMed Central

    Grimes, Tamasine C; Duggan, Catherine A; Delaney, Tim P; Graham, Ian M; Conlon, Kevin C; Deasy, Evelyn; Jago-Byrne, Marie-Claire; O' Brien, Paul

    2011-01-01

    AIMS Movement into or out of hospital is a vulnerable period for medication safety. Reconciling the medication a patient is using before admission with the medication prescribed on discharge, and documenting any changes (medication reconciliation) is recommended to improve safety. The aims of the study were to investigate the factors contributing to medication reconciliation on discharge, and identify the prevalence of non-reconciliation. METHODS The study was a cross-sectional, observational survey using consecutive discharges from purposively selected services in two acute public hospitals in Ireland. Medication reconciliation, potential for harm and unplanned re-admission were investigated. RESULTS Medication non-reconciliation was identified in 50% of 1245 inpatient episodes, involving 16% of 9569 medications. The majority of non-reconciled episodes had potential to result in moderate (63%) or severe (2%) harm. Handwritten rather than computerized discharges (adjusted odds ratio (adjusted OR) 1.60, 95% CI 1.11, 2.99), increasing number of medications (adjusted OR 1.26, 95% CI 1.21, 1.31) or chronic illness (adjusted OR 2.08, 95% CI 1.33, 3.24) were associated with non-reconciliation. Omission of endocrine, central nervous system and nutrition and blood drugs was more likely on discharge, whilst omission on admission and throughout inpatient care, without documentation, was more likely for obstetric, gynaecology and urinary tract (OGU) or respiratory drugs. Documentation in the discharge communication that medication was intentionally stopped during inpatient care was less likely for cardiovascular, musculoskeletal and OGU drugs. Errors involving the dose were most likely for respiratory drugs. CONCLUSIONS The findings inform strategies to facilitate medication reconciliation on discharge from acute hospital care. PMID:21284705

  4. Application of multi-resolution 3D techniques in crime scene documentation with bloodstain pattern analysis.

    PubMed

    Hołowko, Elwira; Januszkiewicz, Kamil; Bolewicki, Paweł; Sitnik, Robert; Michoński, Jakub

    2016-10-01

    In forensic documentation with bloodstain pattern analysis (BPA) it is highly desirable to obtain non-invasively overall documentation of a crime scene, but also register in high resolution single evidence objects, like bloodstains. In this study, we propose a hierarchical 3D scanning platform designed according to the top-down approach known from the traditional forensic photography. The overall 3D model of a scene is obtained via integration of laser scans registered from different positions. Some parts of a scene being particularly interesting are documented using midrange scanner, and the smallest details are added in the highest resolution as close-up scans. The scanning devices are controlled using developed software equipped with advanced algorithms for point cloud processing. To verify the feasibility and effectiveness of multi-resolution 3D scanning in crime scene documentation, our platform was applied to document a murder scene simulated by the BPA experts from the Central Forensic Laboratory of the Police R&D, Warsaw, Poland. Applying the 3D scanning platform proved beneficial in the documentation of a crime scene combined with BPA. The multi-resolution 3D model enables virtual exploration of a scene in a three-dimensional environment, distance measurement, and gives a more realistic preservation of the evidences together with their surroundings. Moreover, high-resolution close-up scans aligned in a 3D model can be used to analyze bloodstains revealed at the crime scene. The result of BPA such as trajectories, and the area of origin are visualized and analyzed in an accurate model of a scene. At this stage, a simplified approach considering the trajectory of blood drop as a straight line is applied. Although the 3D scanning platform offers a new quality of crime scene documentation with BPA, some of the limitations of the technique are also mentioned. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  5. Clinical decision support improves quality of telephone triage documentation - an analysis of triage documentation before and after computerized clinical decision support

    PubMed Central

    2014-01-01

    Background Clinical decision support (CDS) has been shown to be effective in improving medical safety and quality but there is little information on how telephone triage benefits from CDS. The aim of our study was to compare triage documentation quality associated with the use of a clinical decision support tool, ExpertRN©. Methods We examined 50 triage documents before and after a CDS tool was used in nursing triage. To control for the effects of CDS training we had an additional control group of triage documents created by nurses who were trained in the CDS tool, but who did not use it in selected notes. The CDS intervention cohort of triage notes was compared to both the pre-CDS notes and the CDS trained (but not using CDS) cohort. Cohorts were compared using the documentation standards of the American Academy of Ambulatory Care Nursing (AAACN). We also compared triage note content (documentation of associated positive and negative features relating to the symptoms, self-care instructions, and warning signs to watch for), and documentation defects pertinent to triage safety. Results Three of five AAACN documentation standards were significantly improved with CDS. There was a mean of 36.7 symptom features documented in triage notes for the CDS group but only 10.7 symptom features in the pre-CDS cohort (p < 0.0001) and 10.2 for the cohort that was CDS-trained but not using CDS (p < 0.0001). The difference between the mean of 10.2 symptom features documented in the pre-CDS and the mean of 10.7 symptom features documented in the CDS-trained but not using was not statistically significant (p = 0.68). Conclusions CDS significantly improves triage note documentation quality. CDS-aided triage notes had significantly more information about symptoms, warning signs and self-care. The changes in triage documentation appeared to be the result of the CDS alone and not due to any CDS training that came with the CDS intervention. Although this study shows that CDS

  6. Crew Earth Observations: Twelve Years of Documenting Earth from the International Space Station

    NASA Technical Reports Server (NTRS)

    Evans, Cynthia A.; Stefanov, William L.; Willis, Kimberley; Runco, Susan; Wilkinson, M. Justin; Dawson, Melissa; Trenchard, Michael

    2012-01-01

    The Crew Earth Observations (CEO) payload was one of the initial experiments aboard the International Space Station, and has been continuously collecting data about the Earth since Expedition 1. The design of the experiment is simple: using state-of-the-art camera equipment, astronauts collect imagery of the Earth's surface over defined regions of scientific interest and also document dynamic events such as storms systems, floods, wild fires and volcanic eruptions. To date, CEO has provided roughly 600,000 images of Earth, capturing views of features and processes on land, the oceans, and the atmosphere. CEO data are less rigorously constrained than other remote sensing data, but the volume of data, and the unique attributes of the imagery provide a rich and understandable view of the Earth that is difficult to achieve from the classic remote sensing platforms. In addition, the length-of-record of the imagery dataset, especially when combined with astronaut photography from other NASA and Russian missions starting in the early 1960s, provides a valuable record of changes on the surface of the Earth over 50 years. This time period coincides with the rapid growth of human settlements and human infrastructure.

  7. An Investigation of Document Partitions.

    ERIC Educational Resources Information Center

    Shaw, W. M., Jr.

    1986-01-01

    Empirical significance of document partitions is investigated as a function of index term-weight and similarity thresholds. Results show the same empirically preferred partitions can be detected by two independent strategies: an analysis of cluster-based retrieval analysis and an analysis of regularities in the underlying structure of the document…

  8. Document clustering methods, document cluster label disambiguation methods, document clustering apparatuses, and articles of manufacture

    DOEpatents

    Sanfilippo, Antonio [Richland, WA; Calapristi, Augustin J [West Richland, WA; Crow, Vernon L [Richland, WA; Hetzler, Elizabeth G [Kennewick, WA; Turner, Alan E [Kennewick, WA

    2009-12-22

    Document clustering methods, document cluster label disambiguation methods, document clustering apparatuses, and articles of manufacture are described. In one aspect, a document clustering method includes providing a document set comprising a plurality of documents, providing a cluster comprising a subset of the documents of the document set, using a plurality of terms of the documents, providing a cluster label indicative of subject matter content of the documents of the cluster, wherein the cluster label comprises a plurality of word senses, and selecting one of the word senses of the cluster label.

  9. Documentation of Sexual Partner Gender Is Low in Electronic Health Records: Observations, Predictors, and Recommendations to Improve Population Health Management in Primary Care

    PubMed Central

    Yehia, Baligh R.

    2015-01-01

    Abstract The 2011 Institute of Medicine report on LGBT health recommended that sexual orientation and gender identity (SO/GI) be documented in electronic health records (EHRs). Most EHRs cannot document all aspects of SO/GI, but some can record gender of sexual partners. This study sought to determine the proportion of patients who have the gender of sexual partners recorded in the EHR and to identify factors associated with documentation. A retrospective analysis was done of EHR data for 40 family medicine (FM) and general internal medicine (IM) practices, comprising 170,570 adult patients seen in 2012. The primary outcome was EHR documentation of sexual partner gender. Multivariate logistic regression assessed the impact of patient, provider, and practice factors on documentation. In all, 76,767 patients (45%) had the gender of sexual partners recorded, 4.3% of whom had same-gender partners (3.5% of females, 5.6% of males). Likelihood of documentation was independently higher for women; blacks; those with a preventive visit; those with a physician assistant, nurse practitioner, or resident primary care provider (vs. attending); those at urban practices; those at smaller practices; and those at a residency FM practice. Older age and Medicare insurance were associated with lower documentation. Sexual partner gender documentation is important to identify patients for targeted prevention and support, and holds great potential for population health management, yet documentation in the EHR currently is low. Primary care practices should routinely record the gender of sexual partners, and additional work is needed to identify best practices for collecting and using SO/GI data in this setting. (Population Health Management 2015;18:217–222). PMID:25290634

  10. The Role of Business Agreements in Defining Textbook Affordability and Digital Materials: A Document Analysis

    ERIC Educational Resources Information Center

    Raible, John; deNoyelles, Aimee

    2015-01-01

    Adopting digital materials such as eTextbooks and e-coursepacks is a potential strategy to address textbook affordability in the United States. However, university business relationships with bookstore vendors implicitly structure which instructional resources are available and in what manner. In this study, a document analysis was conducted on…

  11. Forensic intelligence applied to questioned document analysis: A model and its application against organized crime.

    PubMed

    De Alcaraz-Fossoul, Josep; Roberts, Katherine A

    2017-07-01

    The capability of forensic sciences to fight crime, especially against organized criminal groups, becomes relevant in the recent economic downturn and the war on terrorism. In view of these societal challenges, the methods of combating crime should experience critical changes in order to improve the effectiveness and efficiency of the current resources available. It is obvious that authorities have serious difficulties combating criminal groups of transnational nature. These are characterized as well structured organizations with international connections, abundant financial resources and comprised of members with significant and diverse expertise. One common practice among organized criminal groups is the use of forged documents that allow for the commission of illegal cross-border activities. Law enforcement can target these movements to identify counterfeits and establish links between these groups. Information on document falsification can become relevant to generate forensic intelligence and to design new strategies against criminal activities of this nature and magnitude. This article discusses a methodology for improving the development of forensic intelligence in the discipline of questioned document analysis. More specifically, it focuses on document forgeries and falsification types used by criminal groups. It also describes the structure of international criminal organizations that use document counterfeits as means to conduct unlawful activities. The model presented is partially based on practical applications of the system that have resulted in satisfactory outcomes in our laboratory. Copyright © 2017 The Chartered Society of Forensic Sciences. Published by Elsevier B.V. All rights reserved.

  12. Reframing the Document(ary): Exploring Asylum Policies on Stage

    ERIC Educational Resources Information Center

    Oberkrome, Friederike

    2018-01-01

    Ensuing from the concept of Documentality (Steyerl), this paper proposes to reframe documentary practices in refugee theatre. This is based on the observation that documentary theatre during the 'refugee crisis' in 2015 extensively negotiated the role of documents within bureaucratic performances (Jeffers). Following the notion of the document as…

  13. Exploratory investigation of communication management in residential-aged care: a comparison of staff knowledge, documentation and observed resident-staff communication.

    PubMed

    Bennett, Michelle K; Ward, Elizabeth C; Scarinci, Nerina A

    2016-05-01

    There is a high prevalence of communication difficulty among older people living in residential-aged care. Such functional deficits can have a negative impact on resident quality of life, staff workplace satisfaction and the provision of quality care. Systematic research investigating the nature of communication management in residential-aged care and factors impacting optimal communication management is lacking. To use data triangulation across multiple sources to describe resident-staff communication and communication management in residential-aged care. Participants included a sample of 14 residents and 29 staff directly involved in communication interactions with residents. Data were obtained from: (1) resident file review (n = 14), (2) observation of resident-staff communication (n = 14), (3) resident surveys (n = 14) and (4) staff surveys (n = 29). Data from each source were examined separately then triangulated. All residents had limited opportunity for meaningful communication with staff. Documentation of residents' communication needs and strategies to facilitate resident-staff communication was insufficient to provide individualized recommendations. Although staff were observed to use various strategies to facilitate communication with residents, staff agreement about the applicability of these strategies to individual residents was inconsistent. Differences in resident-staff communication for residents who experience nil/mild versus moderate/severe communication difficulty were also found. Resident-staff communication and communication management in residential-aged care is limited in scope and challenged in meeting residents' individual communication needs. Improvements in both documentation and staff knowledge of residents' communication needs are necessary. Strategies to facilitate communication with individual residents must be tailored, evidence based, documented in care plans and delivered to staff through ongoing education. Increased involvement

  14. Use of cartography in historical seismicity analysis: a reliable tool to better apprehend the contextualization of the historical documents

    NASA Astrophysics Data System (ADS)

    Thibault, Fradet; Grégory, Quenet; Kevin, Manchuel

    2014-05-01

    Historical studies, including historical seismicity analysis, deal with historical documents. Numerous factors, such as culture, social condition, demography, political situations and opinions or religious ones influence the way the events are transcribed in the archives. As a consequence, it is crucial to contextualize and compare the historical documents reporting on a given event in order to reduce the uncertainties affecting their analysis and interpretation. When studying historical seismic events it is often tricky to have a global view of all the information provided by the historical documents. It is also difficult to extract cross-correlated information from the documents and draw a precise historical context. Use of cartographic and geographic tools in GIS software is the best tool for the synthesis, interpretation and contextualization of the historical material. The main goal is to produce the most complete dataset of available information, in order to take into account all the components of the historical context and consequently improve the macroseismic analysis. The Entre-Deux-Mers earthquake (1759, Iepc= VII-VIII) [SISFRANCE 2013 - EDF-IRSN-BRGM] is well documented but has never benefited from a cross-analysis of historical documents and historical context elements. The map of available intensity data from SISFRANCE highlights a gap in macroseismic information within the estimated epicentral area. The aim of this study is to understand the origin of this gap by making a cartographic compilation of both, archive information and historical context elements. The results support the hypothesis that the lack of documents and macroseismic data in the epicentral area is related to a low human activity rather than low seismic effects in this zone. Topographic features, geographical position, flood hazard, roads and pathways locations, vineyards distribution and the forester coverage, mentioned in the archives and reported on the Cassini's map confirm this

  15. An Analysis of Document Category Prediction Responses to Classifier Model Parameter Treatment Permutations within the Software Design Patterns Subject Domain

    ERIC Educational Resources Information Center

    Pankau, Brian L.

    2009-01-01

    This empirical study evaluates the document category prediction effectiveness of Naive Bayes (NB) and K-Nearest Neighbor (KNN) classifier treatments built from different feature selection and machine learning settings and trained and tested against textual corpora of 2300 Gang-Of-Four (GOF) design pattern documents. Analysis of the experiment's…

  16. Development of spatial data guidelines and standards: spatial data set documentation to support hydrologic analysis in the U.S. Geological Survey

    USGS Publications Warehouse

    Fulton, James L.

    1992-01-01

    Spatial data analysis has become an integral component in many surface and sub-surface hydrologic investigations within the U.S. Geological Survey (USGS). Currently, one of the largest costs in applying spatial data analysis is the cost of developing the needed spatial data. Therefore, guidelines and standards are required for the development of spatial data in order to allow for data sharing and reuse; this eliminates costly redevelopment. In order to attain this goal, the USGS is expanding efforts to identify guidelines and standards for the development of spatial data for hydrologic analysis. Because of the variety of project and database needs, the USGS has concentrated on developing standards for documenting spatial sets to aid in the assessment of data set quality and compatibility of different data sets. An interim data set documentation standard (1990) has been developed that provides a mechanism for associating a wide variety of information with a data set, including data about source material, data automation and editing procedures used, projection parameters, data statistics, descriptions of features and feature attributes, information on organizational contacts lists of operations performed on the data, and free-form comments and notes about the data, made at various times in the evolution of the data set. The interim data set documentation standard has been automated using a commercial geographic information system (GIS) and data set documentation software developed by the USGS. Where possible, USGS developed software is used to enter data into the data set documentation file automatically. The GIS software closely associates a data set with its data set documentation file; the documentation file is retained with the data set whenever it is modified, copied, or transferred to another computer system. The Water Resources Division of the USGS is continuing to develop spatial data and data processing standards, with emphasis on standards needed to support

  17. Cf-252 Characterization Documents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feldman, Alexander

    2014-03-14

    Six documents were written by Vance and Associates under contract to the Off-Site Source Recovery Project of Los Alamos National Laboratory. These Six documents provided the basis for characterization of Californium-252 sealed sources and for the packaging and manifesting of this material for disposal at the Waste Isolation Pilot Project. The Six documents are: 1. VA-OSR-10, Development of radionuclide distributions for Cf-252 sealed sources. 2. VA-OSR-11, Uncertainty analysis for Cf-252 sealed sources. 3. VA-OSR-12, To determine the radionuclides in the waste drums containing Cf-252 sealed source waste that are required to be reported under the requirements of the WIPP WACmore » and the TRAMPAC. 4. VA-OSR-13, Development of the spreadsheet for the radiological calculations for the characterization of Cf-252 sources. 5. VA-OSR-14, Relative importance of neutron-induced fission in Cf-252 sources. 6. VA-OSR-15, Determine upper bound of decay product inventories from a drum of Cf-252 sources. These six documents provide the technical basis for the characterization of Cf-252 sources and will be part of the AK documentation required for submittal to the Central Characterization Project (CCP) of WIPP.« less

  18. Critical discourse analysis of social justice in nursing's foundational documents.

    PubMed

    Valderama-Wallace, Claire P

    2017-07-01

    Social inequities threaten the health of the global population. A superficial acknowledgement of social justice by nursing's foundational documents may limit the degree to which nurses view injustice as relevant to nursing practice and education. The purpose was to examine conceptualizations of social justice and connections to broader contexts in the most recent editions. Critical discourse analysis examines and uncovers dynamics related to power, language, and inequality within the American Nurses Association's Code of Ethics, Scope and Standards of Practice, and Social Policy Statement. This analysis found ongoing inconsistencies in conceptualizations of social justice. Although the Code of Ethics integrates concepts related to social justice far more than the other two, tension between professionalism and social change emerges. The discourse of professionalism renders interrelated cultural, social, economic, historical, and political contexts nearly invisible. Greater consistency would provide a clearer path for nurses to mobilize and engage in the courageous work necessary to address social injustice. These findings also call for an examination of how nurses can critique and use the power and privilege of professionalism to amplify the connection between social institutions and health equity in nursing education, practice, and policy development. © 2017 Wiley Periodicals, Inc.

  19. Simulation Detection in Handwritten Documents by Forensic Document Examiners.

    PubMed

    Kam, Moshe; Abichandani, Pramod; Hewett, Tom

    2015-07-01

    This study documents the results of a controlled experiment designed to quantify the abilities of forensic document examiners (FDEs) and laypersons to detect simulations in handwritten documents. Nineteen professional FDEs and 26 laypersons (typical of a jury pool) were asked to inspect test packages that contained six (6) known handwritten documents written by the same person and two (2) questioned handwritten documents. Each questioned document was either written by the person who wrote the known documents, or written by a different person who tried to simulate the writing of the person who wrote the known document. The error rates of the FDEs were smaller than those of the laypersons when detecting simulations in the questioned documents. Among other findings, the FDEs never labeled a questioned document that was written by the same person who wrote the known documents as "simulation." There was a significant statistical difference between the responses of the FDEs and layperson for documents without simulations. © 2015 American Academy of Forensic Sciences.

  20. Bibliographic Classification Theory and Text Linguistics: Aboutness Analysis, Intertextuality and the Cognitive Act of Classifying Documents.

    ERIC Educational Resources Information Center

    Beghtol, Clare

    1986-01-01

    Explicates a definition and theory of "aboutness" and aboutness analysis developed by text linguist van Dijk; explores implications of text linguistics for bibliographic classification theory; suggests the elements that a theory of the cognitive process of classifying documents needs to encompass; and delineates how people identify…

  1. A distributed analysis and visualization system for model and observational data

    NASA Technical Reports Server (NTRS)

    Wilhelmson, Robert B.

    1994-01-01

    Software was developed with NASA support to aid in the analysis and display of the massive amounts of data generated from satellites, observational field programs, and from model simulations. This software was developed in the context of the PATHFINDER (Probing ATmospHeric Flows in an Interactive and Distributed EnviRonment) Project. The overall aim of this project is to create a flexible, modular, and distributed environment for data handling, modeling simulations, data analysis, and visualization of atmospheric and fluid flows. Software completed with NASA support includes GEMPAK analysis, data handling, and display modules for which collaborators at NASA had primary responsibility, and prototype software modules for three-dimensional interactive and distributed control and display as well as data handling, for which NSCA was responsible. Overall process control was handled through a scientific and visualization application builder from Silicon Graphics known as the Iris Explorer. In addition, the GEMPAK related work (GEMVIS) was also ported to the Advanced Visualization System (AVS) application builder. Many modules were developed to enhance those already available in Iris Explorer including HDF file support, improved visualization and display, simple lattice math, and the handling of metadata through development of a new grid datatype. Complete source and runtime binaries along with on-line documentation is available via the World Wide Web at: http://redrock.ncsa.uiuc.edu/ PATHFINDER/pathre12/top/top.html.

  2. Toward Medical Documentation That Enhances Situational Awareness Learning

    PubMed Central

    Lenert, Leslie A.

    2016-01-01

    The purpose of writing medical notes in a computer system goes beyond documentation for medical-legal purposes or billing. The structure of documentation is a checklist that serves as a cognitive aid and a potential index to retrieve information for learning from the record. For the past 50 years, one of the primary organizing structures for physicians’ clinical documentation have been the SOAP note (Subjective, Objective, Assessment, Plan). The cognitive check list is well-suited to differential diagnosis but may not support detection of changes in systems and/or learning from cases. We describe an alternative cognitive checklist called the OODA Loop (Observe, Orient, Decide, Act. Through incorporation of projections of anticipated course events with and without treatment and by making “Decisions” an explicit category of documentation in the medical record in the context of a variable temporal cycle for observations, OODA may enhance opportunities to learn from clinical care. PMID:28269872

  3. State Policy Climates for College Student Success: An Analysis of State Policy Documents Pertaining to College Persistence and Completion

    ERIC Educational Resources Information Center

    McLendon, Michael K.; Tuchmayer, Jeremy B.; Park, Toby J.

    2010-01-01

    This article reports the findings of an exploratory analysis of state policy climates for college student persistence and completion. We performed an analysis of more than 100 documents collected from 8 states chosen largely on the basis of their performance on past "Measuring Up" reports. Our analysis of governors' state-of-the-state…

  4. Medication communication through documentation in medical wards: knowledge and power relations.

    PubMed

    Liu, Wei; Manias, Elizabeth; Gerdtz, Marie

    2014-09-01

    Health professionals communicate with each other about medication information using different forms of documentation. This article explores knowledge and power relations surrounding medication information exchanged through documentation among nurses, doctors and pharmacists. Ethnographic fieldwork was conducted in 2010 in two medical wards of a metropolitan hospital in Australia. Data collection methods included participant observations, field interviews, video-recordings, document retrieval and video reflexive focus groups. A critical discourse analytic framework was used to guide data analysis. The written medication chart was the main means of communicating medication decisions from doctors to nurses as compared to verbal communication. Nurses positioned themselves as auditors of the medication chart and scrutinised medical prescribing to maintain the discourse of patient safety. Pharmacists utilised the discourse of scientific judgement to guide their decision-making on the necessity of verbal communication with nurses and doctors. Targeted interdisciplinary meetings involving nurses, doctors and pharmacists should be organised in ward settings to discuss the importance of having documented medication information conveyed verbally across different disciplines. Health professionals should be encouraged to proactively seek out each other to relay changes in medication regimens and treatment goals. © 2013 John Wiley & Sons Ltd.

  5. Oak Ridge Environmental Information System (OREIS) functional system design document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Birchfield, T.E.; Brown, M.O.; Coleman, P.R.

    1994-03-01

    The OREIS Functional System Design document provides a detailed functional description of the Oak Ridge Environmental Information System (OREIS). It expands the system requirements defined in the OREIS Phase 1-System Definition Document (ES/ER/TM-34). Documentation of OREIS development is based on the Automated Data Processing System Development Methodology, a Martin Marietta Energy Systems, Inc., procedure written to assist in developing scientific and technical computer systems. This document focuses on the development of the functional design of the user interface, which includes the integration of commercial applications software. The data model and data dictionary are summarized briefly; however, the Data Management Planmore » for OREIS (ES/ER/TM-39), a companion document to the Functional System Design document, provides the complete data dictionary and detailed descriptions of the requirements for the data base structure. The OREIS system will provide the following functions, which are executed from a Menu Manager: (1) preferences, (2) view manager, (3) macro manager, (4) data analysis (assisted analysis and unassisted analysis), and (5) spatial analysis/map generation (assisted ARC/INFO and unassisted ARC/INFO). Additional functionality includes interprocess communications, which handle background operations of OREIS.« less

  6. Market Analysis and Consumer Impacts Source Document. Part III. Consumer Behavior and Attitudes Toward Fuel Efficient Vehicles

    DOT National Transportation Integrated Search

    1980-12-01

    This source document on motor vehicle market analysis and consumer impacts consists of three parts. Part III consists of studies and reviews on: consumer awareness of fuel efficiency issues; consumer acceptance of fuel efficient vehicles; car size ch...

  7. Alternatives for Developing User Documentation for Applications Software

    DTIC Science & Technology

    1991-09-01

    style that is designed to match adult reading behaviors, using reader-based writing techniques, developing effective graphics , creating reference aids...involves research, analysis, design , and testing. The writer must have a solid understanding of the technical aspects of the document being prepared, good...ABSTRACT The preparation of software documentation is an iterative process that involves research, analysis, design , and testing. The writer must have

  8. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.

    1996-08-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to thatmore » team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document.« less

  9. Guiding Documents for Environmental Education Centres: An Analysis in the Spanish Context

    ERIC Educational Resources Information Center

    Medir, Rosa Maria; Heras, Raquel; Geli, Anna Maria

    2014-01-01

    Guiding documents under the "PEC" acronym are commonly used in environmental education centres (EECs) in Spain. They are written documents that are seen as necessary tools to safeguard quality. In this study, we analyse the guiding documents of twenty-three EECs in the province of Girona (Catalonia, Spain) in order to understand their…

  10. Geometric rectification of camera-captured document images.

    PubMed

    Liang, Jian; DeMenthon, Daniel; Doermann, David

    2008-04-01

    Compared to typical scanners, handheld cameras offer convenient, flexible, portable, and non-contact image capture, which enables many new applications and breathes new life into existing ones. However, camera-captured documents may suffer from distortions caused by non-planar document shape and perspective projection, which lead to failure of current OCR technologies. We present a geometric rectification framework for restoring the frontal-flat view of a document from a single camera-captured image. Our approach estimates 3D document shape from texture flow information obtained directly from the image without requiring additional 3D/metric data or prior camera calibration. Our framework provides a unified solution for both planar and curved documents and can be applied in many, especially mobile, camera-based document analysis applications. Experiments show that our method produces results that are significantly more OCR compatible than the original images.

  11. Analysis, review, and documentation of the activation data from LDEF material

    NASA Technical Reports Server (NTRS)

    Laird, C. E.

    1992-01-01

    Samples removed from Long Duration Exposure Facility (LDEF-1) are being studied at various laboratories to determine the specific activity(pCi/kg) produced in orbit by exposure to protons and neutrons in near-Earth orbit. These activities are being corrected for efficiency, self-attenuation, and background. The activities and associated gamma-ray spectra are being collected, analyzed, documented and reviewed by faculty and graduate students at Eastern Kentucky University. The currently available activation results have been tabulated and reviewed in this report. Approximately 500 spectra have been accumulated for future archival and analysis. The effect of the changing satellite orbit on the activation is reported herein and was calculated using more recent estimates of the flux of Van Allen belt protons.

  12. MERRA-2 Input Observations: Summary and Assessment

    NASA Technical Reports Server (NTRS)

    Koster, Randal D. (Editor); McCarty, Will; Coy, Lawrence; Gelaro, Ronald; Huang, Albert; Merkova, Dagmar; Smith, Edmond B.; Sienkiewicz, Meta; Wargan, Krzysztof

    2016-01-01

    The Modern-Era Retrospective Analysis for Research and Applications, Version 2 (MERRA-2) is an atmospheric reanalysis, spanning 1980 through near-realtime, that uses state-of-the-art processing of observations from the continually evolving global observing system. The effectiveness of any reanalysis is a function not only of the input observations themselves, but also of how the observations are handled in the assimilation procedure. Relevant issues to consider include, but are not limited to, data selection, data preprocessing, quality control, bias correction procedures, and blacklisting. As the assimilation algorithm and earth system models are fundamentally fixed in a reanalysis, it is often a change in the character of the observations, and their feedbacks on the system, that cause changes in the character of the reanalysis. It is therefore important to provide documentation of the observing system so that its discontinuities and transitions can be readily linked to discontinuities seen in the gridded atmospheric fields of the reanalysis. With this in mind, this document provides an exhaustive list of the input observations, the context under which they are assimilated, and an initial assessment of selected core observations fundamental to the reanalysis.

  13. 47 CFR 1.1513 - Documentation of fees and expenses.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Documentation of fees and expenses. 1.1513... Applicants § 1.1513 Documentation of fees and expenses. The application shall be accompanied by full documentation of the fees and expenses, including the cost of any study, analysis, engineering report, test...

  14. [Multimodal document management in radiotherapy].

    PubMed

    Fahrner, H; Kirrmann, S; Röhner, F; Schmucker, M; Hall, M; Heinemann, F

    2013-12-01

    After incorporating treatment planning and the organisational model of treatment planning in the operating schedule system (BAS, "Betriebsablaufsystem"), complete document qualities were embedded in the digital environment. The aim of this project was to integrate all documents independent of their source (paper-bound or digital) and to make content from the BAS available in a structured manner. As many workflow steps as possible should be automated, e.g. assigning a document to a patient in the BAS. Additionally it must be guaranteed that at all times it could be traced who, when, how and from which source documents were imported into the departmental system. Furthermore work procedures should be changed that the documentation conducted either directly in the departmental system or from external systems can be incorporated digitally and paper document can be completely avoided (e.g. documents such as treatment certificate, treatment plans or documentation). It was a further aim, if possible, to automate the removal of paper documents from the departmental work flow, or even to make such paper documents superfluous. In this way patient letters for follow-up appointments should automatically generated from the BAS. Similarly patient record extracts in the form of PDF files should be enabled, e.g. for controlling purposes. The available document qualities were analysed in detail by a multidisciplinary working group (BAS-AG) and after this examination and assessment of the possibility of modelling in our departmental workflow (BAS) they were transcribed into a flow diagram. The gathered specifications were implemented in a test environment by the clinical and administrative IT group of the department of radiation oncology and subsequent to a detailed analysis introduced into clinical routine. The department has succeeded under the conditions of the aforementioned criteria to embed all relevant documents in the departmental workflow via continuous processes. Since the

  15. Analysis of IUE Observations of Supernovae

    NASA Technical Reports Server (NTRS)

    Kirshner, Robert P.

    1996-01-01

    This program supported the analysis of IUE observations of supernovae. One aspect was a Target-of-Opportunity program to observe bright supernovae which was applied to SN 1993J in M81, and another was continuing analysis of the IUE data from SN 1987A. Because of its quick response time, the IUE satellite has continued to provide useful data on the ultraviolet spectra of supernovae. Even after the launch of the Hubble Space Telescope, which has much more powerful ultraviolet spectrometers, the IUE has enabled us to obtain early and frequent measurements of ultraviolet radiation: this information has been folded in with our HST data to create unique observations of supernova which can be interpreted to give powerful constraints on the physical properties of the exploding stars. Our chief result in the present grant period was the completion of a detailed reanalysis of the data on the circumstellar shell of SN 1987A. The presence of narrow high-temperature mission lines from nitrogen-rich gas close to SN 1987A has been the principal observational constraint on the evolution of the supernova's progenitor. Our new analysis shows that the onset of these lines, their rise to maximum, and their subsequent fading can be understood in the context of a model for the photoionization of circumstellar matter.

  16. Determining characteristics of artificial near-Earth objects using observability analysis

    NASA Astrophysics Data System (ADS)

    Friedman, Alex M.; Frueh, Carolin

    2018-03-01

    Observability analysis is a method for determining whether a chosen state of a system can be determined from the output or measurements. Knowledge of state information availability resulting from observability analysis leads to improved sensor tasking for observation of orbital debris and better control of active spacecraft. This research performs numerical observability analysis of artificial near-Earth objects. Analysis of linearization methods and state transition matrices is performed to determine the viability of applying linear observability methods to the nonlinear orbit problem. Furthermore, pre-whitening is implemented to reformulate classical observability analysis. In addition, the state in observability analysis is typically composed of position and velocity; however, including object characteristics beyond position and velocity can be crucial for precise orbit propagation. For example, solar radiation pressure has a significant impact on the orbit of high area-to-mass ratio objects in geosynchronous orbit. Therefore, determining the time required for solar radiation pressure parameters to become observable is important for understanding debris objects. In order to compare observability analysis results with and without measurement noise and an extended state, quantitative measures of observability are investigated and implemented.

  17. An International Coordinated Effort to Further the Documentation & Development of Quality Assurance, Quality Control, and Best Practices for Oceanographic Observations

    NASA Astrophysics Data System (ADS)

    Bushnell, M.; Waldmann, C.; Hermes, J.; Tamburri, M.

    2017-12-01

    Many oceanographic observation groups create and maintain QA, QC, and best practices (BP) to ensure efficient and accurate data collection and quantify quality. Several entities - IOOS® QARTOD, AtlantOS, ACT, WMO/IOC JCOMM OCG - have joined forces to document existing practices, identify gaps, and support development of emerging techniques. While each group has a slightly different focus, many underlying QA/QC/BP needs can be quite common. QARTOD focuses upon real-time data QC, and has produced manuals that address QC tests for eleven ocean variables. AtlantOS is a research and innovation project working towards the integration of ocean-observing activities across all disciplines in the Atlantic Basin. ACT brings together research institutions, resource managers, and private companies to foster the development and adoption of effective and reliable sensors for coastal, freshwater, and ocean environments. JCOMM promotes broad international coordination of oceanographic and marine meteorological observations and data management and services. Leveraging existing efforts of these organizations is an efficient way to consolidate available information, develop new practices, and evaluate the use of ISO standards to judge the quality of measurements. ISO standards may offer accepted support for a framework for an ocean data quality management system, similar to the meteorological standards defined by WMO (https://www.wmo.int/pages/prog/arep/gaw/qassurance.html). We will first cooperatively develop a plan to create a QA/QC/BP manual. The resulting plan will describe the need for such a manual, the extent of the manual, the process used to engage the community in creating it, the maintenance of the resultant document, and how these things will be done. It will also investigate standards for metadata. The plan will subsequently be used to develop the QA/QC/BP manual, providing guidance which advances the standards adopted by IOOS, AtlantOS, JCOMM, and others.

  18. Accuracy of Vasopressor Documentation in Anesthesia Records.

    PubMed

    Wax, David B; Feit, Justin B

    2016-06-01

    To determine the accuracy of documentation of vasoactive medication administration in anesthetic records. Cross-sectional observational study. Single academic center. Attending and resident anesthesiologists. None. An auditor inspected the anesthesia worktop between cases looking for partially used syringes of vasopressors, and the anesthesia record for the preceding case was reviewed for entries related to administration of these agents. In 100 anesthesia records for cases in which a phenylephrine and/or ephedrine bolus was apparently administered, 26% (95% CI: 18-35%) had full documentation and 36% (95% CI: 27-46%) had no documentation. In the 38% of cases that had partial documentation, a median of 50% (interquartile range 33%, 67%) of the total amounts given were documented. The authors found complete or partial omission of documentation of bolus doses of vasopressors in anesthesia records in the majority of cases in which such drugs were given. This finding has the potential to jeopardize the data integrity of local and pooled case registries and conclusions of retrospective studies that utilize these data. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. The Intended Curriculum in Co-operative Education in Ontario Secondary Schools: An Analysis of School District Documents.

    ERIC Educational Resources Information Center

    Hutchinson, Nancy L.; Munby, Hugh; Chin, Peter; Edwards, Karol Lyn; Steiner-Bell, Karin; Chapman, Christine; Ho, Katherine; de Espana, Wendy Mills

    2001-01-01

    Analysis of cooperative education policy documents from nine Ontario school districts indicated that statements about evaluation, remediation, equity, and teacher qualifications were inconsistent. Although the Ministry of Education and Training prescribes co-op for delivery of academic subjects. districts focus exclusively on career preparation…

  20. Scientific Contributions to GEO Global Earth Observation Priorities

    NASA Astrophysics Data System (ADS)

    Friedl, L.; Ledrew, E.

    2009-12-01

    Numerous counties and non-governmental organizations have produced documents, held workshops, and published reports in the past decade that identify Earth observation needs to meet their particular objectives. The Group on Earth Observations (GEO) has conducted a review of these documents, workshops, and reports to identify the priority observations common to many societal benefit areas. GEO has made a concerted effort to include materials from a broad range of user types, including scientific researchers, resource managers, and policy makers. GEO has also sought an international breadth in the materials reviewed, including observation priorities from developing countries. The activity will help GEO optimize the observations in GEOSS that are most likely to provide societal benefits, and GEO members will use the results of this meta-analysis to support investment decisions. The Earth observations in GEOSS serve scientific research and applications endeavors. As a primary user of ground-based, airborne, in situ, and space-based observations of the Earth, the scientific community has a significant voice and vested interest in the observations offered through GEOSS. Furthermore, the science and technology community will have opportunities to identify critical scientific/technological advances needed to produce any observations that are needed yet not currently available. In this paper, we will discuss this GEO effort to identify Earth observations priorities. We will present initial findings for some societal benefit areas and the overall meta-analysis. We will also discuss possible roles for the science and technology community to contribute to those priorities, such as scientific advances needed to achieve the observations or to realize societal benefits from the observations.

  1. Learning Documentations in VET Systems: An Analysis of Current Swiss Practices

    ERIC Educational Resources Information Center

    Caruso, Valentina; Cattaneo, Alberto; Gurtner, Jean-Luc

    2016-01-01

    Swiss vocational education and training (VET) is defined as a dual-track system where apprentices weekly alternate between vocational school and a (real) workplace. At the workplace, they have to keep a learning documentation throughout their training, in which they are expected to regularly document their professional development. The actual use…

  2. Documentation of structures branch programs and program updates. Project 3200

    NASA Technical Reports Server (NTRS)

    Probe, D. G.

    1975-01-01

    Update programming of applications programs for the integrated structural analysis system is reported. An attempt is made to layout a standard document format for the preparation of program documents. Documentation which involves changes, additions, and I/O capability revisions to existing programs includes a checklist which should be reviewed each time a programming effort is documented.

  3. Electronic Medical Record Documentation of Driving Safety for Veterans with Diagnosed Dementia.

    PubMed

    Vair, Christina L; King, Paul R; Gass, Julie; Eaker, April; Kusche, Anna; Wray, Laura O

    2018-01-01

    Many older adults continue to drive following dementia diagnosis, with medical providers increasingly likely to be involved in addressing such safety concerns. This study examined electronic medical record (EMR) documentation of driving safety for veterans with dementia (N = 118) seen in Veterans Affairs primary care and interdisciplinary geriatrics clinics in one geographic region over a 10-year period. Qualitative directed content analysis of retrospective EMR data. Assessment of known risk factors or subjective concerns for unsafe driving were documented in fewer than half of observed cases; specific recommendations for driving safety were evident for a minority of patients, with formal driving evaluation the most frequently documented recommendation by providers. Utilizing data from actual clinical encounters provides a unique snapshot of how driving risk and safety concerns are addressed for veterans with dementia. This information provides a meaningful frame of reference for understanding potential strengths and possible gaps in how this important topic area is being addressed in the course of clinical care. The EMR is an important forum for interprofessional communication, with documentation of driving risk and safety concerns an essential element for continuity of care and ensuring consistency of information delivered to patients and caregivers.

  4. The Influence of Observation Errors on Analysis Error and Forecast Skill Investigated with an Observing System Simulation Experiment

    NASA Technical Reports Server (NTRS)

    Prive, N. C.; Errico, R. M.; Tai, K.-S.

    2013-01-01

    The Global Modeling and Assimilation Office (GMAO) observing system simulation experiment (OSSE) framework is used to explore the response of analysis error and forecast skill to observation quality. In an OSSE, synthetic observations may be created that have much smaller error than real observations, and precisely quantified error may be applied to these synthetic observations. Three experiments are performed in which synthetic observations with magnitudes of applied observation error that vary from zero to twice the estimated realistic error are ingested into the Goddard Earth Observing System Model (GEOS-5) with Gridpoint Statistical Interpolation (GSI) data assimilation for a one-month period representing July. The analysis increment and observation innovation are strongly impacted by observation error, with much larger variances for increased observation error. The analysis quality is degraded by increased observation error, but the change in root-mean-square error of the analysis state is small relative to the total analysis error. Surprisingly, in the 120 hour forecast increased observation error only yields a slight decline in forecast skill in the extratropics, and no discernable degradation of forecast skill in the tropics.

  5. Perspectives: Using Historical Documents To Think about NIF Issues.

    ERIC Educational Resources Information Center

    National Archives and Records Service (GSA), Washington, DC.

    The purpose of using historical documents in the classroom is to generate and enhance discussion by providing a historical perspective for issues. Five documents are included in this packet and are to be used as a supplemental material for the National Issues Forum (NIF) topics. Issues raised include (1) an analysis of the documents and (2)…

  6. Design and Documentation: The State of the Art.

    ERIC Educational Resources Information Center

    Gibbons, Andrew S.

    1998-01-01

    Although the trend is for less documentation, this article argues that more is needed to help in the analysis of design failure in instructional design. Presents arguments supporting documented design, including error recognition and correction, verification of completeness and soundness, sharing of new design principles, modifiability, error…

  7. Ethics, Power, Internationalisation and the Postcolonial: A Foucauldian Discourse Analysis of Policy Documents in Two Scottish Universities

    ERIC Educational Resources Information Center

    Guion Akdag, Emma; Swanson, Dalene M.

    2018-01-01

    This paper provides a critical discussion of internationalisation in Higher Education (HE), and exemplifies a process of uncovering the investments in power and ideology through the partial analysis of four strategic internationalisation documents at two Scottish Higher Education institutions, as part of an ongoing international study into the…

  8. FLAMMABLE GAS TECHNICAL BASIS DOCUMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    KRIPPS, L.J.

    2005-02-18

    This document describes the qualitative evaluation of frequency and consequences for double shell tank (DST) and single shell tank (SST) representative flammable gas accidents and associated hazardous conditions without controls. The evaluation indicated that safety-significant SSCs and/or TSRS were required to prevent or mitigate flammable gas accidents. Discussion on the resulting control decisions is included. This technical basis document was developed to support of the Tank Farms Documented Safety Analysis (DSA) and describes the risk binning process for the flammable gas representative accidents and associated represented hazardous conditions. The purpose of the risk binning process is to determine the needmore » for safety-significant structures, systems, and components (SSC) and technical safety requirement (TSR)-level controls for a given representative accident or represented hazardous condition based on an evaluation of the event frequency and consequence.« less

  9. Three-Dimensional Dispaly Of Document Set

    DOEpatents

    Lantrip, David B.; Pennock, Kelly A.; Pottier, Marc C.; Schur, Anne; Thomas, James J.; Wise, James A.

    2003-06-24

    A method for spatializing text content for enhanced visual browsing and analysis. The invention is applied to large text document corpora such as digital libraries, regulations and procedures, archived reports, and the like. The text content from these sources may be transformed to a spatial representation that preserves informational characteristics from the documents. The three-dimensional representation may then be visually browsed and analyzed in ways that avoid language processing and that reduce the analysts' effort.

  10. Three-dimensional display of document set

    DOEpatents

    Lantrip, David B [Oxnard, CA; Pennock, Kelly A [Richland, WA; Pottier, Marc C [Richland, WA; Schur, Anne [Richland, WA; Thomas, James J [Richland, WA; Wise, James A [Richland, WA

    2006-09-26

    A method for spatializing text content for enhanced visual browsing and analysis. The invention is applied to large text document corpora such as digital libraries, regulations and procedures, archived reports, and the like. The text content from these sources may e transformed to a spatial representation that preserves informational characteristics from the documents. The three-dimensional representation may then be visually browsed and analyzed in ways that avoid language processing and that reduce the analysts' effort.

  11. Three-dimensional display of document set

    DOEpatents

    Lantrip, David B [Oxnard, CA; Pennock, Kelly A [Richland, WA; Pottier, Marc C [Richland, WA; Schur, Anne [Richland, WA; Thomas, James J [Richland, WA; Wise, James A [Richland, WA

    2001-10-02

    A method for spatializing text content for enhanced visual browsing and analysis. The invention is applied to large text document corpora such as digital libraries, regulations and procedures, archived reports, and the like. The text content from these sources may be transformed to a spatial representation that preserves informational characteristics from the documents. The three-dimensional representation may then be visually browsed and analyzed in ways that avoid language processing and that reduce the analysts' effort.

  12. Three-dimensional display of document set

    DOEpatents

    Lantrip, David B [Oxnard, CA; Pennock, Kelly A [Richland, WA; Pottier, Marc C [Richland, WA; Schur, Anne [Richland, WA; Thomas, James J [Richland, WA; Wise, James A [Richland, WA; York, Jeremy [Bothell, WA

    2009-06-30

    A method for spatializing text content for enhanced visual browsing and analysis. The invention is applied to large text document corpora such as digital libraries, regulations and procedures, archived reports, and the like. The text content from these sources may be transformed to a spatial representation that preserves informational characteristics from the documents. The three-dimensional representation may then be visually browsed and analyzed in ways that avoid language processing and that reduce the analysts' effort.

  13. Cloud Properties Derived From GOES-7 for Spring 1994 ARM Intensive Observing Period Using Version 1.0.0 of ARM Satellite Data Analysis Program

    NASA Technical Reports Server (NTRS)

    Minnis, Patrick; Smith, William L., Jr.; Garber, Donald P.; Ayers, J. Kirk; Doelling, David R.

    1995-01-01

    This document describes the initial formulation (Version 1.0.0) of the Atmospheric Radiation Measurement (ARM) program satellite data analysis procedures. Techniques are presented for calibrating geostationary satellite data with Sun synchronous satellite radiances and for converting narrowband radiances to top-of-the-atmosphere fluxes and albedos. A methodology is documented for combining geostationary visible and infrared radiances with surface-based temperature observations to derive cloud amount, optical depth, height, thickness, temperature, and albedo. The analysis is limited to two grids centered over the ARM Southern Great Plains central facility in north-central Oklahoma. Daytime data taken during 5 Apr. - 1 May 1994, were analyzed on the 0.3 deg and 0.5 deg latitude-longitude grids that cover areas of 0.9 deg x 0.9 deg and 10 deg x 14 deg, respectively. Conditions ranging from scattered low cumulus to thin cirrus and thick cumulonimbus occurred during the study period. Detailed comparisons with hourly surface observations indicate that the mean cloudiness is within a few percent of the surface-derived sky cover. Formats of the results are also provided. The data can be accessed through the World Wide Web computer network.

  14. Market Analysis and Consumer Impacts Source Document. Part I. The Motor Vehicle Market in the Late 1970's

    DOT National Transportation Integrated Search

    1980-12-01

    The source document on motor vehicle market analysis and consumer impact consists of three parts. Part I is an integrated overview of the motor vehicle market in the late 1970's, with sections on the structure of the market, motor vehicle trends, con...

  15. Analysis of the Arctic system for freshwater cycle intensification: Observations and expectations

    USGS Publications Warehouse

    Rawlins, M.A.; Steele, M.; Holland, M.M.; Adam, J.C.; Cherry, J.E.; Francis, J.A.; Groisman, P.Y.; Hinzman, L.D.; Huntington, T.G.; Kane, D.L.; Kimball, J.S.; Kwok, R.; Lammers, R.B.; Lee, C.M.; Lettenmaier, D.P.; McDonald, K.C.; Podest, E.; Pundsack, J.W.; Rudels, B.; Serreze, Mark C.; Shiklomanov, A.; Skagseth, O.; Troy, T.J.; Vorosmarty, C.J.; Wensnahan, M.; Wood, E.F.; Woodgate, R.; Yang, D.; Zhang, K.; Zhang, T.

    2010-01-01

    Hydrologic cycle intensification is an expected manifestation of a warming climate. Although positive trends in several global average quantities have been reported, no previous studies have documented broad intensification across elements of the Arctic freshwater cycle (FWC). In this study, the authors examine the character and quantitative significance of changes in annual precipitation, evapotranspiration, and river discharge across the terrestrial pan-Arctic over the past several decades from observations and a suite of coupled general circulation models (GCMs). Trends in freshwater flux and storage derived from observations across the Arctic Ocean and surrounding seas are also described. With few exceptions, precipitation, evapotranspiration, and river discharge fluxes from observations and the GCMs exhibit positive trends. Significant positive trends above the 90% confidence level, however, are not present for all of the observations. Greater confidence in the GCM trends arises through lower interannual variability relative to trend magnitude. Put another way, intrinsic variability in the observations tends to limit confidence in trend robustness. Ocean fluxes are less certain, primarily because of the lack of long-term observations. Where available, salinity and volume flux data suggest some decrease in saltwater inflow to the Barents Sea (i.e., a decrease in freshwater outflow) in recent decades. A decline in freshwater storage across the central Arctic Ocean and suggestions that large-scale circulation plays a dominant role in freshwater trends raise questions as to whether Arctic Ocean freshwater flows are intensifying. Although oceanic fluxes of freshwater are highly variable and consistent trends are difficult to verify, the other components of the Arctic FWC do show consistent positive trends over recent decades. The broad-scale increases provide evidence that the Arctic FWC is experiencing intensification. Efforts that aim to develop an adequate

  16. Hospital mainframe computer documentation of pharmacist interventions.

    PubMed

    Schumock, G T; Guenette, A J; Clark, T; McBride, J M

    1993-07-01

    The hospital mainframe computer pharmacist intervention documentation system described has successfully facilitated the recording, communication, analysis, and reporting of interventions at our hospital. It has proven to be time efficient, accessible, and user-friendly from the standpoint of both the pharmacist and administrator. The advantages of this system greatly outweigh manual documentation and justify the initial time investment in its design and development. In the future, it is hoped that the system can have even broader impact. Intervention/recommendations documented can be made accessible to medical and nursing staff, and as such further increase interdepartmental communication. As pharmacists embrace the pharmaceutical care mandate, documenting interventions in patient care will continue to grow in importance. Complete documentation is essential if pharmacists are to assume responsibility for patient outcomes. With time being an ever-increasing premium, and with economic and human resources dwindling, an efficient and effective means of recording and tracking pharmacist interventions will become imperative for survival in the fiscally challenged health care arena. Documentation of pharmacist intervention using a hospital mainframe computer at UIH has proven both efficient and effective.

  17. Prototyping a bedside documentation system.

    PubMed

    Bachand, P; Bobis, K

    1993-01-01

    The implementation of a comprehensive bedside documentation system is a major project that demands careful analysis and planning. Since the cost of a typical bedside system can easily exceed $3 million, a design oversight could have disastrous effects on the benefits of the system.

  18. Electronic reminders improve procedure documentation compliance and professional fee reimbursement.

    PubMed

    Kheterpal, Sachin; Gupta, Ruchika; Blum, James M; Tremper, Kevin K; O'Reilly, Michael; Kazanjian, Paul E

    2007-03-01

    Medicolegal, clinical, and reimbursement needs warrant complete and accurate documentation. We sought to identify and improve our compliance rate for the documentation of arterial catheterization in the perioperative setting. We first reviewed 12 mo of electronic anesthesia records to establish a baseline compliance rate for arterial catheter documentation. Residents and Certified Registered Nurse Anesthetists were randomly assigned to a control group and experimental group. When surgical incision and anesthesia end were documented in the electronic record keeper, a reminder routine checked for an invasive arterial blood pressure tracing. If a case used an arterial catheter, but no procedure note was observed, the resident or Certified Registered Nurse Anesthetist assigned to the case was sent an automated alphanumeric pager and e-mail reminder. Providers in the control group received no pager or e-mail message. After 2 mo, all staff received the reminders. A baseline compliance rate of 80% was observed (1963 of 2459 catheters documented). During the 2-mo study period, providers in the control group documented 152 of 202 (75%) arterial catheters, and the experimental group documented 177 of 201 (88%) arterial lines (P < 0.001). After all staff began receiving reminders, 309 of 314 arterial lines were documented in a subsequent 2 mo period (98%). Extrapolating this compliance rate to 12 mo of expected arterial catheter placement would result in an annual incremental $40,500 of professional fee reimbursement. The complexity of the tertiary care process results in documentation deficiencies. Inexpensive automated reminders can drastically improve compliance without the need for complicated negative or positive feedback.

  19. Chandra Interactive Analysis of Observations (CIAO)

    NASA Technical Reports Server (NTRS)

    Dobrzycki, Adam

    2000-01-01

    The Chandra (formerly AXAF) telescope, launched on July 23, 1999, provides X-rays data with unprecedented spatial and spectral resolution. As part of the Chandra scientific support, the Chandra X-ray Observatory Center provides a new data analysis system, CIAO ("Chandra Interactive Analysis of Observations"). We will present the main components of the system: "First Look" analysis; SHERPA: a multi-dimensional, multi-mission modeling and fitting application; Chandra Imaging and Plotting System; Detect package-source detection algorithms; and DM package generic data manipulation tools, We will set up a demonstration of the portable version of the system and show examples of Chandra Data Analysis.

  20. Computer-assisted handwriting style identification system for questioned document examination

    NASA Astrophysics Data System (ADS)

    Cha, Sung-Hyuk; Yoon, Sungsoo; Tappert, Charles C.; Lee, Yillbyung

    2005-03-01

    Handwriting originates from a particular copybook style such as Palmer or Zaner-Bloser that one learns in childhood. Since questioned document examination plays an important investigative and forensic role in many types of crime, it is important to develop a system that helps objectively identify a questioned document"s handwriting style. Here, we propose a computer vision system that can assist a document examiner in the identification of a writer"s handwriting style and therefore the origin or nationality of an unknown writer of a questioned document. We collected 33 Roman alphabet copybook styles from 18 countries. Each character in a questioned document is segmented and matched against all of the 33 handwriting copybook styles. The more characters present in the questioned document, the higher the accuracy observed.

  1. Self-authentication of value documents

    NASA Astrophysics Data System (ADS)

    Hayosh, Thomas D.

    1998-04-01

    To prevent fraud it is critical to distinguish an authentic document from a counterfeit or altered document. Most current technologies rely on difficult-to-print human detectable features which are added to a document to prevent illegal reproduction. Fraud detection is mostly accomplished by human observation and is based upon the examiner's knowledge, experience and time allotted for examination of a document. Another approach to increasing the security of a value document is to add a unique property to each document. Data about that property is then encoded on the document itself and finally secured using a public key based digital signature. In such a scheme, machine readability of authenticity is possible. This paper describes a patent-applied-for methodology using the unique property of magnetic ink printing, magnetic remanence, that provides for full self- authentication when used with a recordable magnetic stripe for storing a digital signature and other document data. Traditionally the authenticity of a document is determined by physical examination for color, background printing, paper texture, printing resolution, and ink characteristics. On an initial level, there may be numerous security features present on a value document but only a few can be detected and evaluated by the untrained individual. Because security features are normally not standardized except on currency, training tellers and cashiers to do extensive security evaluation is not practical, even though these people are often the only people who get a chance to closely examine the document in a payment system which is back-end automated. In the context of this paper, one should be thinking about value documents such as commercial and personal checks although the concepts presented here can easily be applied to travelers cheques, credit cards, event tickets, passports, driver's licenses, motor vehicle titles, and even currency. For a practical self-authentication system, the false alarms

  2. In Search of Social Translucence: An Audit Log Analysis of Handoff Documentation Views and Updates.

    PubMed

    Jiang, Silis Y; Hum, R Stanley; Vawdrey, David; Mamykina, Lena

    2015-01-01

    Communication and information sharing are critical parts of teamwork in the hospital; however, achieving open and fluid communication can be challenging. Finding specific patient information within documentation can be difficult. Recent studies on handoff documentation tools show that resident handoff notes are increasingly used as an alternative information source by non-physician clinicians. Previous findings also show that residents have become aware of this unintended use. This study investigated the alignment of resident note updating patterns and team note viewing patterns based on usage log data of handoff notes. Qualitative interviews with clinicians were used to triangulate findings based on the log analysis. The study found that notes that were frequently updated were viewed significantly more frequently than notes updated less often (p < 2.2 × 10(-16)). Almost 44% of all notes had aligned frequency of views and updates. The considerable percentage (56%) of mismatched note utilization suggests an opportunity for improvement.

  3. Document cards: a top trumps visualization for documents.

    PubMed

    Strobelt, Hendrik; Oelke, Daniela; Rohrdantz, Christian; Stoffel, Andreas; Keim, Daniel A; Deussen, Oliver

    2009-01-01

    Finding suitable, less space consuming views for a document's main content is crucial to provide convenient access to large document collections on display devices of different size. We present a novel compact visualization which represents the document's key semantic as a mixture of images and important key terms, similar to cards in a top trumps game. The key terms are extracted using an advanced text mining approach based on a fully automatic document structure extraction. The images and their captions are extracted using a graphical heuristic and the captions are used for a semi-semantic image weighting. Furthermore, we use the image color histogram for classification and show at least one representative from each non-empty image class. The approach is demonstrated for the IEEE InfoVis publications of a complete year. The method can easily be applied to other publication collections and sets of documents which contain images.

  4. Integrated computer-aided forensic case analysis, presentation, and documentation based on multimodal 3D data.

    PubMed

    Bornik, Alexander; Urschler, Martin; Schmalstieg, Dieter; Bischof, Horst; Krauskopf, Astrid; Schwark, Thorsten; Scheurer, Eva; Yen, Kathrin

    2018-06-01

    Three-dimensional (3D) crime scene documentation using 3D scanners and medical imaging modalities like computed tomography (CT) and magnetic resonance imaging (MRI) are increasingly applied in forensic casework. Together with digital photography, these modalities enable comprehensive and non-invasive recording of forensically relevant information regarding injuries/pathologies inside the body and on its surface. Furthermore, it is possible to capture traces and items at crime scenes. Such digitally secured evidence has the potential to similarly increase case understanding by forensic experts and non-experts in court. Unlike photographs and 3D surface models, images from CT and MRI are not self-explanatory. Their interpretation and understanding requires radiological knowledge. Findings in tomography data must not only be revealed, but should also be jointly studied with all the 2D and 3D data available in order to clarify spatial interrelations and to optimally exploit the data at hand. This is technically challenging due to the heterogeneous data representations including volumetric data, polygonal 3D models, and images. This paper presents a novel computer-aided forensic toolbox providing tools to support the analysis, documentation, annotation, and illustration of forensic cases using heterogeneous digital data. Conjoint visualization of data from different modalities in their native form and efficient tools to visually extract and emphasize findings help experts to reveal unrecognized correlations and thereby enhance their case understanding. Moreover, the 3D case illustrations created for case analysis represent an efficient means to convey the insights gained from case analysis to forensic non-experts involved in court proceedings like jurists and laymen. The capability of the presented approach in the context of case analysis, its potential to speed up legal procedures and to ultimately enhance legal certainty is demonstrated by introducing a number of

  5. Medical emergencies on board commercial airlines: is documentation as expected?

    PubMed Central

    2012-01-01

    Introduction The purpose of this study was to perform a descriptive, content-based analysis on the different forms of documentation for in-flight medical emergencies that are currently provided in the emergency medical kits on board commercial airlines. Methods Passenger airlines in the World Airline Directory were contacted between March and May 2011. For each participating airline, sample in-flight medical emergency documentation forms were obtained. All items in the sample documentation forms were subjected to a descriptive analysis and compared to a sample "medical incident report" form published by the International Air Transport Association (IATA). Results A total of 1,318 airlines were contacted. Ten airlines agreed to participate in the study and provided a copy of their documentation forms. A descriptive analysis revealed a total of 199 different items, which were summarized into five sub-categories: non-medical data (63), signs and symptoms (68), diagnosis (26), treatment (22) and outcome (20). Conclusions The data in this study illustrate a large variation in the documentation of in-flight medical emergencies by different airlines. A higher degree of standardization is preferable to increase the data quality in epidemiologic aeromedical research in the future. PMID:22397530

  6. Data Provenance in Photogrammetry Through Documentation Protocols

    NASA Astrophysics Data System (ADS)

    Carboni, N.; Bruseker, G.; Guillem, A.; Bellido Castañeda, D.; Coughenour, C.; Domajnko, M.; de Kramer, M.; Ramos Calles, M. M.; Stathopoulou, E. K.; Suma, R.

    2016-06-01

    Documenting the relevant aspects in digitisation processes such as photogrammetry in order to provide a robust provenance for their products continues to present a challenge. The creation of a product that can be re-used scientifically requires a framework for consistent, standardised documentation of the entire digitisation pipeline. This article provides an analysis of the problems inherent to such goals and presents a series of protocols to document the various steps of a photogrammetric workflow. We propose this pipeline, with descriptors to track all phases of digital product creation in order to assure data provenance and enable the validation of the operations from an analytic and production perspective. The approach aims to support adopters of the workflow to define procedures with a long term perspective. The conceptual schema we present is founded on an analysis of information and actor exchanges in the digitisation process. The metadata were defined through the synthesis of previous proposals in this area and were tested on a case study. We performed the digitisation of a set of cultural heritage artefacts from an Iron Age burial in Ilmendorf, Germany. The objects were captured and processed using different techniques, including a comparison of different imaging tools and algorithms. This augmented the complexity of the process allowing us to test the flexibility of the schema for documenting complex scenarios. Although we have only presented a photogrammetry digitisation scenario, we claim that our schema is easily applicable to a multitude of 3D documentation processes.

  7. Public versus internal conceptions of addiction: An analysis of internal Philip Morris documents.

    PubMed

    Elias, Jesse; Hendlin, Yogi Hale; Ling, Pamela M

    2018-05-01

    Tobacco addiction is a complex, multicomponent phenomenon stemming from nicotine's pharmacology and the user's biology, psychology, sociology, and environment. After decades of public denial, the tobacco industry now agrees with public health authorities that nicotine is addictive. In 2000, Philip Morris became the first major tobacco company to admit nicotine's addictiveness. Evolving definitions of addiction have historically affected subsequent policymaking. This article examines how Philip Morris internally conceptualized addiction immediately before and after this announcement. We analyzed previously secret, internal Philip Morris documents made available as a result of litigation against the tobacco industry. We compared these documents to public company statements and found that Philip Morris's move from public denial to public affirmation of nicotine's addictiveness coincided with pressure on the industry from poor public approval ratings, the Master Settlement Agreement (MSA), the United States government's filing of the Racketeer Influenced and Corrupt Organizations (RICO) suit, and the Institute of Medicine's (IoM's) endorsement of potentially reduced risk products. Philip Morris continued to research the causes of addiction through the 2000s in order to create successful potentially reduced exposure products (PREPs). While Philip Morris's public statements reinforce the idea that nicotine's pharmacology principally drives smoking addiction, company scientists framed addiction as the result of interconnected biological, social, psychological, and environmental determinants, with nicotine as but one component. Due to the fragmentary nature of the industry document database, we may have missed relevant information that could have affected our analysis. Philip Morris's research suggests that tobacco industry activity influences addiction treatment outcomes. Beyond nicotine's pharmacology, the industry's continued aggressive advertising, lobbying, and

  8. The Earth System Documentation (ES-DOC) project

    NASA Astrophysics Data System (ADS)

    Murphy, S.; Greenslade, M. A.; Treshansky, A.; DeLuca, C.; Guilyardi, E.; Denvil, S.

    2013-12-01

    Earth System Documentation (ES-DOC) is an international project supplying high quality tools and services in support of Earth system documentation creation, analysis and dissemination. It is nurturing a sustainable standards based documentation ecosystem that aims to become an integral part of the next generation of exa-scale dataset archives. ES-DOC leverages open source software, and applies a software development methodology that places end-user narratives at the heart of all it does. ES-DOC has initially focused upon nurturing the Earth System Model (ESM) documentation eco-system. Within this context ES-DOC leverages the emerging Common Information Model (CIM) metadata standard, which has supported the following projects: ** Coupled Model Inter-comparison Project Phase 5 (CMIP5); ** Dynamical Core Model Inter-comparison Project (DCMIP-2012); ** National Climate Predictions and Projections Platforms (NCPP) Quantitative Evaluation of Downscaling Workshop (QED-2013). This presentation will introduce the project to a wider audience and will demonstrate the current production level capabilities of the eco-system: ** An ESM documentation Viewer embeddable into any website; ** An ESM Questionnaire configurable on a project by project basis; ** An ESM comparison tool reusable across projects; ** An ESM visualization tool reusable across projects; ** A search engine for speedily accessing published documentation; ** Libraries for streamlining document creation, validation and publishing pipelines.

  9. Helping Students Analyze Business Documents.

    ERIC Educational Resources Information Center

    Devet, Bonnie

    2001-01-01

    Notes that student writers gain greater insight into the importance of audience by analyzing business documents. Discusses how business writing teachers can help students understand the rhetorical refinements of writing to an audience. Presents an assignment designed to lead writers systematically through an analysis of two advertisements. (SG)

  10. Narrative review: the promotion of gabapentin: an analysis of internal industry documents.

    PubMed

    Steinman, Michael A; Bero, Lisa A; Chren, Mary-Margaret; Landefeld, C Seth

    2006-08-15

    Internal documents from the pharmaceutical industry provide a unique window for understanding the structure and methods of pharmaceutical promotion. Such documents have become available through litigation concerning the promotion of gabapentin (Neurontin, Pfizer, Inc., New York, New York) for off-label uses. To describe how gabapentin was promoted, focusing on the use of medical education, research, and publication. Court documents available to the public from United States ex. rel David Franklin vs. Pfizer, Inc., and Parke-Davis, Division of Warner-Lambert Company, mostly from 1994-1998. All documents were reviewed by 1 author, with selected review by coauthors. Marketing strategies and tactics were identified by using an iterative process of review, discussion, and re-review of selected documents. The promotion of gabapentin was a comprehensive and multifaceted process. Advisory boards, consultants meetings, and accredited continuing medical education events organized by third-party vendors were used to deliver promotional messages. These tactics were augmented by the recruitment of local champions and engagement of thought leaders, who could be used to communicate favorable messages about gabapentin to their physician colleagues. Research and scholarship were also used for marketing by encouraging "key customers" to participate in research, using a large study to advance promotional themes and build market share, paying medical communication companies to develop and publish articles about gabapentin for the medical literature, and planning to suppress unfavorable study results. Most available documents were submitted by the plaintiff and may not represent a complete picture of marketing practices. Activities traditionally considered independent of promotional intent, including continuing medical education and research, were extensively used to promote gabapentin. New strategies are needed to ensure a clear separation between scientific and commercial activity.

  11. Clinical map document based on XML (cMDX): document architecture with mapping feature for reporting and analysing prostate cancer in radical prostatectomy specimens.

    PubMed

    Eminaga, Okyaz; Hinkelammert, Reemt; Semjonow, Axel; Neumann, Joerg; Abbas, Mahmoud; Koepke, Thomas; Bettendorf, Olaf; Eltze, Elke; Dugas, Martin

    2010-11-15

    The pathology report of radical prostatectomy specimens plays an important role in clinical decisions and the prognostic evaluation in Prostate Cancer (PCa). The anatomical schema is a helpful tool to document PCa extension for clinical and research purposes. To achieve electronic documentation and analysis, an appropriate documentation model for anatomical schemas is needed. For this purpose we developed cMDX. The document architecture of cMDX was designed according to Open Packaging Conventions by separating the whole data into template data and patient data. Analogue custom XML elements were considered to harmonize the graphical representation (e.g. tumour extension) with the textual data (e.g. histological patterns). The graphical documentation was based on the four-layer visualization model that forms the interaction between different custom XML elements. Sensible personal data were encrypted with a 256-bit cryptographic algorithm to avoid misuse. In order to assess the clinical value, we retrospectively analysed the tumour extension in 255 patients after radical prostatectomy. The pathology report with cMDX can represent pathological findings of the prostate in schematic styles. Such reports can be integrated into the hospital information system. "cMDX" documents can be converted into different data formats like text, graphics and PDF. Supplementary tools like cMDX Editor and an analyser tool were implemented. The graphical analysis of 255 prostatectomy specimens showed that PCa were mostly localized in the peripheral zone (Mean: 73% ± 25). 54% of PCa showed a multifocal growth pattern. cMDX can be used for routine histopathological reporting of radical prostatectomy specimens and provide data for scientific analysis.

  12. Does teaching of documentation of shoulder dystocia delivery through simulation result in improved documentation in real life?

    PubMed

    Comeau, Robyn; Craig, Catherine

    2014-03-01

    Documentation of deliveries complicated by shoulder dystocia is a valuable communication skill necessary for residents to attain during residency training. Our objective was to determine whether the teaching of documentation of shoulder dystocia in a simulation environment would translate to improved documentation of the event in an actual clinical situation. We conducted a cohort study involving obstetrics and gynaecology residents in years 2 to 5 between November 2010 and December 2012. Each resident participated in a shoulder dystocia simulation teaching session and was asked to write a delivery note immediately afterwards. They were given feedback regarding their performance of the delivery and their documentation of the events. Following this, dictated records of shoulder dystocia deliveries immediately before and after the simulation session were identified through the Meditech system. An itemized checklist was used to assess the quality of residents' dictated documentation before and after the simulation session. All eligible residents (18) enrolled in the study, and 17 met the inclusion criteria. For 10 residents (59%) documentation of a delivery with shoulder dystocia was present before and after the simulation session, for five residents (29%) it was only present before the session, and for two residents (18%) it was only present after the session. When residents were assessed as a group, there were no differences in the proportion of residents recording items on the checklist before and after the simulation session (P > 0.05 for all). Similarly, analysis of the performance of the10 residents who had dictated documentation both before and after the session showed no differences in the number of elements recorded on dictations done before and after the simulation session (P > 0.05 for all). The teaching of shoulder dystocia documentation through simulation did not result in a measurable improvement in the quality of documentation of shoulder dystocia in

  13. Goal-oriented evaluation of binarization algorithms for historical document images

    NASA Astrophysics Data System (ADS)

    Obafemi-Ajayi, Tayo; Agam, Gady

    2013-01-01

    Binarization is of significant importance in document analysis systems. It is an essential first step, prior to further stages such as Optical Character Recognition (OCR), document segmentation, or enhancement of readability of the document after some restoration stages. Hence, proper evaluation of binarization methods to verify their effectiveness is of great value to the document analysis community. In this work, we perform a detailed goal-oriented evaluation of image quality assessment of the 18 binarization methods that participated in the DIBCO 2011 competition using the 16 historical document test images used in the contest. We are interested in the image quality assessment of the outputs generated by the different binarization algorithms as well as the OCR performance, where possible. We compare our evaluation of the algorithms based on human perception of quality to the DIBCO evaluation metrics. The results obtained provide an insight into the effectiveness of these methods with respect to human perception of image quality as well as OCR performance.

  14. Influential Observations in Principal Factor Analysis.

    ERIC Educational Resources Information Center

    Tanaka, Yutaka; Odaka, Yoshimasa

    1989-01-01

    A method is proposed for detecting influential observations in iterative principal factor analysis. Theoretical influence functions are derived for two components of the common variance decomposition. The major mathematical tool is the influence function derived by Tanaka (1988). (SLD)

  15. Semantic Similarity between Web Documents Using Ontology

    NASA Astrophysics Data System (ADS)

    Chahal, Poonam; Singh Tomer, Manjeet; Kumar, Suresh

    2018-06-01

    The World Wide Web is the source of information available in the structure of interlinked web pages. However, the procedure of extracting significant information with the assistance of search engine is incredibly critical. This is for the reason that web information is written mainly by using natural language, and further available to individual human. Several efforts have been made in semantic similarity computation between documents using words, concepts and concepts relationship but still the outcome available are not as per the user requirements. This paper proposes a novel technique for computation of semantic similarity between documents that not only takes concepts available in documents but also relationships that are available between the concepts. In our approach documents are being processed by making ontology of the documents using base ontology and a dictionary containing concepts records. Each such record is made up of the probable words which represents a given concept. Finally, document ontology's are compared to find their semantic similarity by taking the relationships among concepts. Relevant concepts and relations between the concepts have been explored by capturing author and user intention. The proposed semantic analysis technique provides improved results as compared to the existing techniques.

  16. Semantic Similarity between Web Documents Using Ontology

    NASA Astrophysics Data System (ADS)

    Chahal, Poonam; Singh Tomer, Manjeet; Kumar, Suresh

    2018-03-01

    The World Wide Web is the source of information available in the structure of interlinked web pages. However, the procedure of extracting significant information with the assistance of search engine is incredibly critical. This is for the reason that web information is written mainly by using natural language, and further available to individual human. Several efforts have been made in semantic similarity computation between documents using words, concepts and concepts relationship but still the outcome available are not as per the user requirements. This paper proposes a novel technique for computation of semantic similarity between documents that not only takes concepts available in documents but also relationships that are available between the concepts. In our approach documents are being processed by making ontology of the documents using base ontology and a dictionary containing concepts records. Each such record is made up of the probable words which represents a given concept. Finally, document ontology's are compared to find their semantic similarity by taking the relationships among concepts. Relevant concepts and relations between the concepts have been explored by capturing author and user intention. The proposed semantic analysis technique provides improved results as compared to the existing techniques.

  17. Computer software documentation

    NASA Technical Reports Server (NTRS)

    Comella, P. A.

    1973-01-01

    A tutorial in the documentation of computer software is presented. It presents a methodology for achieving an adequate level of documentation as a natural outgrowth of the total programming effort commencing with the initial problem statement and definition and terminating with the final verification of code. It discusses the content of adequate documentation, the necessity for such documentation and the problems impeding achievement of adequate documentation.

  18. Satellite observation analysis of aerosols loading effect over Monrovia-Liberia

    NASA Astrophysics Data System (ADS)

    Emetere, M. E.; Esisio, F.; Oladapo, F.

    2017-05-01

    The effect of aerosols loading most often results in aerosols retention in the atmosphere. Aside the health hazards of aerosol retention, its effect on climate change are visible. In this research, it was proposed that the effect of aerosol retention also affects rain pattern. The Tropical Rainfall Measuring Mission (TRMM) layer 3 observations and the multi-imaging spectro-reflectometer (MISR) was used for the study. The aerosols loading over were investigated using sixteen years satellite observation in Monrovia-Liberia. Its effect on the rain rate over the region was documented. The results show that aerosol loading over the region is high and may have effect on farming in the nearest future. It was affirmed that the scanty AOD data was as a result of the rain rate that is higher within May and October.

  19. AgRISTARS: Foreign Commodity production forecasting. Project procedures designation and description document, volume 1

    NASA Technical Reports Server (NTRS)

    Waggoner, J. T.; Phinney, D. E. (Principal Investigator)

    1981-01-01

    The crop estimation analysis procedures documentation of the AgRISTARS - Foreign Commodity Production Forecasting Project (FCPF) is presented. Specifically it includes the technical/management documentation of the remote sensing data analysis procedures prepared in accordance with the guidelines provided in the FCPF communication/documentation standards manual. Standard documentation sets are given arranged by procedural type and level then by crop types or other technically differentiating categories.

  20. Semantic Metadata for Heterogeneous Spatial Planning Documents

    NASA Astrophysics Data System (ADS)

    Iwaniak, A.; Kaczmarek, I.; Łukowicz, J.; Strzelecki, M.; Coetzee, S.; Paluszyński, W.

    2016-09-01

    Spatial planning documents contain information about the principles and rights of land use in different zones of a local authority. They are the basis for administrative decision making in support of sustainable development. In Poland these documents are published on the Web according to a prescribed non-extendable XML schema, designed for optimum presentation to humans in HTML web pages. There is no document standard, and limited functionality exists for adding references to external resources. The text in these documents is discoverable and searchable by general-purpose web search engines, but the semantics of the content cannot be discovered or queried. The spatial information in these documents is geographically referenced but not machine-readable. Major manual efforts are required to integrate such heterogeneous spatial planning documents from various local authorities for analysis, scenario planning and decision support. This article presents results of an implementation using machine-readable semantic metadata to identify relationships among regulations in the text, spatial objects in the drawings and links to external resources. A spatial planning ontology was used to annotate different sections of spatial planning documents with semantic metadata in the Resource Description Framework in Attributes (RDFa). The semantic interpretation of the content, links between document elements and links to external resources were embedded in XHTML pages. An example and use case from the spatial planning domain in Poland is presented to evaluate its efficiency and applicability. The solution enables the automated integration of spatial planning documents from multiple local authorities to assist decision makers with understanding and interpreting spatial planning information. The approach is equally applicable to legal documents from other countries and domains, such as cultural heritage and environmental management.

  1. Observing Community Residences.

    ERIC Educational Resources Information Center

    Taylor, Steven J.; Bogdan, Robert

    The document offers guidelines effectively monitoring the quality of care provided in community residences serving people with disabilities. An initial section offers suggestions on observation and evaluation procedures. The remainder of the document lists possible questions to be asked in 19 areas: location, building and yard, relations with the…

  2. 32 CFR 989.11 - Combining EIAP with other documentation.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... documentation. (a) The EPF combines environmental analysis with other related documentation when practicable (40 CFR 1506.4) following the procedures prescribed by the CEQ regulations and this part. (b) The EPF must... the EIAP. Prior to making a decision to proceed, the EPF must analyze the environmental impacts that...

  3. [Photography as analysis document, body and medicine: theory, method and criticism--the experience of Museo Nacional de Medicina Enrique Laval].

    PubMed

    Robinson, César Leyton; Caballero, Andrés Díaz

    2007-01-01

    This article is an experimental methodological reflection on the use of medical images as useful documents for constructing the history of medicine. A method is used that is based on guidelines or analysis topics that include different ways of viewing documents, from aesthetic, technical, social and political theories to historical and medical thinking. Some exercises are also included that enhance the proposal for the reader: rediscovering the worlds in society that harbor these medical photographical archives to obtain a new theoretical approach to the construction of the history of medical science.

  4. Clinical map document based on XML (cMDX): document architecture with mapping feature for reporting and analysing prostate cancer in radical prostatectomy specimens

    PubMed Central

    2010-01-01

    Background The pathology report of radical prostatectomy specimens plays an important role in clinical decisions and the prognostic evaluation in Prostate Cancer (PCa). The anatomical schema is a helpful tool to document PCa extension for clinical and research purposes. To achieve electronic documentation and analysis, an appropriate documentation model for anatomical schemas is needed. For this purpose we developed cMDX. Methods The document architecture of cMDX was designed according to Open Packaging Conventions by separating the whole data into template data and patient data. Analogue custom XML elements were considered to harmonize the graphical representation (e.g. tumour extension) with the textual data (e.g. histological patterns). The graphical documentation was based on the four-layer visualization model that forms the interaction between different custom XML elements. Sensible personal data were encrypted with a 256-bit cryptographic algorithm to avoid misuse. In order to assess the clinical value, we retrospectively analysed the tumour extension in 255 patients after radical prostatectomy. Results The pathology report with cMDX can represent pathological findings of the prostate in schematic styles. Such reports can be integrated into the hospital information system. "cMDX" documents can be converted into different data formats like text, graphics and PDF. Supplementary tools like cMDX Editor and an analyser tool were implemented. The graphical analysis of 255 prostatectomy specimens showed that PCa were mostly localized in the peripheral zone (Mean: 73% ± 25). 54% of PCa showed a multifocal growth pattern. Conclusions cMDX can be used for routine histopathological reporting of radical prostatectomy specimens and provide data for scientific analysis. PMID:21078179

  5. MODFLOW-2000 : the U.S. Geological Survey modular ground-water model--documentation of the Advective-Transport Observation (ADV2) Package

    USGS Publications Warehouse

    Anderman, Evan R.; Hill, Mary Catherine

    2001-01-01

    Observations of the advective component of contaminant transport in steady-state flow fields can provide important information for the calibration of ground-water flow models. This report documents the Advective-Transport Observation (ADV2) Package, version 2, which allows advective-transport observations to be used in the three-dimensional ground-water flow parameter-estimation model MODFLOW-2000. The ADV2 Package is compatible with some of the features in the Layer-Property Flow and Hydrogeologic-Unit Flow Packages, but is not compatible with the Block-Centered Flow or Generalized Finite-Difference Packages. The particle-tracking routine used in the ADV2 Package duplicates the semi-analytical method of MODPATH, as shown in a sample problem. Particles can be tracked in a forward or backward direction, and effects such as retardation can be simulated through manipulation of the effective-porosity value used to calculate velocity. Particles can be discharged at cells that are considered to be weak sinks, in which the sink applied does not capture all the water flowing into the cell, using one of two criteria: (1) if there is any outflow to a boundary condition such as a well or surface-water feature, or (2) if the outflow exceeds a user specified fraction of the cell budget. Although effective porosity could be included as a parameter in the regression, this capability is not included in this package. The weighted sum-of-squares objective function, which is minimized in the Parameter-Estimation Process, was augmented to include the square of the weighted x-, y-, and z-components of the differences between the simulated and observed advective-front locations at defined times, thereby including the direction of travel as well as the overall travel distance in the calibration process. The sensitivities of the particle movement to the parameters needed to minimize the objective function are calculated for any particle location using the exact sensitivity

  6. Meaningful participation for children in the Dutch child protection system: A critical analysis of relevant provisions in policy documents.

    PubMed

    Bouma, Helen; López López, Mónica; Knorth, Erik J; Grietens, Hans

    2018-05-01

    Policymakers are increasingly focusing on the participation of children in the child protection system (CPS). However, research shows that actual practice still needs to be improved. Embedding children's participation in legislation and policy documents is one important prerequisite for achieving meaningful participation in child protection practice. In this study, the participation of children in the Dutch CPS under the new Youth Act 2015 is critically analyzed. National legislation and policy documents were studied using a model of "meaningful participation" based on article 12 of the UNCRC. Results show that the idea of children's participation is deeply embedded in the current Dutch CPS. However, Dutch policy documents do not fully cover the three dimensions of what is considered to be meaningful participation for children: informing, hearing, and involving. Furthermore, children's participation differs among the organizations included in the child protection chain. A clear overall policy concerning the participation of children in the Dutch CPS is lacking. The conclusions of this critical analysis of policy documents and the framework of meaningful participation presented may provide a basis for the embedding of meaningful participation for children in child protection systems of other countries. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Genesis Reentry Observations and Data Analysis

    NASA Technical Reports Server (NTRS)

    Suggs, R. M.; Swift, W. R.

    2005-01-01

    The Genesis spacecraft reentry represented a unique opportunity to observe a "calibrated meteor" from northern Nevada. Knowing its speed, mass, composition, and precise trajectory made it a good subject to test some of the algorithms used to determine meteoroid mass from observed brightness. It was also a good test of an inexpensive set of cameras that could be deployed to observe future shuttle reentries. The utility of consumer-grade video cameras was evident during the STS-107 accident investigation, and the Genesis reentry gave us the opportunity to specify and test commercially available cameras that could be used during future reentries. This Technical Memorandum describes the video observations and their analysis, compares the results with a simple photometric model, describes the forward scatter radar experiment, and lists lessons learned from the expedition and implications for the Stardust reentry in January 2006 as well as future shuttle reentries.

  8. The Role of Documentation Quality in Anesthesia-Related Closed Claims: A Descriptive Qualitative Study.

    PubMed

    Wilbanks, Bryan A; Geisz-Everson, Marjorie; Boust, Rebecca R

    2016-09-01

    Clinical documentation is a critical tool in supporting care provided to patients. Sound documentation provides a picture of clinical events that can be used to improve patient care. However, many other uses for clinical documentation are equally important. Such documentation informs clinical decision support tools, creates a legal record of patient care, assists in financial reimbursement of services, and serves as a repository for secondary data analysis. Conversely, poor documentation can impair patient safety and increase malpractice risk exposure by reflecting poor or inaccurate information that ultimately may guide patient care decisions.Through an examination of anesthesia-related closed claims, a descriptive qualitative study emerged, which explored the antecedents and consequences of documentation quality in the claims reviewed. A secondary data analysis utilized a database generated by the American Association of Nurse Anesthetists Foundation closed claim review team. Four major themes emerged from the analysis. Themes 1, 2, and 4 primarily describe how poor documentation quality can have negative consequences for clinicians. The third theme primarily describes how poor documentation quality that can negatively affect patient safety.

  9. PSD Guidance Document

    EPA Pesticide Factsheets

    This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  10. Questioned document workflow for handwriting with automated tools

    NASA Astrophysics Data System (ADS)

    Das, Krishnanand; Srihari, Sargur N.; Srinivasan, Harish

    2012-01-01

    During the last few years many document recognition methods have been developed to determine whether a handwriting specimen can be attributed to a known writer. However, in practice, the work-flow of the document examiner continues to be manual-intensive. Before a systematic or computational, approach can be developed, an articulation of the steps involved in handwriting comparison is needed. We describe the work flow of handwritten questioned document examination, as described in a standards manual, and the steps where existing automation tools can be used. A well-known ransom note case is considered as an example, where one encounters testing for multiple writers of the same document, determining whether the writing is disguised, known writing is formal while questioned writing is informal, etc. The findings for the particular ransom note case using the tools are given. Also observations are made for developing a more fully automated approach to handwriting examination.

  11. [Ethical analysis and commentary of Dignitas Personae document: from continuity toward the innovation].

    PubMed

    Pastor, Luis Miguel

    2011-01-01

    In 2008 [corrected] the Catholic Church published a document entitled Dignitas Personae (DP) about a range of bioethical issues related to the areas of assisted reproduction and human genetics. The objective of this paper is analyzing the issues treated in the same and comments the novelty of his arguments in the bioethical thinking of the Catholic Church. DP document has an introduction, three parts and a conclusion. The publication of document is due to recent advances that have occurred in recent years in the two areas mentioned above. This advances were not analyzed in a previously document called Donum Vitae (DV). DP analyzes these new advances from the anthropological and ethical approaches of DV. Not intending to contradict DV, the DP applies the arguments of DV to new situations. In both the title and elsewhere in the text it is affirmed that the human embryo has the dignity of human person. From this principle DP analyzes issues such as the status of the human embryo, intracytoplasmic sperm injection, (ICSI), preimplantation diagnosis, embryo cryopreservation, contragestion, embryo reduction etc. In these matters, as in the questions such as human genetics, cloning, gene therapy or the use of biological material obtained from abortions, the document reaffirms previous ideas of the Catholic Church, applies them to new problems or develops new arguments that will require further reflection. In conclusion, the document is very useful for understanding the current bioethical thinking of the Catholic Church on these issues; it clarifies certain disputes, suggesting new arguments, and leaves other issues to free discussion and subsequent interventions of the Catholic Magisterium. Finally, the document reaffirms the commitment of the Catholic Church to the poor of our techno-scientific society, the proletariat of the new century: human embryos.

  12. Army-NASA aircrew/aircraft integration program. Phase 5: A3I Man-Machine Integration Design and Analysis System (MIDAS) software concept document

    NASA Technical Reports Server (NTRS)

    Banda, Carolyn; Bushnell, David; Chen, Scott; Chiu, Alex; Neukom, Christian; Nishimura, Sayuri; Prevost, Michael; Shankar, Renuka; Staveland, Lowell; Smith, Greg

    1992-01-01

    This is the Software Concept Document for the Man-machine Integration Design and Analysis System (MIDAS) being developed as part of Phase V of the Army-NASA Aircrew/Aircraft Integration (A3I) Progam. The approach taken in this program since its inception in 1984 is that of incremental development with clearly defined phases. Phase 1 began in 1984 and subsequent phases have progressed at approximately 10-16 month intervals. Each phase of development consists of planning, setting requirements, preliminary design, detailed design, implementation, testing, demonstration and documentation. Phase 5 began with an off-site planning meeting in November, 1990. It is expected that Phase 5 development will be complete and ready for demonstration to invited visitors from industry, government and academia in May, 1992. This document, produced during the preliminary design period of Phase 5, is intended to record the top level design concept for MIDAS as it is currently conceived. This document has two main objectives: (1) to inform interested readers of the goals of the MIDAS Phase 5 development period, and (2) to serve as the initial version of the MIDAS design document which will be continuously updated as the design evolves. Since this document is written fairly early in the design period, many design issues still remain unresolved. Some of the unresolved issues are mentioned later in this document in the sections on specific components. Readers are cautioned that this is not a final design document and that, as the design of MIDAS matures, some of the design ideas recorded in this document will change. The final design will be documented in a detailed design document published after the demonstrations.

  13. Multisource data fusion for documenting archaeological sites

    NASA Astrophysics Data System (ADS)

    Knyaz, Vladimir; Chibunichev, Alexander; Zhuravlev, Denis

    2017-10-01

    The quality of archaeological sites documenting is of great importance for cultural heritage preserving and investigating. The progress in developing new techniques and systems for data acquisition and processing creates an excellent basis for achieving a new quality of archaeological sites documenting and visualization. archaeological data has some specific features which have to be taken into account when acquiring, processing and managing. First of all, it is a needed to gather as full as possible information about findings providing no loss of information and no damage to artifacts. Remote sensing technologies are the most adequate and powerful means which satisfy this requirement. An approach to archaeological data acquiring and fusion based on remote sensing is proposed. It combines a set of photogrammetric techniques for obtaining geometrical and visual information at different scales and detailing and a pipeline for archaeological data documenting, structuring, fusion, and analysis. The proposed approach is applied for documenting of Bosporus archaeological expedition of Russian State Historical Museum.

  14. Simple Levelized Cost of Energy (LCOE) Calculator Documentation | Energy

    Science.gov Websites

    Analysis | NREL Simple Levelized Cost of Energy (LCOE) Calculator Documentation Simple Levelized Cost of Energy (LCOE) Calculator Documentation Transparent Cost Database Button This is a simple : 1). Cost and Performance Adjust the sliders to suitable values for each of the cost and performance

  15. "I Like to Plan Events": A Document Analysis of Essays Written by Applicants to a Public Relations Program

    ERIC Educational Resources Information Center

    Taylor, Ronald E.

    2016-01-01

    A document analysis of 249 essays written during a 5-year period by applicants to a public relations program at a major state university in the southeast suggests that there are enduring reasons why students choose to major in public relations. Public relations is described as a major that allows for and encourages creative expression and that…

  16. English "in the Context of" European Integration: A Corpus-Driven Analysis of Lexical Bundles in English EU Documents

    ERIC Educational Resources Information Center

    Jablonkai, Reka

    2010-01-01

    This study extends research into the use of English as a lingua franca in the European context by investigating the most frequent word combinations in English documents issued by EU institutions. As there is little research on the use of the English language within the European Union for ESP pedagogic purposes, as part of a larger scale analysis,…

  17. SGML and HTML: The Merging of Document Management and Electronic Document Publishing.

    ERIC Educational Resources Information Center

    Dixon, Ross

    1996-01-01

    Document control is an issue for organizations that use SGML/HTML. The prevalent approach is to apply the same techniques to document elements that are applied to full documents, a practice that has led to an overlap of electronic publishing and document management. Lists requirements for the management of SGML/HTML documents. (PEN)

  18. Reactive documentation system

    NASA Astrophysics Data System (ADS)

    Boehnlein, Thomas R.; Kramb, Victoria

    2018-04-01

    Proper formal documentation of computer acquired NDE experimental data generated during research is critical to the longevity and usefulness of the data. Without documentation describing how and why the data was acquired, NDE research teams lose capability such as their ability to generate new information from previously collected data or provide adequate information so that their work can be replicated by others seeking to validate their research. Despite the critical nature of this issue, NDE data is still being generated in research labs without appropriate documentation. By generating documentation in series with data, equal priority is given to both activities during the research process. One way to achieve this is to use a reactive documentation system (RDS). RDS prompts an operator to document the data as it is generated rather than relying on the operator to decide when and what to document. This paper discusses how such a system can be implemented in a dynamic environment made up of in-house and third party NDE data acquisition systems without creating additional burden on the operator. The reactive documentation approach presented here is agnostic enough that the principles can be applied to any operator controlled, computer based, data acquisition system.

  19. Public versus internal conceptions of addiction: An analysis of internal Philip Morris documents

    PubMed Central

    2018-01-01

    Background Tobacco addiction is a complex, multicomponent phenomenon stemming from nicotine’s pharmacology and the user’s biology, psychology, sociology, and environment. After decades of public denial, the tobacco industry now agrees with public health authorities that nicotine is addictive. In 2000, Philip Morris became the first major tobacco company to admit nicotine’s addictiveness. Evolving definitions of addiction have historically affected subsequent policymaking. This article examines how Philip Morris internally conceptualized addiction immediately before and after this announcement. Methods and findings We analyzed previously secret, internal Philip Morris documents made available as a result of litigation against the tobacco industry. We compared these documents to public company statements and found that Philip Morris’s move from public denial to public affirmation of nicotine’s addictiveness coincided with pressure on the industry from poor public approval ratings, the Master Settlement Agreement (MSA), the United States government’s filing of the Racketeer Influenced and Corrupt Organizations (RICO) suit, and the Institute of Medicine’s (IoM’s) endorsement of potentially reduced risk products. Philip Morris continued to research the causes of addiction through the 2000s in order to create successful potentially reduced exposure products (PREPs). While Philip Morris’s public statements reinforce the idea that nicotine’s pharmacology principally drives smoking addiction, company scientists framed addiction as the result of interconnected biological, social, psychological, and environmental determinants, with nicotine as but one component. Due to the fragmentary nature of the industry document database, we may have missed relevant information that could have affected our analysis. Conclusions Philip Morris’s research suggests that tobacco industry activity influences addiction treatment outcomes. Beyond nicotine’s pharmacology

  20. Starlink Document Styles

    NASA Astrophysics Data System (ADS)

    Lawden, M. D.

    This document describes the various styles which are recommended for Starlink documents. It also explains how to use the templates which are provided by Starlink to help authors create documents in a standard style. This paper is concerned mainly with conveying the ``look and feel" of the various styles of Starlink document rather than describing the technical details of how to produce them. Other Starlink papers give recommendations for the detailed aspects of document production, design, layout, and typography. The only style that is likely to be used by most Starlink authors is the Standard style.

  1. Teaching Documentation Writing: What Else Students--And Instructors--Should Know.

    ERIC Educational Resources Information Center

    Boiarsky, Carolyn; Dobberstein, Michael

    1998-01-01

    Discusses the knowledge, problem-solving strategies, and desktop publishing skills students need to learn about documentation writing. Describes a course developed by the authors that provides these skills, focusing on strategies for problem solving, user analysis, conventions, document design and desktop publishing, and using authentic…

  2. Accountability through Documentation: What Are Best Practices for School Counselors?

    ERIC Educational Resources Information Center

    Wehrman, Joseph D.; Williams, Rhonda; Field, Julaine; Schroeder, Shanna Dahl

    2010-01-01

    This article provides an analysis of important considerations for documentation for school counselors. Although the American School Counseling Association (ASCA) does not provide a national protocol for documentation of school counseling services, the ASCA Ethical guidelines provide insight into ethical record keeping which protects student…

  3. Rendering of Names of Corporate Bodies. Subject Analysis, With Special Reference to Social Sciences. Documentation Systems for Industry (8th Annual Seminar). Part 1: Papers.

    ERIC Educational Resources Information Center

    Documentation Research and Training Centre, Bangalore (India).

    The four sections of the report cover the topics of cataloging, subject analysis, documentation systems for industry and the Documentation Research and Training Centre (DRTC) research report for 1970. The cataloging section covers the conflicts of cataloging, recall, corporate bodies, titles, publishers series and the entity name. The subject…

  4. Documenting with Parents and Toddlers: A Finnish Case Study

    ERIC Educational Resources Information Center

    Rintakorpi, Kati; Lipponen, Lasse; Reunamo, Jyrki

    2014-01-01

    In recent years, there has been a growing interest in pedagogical documentation and the way in which it can be applied to advance pedagogical practices in early childhood education. This study is a case analysis which focuses on the transition phase from home to kindergarten of a toddler, Leo, and his family. Documentation was performed by the…

  5. Quantifying Selection Bias in National Institute of Health Stroke Scale Data Documented in an Acute Stroke Registry.

    PubMed

    Thompson, Michael P; Luo, Zhehui; Gardiner, Joseph; Burke, James F; Nickles, Adrienne; Reeves, Mathew J

    2016-05-01

    As a measure of stroke severity, the National Institutes of Health Stroke Scale (NIHSS) is an important predictor of patient- and hospital-level outcomes, yet is often undocumented. The purpose of this study is to quantify and correct for potential selection bias in observed NIHSS data. Data were obtained from the Michigan Stroke Registry and included 10 262 patients with ischemic stroke aged ≥65 years discharged from 23 hospitals from 2009 to 2012, of which 74.6% of patients had documented NIHSS. We estimated models predicting NIHSS documentation and NIHSS score and used the Heckman selection model to estimate a correlation coefficient (ρ) between the 2 model error terms, which quantifies the degree of selection bias in the documentation of NIHSS. The Heckman model found modest, but significant, selection bias (ρ=0.19; 95% confidence interval: 0.09, 0.29; P<0.001), indicating that because NIHSS score increased (ie, strokes were more severe), the probability of documentation also increased. We also estimated a selection bias-corrected population mean NIHSS score of 4.8, which was substantially lower than the observed mean NIHSS score of 7.4. Evidence of selection bias was also identified using hospital-level analysis, where increased NIHSS documentation was correlated with lower mean NIHSS scores (r=-0.39; P<0.001). We demonstrate modest, but important, selection bias in documented NIHSS data, which are missing more often in patients with less severe stroke. The population mean NIHSS score was overestimated by >2 points, which could significantly alter the risk profile of hospitals treating patients with ischemic stroke and subsequent hospital risk-adjusted outcomes. © 2016 American Heart Association, Inc.

  6. CAED Document Repository

    EPA Pesticide Factsheets

    Compliance Assurance and Enforcement Division Document Repository (CAEDDOCRESP) provides internal and external access of Inspection Records, Enforcement Actions, and National Environmental Protection Act (NEPA) documents to all CAED staff. The respository will also include supporting documents, images, etc.

  7. An ideal observer analysis of visual working memory.

    PubMed

    Sims, Chris R; Jacobs, Robert A; Knill, David C

    2012-10-01

    Limits in visual working memory (VWM) strongly constrain human performance across many tasks. However, the nature of these limits is not well understood. In this article we develop an ideal observer analysis of human VWM by deriving the expected behavior of an optimally performing but limited-capacity memory system. This analysis is framed around rate-distortion theory, a branch of information theory that provides optimal bounds on the accuracy of information transmission subject to a fixed information capacity. The result of the ideal observer analysis is a theoretical framework that provides a task-independent and quantitative definition of visual memory capacity and yields novel predictions regarding human performance. These predictions are subsequently evaluated and confirmed in 2 empirical studies. Further, the framework is general enough to allow the specification and testing of alternative models of visual memory (e.g., how capacity is distributed across multiple items). We demonstrate that a simple model developed on the basis of the ideal observer analysis-one that allows variability in the number of stored memory representations but does not assume the presence of a fixed item limit-provides an excellent account of the empirical data and further offers a principled reinterpretation of existing models of VWM. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  8. Longitudinal analysis on utilization of medical document management system in a hospital with EPR implementation.

    PubMed

    Kuwata, Shigeki; Yamada, Hitomi; Park, Keunsik

    2011-01-01

    Document management systems (DMS) have widespread in major hospitals in Japan as a platform to digitize the paper-based records being out of coverage by EPR. This study aimed to examine longitudinal trends of actual use of DMS in a hospital in which EPR had been in operation, which would be conducive to planning the further information management system in the hospital. Degrees of utilization of electronic documents and templates with DMS were analyzed based on data extracted from a university-affiliated hospital with EPR. As a result, it was found that the number of electronic documents as well as scanned documents circulating at the hospital tended to increase. The result indicated that replacement of paper-based documents with electronic documents did not occur. Therefore it was anticipated that the need for DMS would continue to increase in the hospital. The methods used this study to analyze the trend of DMS utilization would be applicable to other hospitals with with a variety of DMS implementation, such as electronic storage by scanning documents or paper preservation that is compatible with EPR.

  9. Teaching Integrity in Empirical Research: A Protocol for Documenting Data Management and Analysis

    ERIC Educational Resources Information Center

    Ball, Richard; Medeiros, Norm

    2012-01-01

    This article describes a protocol the authors developed for teaching undergraduates to document their statistical analyses for empirical research projects so that their results are completely reproducible and verifiable. The protocol is guided by the principle that the documentation prepared to accompany an empirical research project should be…

  10. One-click scanning of large-size documents using mobile phone camera

    NASA Astrophysics Data System (ADS)

    Liu, Sijiang; Jiang, Bo; Yang, Yuanjie

    2016-07-01

    Currently mobile apps for document scanning do not provide convenient operations to tackle large-size documents. In this paper, we present a one-click scanning approach for large-size documents using mobile phone camera. After capturing a continuous video of documents, our approach automatically extracts several key frames by optical flow analysis. Then based on key frames, a mobile GPU based image stitching method is adopted to generate a completed document image with high details. There are no extra manual intervention in the process and experimental results show that our app performs well, showing convenience and practicability for daily life.

  11. STRengthening Analytical Thinking for Observational Studies: the STRATOS initiative

    PubMed Central

    Sauerbrei, Willi; Abrahamowicz, Michal; Altman, Douglas G; le Cessie, Saskia; Carpenter, James

    2014-01-01

    The validity and practical utility of observational medical research depends critically on good study design, excellent data quality, appropriate statistical methods and accurate interpretation of results. Statistical methodology has seen substantial development in recent times. Unfortunately, many of these methodological developments are ignored in practice. Consequently, design and analysis of observational studies often exhibit serious weaknesses. The lack of guidance on vital practical issues discourages many applied researchers from using more sophisticated and possibly more appropriate methods when analyzing observational studies. Furthermore, many analyses are conducted by researchers with a relatively weak statistical background and limited experience in using statistical methodology and software. Consequently, even ‘standard’ analyses reported in the medical literature are often flawed, casting doubt on their results and conclusions. An efficient way to help researchers to keep up with recent methodological developments is to develop guidance documents that are spread to the research community at large. These observations led to the initiation of the strengthening analytical thinking for observational studies (STRATOS) initiative, a large collaboration of experts in many different areas of biostatistical research. The objective of STRATOS is to provide accessible and accurate guidance in the design and analysis of observational studies. The guidance is intended for applied statisticians and other data analysts with varying levels of statistical education, experience and interests. In this article, we introduce the STRATOS initiative and its main aims, present the need for guidance documents and outline the planned approach and progress so far. We encourage other biostatisticians to become involved. PMID:25074480

  12. STRengthening analytical thinking for observational studies: the STRATOS initiative.

    PubMed

    Sauerbrei, Willi; Abrahamowicz, Michal; Altman, Douglas G; le Cessie, Saskia; Carpenter, James

    2014-12-30

    The validity and practical utility of observational medical research depends critically on good study design, excellent data quality, appropriate statistical methods and accurate interpretation of results. Statistical methodology has seen substantial development in recent times. Unfortunately, many of these methodological developments are ignored in practice. Consequently, design and analysis of observational studies often exhibit serious weaknesses. The lack of guidance on vital practical issues discourages many applied researchers from using more sophisticated and possibly more appropriate methods when analyzing observational studies. Furthermore, many analyses are conducted by researchers with a relatively weak statistical background and limited experience in using statistical methodology and software. Consequently, even 'standard' analyses reported in the medical literature are often flawed, casting doubt on their results and conclusions. An efficient way to help researchers to keep up with recent methodological developments is to develop guidance documents that are spread to the research community at large. These observations led to the initiation of the strengthening analytical thinking for observational studies (STRATOS) initiative, a large collaboration of experts in many different areas of biostatistical research. The objective of STRATOS is to provide accessible and accurate guidance in the design and analysis of observational studies. The guidance is intended for applied statisticians and other data analysts with varying levels of statistical education, experience and interests. In this article, we introduce the STRATOS initiative and its main aims, present the need for guidance documents and outline the planned approach and progress so far. We encourage other biostatisticians to become involved. © 2014 The Authors. Statistics in Medicine published by John Wiley & Sons, Ltd.

  13. Document Set Differentiability Analyzer v. 0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osborn, Thor D.

    Software is a JMP Scripting Language (JSL) script designed to evaluate the differentiability of a set of documents that exhibit some conceptual commonalities but are expected to describe substantially different – thus differentiable – categories. The script imports the document set, a subset of which may be partitioned into an additions pool. The bulk of the documents form a basis pool. Text analysis is applied to the basis pool to extract a mathematical representation of its conceptual content, referred to as the document concept space. A bootstrapping approach is applied to that mathematical representation in order to generate a representationmore » of a large population of randomly designed documents that could be written within the concept space, notably without actually writing the text of those documents.The Kolmogorov-Smirnov test is applied to determine whether the basis pool document set exhibits superior differentiation relative to the randomly designed virtual documents produced by bootstrapping. If an additions pool exists, the documents are incrementally added to the basis pool, choosing the best differentiated remaining document at each step. In this manner the impact of additional categories to overall document set differentiability may be assessed.The software was developed to assess the differentiability of job description document sets. Differentiability is key to meaningful categorization. Poor job differentiation may have economic, ethical, and/or legal implications for an organization. Job categories are used in the assignment of market-based salaries; consequently, poor differentiation of job duties may set the stage for legal challenges if very similar jobs pay differently depending on title, a circumstance that also invites economic waste.The software can be applied to ensure job description set differentiability, reducing legal, economic, and ethical risks to an organization and its people. The extraction of the conceptual space to a

  14. Linking Science Analysis with Observation Planning: A Full Circle Data Lifecycle

    NASA Technical Reports Server (NTRS)

    Grosvenor, Sandy; Jones, Jeremy; Koratkar, Anuradha; Li, Connie; Mackey, Jennifer; Neher, Ken; Wolf, Karl; Obenschain, Arthur F. (Technical Monitor)

    2001-01-01

    A clear goal of the Virtual Observatory (VO) is to enable new science through analysis of integrated astronomical archives. An additional and powerful possibility of the VO is to link and integrate these new analyses with planning of new observations. By providing tools that can be used for observation planning in the VO, the VO will allow the data lifecycle to come full circle: from theory to observations to data and back around to new theories and new observations. The Scientist's Expert Assistant (SEA) Simulation Facility (SSF) is working to combine the ability to access existing archives with the ability to model and visualize new observations. Integrating the two will allow astronomers to better use the integrated archives of the VO to plan and predict the success of potential new observations more efficiently, The full circle lifecycle enabled by SEA can allow astronomers to make substantial leaps in the quality of data and science returns on new observations. Our paper examines the exciting potential of integrating archival analysis with new observation planning, such as performing data calibration analysis on archival images and using that analysis to predict the success of new observations, or performing dynamic signal-to-noise analysis combining historical results with modeling of new instruments or targets. We will also describe how the development of the SSF is progressing and what have been its successes and challenges.

  15. Linking Science Analysis with Observation Planning: A Full Circle Data Lifecycle

    NASA Technical Reports Server (NTRS)

    Jones, Jeremy; Grosvenor, Sandy; Wolf, Karl; Li, Connie; Koratkar, Anuradha; Powers, Edward I. (Technical Monitor)

    2001-01-01

    A clear goal of the Virtual Observatory (VO) is to enable new science through analysis of integrated astronomical archives. An additional and powerful possibility of the VO is to link and integrate these new analyses with planning of new observations. By providing tools that can be used for observation planning in the VO, the VO will allow the data lifecycle to come full circle: from theory to observations to data and back around to new theories and new observations. The Scientist's Expert Assistant (SEA) Simulation Facility (SSF) is working to combine the ability to access existing archives with the ability to model and visualize new observations. Integrating the two will allow astronomers to better use the integrated archives of the VO to plan and predict the success of potential new observations. The full circle lifecycle enabled by SEA can allow astronomers to make substantial leaps in the quality of data and science returns on new observations. Our paper will examine the exciting potential of integrating archival analysis with new observation planning, such as performing data calibration analysis on archival images and using that analysis to predict the success of new observations, or performing dynamic signal-to-noise analysis combining historical results with modeling of new instruments or targets. We will also describe how the development of the SSF is progressing and what has been its successes and challenges.

  16. JSC document index

    NASA Technical Reports Server (NTRS)

    1988-01-01

    The Johnson Space Center (JSC) document index is intended to provide a single source listing of all published JSC-numbered documents their authors, and the designated offices of prime responsibility (OPR's) by mail code at the time of publication. The index contains documents which have been received and processed by the JSC Technical Library as of January 13, 1988. Other JSC-numbered documents which are controlled but not available through the JSC Library are also listed.

  17. Gaia DR2 documentation Chapter 1: Introduction

    NASA Astrophysics Data System (ADS)

    de Bruijne, J. H. J.; Abreu, A.; Brown, A. G. A.; Castañeda, J.; Cheek, N.; Crowley, C.; De Angeli, F.; Drimmel, R.; Fabricius, C.; Fleitas, J.; Gracia-Abril, G.; Guerra, R.; Hutton, A.; Messineo, R.; Mora, A.; Nienartowicz, K.; Panem, C.; Siddiqui, H.

    2018-04-01

    This chapter of the Gaia DR2 documentation describes the Gaia mission, the Gaia spacecraft, and the organisation of the Gaia Data Processing and Analysis Consortium (DPAC), which is responsible for the processing and analysis of the Gaia data. Furthermore, various properties of the data release are summarised, including statistical properties, object statistics, completeness, selection and filtering criteria, and limitations of the data.

  18. Capturing User Reading Behaviors for Personalized Document Summarization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Songhua; Jiang, Hao; Lau, Francis

    2011-01-01

    We propose a new personalized document summarization method that observes a user's personal reading preferences. These preferences are inferred from the user's reading behaviors, including facial expressions, gaze positions, and reading durations that were captured during the user's past reading activities. We compare the performance of our algorithm with that of a few peer algorithms and software packages. The results of our comparative study show that our algorithm can produce more superior personalized document summaries than all the other methods in that the summaries generated by our algorithm can better satisfy a user's personal preferences.

  19. FUSE Observations of Galactic and LMC Novae in Outburst

    NASA Technical Reports Server (NTRS)

    Huschildt, P. H.

    2001-01-01

    This document is a collection of five abstracts from papers written on the 'FUSE Observations of Galactic and LMC Novae in Outburst'. The titles are the following: (1) Analyzing FUSE Observations of Galactic and LMC Novae; (2) Detailed NLTE Model Atmospheres for Novae during Outburst: Modeling Optical and Ultraviolet Observations for Nova LMC 1988; (3) Numerical Solution of the Expanding Stellar Atmosphere Problem; (4) A Non-LTE Line-Blanketed Expanding Atmosphere Model for A-supergiant Alpha Cygni; and (5) Non-LTE Model Atmosphere Analysis of the Early Ultraviolet Spectra of Nova Andromedae 1986. A list of journal publications is also included.

  20. MODFLOW-2000, the U.S. Geological Survey modular ground-water model -- Documentation of MOD-PREDICT for predictions, prediction sensitivity analysis, and evaluation of uncertainty

    USGS Publications Warehouse

    Tonkin, M.J.; Hill, Mary C.; Doherty, John

    2003-01-01

    This document describes the MOD-PREDICT program, which helps evaluate userdefined sets of observations, prior information, and predictions, using the ground-water model MODFLOW-2000. MOD-PREDICT takes advantage of the existing Observation and Sensitivity Processes (Hill and others, 2000) by initiating runs of MODFLOW-2000 and using the output files produced. The names and formats of the MODFLOW-2000 input files are unchanged, such that full backward compatibility is maintained. A new name file and input files are required for MOD-PREDICT. The performance of MOD-PREDICT has been tested in a variety of applications. Future applications, however, might reveal errors that were not detected in the test simulations. Users are requested to notify the U.S. Geological Survey of any errors found in this document or the computer program using the email address available at the web address below. Updates might occasionally be made to this document, to the MOD-PREDICT program, and to MODFLOW- 2000. Users can check for updates on the Internet at URL http://water.usgs.gov/software/ground water.html/.

  1. A ward round proforma improves documentation and communication.

    PubMed

    Alazzawi, Sulaiman; Silk, Zacharia; Saha, Urmila U; Auplish, Sunil; Masterson, Sean

    2016-12-02

    This article present the results of an audit cycle which evaluated the quality of inpatient ward round documentation in a busy district general hospital before and after the implementation of a standardized proforma which was specifically designed for trauma and orthopaedic patients. In each cycle, 20 case notes were examined and the data analysed to examine three main areas: Diagnosis, management and/or discharge plan Objective assessments including neurovascular status, weight-bearing status, surgical wound review, observations, results of investigations and decision from the daily trauma meeting Logistics of the documentation such as legibility, date and time, name and grade of the doctor and contact number. This audit demonstrated that using a ward round proforma can significantly enhance the quality of documentation and improve communication between multidisciplinary team members.

  2. Arab oil weapon. [documents, treaties, commentaries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paust, J.J.; Blaustein, A.P.

    1977-01-01

    This compilation of publications dealing with the Arab oil weapon presents documents and commentaries. In Part I, the Embargo, fhe documents include: Historical Chronologies; The United States Oil Shortage and the Arab-Israelic Conflict; OPEC: Oil Negotations, OPEC, and the Stability of Supply; and OPEC Resolutions and Other Documents. Commentaries include: The Arab Oil Weapon--A Threat to International Peace, by Jordon J. Paust and Albert P. Blaustein; Destination Embargo of Arab Oil: Its Legality Under International Law, by Ibrahim F. I. Shihata; The Arab Oil Weapon: A Reply and Re-Affirmation of Illegality, by Jordan J. Paust and Albert P. Blaustein; Economicmore » Coercion and the International Legal Order, by Richard B. Lillich; Some Politico-Legal Aspects of Resource Scarcity, by Timothy Stanley; and OPEC in the Context of the Global Power Equation, by Jahangir Amuzegar. Part II, The Response, includes the following documents: Presidential Statements: Carter and Ford; The Energy Crisis: Strategy for Cooperative Action, by Henry A. Kissinger; Oil Fields as Military Objectives; and Data and Analysis: Concerning the Possibility of a U.S. Food Embargo as a Response to the present Arab Oil Boycott. The commentaries in Part II are: Oil: The Issue of American Intervention, by Robert W. Tucker; War--The Ultimate Antitrust Actions, by Andrew Tobias; and The Need for Negotiated Reforms, by John H. Jackson. Part III, Legal Framework, contains 10 United Nations documents and 4 treaties. (MCW)« less

  3. Managing Documents in the Wider Area: Intelligent Document Management.

    ERIC Educational Resources Information Center

    Bittleston, Richard

    1995-01-01

    Discusses techniques for managing documents in wide area networks, reviews technique limitations, and offers recommendations to database designers. Presented techniques include: increasing bandwidth, reducing data traffic, synchronizing documentation, partial synchronization, audit trials, navigation, and distribution control and security. Two…

  4. How Should We Treat the Vulnerable?: Qualitative Study of Authoritative Ethics Documents.

    PubMed

    Zagorac, Ivana

    2016-01-01

    The aim of this study is to explore what actual guidance is provided by authoritative ethics documents regarding the recognition and protection of the vulnerable. The documents included in this analysis are the Belmont Report, the Declaration of Helsinki, The Council for International Organizations of Medical Sciences (CIOMS) Guidelines, and the UNESCO Universal Declaration on Bioethics and Human Rights, including its supplementary report on vulnerability. A qualitative analysis of these documents was conducted in light of three questions: what is vulnerability, who are the vulnerable, and how should the vulnerable be protected? The results show significant differences among the documents regarding the first two questions. None of the documents provides any guidance on the third question (how to protect the vulnerable). These results suggest a great discrepancy between the acknowledged importance of the concept of vulnerability and a general understanding of the scope, content, and practical implications of vulnerability.

  5. Perspectives of Education for Sustainable Development--Understanding and Introducing the Notion in Polish Educational Documents

    ERIC Educational Resources Information Center

    Czapla, Malgorzata; Berlinska, Agnieszka

    2011-01-01

    The aim of this article is to present an analysis of formal educational documents in the context of the sustainable development notion. This goal was realised by an analysis of the National Curriculum Framework documents from 2002 in comparison with the newest document from 2008. In addition, seven teaching programmes were analysed. On the grounds…

  6. BIM applied in historical building documentation and refurbishing

    NASA Astrophysics Data System (ADS)

    Cheng, H.-M.; Yang, W.-B.; Yen, Y.-N.

    2015-08-01

    Historical building conservation raises two important issues which are documentation and refurbishing. For the recording and documentation, we already have developed 3d laser scanner and such photogrammetry technology those represent a freeze object of virtual reality by digital documentation. On the other hand, the refurbished engineering of historic building is a challenge for conservation heritage which are not only reconstructing the damage part but also restoring tangible cultural heritage. 3D digital cultural heritage models has become a topic of great interest in recent years. One reason for this is the more widespread use of laser scanning and photogrammetry for recording cultural heritage sites. These technologies have made it possible to efficiently and accurately record complex structures remotely that would not have been possible with previous survey methods. In addition to these developments, digital information systems are evolving for the presentation, analysis and archival of heritage documentation.

  7. Manned space flight nuclear system safety. Volume 3: Reactor system preliminary nuclear safety analysis. Part 2: Accident Model Document (AMD)

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The Accident Model Document is one of three documents of the Preliminary Safety Analysis Report (PSAR) - Reactor System as applied to a Space Base Program. Potential terrestrial nuclear hazards involving the zirconium hydride reactor-Brayton power module are identified for all phases of the Space Base program. The accidents/events that give rise to the hazards are defined and abort sequence trees are developed to determine the sequence of events leading to the hazard and the associated probabilities of occurence. Source terms are calculated to determine the magnitude of the hazards. The above data is used in the mission accident analysis to determine the most probable and significant accidents/events in each mission phase. The only significant hazards during the prelaunch and launch ascent phases of the mission are those which arise form criticality accidents. Fission product inventories during this time period were found to be very low due to very limited low power acceptance testing.

  8. Font adaptive word indexing of modern printed documents.

    PubMed

    Marinai, Simone; Marino, Emanuele; Soda, Giovanni

    2006-08-01

    We propose an approach for the word-level indexing of modern printed documents which are difficult to recognize using current OCR engines. By means of word-level indexing, it is possible to retrieve the position of words in a document, enabling queries involving proximity of terms. Web search engines implement this kind of indexing, allowing users to retrieve Web pages on the basis of their textual content. Nowadays, digital libraries hold collections of digitized documents that can be retrieved either by browsing the document images or relying on appropriate metadata assembled by domain experts. Word indexing tools would therefore increase the access to these collections. The proposed system is designed to index homogeneous document collections by automatically adapting to different languages and font styles without relying on OCR engines for character recognition. The approach is based on three main ideas: the use of Self Organizing Maps (SOM) to perform unsupervised character clustering, the definition of one suitable vector-based word representation whose size depends on the word aspect-ratio, and the run-time alignment of the query word with indexed words to deal with broken and touching characters. The most appropriate applications are for processing modern printed documents (17th to 19th centuries) where current OCR engines are less accurate. Our experimental analysis addresses six data sets containing documents ranging from books of the 17th century to contemporary journals.

  9. Documenting Employee Conduct

    ERIC Educational Resources Information Center

    Dalton, Jason

    2009-01-01

    One of the best ways for a child care program to lose an employment-related lawsuit is failure to document the performance of its employees. Documentation of an employee's performance can provide evidence of an employment-related decision such as discipline, promotion, or discharge. When properly implemented, documentation of employee performance…

  10. HIS-based Kaplan-Meier plots--a single source approach for documenting and reusing routine survival information.

    PubMed

    Breil, Bernhard; Semjonow, Axel; Müller-Tidow, Carsten; Fritz, Fleur; Dugas, Martin

    2011-02-16

    Survival or outcome information is important for clinical routine as well as for clinical research and should be collected completely, timely and precisely. This information is relevant for multiple usages including quality control, clinical trials, observational studies and epidemiological registries. However, the local hospital information system (HIS) does not support this documentation and therefore this data has to generated by paper based or spreadsheet methods which can result in redundantly documented data. Therefore we investigated, whether integrating the follow-up documentation of different departments in the HIS and reusing it for survival analysis can enable the physician to obtain survival curves in a timely manner and to avoid redundant documentation. We analysed the current follow-up process of oncological patients in two departments (urology, haematology) with respect to different documentation forms. We developed a concept for comprehensive survival documentation based on a generic data model and implemented a follow-up form within the HIS of the University Hospital Muenster which is suitable for a secondary use of these data. We designed a query to extract the relevant data from the HIS and implemented Kaplan-Meier plots based on these data. To re-use this data sufficient data quality is needed. We measured completeness of forms with respect to all tumour cases in the clinic and completeness of documented items per form as incomplete information can bias results of the survival analysis. Based on the form analysis we discovered differences and concordances between both departments. We identified 52 attributes from which 13 were common (e.g. procedures and diagnosis dates) and were used for the generic data model. The electronic follow-up form was integrated in the clinical workflow. Survival data was also retrospectively entered in order to perform survival and quality analyses on a comprehensive data set. Physicians are now able to generate

  11. Documentation of Atmospheric Conditions During Observed Rising Aircraft Wakes

    NASA Technical Reports Server (NTRS)

    Zak, J. Allen; Rodgers, William G., Jr.

    1997-01-01

    Flight tests were conducted in the fall of 1995 off the coast of Wallops Island, Virginia in order to determine characteristics of wake vortices at flight altitudes. A NASA Wallops Flight Facility C130 aircraft equipped with smoke generators produced visible wakes at altitudes ranging from 775 to 2225 m in a variety of atmospheric conditions, orientations (head wind, cross wind), and airspeeds. Meteorological and aircraft parameters were collected continuously from a Langley Research Center OV-10A aircraft as it flew alongside and through the wake vortices at varying distances behind the C130. Meteorological data were also obtained from special balloon observations made at Wallops. Differential GPS capabilities were on each aircraft from which accurate altitude profiles were obtained. Vortices were observed to rise at distances beyond a mile behind the C130. The maximum altitude was 150 m above the C130 in a near neutral atmosphere with significant turbulence. This occurred from large vertical oscillations in the wakes. There were several cases when vortices did not descend after a very short initial period and remained near generation altitude in a variety of moderately stable atmospheres and wind shears.

  12. Documentation of indigenous Pacific agroforestry systems: a review of methodologies

    Treesearch

    Bill Raynor

    1993-01-01

    Recent interest in indigenous agroforestry has led to a need for documentation of these systems. However, previous work is very limited, and few methodologies are well-known or widely accepted. This paper outlines various methodologies (including sampling methods, data to be collected, and considerations in analysis) for documenting structure and productivity of...

  13. Enhancing the performance of gastrointestinal tumour board by improving documentation.

    PubMed

    Alsuhaibani, Roaa Saleh; Alzahrani, Hajer; Algwaiz, Ghada; Alfarhan, Haneen; Alolayan, Ashwaq; Abdelhafiz, Nafisa; Ali, Yosra; Jazieh, Abdul Rahman

    2018-01-01

    Tumour board contributes to providing better patient care by using a multidisciplinary team approach. In the efforts of evaluating the performance of the gastrointestinal tumour board at our institution, it was difficult to assess past performance due to lack of proper use of standardised documentation tool. This project aimed at improving adherence to the documentation tool and its recommendations in order to obtain performance measures for the tumour board. A multidisciplinary team and a plan were developed to improve documentation. Four rapid improvement cycles, Plan-Do-Study-Act (PDSA) cycles, were conducted. The first cycle focused on updating the case discussion summary form (CDSF) based on experts' input and previous identified deficiencies to enhance documentation and improve performance. The second PDSA cycle aimed at incorporating the CDSF into the electronic medical records system and assessing its functionality. The third cycle was to orient and train staff on using the form and launching it. The fourth PDSA cycle aimed at assessing the ability to obtain tumour board performance measures. Adherence to completion of the CDSF improved from 82% (baseline) to 94% after the fourth PDSA cycle. Over 104 consecutive cases discussed in the tumour board between January and July 2016 and 76 cases discussed in 2015, results were as follows: adherence to National Comprehensive Cancer Network guidelines in 2016 was observed in 141 (95%) recommendations, while it was observed in 90 (92%) recommendations in 2015. Changes in the management plans were observed in 37 (36%) cases in 2016 and in 6 (8%) cases in 2015. Regarding tumour board recommendations, 87% were done within 3 months of tumour board discussion in 2016, while 69% were done in 2015. Implementing electronic standardised documentation tool improved communication among the team and enabled getting accurate data about performance measures of the tumour board with positive impact on healthcare process and

  14. Enhancing the performance of gastrointestinal tumour board by improving documentation

    PubMed Central

    Alsuhaibani, Roaa Saleh; Alzahrani, Hajer; Algwaiz, Ghada; Alfarhan, Haneen; Alolayan, Ashwaq; Abdelhafiz, Nafisa; Ali, Yosra; Jazieh, Abdul Rahman

    2018-01-01

    Tumour board contributes to providing better patient care by using a multidisciplinary team approach. In the efforts of evaluating the performance of the gastrointestinal tumour board at our institution, it was difficult to assess past performance due to lack of proper use of standardised documentation tool. This project aimed at improving adherence to the documentation tool and its recommendations in order to obtain performance measures for the tumour board. A multidisciplinary team and a plan were developed to improve documentation. Four rapid improvement cycles, Plan–Do–Study–Act (PDSA) cycles, were conducted. The first cycle focused on updating the case discussion summary form (CDSF) based on experts’ input and previous identified deficiencies to enhance documentation and improve performance. The second PDSA cycle aimed at incorporating the CDSF into the electronic medical records system and assessing its functionality. The third cycle was to orient and train staff on using the form and launching it. The fourth PDSA cycle aimed at assessing the ability to obtain tumour board performance measures. Adherence to completion of the CDSF improved from 82% (baseline) to 94% after the fourth PDSA cycle. Over 104 consecutive cases discussed in the tumour board between January and July 2016 and 76 cases discussed in 2015, results were as follows: adherence to National Comprehensive Cancer Network guidelines in 2016 was observed in 141 (95%) recommendations, while it was observed in 90 (92%) recommendations in 2015. Changes in the management plans were observed in 37 (36%) cases in 2016 and in 6 (8%) cases in 2015. Regarding tumour board recommendations, 87% were done within 3 months of tumour board discussion in 2016, while 69% were done in 2015. Implementing electronic standardised documentation tool improved communication among the team and enabled getting accurate data about performance measures of the tumour board with positive impact on healthcare process

  15. Market Analysis and Consumer Impacts Source Document. Part II. Review of Motor Vehicle Market and Consumer Expenditures on Motor Vehicle Transportation

    DOT National Transportation Integrated Search

    1980-12-01

    This source document on motor vehicle market analysis and consumer impacts consists of three parts. Part II consists of studies and review on: motor vehicle sales trends; motor vehicle fleet life and fleet composition; car buying patterns of the busi...

  16. Guidelines for economic analysis of pharmaceutical products: a draft document for Ontario and Canada.

    PubMed

    Detsky, A S

    1993-05-01

    In Canada, provincial formulary review committees consider the effectiveness, safety, and cost of products when they derive advice for each Minister of Health. This article offers a draft set of guidelines for pharmaceutical manufacturers making submissions which include economic information, moving beyond a simple presentation of the unit price of the pharmaceutical product (e.g. price per day or course of therapy) and comparison to similar prices for alternative products. A full economic analysis compares all relevant costs and clinical outcomes of the new product with alternate therapeutic strategies for treating patients with a particular condition. The perspective of the decision maker must be clearly identified. The quality of the evidence supporting estimates of the variables incorporated in the analysis should be evaluated. Sensitivity analyses are used to assess the robustness of the qualitative conclusions. Reviewers will examine the answers to a set of 19 questions. Manufacturers can use these questions as a worksheet for preparation of an economic analysis to be incorporated in a submission. These guidelines are intended to be a starting point for further refinement, and discussion with health economists in industry and academia. Considerable flexibility will be used in reviewing documentation supporting economic analysis. Those preparing submissions should be encouraged to experiment with various approaches as part of the general development of this field and to engage provincial review committees in ongoing discussions.

  17. Safety and fitness electronic records (SAFER) system : logical architecture document : working draft

    DOT National Transportation Integrated Search

    1997-01-31

    This Logical Architecture Document includes the products developed during the functional analysis of the Safety and Fitness Electronic Records (SAFER) System. This document, along with the companion Operational Concept and Physical Architecture Docum...

  18. Impact of incomplete correspondence between document titles and texts on users' representations: a cognitive and linguistic analysis based on 25 technical documents.

    PubMed

    Eyrolle, Hélène; Virbel, Jacques; Lemarié, Julie

    2008-03-01

    Based on previous research in the field of cognitive psychology, highlighting the facilitatory effects of titles on several text-related activities, this paper looks at the extent to which titles reflect text content. An exploratory study of real-life technical documents investigated the content of their Subject lines, which linguistic analyses had led us to regard as titles. The study showed that most of the titles supplied by the writers failed to represent the documents' contents and that most users failed to detect this lack of validity.

  19. Systematic Documentation: Structures and Tools in a Practice of Communicative Documentation

    ERIC Educational Resources Information Center

    Alnervik, Karin

    2018-01-01

    Swedish preschool teachers must systematically document activities in the preschool in order to evaluate the quality of these activities. Pedagogical documentation is one form of documentation that is proposed. The aim of this article is to discuss and create knowledge of structures and tools based on different communicative aspects of pedagogical…

  20. Cat swarm optimization based evolutionary framework for multi document summarization

    NASA Astrophysics Data System (ADS)

    Rautray, Rasmita; Balabantaray, Rakesh Chandra

    2017-07-01

    Today, World Wide Web has brought us enormous quantity of on-line information. As a result, extracting relevant information from massive data has become a challenging issue. In recent past text summarization is recognized as one of the solution to extract useful information from vast amount documents. Based on number of documents considered for summarization, it is categorized as single document or multi document summarization. Rather than single document, multi document summarization is more challenging for the researchers to find accurate summary from multiple documents. Hence in this study, a novel Cat Swarm Optimization (CSO) based multi document summarizer is proposed to address the problem of multi document summarization. The proposed CSO based model is also compared with two other nature inspired based summarizer such as Harmony Search (HS) based summarizer and Particle Swarm Optimization (PSO) based summarizer. With respect to the benchmark Document Understanding Conference (DUC) datasets, the performance of all algorithms are compared in terms of different evaluation metrics such as ROUGE score, F score, sensitivity, positive predicate value, summary accuracy, inter sentence similarity and readability metric to validate non-redundancy, cohesiveness and readability of the summary respectively. The experimental analysis clearly reveals that the proposed approach outperforms the other summarizers included in the study.

  1. A knowledge-driven approach to biomedical document conceptualization.

    PubMed

    Zheng, Hai-Tao; Borchert, Charles; Jiang, Yong

    2010-06-01

    Biomedical document conceptualization is the process of clustering biomedical documents based on ontology-represented domain knowledge. The result of this process is the representation of the biomedical documents by a set of key concepts and their relationships. Most of clustering methods cluster documents based on invariant domain knowledge. The objective of this work is to develop an effective method to cluster biomedical documents based on various user-specified ontologies, so that users can exploit the concept structures of documents more effectively. We develop a flexible framework to allow users to specify the knowledge bases, in the form of ontologies. Based on the user-specified ontologies, we develop a key concept induction algorithm, which uses latent semantic analysis to identify key concepts and cluster documents. A corpus-related ontology generation algorithm is developed to generate the concept structures of documents. Based on two biomedical datasets, we evaluate the proposed method and five other clustering algorithms. The clustering results of the proposed method outperform the five other algorithms, in terms of key concept identification. With respect to the first biomedical dataset, our method has the F-measure values 0.7294 and 0.5294 based on the MeSH ontology and gene ontology (GO), respectively. With respect to the second biomedical dataset, our method has the F-measure values 0.6751 and 0.6746 based on the MeSH ontology and GO, respectively. Both results outperforms the five other algorithms in terms of F-measure. Based on the MeSH ontology and GO, the generated corpus-related ontologies show informative conceptual structures. The proposed method enables users to specify the domain knowledge to exploit the conceptual structures of biomedical document collections. In addition, the proposed method is able to extract the key concepts and cluster the documents with a relatively high precision. Copyright 2010 Elsevier B.V. All rights reserved.

  2. An Ideal Observer Analysis of Visual Working Memory

    ERIC Educational Resources Information Center

    Sims, Chris R.; Jacobs, Robert A.; Knill, David C.

    2012-01-01

    Limits in visual working memory (VWM) strongly constrain human performance across many tasks. However, the nature of these limits is not well understood. In this article we develop an ideal observer analysis of human VWM by deriving the expected behavior of an optimally performing but limited-capacity memory system. This analysis is framed around…

  3. Payload Documentation Enhancement Project

    NASA Technical Reports Server (NTRS)

    Brown, Betty G.

    1999-01-01

    In late 1998, the Space Shuttle Program recognized a need to revitalize its payload accommodations documentation. As a result a payload documentation enhancement project was initiated to review and update payload documentation and improve the accessibility to that documentation by the Space Shuttle user community.

  4. The more it changes; the more it remains the same: a Foucauldian analysis of Canadian policy documents relevant to student selection for medical school.

    PubMed

    Razack, Saleem; Lessard, David; Hodges, Brian D; Maguire, Mary H; Steinert, Yvonne

    2014-05-01

    Calls to increase the demographic representativeness of medical classes to better reflect the diversity of society are part of a growing international trend. Despite this, entry into medical school remains highly competitive and exclusive of marginalized groups. To address these questions, we conducted a Foucauldian discourse analysis of 15 publically available policy documents from the websites of Canadian medical education regulatory bodies, using the concepts of "excellence" (institutional or in an applicant), "diversity," and "equity" to frame the analysis. In most documents, there were appeals to broaden definitions of institutional excellence to include concerns for greater social accountability. Equity concerns tended to be represented as needing to be dealt with by people in positions of authority in order to counter a "hidden curriculum." Diversity was represented as an object of value, situated within a discontinuous history. As a rhetorical strategy, documents invoked complex societal shifts to promote change toward a more humanistic medical education system and profession. "Social accountability" was reified as an all-encompassing solution to most issues of representation. Although the policy documents proclaimed rootedness in an ethos of improving the societal responsiveness of the medical profession, our analysis takes a more critical stance towards the discourses identified. On the basis of our research findings, we question whether these calls may contribute to the maintenance of the specific power relations they seek to address. These conclusions lead us to consider the possibility that the discourses represented in the documents might be reframed to take into account issues of power distribution and its productive and reproductive features. A reframing of discourses could potentially generate greater inclusiveness in policy development processes, and afford disadvantaged and marginalized groups more participatory roles in the discussion.

  5. Reading, Writing, and Documentation and Managing the Development of User Documentation.

    ERIC Educational Resources Information Center

    Lindberg, Wayne; Hoffman, Terrye

    1987-01-01

    The first of two articles addressing the issue of user documentation for computer software discusses the need to teach users how to read documentation. The second presents a guide for writing documentation that is based on the instructional systems design model, and makes suggestions for the desktop publishing of user manuals. (CLB)

  6. Script-independent text line segmentation in freestyle handwritten documents.

    PubMed

    Li, Yi; Zheng, Yefeng; Doermann, David; Jaeger, Stefan; Li, Yi

    2008-08-01

    Text line segmentation in freestyle handwritten documents remains an open document analysis problem. Curvilinear text lines and small gaps between neighboring text lines present a challenge to algorithms developed for machine printed or hand-printed documents. In this paper, we propose a novel approach based on density estimation and a state-of-the-art image segmentation technique, the level set method. From an input document image, we estimate a probability map, where each element represents the probability that the underlying pixel belongs to a text line. The level set method is then exploited to determine the boundary of neighboring text lines by evolving an initial estimate. Unlike connected component based methods ( [1], [2] for example), the proposed algorithm does not use any script-specific knowledge. Extensive quantitative experiments on freestyle handwritten documents with diverse scripts, such as Arabic, Chinese, Korean, and Hindi, demonstrate that our algorithm consistently outperforms previous methods [1]-[3]. Further experiments show the proposed algorithm is robust to scale change, rotation, and noise.

  7. Ensemble methods with simple features for document zone classification

    NASA Astrophysics Data System (ADS)

    Obafemi-Ajayi, Tayo; Agam, Gady; Xie, Bingqing

    2012-01-01

    Document layout analysis is of fundamental importance for document image understanding and information retrieval. It requires the identification of blocks extracted from a document image via features extraction and block classification. In this paper, we focus on the classification of the extracted blocks into five classes: text (machine printed), handwriting, graphics, images, and noise. We propose a new set of features for efficient classifications of these blocks. We present a comparative evaluation of three ensemble based classification algorithms (boosting, bagging, and combined model trees) in addition to other known learning algorithms. Experimental results are demonstrated for a set of 36503 zones extracted from 416 document images which were randomly selected from the tobacco legacy document collection. The results obtained verify the robustness and effectiveness of the proposed set of features in comparison to the commonly used Ocropus recognition features. When used in conjunction with the Ocropus feature set, we further improve the performance of the block classification system to obtain a classification accuracy of 99.21%.

  8. Car manufacturers and global road safety: a word frequency analysis of road safety documents.

    PubMed

    Roberts, I; Wentz, R; Edwards, P

    2006-10-01

    The World Bank believes that the car manufacturers can make a valuable contribution to road safety in poor countries and has established the Global Road Safety Partnership (GRSP) for this purpose. However, some commentators are sceptical. The authors examined road safety policy documents to assess the extent of any bias. Word frequency analyses of road safety policy documents from the World Health Organization (WHO) and the GRSP. The relative occurrence of key road safety terms was quantified by calculating a word prevalence ratio with 95% confidence intervals. Terms for which there was a fourfold difference in prevalence between the documents were tabulated. Compared to WHO's World report on road traffic injury prevention, the GRSP road safety documents were substantially less likely to use the words speed, speed limits, child restraint, pedestrian, public transport, walking, and cycling, but substantially more likely to use the words school, campaign, driver training, and billboard. There are important differences in emphasis in road safety policy documents prepared by WHO and the GRSP. Vigilance is needed to ensure that the road safety interventions that the car industry supports are based on sound evidence of effectiveness.

  9. Skyline: an open source document editor for creating and analyzing targeted proteomics experiments.

    PubMed

    MacLean, Brendan; Tomazela, Daniela M; Shulman, Nicholas; Chambers, Matthew; Finney, Gregory L; Frewen, Barbara; Kern, Randall; Tabb, David L; Liebler, Daniel C; MacCoss, Michael J

    2010-04-01

    Skyline is a Windows client application for targeted proteomics method creation and quantitative data analysis. It is open source and freely available for academic and commercial use. The Skyline user interface simplifies the development of mass spectrometer methods and the analysis of data from targeted proteomics experiments performed using selected reaction monitoring (SRM). Skyline supports using and creating MS/MS spectral libraries from a wide variety of sources to choose SRM filters and verify results based on previously observed ion trap data. Skyline exports transition lists to and imports the native output files from Agilent, Applied Biosystems, Thermo Fisher Scientific and Waters triple quadrupole instruments, seamlessly connecting mass spectrometer output back to the experimental design document. The fast and compact Skyline file format is easily shared, even for experiments requiring many sample injections. A rich array of graphs displays results and provides powerful tools for inspecting data integrity as data are acquired, helping instrument operators to identify problems early. The Skyline dynamic report designer exports tabular data from the Skyline document model for in-depth analysis with common statistical tools. Single-click, self-updating web installation is available at http://proteome.gs.washington.edu/software/skyline. This web site also provides access to instructional videos, a support board, an issues list and a link to the source code project.

  10. Object-Oriented Approach for 3d Archaeological Documentation

    NASA Astrophysics Data System (ADS)

    Valente, R.; Brumana, R.; Oreni, D.; Banfi, F.; Barazzetti, L.; Previtali, M.

    2017-08-01

    Documentation on archaeological fieldworks needs to be accurate and time-effective. Many features unveiled during excavations can be recorded just once, since the archaeological workflow physically removes most of the stratigraphic elements. Some of them have peculiar characteristics which make them hardly recognizable as objects and prevent a full 3D documentation. The paper presents a suitable feature-based method to carry on archaeological documentation with a three-dimensional approach, tested on the archaeological site of S. Calocero in Albenga (Italy). The method is based on one hand on the use of structure from motion techniques for on-site recording and 3D Modelling to represent the three-dimensional complexity of stratigraphy. The entire documentation workflow is carried out through digital tools, assuring better accuracy and interoperability. Outputs can be used in GIS to perform spatial analysis; moreover, a more effective dissemination of fieldworks results can be assured with the spreading of datasets and other information through web-services.

  11. Robust binarization of degraded document images using heuristics

    NASA Astrophysics Data System (ADS)

    Parker, Jon; Frieder, Ophir; Frieder, Gideon

    2013-12-01

    Historically significant documents are often discovered with defects that make them difficult to read and analyze. This fact is particularly troublesome if the defects prevent software from performing an automated analysis. Image enhancement methods are used to remove or minimize document defects, improve software performance, and generally make images more legible. We describe an automated, image enhancement method that is input page independent and requires no training data. The approach applies to color or greyscale images with hand written script, typewritten text, images, and mixtures thereof. We evaluated the image enhancement method against the test images provided by the 2011 Document Image Binarization Contest (DIBCO). Our method outperforms all 2011 DIBCO entrants in terms of average F1 measure - doing so with a significantly lower variance than top contest entrants. The capability of the proposed method is also illustrated using select images from a collection of historic documents stored at Yad Vashem Holocaust Memorial in Israel.

  12. Standardization Documents

    DTIC Science & Technology

    2011-08-01

    Specifications and Standards; Guide Specifications; CIDs; and NGSs . Learn. Perform. Succeed. STANDARDIZATION DOCUMENTS Federal Specifications Commercial...national or international standardization document developed by a private sector association, organization, or technical society that plans ...Maintain lessons learned • Examples: Guidance for application of a technology; Lists of options Learn. Perform. Succeed. DEFENSE HANDBOOK

  13. Model Documentation of Base Case Data | Regional Energy Deployment System

    Science.gov Websites

    Model | Energy Analysis | NREL Documentation of Base Case Data Model Documentation of Base Case base case of the model. The base case was developed simply as a point of departure for other analyses Base Case derives many of its inputs from the Energy Information Administration's (EIA's) Annual Energy

  14. Health search engine with e-document analysis for reliable search results.

    PubMed

    Gaudinat, Arnaud; Ruch, Patrick; Joubert, Michel; Uziel, Philippe; Strauss, Anne; Thonnet, Michèle; Baud, Robert; Spahni, Stéphane; Weber, Patrick; Bonal, Juan; Boyer, Celia; Fieschi, Marius; Geissbuhler, Antoine

    2006-01-01

    After a review of the existing practical solution available to the citizen to retrieve eHealth document, the paper describes an original specialized search engine WRAPIN. WRAPIN uses advanced cross lingual information retrieval technologies to check information quality by synthesizing medical concepts, conclusions and references contained in the health literature, to identify accurate, relevant sources. Thanks to MeSH terminology [1] (Medical Subject Headings from the U.S. National Library of Medicine) and advanced approaches such as conclusion extraction from structured document, reformulation of the query, WRAPIN offers to the user a privileged access to navigate through multilingual documents without language or medical prerequisites. The results of an evaluation conducted on the WRAPIN prototype show that results of the WRAPIN search engine are perceived as informative 65% (59% for a general-purpose search engine), reliable and trustworthy 72% (41% for the other engine) by users. But it leaves room for improvement such as the increase of database coverage, the explanation of the original functionalities and an audience adaptability. Thanks to evaluation outcomes, WRAPIN is now in exploitation on the HON web site (http://www.healthonnet.org), free of charge. Intended to the citizen it is a good alternative to general-purpose search engines when the user looks up trustworthy health and medical information or wants to check automatically a doubtful content of a Web page.

  15. Earth Observation, Spatial Data Quality, and Neglected Tropical Diseases.

    PubMed

    Hamm, Nicholas A S; Soares Magalhães, Ricardo J; Clements, Archie C A

    2015-12-01

    Earth observation (EO) is the use of remote sensing and in situ observations to gather data on the environment. It finds increasing application in the study of environmentally modulated neglected tropical diseases (NTDs). Obtaining and assuring the quality of the relevant spatially and temporally indexed EO data remain challenges. Our objective was to review the Earth observation products currently used in studies of NTD epidemiology and to discuss fundamental issues relating to spatial data quality (SDQ), which limit the utilization of EO and pose challenges for its more effective use. We searched Web of Science and PubMed for studies related to EO and echinococossis, leptospirosis, schistosomiasis, and soil-transmitted helminth infections. Relevant literature was also identified from the bibliographies of those papers. We found that extensive use is made of EO products in the study of NTD epidemiology; however, the quality of these products is usually given little explicit attention. We review key issues in SDQ concerning spatial and temporal scale, uncertainty, and the documentation and use of quality information. We give examples of how these issues may interact with uncertainty in NTD data to affect the output of an epidemiological analysis. We conclude that researchers should give careful attention to SDQ when designing NTD spatial-epidemiological studies. This should be used to inform uncertainty analysis in the epidemiological study. SDQ should be documented and made available to other researchers.

  16. Signature detection and matching for document image retrieval.

    PubMed

    Zhu, Guangyu; Zheng, Yefeng; Doermann, David; Jaeger, Stefan

    2009-11-01

    As one of the most pervasive methods of individual identification and document authentication, signatures present convincing evidence and provide an important form of indexing for effective document image processing and retrieval in a broad range of applications. However, detection and segmentation of free-form objects such as signatures from clustered background is currently an open document analysis problem. In this paper, we focus on two fundamental problems in signature-based document image retrieval. First, we propose a novel multiscale approach to jointly detecting and segmenting signatures from document images. Rather than focusing on local features that typically have large variations, our approach captures the structural saliency using a signature production model and computes the dynamic curvature of 2D contour fragments over multiple scales. This detection framework is general and computationally tractable. Second, we treat the problem of signature retrieval in the unconstrained setting of translation, scale, and rotation invariant nonrigid shape matching. We propose two novel measures of shape dissimilarity based on anisotropic scaling and registration residual error and present a supervised learning framework for combining complementary shape information from different dissimilarity metrics using LDA. We quantitatively study state-of-the-art shape representations, shape matching algorithms, measures of dissimilarity, and the use of multiple instances as query in document image retrieval. We further demonstrate our matching techniques in offline signature verification. Extensive experiments using large real-world collections of English and Arabic machine-printed and handwritten documents demonstrate the excellent performance of our approaches.

  17. An Ideal Observer Analysis of Visual Working Memory

    PubMed Central

    Sims, Chris R.; Jacobs, Robert A.; Knill, David C.

    2013-01-01

    Limits in visual working memory (VWM) strongly constrain human performance across many tasks. However, the nature of these limits is not well understood. In this paper we develop an ideal observer analysis of human visual working memory, by deriving the expected behavior of an optimally performing, but limited-capacity memory system. This analysis is framed around rate–distortion theory, a branch of information theory that provides optimal bounds on the accuracy of information transmission subject to a fixed information capacity. The result of the ideal observer analysis is a theoretical framework that provides a task-independent and quantitative definition of visual memory capacity and yields novel predictions regarding human performance. These predictions are subsequently evaluated and confirmed in two empirical studies. Further, the framework is general enough to allow the specification and testing of alternative models of visual memory (for example, how capacity is distributed across multiple items). We demonstrate that a simple model developed on the basis of the ideal observer analysis—one which allows variability in the number of stored memory representations, but does not assume the presence of a fixed item limit—provides an excellent account of the empirical data, and further offers a principled re-interpretation of existing models of visual working memory. PMID:22946744

  18. The present status and problems in document retrieval system : document input type retrieval system

    NASA Astrophysics Data System (ADS)

    Inagaki, Hirohito

    The office-automation (OA) made many changes. Many documents were begun to maintained in an electronic filing system. Therefore, it is needed to establish efficient document retrieval system to extract useful information. Current document retrieval systems are using simple word-matching, syntactic-matching, semantic-matching to obtain high retrieval efficiency. On the other hand, the document retrieval systems using special hardware devices, such as ISSP, were developed for aiming high speed retrieval. Since these systems can accept a single sentence or keywords as input, it is difficult to explain searcher's request. We demonstrated document input type retrieval system, which can directly accept document as an input, and can search similar documents from document data-base.

  19. Transnational tobacco company interests in smokeless tobacco in Europe: analysis of internal industry documents and contemporary industry materials.

    PubMed

    Peeters, Silvy; Gilmore, Anna B

    2013-01-01

    European Union (EU) legislation bans the sale of snus, a smokeless tobacco (SLT) which is considerably less harmful than smoking, in all EU countries other than Sweden. To inform the current review of this legislation, this paper aims to explore transnational tobacco company (TTC) interests in SLT and pure nicotine in Europe from the 1970s to the present, comparing them with TTCs' public claims of support for harm reduction. Internal tobacco industry documents (in total 416 documents dating from 1971 to 2009), obtained via searching the online Legacy Tobacco Documents Library, were analysed using a hermeneutic approach. This library comprises documents obtained via litigation in the US and does not include documents from Imperial Tobacco, Japan Tobacco International, or Swedish Match. To help overcome this limitation and provide more recent data, we triangulated our documentary findings with contemporary documentation including TTC investor presentations. The analysis demonstrates that British American Tobacco explored SLT opportunities in Europe from 1971 driven by regulatory threats and health concerns, both likely to impact cigarette sales negatively, and the potential to create a new form of tobacco use among those no longer interested in taking up smoking. Young people were a key target. TTCs did not, however, make SLT investments until 2002, a time when EU cigarette volumes started declining, smoke-free legislation was being introduced, and public health became interested in harm reduction. All TTCs have now invested in snus (and recently in pure nicotine), yet both early and recent snus test markets appear to have failed, and little evidence was found in TTCs' corporate materials that snus is central to their business strategy. There is clear evidence that BAT's early interest in introducing SLT in Europe was based on the potential for creating an alternative form of tobacco use in light of declining cigarette sales and social restrictions on smoking, with

  20. A document centric metadata registration tool constructing earth environmental data infrastructure

    NASA Astrophysics Data System (ADS)

    Ichino, M.; Kinutani, H.; Ono, M.; Shimizu, T.; Yoshikawa, M.; Masuda, K.; Fukuda, K.; Kawamoto, H.

    2009-12-01

    DIAS (Data Integration and Analysis System) is one of GEOSS activities in Japan. It is also a leading part of the GEOSS task with the same name defined in GEOSS Ten Year Implementation Plan. The main mission of DIAS is to construct data infrastructure that can effectively integrate earth environmental data such as observation data, numerical model outputs, and socio-economic data provided from the fields of climate, water cycle, ecosystem, ocean, biodiversity and agriculture. Some of DIAS's data products are available at the following web site of http://www.jamstec.go.jp/e/medid/dias. Most of earth environmental data commonly have spatial and temporal attributes such as the covering geographic scope or the created date. The metadata standards including these common attributes are published by the geographic information technical committee (TC211) in ISO (the International Organization for Standardization) as specifications of ISO 19115:2003 and 19139:2007. Accordingly, DIAS metadata is developed with basing on ISO/TC211 metadata standards. From the viewpoint of data users, metadata is useful not only for data retrieval and analysis but also for interoperability and information sharing among experts, beginners and nonprofessionals. On the other hand, from the viewpoint of data providers, two problems were pointed out after discussions. One is that data providers prefer to minimize another tasks and spending time for creating metadata. Another is that data providers want to manage and publish documents to explain their data sets more comprehensively. Because of solving these problems, we have been developing a document centric metadata registration tool. The features of our tool are that the generated documents are available instantly and there is no extra cost for data providers to generate metadata. Also, this tool is developed as a Web application. So, this tool does not demand any software for data providers if they have a web-browser. The interface of the tool

  1. Evaluation of Nursing Documentation Completion of Stroke Patients in the Emergency Department: A Pre-Post Analysis Using Flowsheet Templates and Clinical Decision Support.

    PubMed

    Richardson, Karen J; Sengstack, Patricia; Doucette, Jeffrey N; Hammond, William E; Schertz, Matthew; Thompson, Julie; Johnson, Constance

    2016-02-01

    The primary aim of this performance improvement project was to determine whether the electronic health record implementation of stroke-specific nursing documentation flowsheet templates and clinical decision support alerts improved the nursing documentation of eligible stroke patients in seven stroke-certified emergency departments. Two system enhancements were introduced into the electronic record in an effort to improve nursing documentation: disease-specific documentation flowsheets and clinical decision support alerts. Using a pre-post design, project measures included six stroke management goals as defined by the National Institute of Neurological Disorders and Stroke and three clinical decision support measures based on entry of orders used to trigger documentation reminders for nursing: (1) the National Institutes of Health's Stroke Scale, (2) neurological checks, and (3) dysphagia screening. Data were reviewed 6 months prior (n = 2293) and 6 months following the intervention (n = 2588). Fisher exact test was used for statistical analysis. Statistical significance was found for documentation of five of the six stroke management goals, although effect sizes were small. Customizing flowsheets to meet the needs of nursing workflow showed improvement in the completion of documentation. The effects of the decision support alerts on the completeness of nursing documentation were not statistically significant (likely due to lack of order entry). For example, an order for the National Institutes of Health Stroke Scale was entered only 10.7% of the time, which meant no alert would fire for nursing in the postintervention group. Future work should focus on decision support alerts that trigger reminders for clinicians to place relevant orders for this population.

  2. Information Types in Nonmimetic Documents: A Review of Biddle's Wipe-Clean Slate (Understanding Documents).

    ERIC Educational Resources Information Center

    Mosenthal, Peter B.; Kirsch, Irwin S.

    1991-01-01

    Describes how the 16 permanent lists used by a first grade reading teacher (and mother of 6) to manage the household represents the whole range of documents covered in the 3 major types of documents: matrix documents, graphic documents, and locative documents. Suggests class activities to clarify students' understanding of the information in…

  3. Discovering Semantic Patterns in Bibliographically Coupled Documents.

    ERIC Educational Resources Information Center

    Qin, Jian

    1999-01-01

    An example of semantic pattern analysis, based on keywords selected from documents grouped by bibliographical coupling, is used to demonstrate the methodological aspects of knowledge discovery in bibliographic databases. Frequency distribution patterns suggest the existence of a common intellectual base with a wide range of specialties and…

  4. Semantic Document Library: A Virtual Research Environment for Documents, Data and Workflows Sharing

    NASA Astrophysics Data System (ADS)

    Kotwani, K.; Liu, Y.; Myers, J.; Futrelle, J.

    2008-12-01

    The Semantic Document Library (SDL) was driven by use cases from the environmental observatory communities and is designed to provide conventional document repository features of uploading, downloading, editing and versioning of documents as well as value adding features of tagging, querying, sharing, annotating, ranking, provenance, social networking and geo-spatial mapping services. It allows users to organize a catalogue of watershed observation data, model output, workflows, as well publications and documents related to the same watershed study through the tagging capability. Users can tag all relevant materials using the same watershed name and find all of them easily later using this tag. The underpinning semantic content repository can store materials from other cyberenvironments such as workflow or simulation tools and SDL provides an effective interface to query and organize materials from various sources. Advanced features of the SDL allow users to visualize the provenance of the materials such as the source and how the output data is derived. Other novel features include visualizing all geo-referenced materials on a geospatial map. SDL as a component of a cyberenvironment portal (the NCSA Cybercollaboratory) has goal of efficient management of information and relationships between published artifacts (Validated models, vetted data, workflows, annotations, best practices, reviews and papers) produced from raw research artifacts (data, notes, plans etc.) through agents (people, sensors etc.). Tremendous scientific potential of artifacts is achieved through mechanisms of sharing, reuse and collaboration - empowering scientists to spread their knowledge and protocols and to benefit from the knowledge of others. SDL successfully implements web 2.0 technologies and design patterns along with semantic content management approach that enables use of multiple ontologies and dynamic evolution (e.g. folksonomies) of terminology. Scientific documents involved with

  5. Using WorldWide Telescope in Observing, Research and Presentation

    NASA Astrophysics Data System (ADS)

    Roberts, Douglas A.; Fay, J.

    2014-01-01

    WorldWide Telescope (WWT) is free software that enables researchers to interactively explore observational data using a user-friendly interface. Reference, all-sky datasets and pointed observations are available as layers along with the ability to easily overlay additional FITS images and catalog data. Connections to the Astrophysics Data System (ADS) are included which enable visual investigation using WWT to drive document searches in ADS. WWT can be used to capture and share visual exploration with colleagues during observational planning and analysis. Finally, researchers can use WorldWide Telescope to create videos for professional, education and outreach presentations. I will conclude with an example of how I have used WWT in a research project. Specifically, I will discuss how WorldWide Telescope helped our group to prepare for radio observations and following them, in the analysis of multi-wavelength data taken in the inner parsec of the Galaxy. A concluding video will show how WWT brought together disparate datasets in a unified interactive visualization environment.

  6. Serological documentation of maternal influenza exposure and bipolar disorder in adult offspring.

    PubMed

    Canetta, Sarah E; Bao, Yuanyuan; Co, Mary Dawn T; Ennis, Francis A; Cruz, John; Terajima, Masanori; Shen, Ling; Kellendonk, Christoph; Schaefer, Catherine A; Brown, Alan S

    2014-05-01

    The authors examined whether serologically confirmed maternal exposure to influenza was associated with an increased risk of bipolar disorder in the offspring and with subtypes of bipolar disorder, with and without psychotic features. The study used a nested case-control design in the Child Health and Development Study birth cohort. In all, 85 individuals with bipolar disorder were identified following extensive ascertainment and diagnostic assessment and matched to 170 comparison subjects in the analysis. Serological documentation of maternal exposure to influenza was determined using the hemagglutination inhibition assay. No association was observed between serologically documented maternal exposure to influenza and bipolar disorder in offspring. However, maternal serological influenza exposure was related to a significant fivefold greater risk of bipolar disorder with psychotic features. The results suggest that maternal influenza exposure may increase the risk for offspring to develop bipolar disorder with psychotic features. Taken together with earlier associations between prenatal influenza exposure and schizophrenia, these results may suggest that prenatal influenza is a risk factor for psychosis rather than for a specific psychotic disorder diagnosis.

  7. Manned space flight nuclear system safety. Volume 3: Reactor system preliminary nuclear safety analysis. Part 3: Nuclear Safety Analysis Document (NSAD)

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Nuclear safety analysis as applied to a space base mission is presented. The nuclear safety analysis document summarizes the mission and the credible accidents/events which may lead to nuclear hazards to the general public. The radiological effects and associated consequences of the hazards are discussed in detail. The probability of occurrence is combined with the potential number of individuals exposed to or above guideline values to provide a measure of accident and total mission risk. The overall mission risk has been determined to be low with the potential exposure to or above 25 rem limited to less than 4 individuals per every 1000 missions performed. No radiological risk to the general public occurs during the prelaunch phase at KSC. The most significant risks occur from prolonged exposure to reactor debris following land impact generally associated with the disposal phase of the mission where fission product inventories can be high.

  8. Concept document of the repository-based software engineering program: A constructive appraisal

    NASA Technical Reports Server (NTRS)

    1992-01-01

    A constructive appraisal of the Concept Document of the Repository-Based Software Engineering Program is provided. The Concept Document is designed to provide an overview of the Repository-Based Software Engineering (RBSE) Program. The Document should be brief and provide the context for reading subsequent requirements and product specifications. That is, all requirements to be developed should be traceable to the Concept Document. Applied Expertise's analysis of the Document was directed toward assuring that: (1) the Executive Summary provides a clear, concise, and comprehensive overview of the Concept (rewrite as necessary); (2) the sections of the Document make best use of the NASA 'Data Item Description' for concept documents; (3) the information contained in the Document provides a foundation for subsequent requirements; and (4) the document adequately: identifies the problem being addressed; articulates RBSE's specific role; specifies the unique aspects of the program; and identifies the nature and extent of the program's users.

  9. Contingency Management Requirements Document: Preliminary Version. Revision F

    NASA Technical Reports Server (NTRS)

    2005-01-01

    This is the High Altitude, Long Endurance (HALE) Remotely Operated Aircraft (ROA) Contingency Management (CM) Functional Requirements document. This document applies to HALE ROA operating within the National Airspace System (NAS) limited at this time to enroute operations above 43,000 feet (defined as Step 1 of the Access 5 project, sponsored by the National Aeronautics and Space Administration). A contingency is an unforeseen event requiring a response. The unforeseen event may be an emergency, an incident, a deviation, or an observation. Contingency Management (CM) is the process of evaluating the event, deciding on the proper course of action (a plan), and successfully executing the plan.

  10. Sequential Ideal-Observer Analysis of Visual Discriminations.

    ERIC Educational Resources Information Center

    Geisler, Wilson S.

    1989-01-01

    A new analysis, based on the concept of the ideal observer in signal detection theory, is described. It allows: tracing of the flow of discrimination information through the initial physiological stages of visual processing for arbitrary spatio-chromatic stimuli, and measurement of the information content of said visual stimuli. (TJH)

  11. Three Years of Unmediated Document Delivery: An Analysis and Consideration of Collection Development Priorities.

    PubMed

    Chan, Emily K; Mune, Christina; Wang, YiPing; Kendall, Susan L

    2016-01-01

    Like most academic libraries, San José State University Library is struggling to meet users' rising expectations for immediate information within the financial confines of a flat budget. To address acquisition of nonsubscribed article content, particularly outside of business hours, San José State University Library implemented Copyright Clearance Center's Get It Now, a document delivery service. Three academic years of analyzed data, which involves more than 10,000 requests, and the subsequent collection development actions taken by the library will be discussed. The value and challenges of patron-driven, unmediated document delivery services in conjunction with traditional document delivery services will be considered.

  12. Computerized clinical documentation system in the pediatric intensive care unit

    PubMed Central

    2001-01-01

    Background To determine whether a computerized clinical documentation system (CDS): 1) decreased time spent charting and increased time spent in patient care; 2) decreased medication errors; 3) improved clinical decision making; 4) improved quality of documentation; and/or 5) improved shift to shift nursing continuity. Methods Before and after implementation of CDS, a time study involving nursing care, medication delivery, and normalization of serum calcium and potassium values was performed. In addition, an evaluation of completeness of documentation and a clinician survey of shift to shift reporting were also completed. This was a modified one group, pretest-posttest design. Results With the CDS there was: improved legibility and completeness of documentation, data with better accessibility and accuracy, no change in time spent in direct patient care or charting by nursing staff. Incidental observations from the study included improved management functions of our nurse manager; improved JCAHO documentation compliance; timely access to clinical data (labs, vitals, etc); a decrease in time and resource use for audits; improved reimbursement because of the ability to reconstruct lost charts; limited human data entry by automatic data logging; eliminated costs of printing forms. CDS cost was reasonable. Conclusions When compared to a paper chart, the CDS provided a more legible, compete, and accessible patient record without affecting time spent in direct patient care. The availability of the CDS improved shift to shift reporting. Other observations showed that the CDS improved management capabilities; helped physicians deliver care; improved reimbursement; limited data entry errors; and reduced costs. PMID:11604105

  13. Health check documentation of psychosocial factors using the WAI.

    PubMed

    Uronen, L; Heimonen, J; Puukka, P; Martimo, K-P; Hartiala, J; Salanterä, S

    2017-03-01

    Health checks in occupational health (OH) care should prevent deterioration of work ability and promote well-being at work. Documentation of health checks should reflect and support continuity of prevention and practice. To analyse how OH nurses (OHNs) undertaking health checks document psychosocial factors at work and use the Work Ability Index (WAI). Analysis of two consecutive OHN health check records and WAI scores with statistical analyses and annotations of 13 psychosocial factors based on a publicly available standard on psychosocial risk management: British Standards Institution specification PAS 1010, part of European Council Directive 89/391/EEC, with a special focus on work-related stress and workplace violence. We analysed health check records for 196 employees. The most frequently documented psychosocial risk factors were home-work interface, work environment and equipment, job content, workload and work pace and work schedule. The correlations between the number of documented risk and non-risk factors and WAI scores were significant: OHNs documented more risk factors in employees with lower WAI scores. However, documented psychosocial risk factors were not followed up, and the OHNs' most common response to detected psychosocial risks was an appointment with a physician. The number of psychosocial risk factors documented by OHNs correlated with subjects' WAI scores. However, the documentation was not systematic and the interventions were not always relevant. OHNs need a structure to document psychosocial factors and more guidance in how to use the documentation as a tool in their decision making in health checks. © The Author 2016. Published by Oxford University Press on behalf of the Society of Occupational Medicine. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  14. New public dataset for spotting patterns in medieval document images

    NASA Astrophysics Data System (ADS)

    En, Sovann; Nicolas, Stéphane; Petitjean, Caroline; Jurie, Frédéric; Heutte, Laurent

    2017-01-01

    With advances in technology, a large part of our cultural heritage is becoming digitally available. In particular, in the field of historical document image analysis, there is now a growing need for indexing and data mining tools, thus allowing us to spot and retrieve the occurrences of an object of interest, called a pattern, in a large database of document images. Patterns may present some variability in terms of color, shape, or context, making the spotting of patterns a challenging task. Pattern spotting is a relatively new field of research, still hampered by the lack of available annotated resources. We present a new publicly available dataset named DocExplore dedicated to spotting patterns in historical document images. The dataset contains 1500 images and 1464 queries, and allows the evaluation of two tasks: image retrieval and pattern localization. A standardized benchmark protocol along with ad hoc metrics is provided for a fair comparison of the submitted approaches. We also provide some first results obtained with our baseline system on this new dataset, which show that there is room for improvement and that should encourage researchers of the document image analysis community to design new systems and submit improved results.

  15. Documenting Living Monuments in Indonesia: Methodology for Sustainable Utility

    NASA Astrophysics Data System (ADS)

    Suryaningsih, F.; Purwestri, N.

    2013-07-01

    The systematic documentation of cultural heritage in Indonesia has been developed after the establishment of Bataviaasch Genootschap van Kunsten en Wetenschappen (1778) and De Oudheidkundige Dienst (1913) by the Netherlands Indies government. After Indonesian independent, the tasks of cultural heritage documentation take over by The Ministry of Culture (now become The Ministry of Education of Culture) with focus on the ancient and classical heritage, so called dead monument. The needed of comprehensive documentation data regarding cultural heritage become significant issues since the government and private sector pay attention to the preservation of heritage building in the urban site, so called living monument. The archives of original drawing plan many times do not fit with the existing condition, while the conservation plan demands a document such as built drawing plan to work on. The technology, methodology and system to provide such comprehensive document of heritage building and site become important, to produce good conservation plan and heritage building regular maintenance. It means the products will have a sustainable and various utility values. Since 1994, Documentation Centre for Architecture - Indonesia (PDA), has established to meet the needs of a comprehensive data of heritage building (living monuments), to utilized as basic document for conservation planning. Not only provide document of the digital drawing such site plan, plan, elevation, section and details of architecture elements, but also document of historic research, material analysis and completed with diagnosis and mapping of building damages. This manuscript is about PDA field experience, working in this subject issue

  16. Document image cleanup and binarization

    NASA Astrophysics Data System (ADS)

    Wu, Victor; Manmatha, Raghaven

    1998-04-01

    Image binarization is a difficult task for documents with text over textured or shaded backgrounds, poor contrast, and/or considerable noise. Current optical character recognition (OCR) and document analysis technology do not handle such documents well. We have developed a simple yet effective algorithm for document image clean-up and binarization. The algorithm consists of two basic steps. In the first step, the input image is smoothed using a low-pass filter. The smoothing operation enhances the text relative to any background texture. This is because background texture normally has higher frequency than text does. The smoothing operation also removes speckle noise. In the second step, the intensity histogram of the smoothed image is computed and a threshold automatically selected as follows. For black text, the first peak of the histogram corresponds to text. Thresholding the image at the value of the valley between the first and second peaks of the histogram binarizes the image well. In order to reliably identify the valley, the histogram is smoothed by a low-pass filter before the threshold is computed. The algorithm has been applied to some 50 images from a wide variety of source: digitized video frames, photos, newspapers, advertisements in magazines or sales flyers, personal checks, etc. There are 21820 characters and 4406 words in these images. 91 percent of the characters and 86 percent of the words are successfully cleaned up and binarized. A commercial OCR was applied to the binarized text when it consisted of fonts which were OCR recognizable. The recognition rate was 84 percent for the characters and 77 percent for the words.

  17. Documents Similarity Measurement Using Field Association Terms.

    ERIC Educational Resources Information Center

    Atlam, El-Sayed; Fuketa, M.; Morita, K.; Aoe, Jun-ichi

    2003-01-01

    Discussion of text analysis and information retrieval and measurement of document similarity focuses on a new text manipulation system called FA (field association)-Sim that is useful for retrieving information in large heterogeneous texts and for recognizing content similarity in text excerpts. Discusses recall and precision, automatic indexing…

  18. Patients' Care Needs: Documentation Analysis in General Hospitals.

    PubMed

    Paans, Wolter; Müller-Staub, Maria

    2015-10-01

    The purpose of the study is (a) to describe care needs derived from records of patients in Dutch hospitals, and (b) to evaluate whether nurses employed the NANDA-I classification to formulate patients' care needs. A stratified cross-sectional random-sampling nursing documentation audit was conducted employing the D-Catch instrument in 10 hospitals comprising 37 wards. The most prevalent nursing diagnoses were acute pain, nausea, fatigue, and risk for impaired skin integrity. Most care needs were determined in physiological health patterns and few in psychosocial patterns. To perform effective interventions leading to high-quality nursing-sensitive outcomes, nurses should also diagnose patients' care needs in the health management, value-belief, and coping stress patterns. © 2014 NANDA International, Inc.

  19. Pretest analysis document for Test S-FS-7

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hall, D.G.

    This report documents the pretest calculations completed for Semiscale Test S-FS-7. This test will simulate a transient initiated by a 14.3% break in a steam generator bottom feedwater line downstream of the check valve. The initial conditions represent normal operating conditions for a C-E System 80 nuclear power plant. Predictions of transients resulting from feedwater line breaks in these plants have indicated that significant primary system overpressurization may occur. The results of a RELAP5/MOD2/CY21 code calculation indicate that the test objectives for Test S-FS-7 can be achieved. The primary system overpressurization will occur but pose no threat to personnel ormore » to plant integrity. 3 refs., 15 figs., 5 tabs.« less

  20. Pretest analysis document for Test S-FS-11

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hall, D.G.; Shaw, R.A.

    This report documents the pretest calculations completed for Semiscale Test S-FS-11. This test will simulate a transient initiated by a 50% break in a steam generator bottom feedwater line downstream of the check valve. The initial conditions represents normal operating conditions for a C-E System 80 nuclear plant. Prediction of transients resulting from feedwater line breaks in these plants have indicated that significant primary system overpressurization may occur. The results of a RELAP5/MOD2/CY21 code calculation indicate that the test objectives for Test S-FS-11 can be achieved. The primary system overpressurization will occur but pose no threat to personnel or plantmore » integrity. 3 refs., 15 figs., 5 tabs.« less

  1. IDC Re-Engineering Phase 2 System Specification Document Version 1.5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Satpathi, Meara Allena; Burns, John F.; Harris, James M.

    This document contains the system specifications derived to satisfy the system requirements found in the IDC System Requirements Document for the IDC Re-Engineering Phase 2 project. This System Specification Document (SSD) defines waveform data processing requirements for the International Data Centre (IDC) of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO). The routine processing includes characterization of events with the objective of screening out events considered to be consistent with natural phenomena or non-nuclear, man-made phenomena. This document does not address requirements concerning acquisition, processing and analysis of radionuclide data but does include requirements for the dissemination of radionuclide datamore » and products.« less

  2. Rhetorical Motives of Identity, Consubstantiality, and Hierarchy: An Analysis of Community College Program Documents

    ERIC Educational Resources Information Center

    Parkis Pettit, Angela G.

    2011-01-01

    The dissertation focuses on three academic programs at Tarrant County College, Northeast Campus, specifically the documents used to create and sustain these programs. The purpose of this study includes the following: first, to identify the terminology specific to each program and/or the documents used within the program; second and third to…

  3. COSMIC program documentation experience

    NASA Technical Reports Server (NTRS)

    Kalar, M. C.

    1970-01-01

    A brief history of COSMIC as it relates to the handling of program documentation is summarized; the items that are essential for computer program documentation are also discussed. COSMIC documentation and program standards handbook is appended.

  4. The Stark Reality of the "White Saviour" Complex and the Need for Critical Consciousness: A Document Analysis of the Early Journals of a Freirean Educator

    ERIC Educational Resources Information Center

    Straubhaar, Rolf

    2015-01-01

    While the anglophone academic literature has long engaged in analysis of the role of privilege in the work of educators in the Global North, this article represents an initial foray into such analysis in non-formal educational settings in the Global South. Through a cultural-textual document analysis of 12?months of personal journal entries…

  5. Tank Monitoring and Document control System (TMACS) As Built Software Design Document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    GLASSCOCK, J.A.

    This document describes the software design for the Tank Monitor and Control System (TMACS). This document captures the existing as-built design of TMACS as of November 1999. It will be used as a reference document to the system maintainers who will be maintaining and modifying the TMACS functions as necessary. The heart of the TMACS system is the ''point-processing'' functionality where a sample value is received from the field sensors and the value is analyzed, logged, or alarmed as required. This Software Design Document focuses on the point-processing functions.

  6. Pre-hospital policies for the care of patients with acute coronary syndromes in India: A policy document analysis.

    PubMed

    Patel, Amisha; Prabhakaran, Dorairaj; Berendsen, Mark; Mohanan, P P; Huffman, Mark D

    2017-04-01

    Ischemic heart disease is the leading cause of death in India. In high-income countries, pre-hospital systems of care have been developed to manage acute manifestations of ischemic heart disease, such as acute coronary syndrome (ACS). However, it is unknown whether guidelines, policies, regulations, or laws exist to guide pre-hospital ACS care in India. We undertook a nation-wide document analysis to address this gap in knowledge. From November 2014 to May 2016, we searched for publicly available emergency care guidelines and legislation addressing pre-hospital ACS care in all 29 Indian states and 7 Union Territories via Internet search and direct correspondence. We found two documents addressing pre-hospital ACS care. Though India has legislation mandating acute care for emergencies such as trauma, regulations or laws to guide pre-hospital ACS care are largely absent. Policy makers urgently need to develop comprehensive, multi-stakeholder policies for pre-hospital emergency cardiovascular care in India. Copyright © 2016. Published by Elsevier B.V.

  7. Document Level Assessment of Document Retrieval Systems in a Pairwise System Evaluation

    ERIC Educational Resources Information Center

    Rajagopal, Prabha; Ravana, Sri Devi

    2017-01-01

    Introduction: The use of averaged topic-level scores can result in the loss of valuable data and can cause misinterpretation of the effectiveness of system performance. This study aims to use the scores of each document to evaluate document retrieval systems in a pairwise system evaluation. Method: The chosen evaluation metrics are document-level…

  8. Focused Observations: How to Observe Young Children for Assessment and Curriculum Planning, Second Edition

    ERIC Educational Resources Information Center

    Gronlund, Gaye; James, Marlyn

    2013-01-01

    Intentional teaching begins with focused observations and systematic documentation of children's learning and development. "Focused Observations, Second Edition," explains why observation is one of the best methods to get to know each child well, track progress, and plan individualized curriculum. It also provides tools and techniques to…

  9. Web Video Event Recognition by Semantic Analysis From Ubiquitous Documents.

    PubMed

    Yu, Litao; Yang, Yang; Huang, Zi; Wang, Peng; Song, Jingkuan; Shen, Heng Tao

    2016-12-01

    In recent years, the task of event recognition from videos has attracted increasing interest in multimedia area. While most of the existing research was mainly focused on exploring visual cues to handle relatively small-granular events, it is difficult to directly analyze video content without any prior knowledge. Therefore, synthesizing both the visual and semantic analysis is a natural way for video event understanding. In this paper, we study the problem of Web video event recognition, where Web videos often describe large-granular events and carry limited textual information. Key challenges include how to accurately represent event semantics from incomplete textual information and how to effectively explore the correlation between visual and textual cues for video event understanding. We propose a novel framework to perform complex event recognition from Web videos. In order to compensate the insufficient expressive power of visual cues, we construct an event knowledge base by deeply mining semantic information from ubiquitous Web documents. This event knowledge base is capable of describing each event with comprehensive semantics. By utilizing this base, the textual cues for a video can be significantly enriched. Furthermore, we introduce a two-view adaptive regression model, which explores the intrinsic correlation between the visual and textual cues of the videos to learn reliable classifiers. Extensive experiments on two real-world video data sets show the effectiveness of our proposed framework and prove that the event knowledge base indeed helps improve the performance of Web video event recognition.

  10. Performance evaluation methodology for historical document image binarization.

    PubMed

    Ntirogiannis, Konstantinos; Gatos, Basilis; Pratikakis, Ioannis

    2013-02-01

    Document image binarization is of great importance in the document image analysis and recognition pipeline since it affects further stages of the recognition process. The evaluation of a binarization method aids in studying its algorithmic behavior, as well as verifying its effectiveness, by providing qualitative and quantitative indication of its performance. This paper addresses a pixel-based binarization evaluation methodology for historical handwritten/machine-printed document images. In the proposed evaluation scheme, the recall and precision evaluation measures are properly modified using a weighting scheme that diminishes any potential evaluation bias. Additional performance metrics of the proposed evaluation scheme consist of the percentage rates of broken and missed text, false alarms, background noise, character enlargement, and merging. Several experiments conducted in comparison with other pixel-based evaluation measures demonstrate the validity of the proposed evaluation scheme.

  11. Conjunctive Cohesion in English Language EU Documents--A Corpus-Based Analysis and Its Implications

    ERIC Educational Resources Information Center

    Trebits, Anna

    2009-01-01

    This paper reports the findings of a study which forms part of a larger-scale research project investigating the use of English in the documents of the European Union (EU). The documents of the EU show various features of texts written for legal, business and other specific purposes. Moreover, the translation services of the EU institutions often…

  12. Skyline: an open source document editor for creating and analyzing targeted proteomics experiments

    PubMed Central

    MacLean, Brendan; Tomazela, Daniela M.; Shulman, Nicholas; Chambers, Matthew; Finney, Gregory L.; Frewen, Barbara; Kern, Randall; Tabb, David L.; Liebler, Daniel C.; MacCoss, Michael J.

    2010-01-01

    Summary: Skyline is a Windows client application for targeted proteomics method creation and quantitative data analysis. It is open source and freely available for academic and commercial use. The Skyline user interface simplifies the development of mass spectrometer methods and the analysis of data from targeted proteomics experiments performed using selected reaction monitoring (SRM). Skyline supports using and creating MS/MS spectral libraries from a wide variety of sources to choose SRM filters and verify results based on previously observed ion trap data. Skyline exports transition lists to and imports the native output files from Agilent, Applied Biosystems, Thermo Fisher Scientific and Waters triple quadrupole instruments, seamlessly connecting mass spectrometer output back to the experimental design document. The fast and compact Skyline file format is easily shared, even for experiments requiring many sample injections. A rich array of graphs displays results and provides powerful tools for inspecting data integrity as data are acquired, helping instrument operators to identify problems early. The Skyline dynamic report designer exports tabular data from the Skyline document model for in-depth analysis with common statistical tools. Availability: Single-click, self-updating web installation is available at http://proteome.gs.washington.edu/software/skyline. This web site also provides access to instructional videos, a support board, an issues list and a link to the source code project. Contact: brendanx@u.washington.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20147306

  13. Narrative Analysis: Exploring Experiences of Observational Drawing and Dyspraxia

    ERIC Educational Resources Information Center

    Penketh, Claire

    2011-01-01

    Narrative analysis offers a powerful and accessible means of understanding the ways in which individuals experience learning across a range of educational sites. Drawing on a recent study that explored "dyspraxic" pupils' experiences of drawing from observation, this paper offers an insight into the potential that narrative analysis has…

  14. Document Monitor

    NASA Technical Reports Server (NTRS)

    1988-01-01

    The charters of Freedom Monitoring System will periodically assess the physical condition of the U.S. Constitution, Declaration of Independence and Bill of Rights. Although protected in helium filled glass cases, the documents are subject to damage from light vibration and humidity. The photometer is a CCD detector used as the electronic film for the camera system's scanning camera which mechanically scans the document line by line and acquires a series of images, each representing a one square inch portion of the document. Perkin-Elmer Corporation's photometer is capable of detecting changes in contrast, shape or other indicators of degradation with 5 to 10 times the sensitivity of the human eye. A Vicom image processing computer receives the data from the photometer stores it and manipulates it, allowing comparison of electronic images over time to detect changes.

  15. Document creation, linking, and maintenance system

    DOEpatents

    Claghorn, Ronald [Pasco, WA

    2011-02-15

    A document creation and citation system designed to maintain a database of reference documents. The content of a selected document may be automatically scanned and indexed by the system. The selected documents may also be manually indexed by a user prior to the upload. The indexed documents may be uploaded and stored within a database for later use. The system allows a user to generate new documents by selecting content within the reference documents stored within the database and inserting the selected content into a new document. The system allows the user to customize and augment the content of the new document. The system also generates citations to the selected content retrieved from the reference documents. The citations may be inserted into the new document in the appropriate location and format, as directed by the user. The new document may be uploaded into the database and included with the other reference documents. The system also maintains the database of reference documents so that when changes are made to a reference document, the author of a document referencing the changed document will be alerted to make appropriate changes to his document. The system also allows visual comparison of documents so that the user may see differences in the text of the documents.

  16. Medical Treatment Facility Workload Documentation Guide.

    DTIC Science & Technology

    1980-04-15

    to the system implementation should be presented. This document is intended as a guidebook for determining the site specific characteristics of an...wide volume analysis of a communications system. Prior to collecting any data, objectives and initial operating characteristics of the system(s...unique characteristics involved. An on-site inspection of all spaces to be impacted by NIS implementation is required initially. During this inspection

  17. NASA Software Documentation Standard

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as "Standard") is designed to support the documentation of all software developed for NASA; its goal is to provide a framework and model for recording the essential information needed throughout the development life cycle and maintenance of a software system. The NASA Software Documentation Standard can be applied to the documentation of all NASA software. The Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. The basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  18. Listening for Competence through Documentation: Assessing Children with Language Delays Using Digital Video

    ERIC Educational Resources Information Center

    Suarez, Stephanie Cox; Daniels, Karen J.

    2009-01-01

    This case study uses documentation as a tool for formative assessment to interpret the learning of twin boys with significantly delayed language skills. Reggio-inspired documentation (the act of collecting, interpreting, and reflecting on traces of learning from video, images, and observation notes) focused on the unfolding of the boys' nonverbal…

  19. Application of 3D documentation and geometric reconstruction methods in traffic accident analysis: with high resolution surface scanning, radiological MSCT/MRI scanning and real data based animation.

    PubMed

    Buck, Ursula; Naether, Silvio; Braun, Marcel; Bolliger, Stephan; Friederich, Hans; Jackowski, Christian; Aghayev, Emin; Christe, Andreas; Vock, Peter; Dirnhofer, Richard; Thali, Michael J

    2007-07-20

    The examination of traffic accidents is daily routine in forensic medicine. An important question in the analysis of the victims of traffic accidents, for example in collisions between motor vehicles and pedestrians or cyclists, is the situation of the impact. Apart from forensic medical examinations (external examination and autopsy), three-dimensional technologies and methods are gaining importance in forensic investigations. Besides the post-mortem multi-slice computed tomography (MSCT) and magnetic resonance imaging (MRI) for the documentation and analysis of internal findings, highly precise 3D surface scanning is employed for the documentation of the external body findings and of injury-inflicting instruments. The correlation of injuries of the body to the injury-inflicting object and the accident mechanism are of great importance. The applied methods include documentation of the external and internal body and the involved vehicles and inflicting tools as well as the analysis of the acquired data. The body surface and the accident vehicles with their damages were digitized by 3D surface scanning. For the internal findings of the body, post-mortem MSCT and MRI were used. The analysis included the processing of the obtained data to 3D models, determination of the driving direction of the vehicle, correlation of injuries to the vehicle damages, geometric determination of the impact situation and evaluation of further findings of the accident. In the following article, the benefits of the 3D documentation and computer-assisted, drawn-to-scale 3D comparisons of the relevant injuries with the damages to the vehicle in the analysis of the course of accidents, especially with regard to the impact situation, are shown on two examined cases.

  20. Document Concurrence System

    NASA Technical Reports Server (NTRS)

    Muhsin, Mansour; Walters, Ian

    2004-01-01

    The Document Concurrence System is a combination of software modules for routing users expressions of concurrence with documents. This system enables determination of the current status of concurrences and eliminates the need for the prior practice of manually delivering paper documents to all persons whose approvals were required. This system runs on a server, and participants gain access via personal computers equipped with Web-browser and electronic-mail software. A user can begin a concurrence routing process by logging onto an administration module, naming the approvers and stating the sequence for routing among them, and attaching documents. The server then sends a message to the first person on the list. Upon concurrence by the first person, the system sends a message to the second person, and so forth. A person on the list indicates approval, places the documents on hold, or indicates disapproval, via a Web-based module. When the last person on the list has concurred, a message is sent to the initiator, who can then finalize the process through the administration module. A background process running on the server identifies concurrence processes that are overdue and sends reminders to the appropriate persons.

  1. Obs4MIPS: Satellite Observations for Model Evaluation

    NASA Astrophysics Data System (ADS)

    Ferraro, R.; Waliser, D. E.; Gleckler, P. J.

    2017-12-01

    This poster will review the current status of the obs4MIPs project, whose purpose is to provide a limited collection of well-established and documented datasets for comparison with Earth system models (https://www.earthsystemcog.org/projects/obs4mips/). These datasets have been reformatted to correspond with the CMIP5 model output requirements, and include technical documentation specifically targeted for their use in model output evaluation. The project holdings now exceed 120 datasets with observations that directly correspond to CMIP5 model output variables, with new additions in response to the CMIP6 experiments. With the growth in climate model output data volume, it is increasing more difficult to bring the model output and the observations together to do evaluations. The positioning of the obs4MIPs datasets within the Earth System Grid Federation (ESGF) allows for the use of currently available and planned online tools within the ESGF to perform analysis using model output and observational datasets without necessarily downloading everything to a local workstation. This past year, obs4MIPs has updated its submission guidelines to closely align with changes in the CMIP6 experiments, and is implementing additional indicators and ancillary data to allow users to more easily determine the efficacy of an obs4MIPs dataset for specific evaluation purposes. This poster will present the new guidelines and indicators, and update the list of current obs4MIPs holdings and their connection to the ESGF evaluation and analysis tools currently available, and being developed for the CMIP6 experiments.

  2. Guidelines for Documentation of Computer Programs and Automated Data Systems. (Category: Software; Subcategory: Documentation).

    ERIC Educational Resources Information Center

    Federal Information Processing Standards Publication, 1976

    1976-01-01

    These guidelines provide a basis for determining the content and extent of documentation for computer programs and automated data systems. Content descriptions of ten document types plus examples of how management can determine when to use the various types are included. The documents described are (1) functional requirements documents, (2) data…

  3. Technology reference document

    DOT National Transportation Integrated Search

    1993-09-01

    To provide an all encompassing technology document was beyond the scope of this particular effort. Consequently, the new technology introduced with this updated document focused on selected topics which were considered by the authors to be of prime i...

  4. Transnational Tobacco Company Interests in Smokeless Tobacco in Europe: Analysis of Internal Industry Documents and Contemporary Industry Materials

    PubMed Central

    Peeters, Silvy; Gilmore, Anna B.

    2013-01-01

    Background European Union (EU) legislation bans the sale of snus, a smokeless tobacco (SLT) which is considerably less harmful than smoking, in all EU countries other than Sweden. To inform the current review of this legislation, this paper aims to explore transnational tobacco company (TTC) interests in SLT and pure nicotine in Europe from the 1970s to the present, comparing them with TTCs' public claims of support for harm reduction. Methods and Results Internal tobacco industry documents (in total 416 documents dating from 1971 to 2009), obtained via searching the online Legacy Tobacco Documents Library, were analysed using a hermeneutic approach. This library comprises documents obtained via litigation in the US and does not include documents from Imperial Tobacco, Japan Tobacco International, or Swedish Match. To help overcome this limitation and provide more recent data, we triangulated our documentary findings with contemporary documentation including TTC investor presentations. The analysis demonstrates that British American Tobacco explored SLT opportunities in Europe from 1971 driven by regulatory threats and health concerns, both likely to impact cigarette sales negatively, and the potential to create a new form of tobacco use among those no longer interested in taking up smoking. Young people were a key target. TTCs did not, however, make SLT investments until 2002, a time when EU cigarette volumes started declining, smoke-free legislation was being introduced, and public health became interested in harm reduction. All TTCs have now invested in snus (and recently in pure nicotine), yet both early and recent snus test markets appear to have failed, and little evidence was found in TTCs' corporate materials that snus is central to their business strategy. Conclusions There is clear evidence that BAT's early interest in introducing SLT in Europe was based on the potential for creating an alternative form of tobacco use in light of declining cigarette sales

  5. The Document Management Alliance.

    ERIC Educational Resources Information Center

    Fay, Chuck

    1998-01-01

    Describes the Document Management Alliance, a standards effort for document management systems that manages and tracks changes to electronic documents created and used by collaborative teams, provides secure access, and facilitates online information retrieval via the Internet and World Wide Web. Future directions are also discussed. (LRW)

  6. Software Tools for In-Situ Documentation of Built Heritage

    NASA Astrophysics Data System (ADS)

    Smars, P.

    2013-07-01

    The paper presents open source software tools developed by the author to facilitate in-situ documentation of architectural and archæological heritage. The design choices are exposed and related to a general issue in conservation and documentation: taking decisions about a valuable object under threat . The questions of level of objectivity is central to the three steps of this process. It is our belief that in-situ documentation has to be favoured in this demanding context, full of potential discoveries. The very powerful surveying techniques in rapid development nowadays enhance our vision but often tend to bring back a critical part of the documentation process to the office. The software presented facilitate a direct treatment of the data on the site. Emphasis is given to flexibility, interoperability and simplicity. Key features of the software are listed and illustrated with examples (3D model of Gothic vaults, analysis of the shape of a column, deformation of a wall, direct interaction with AutoCAD).

  7. Tobacco documents research methodology

    PubMed Central

    McCandless, Phyra M; Klausner, Kim; Taketa, Rachel; Yerger, Valerie B

    2011-01-01

    Tobacco documents research has developed into a thriving academic enterprise since its inception in 1995. The technology supporting tobacco documents archiving, searching and retrieval has improved greatly since that time, and consequently tobacco documents researchers have considerably more access to resources than was the case when researchers had to travel to physical archives and/or electronically search poorly and incompletely indexed documents. The authors of the papers presented in this supplement all followed the same basic research methodology. Rather than leave the reader of the supplement to read the same discussion of methods in each individual paper, presented here is an overview of the methods all authors followed. In the individual articles that follow in this supplement, the authors present the additional methodological information specific to their topics. This brief discussion also highlights technological capabilities in the Legacy Tobacco Documents Library and updates methods for organising internal tobacco documents data and findings. PMID:21504933

  8. RELEASE OF DRIED RADIOACTIVE WASTE MATERIALS TECHNICAL BASIS DOCUMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    KOZLOWSKI, S.D.

    2007-05-30

    This technical basis document was developed to support RPP-23429, Preliminary Documented Safety Analysis for the Demonstration Bulk Vitrification System (PDSA) and RPP-23479, Preliminary Documented Safety Analysis for the Contact-Handled Transuranic Mixed (CH-TRUM) Waste Facility. The main document describes the risk binning process and the technical basis for assigning risk bins to the representative accidents involving the release of dried radioactive waste materials from the Demonstration Bulk Vitrification System (DBVS) and to the associated represented hazardous conditions. Appendices D through F provide the technical basis for assigning risk bins to the representative dried waste release accident and associated represented hazardous conditionsmore » for the Contact-Handled Transuranic Mixed (CH-TRUM) Waste Packaging Unit (WPU). The risk binning process uses an evaluation of the frequency and consequence of a given representative accident or represented hazardous condition to determine the need for safety structures, systems, and components (SSC) and technical safety requirement (TSR)-level controls. A representative accident or a represented hazardous condition is assigned to a risk bin based on the potential radiological and toxicological consequences to the public and the collocated worker. Note that the risk binning process is not applied to facility workers because credible hazardous conditions with the potential for significant facility worker consequences are considered for safety-significant SSCs and/or TSR-level controls regardless of their estimated frequency. The controls for protection of the facility workers are described in RPP-23429 and RPP-23479. Determination of the need for safety-class SSCs was performed in accordance with DOE-STD-3009-94, Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses, as described below.« less

  9. NOAA Observing System Integrated Analysis (NOSIA): development and support to the NOAA Satellite Observing System Architecture

    NASA Astrophysics Data System (ADS)

    Reining, R. C.; Cantrell, L. E., Jr.; Helms, D.; LaJoie, M.; Pratt, A. S.; Ries, V.; Taylor, J.; Yuen-Murphy, M. A.

    2016-12-01

    There is a deep relationship between NOSIA-II and the Federal Earth Observation Assessment (EOA) efforts (EOA 2012 and 2016) chartered under the National Science and Technology Council, Committee on Environment, Natural Resources, and Sustainability, co-chaired by the White House Office of Science and Technology Policy, NASA, NOAA, and USGS. NOSIA-1, which was conducted with a limited scope internal to NOAA in 2010, developed the methodology and toolset that was adopted for EOA 2012, and NOAA staffed the team that conducted the data collection, modeling, and analysis effort for EOA 2012. EOA 2012 was the first-ever integrated analysis of the relative impact of 379 observing systems and data sources contributing to the key objectives identified for 13 Societal Benefit Areas (SBA) including Weather, Climate, Disasters, Oceans and Coastal Resources, and Water Resources. This effort culminated in the first National Plan for Civil Earth Observations. NOAA conducted NOSIA-II starting in 2012 to extend the NOSIA methodology across all of NOAA's Mission Service Areas, covering a representative sample (over 1000) of NOAA's products and services. The detailed information from NOSIA-II is being integrated into EOA 2016 to underpin a broad array of Key Products, Services, and (science) Objectives (KPSO) identified by the inter-agency SBA teams. EOA 2016 is expected to provide substantially greater insight into the cross-agency impacts of observing systems contributing to a wide array of KPSOs, and by extension, to societal benefits flowing from these public-facing products. NOSIA-II is being adopted by NOAA as a corporate decision-analysis and support capability to inform leadership decisions on its integrated observing systems portfolio. Application examples include assessing the agency-wide impacts of planned decommissioning of ships and aircraft in NOAA's fleet, and the relative cost-effectiveness of alternative space-based architectures in the post-GOES-R and JPSS era

  10. Standardised care plans for in hospital stroke care improve documentation of health care assessments.

    PubMed

    Pöder, Ulrika; Dahm, Marie Fogelberg; Karlsson, Nina; Wadensten, Barbro

    2015-10-01

    To compare stroke unit staff members' documentation of care in line with evidence-based guidelines pre- and postimplementation of a multi-professional, evidence-based standardised care plan for stroke care in the electronic health record. Rapid and effective measures for patients with stroke or suspected stroke can limit the extent of damage; it is imperative that patients be observed, assessed and treated in accordance with evidence-based practice in hospital. Quantitative, comparative. Structured retrospective health record reviews were made prior to (n 60) and one and a half years after implementation (n 60) of a multi-professional evidence-based standardised care plan with a quality standard for stroke care in the electronic health record. Significant improvements were found in documentation of assessed vital signs, except for body temperature, Day 1 post compared with preimplementation. Documentation frequency regarding body temperature Day 1 and blood pressure and pulse Day 2 decreased post compared with preimplementation. Improvements were also detected in documented observations of patients' micturition capacity, swallowing capacity and mouth status and the proportion of physiotherapist-documented aid assessments. Observations of blood glucose, mobilisation ability and speech and communication ability were unchanged. An evidence-based standardised care plan in an electronic health record assists staff in improving documentation of health status assessments during the first days after a stroke diagnosis. Use of a standardised care plan seems to have the potential to help staff adhere to evidence-based patient care and, thereby, to increase patient safety. © 2015 John Wiley & Sons Ltd.

  11. Hurricane Ike: Observations and Analysis of Coastal Change

    USGS Publications Warehouse

    Doran, Kara S.; Plant, Nathaniel G.; Stockdon, Hilary F.; Sallenger, Asbury H.; Serafin, Katherine A.

    2009-01-01

    Understanding storm-induced coastal change and forecasting these changes require knowledge of the physical processes associated with the storm and the geomorphology of the impacted coastline. The primary physical processes of interest are the wind field, storm surge, and wave climate. Not only does wind cause direct damage to structures along the coast, but it is ultimately responsible for much of the energy that is transferred to the ocean and expressed as storm surge, mean currents, and large waves. Waves and currents are the processes most responsible for moving sediments in the coastal zone during extreme storm events. Storm surge, the rise in water level due to the wind, barometric pressure, and other factors, allows both waves and currents to attack parts of the coast not normally exposed to those processes. Coastal geomorphology, including shapes of the shoreline, beaches, and dunes, is equally important to the coastal change observed during extreme storm events. Relevant geomorphic variables include sand dune elevation, beach width, shoreline position, sediment grain size, and foreshore beach slope. These variables, in addition to hydrodynamic processes, can be used to predict coastal vulnerability to storms The U.S. Geological Survey's (USGS) National Assessment of Coastal Change Hazards Project (http://coastal.er.usgs.gov/hurricanes), strives to provide hazard information to those interested in the Nation's coastlines, including residents of coastal areas, government agencies responsible for coastal management, and coastal researchers. As part of the National Assessment, observations were collected to measure coastal changes associated with Hurricane Ike, which made landfall near Galveston, Texas, on September 13, 2008. Methods of observation included aerial photography and airborne topographic surveys. This report documents these data-collection efforts and presents qualitative and quantitative descriptions of hurricane-induced changes to the shoreline

  12. DEM analysis of FOXSI-2 microflare using AIA observations

    NASA Astrophysics Data System (ADS)

    Athiray Panchapakesan, Subramania; Glesener, Lindsay; Vievering, Juliana; Camilo Buitrago-Casas, Juan; Christe, Steven; Inglis, Andrew; Krucker, Sam; Musset, Sophie

    2017-08-01

    The second flight of Focusing Optics X-ray Solar Imager (FOXSI) sounding rocket experiment was successfully completed on 11 December 2014. FOXSI makes direct imaging and spectral observation of the Sun in hard X-rays using grazing incidence optics modules which focus X-rays onto seven focal plane detectors kept at a 2m distance, in the energy range 4 to 20 keV, to study particle acceleration and coronal heating. Significant HXR emissions were observed by FOXSI during microflare events with A0.5 and A2.5 class, as classified by GOES, that occurred during FOXSI-2 flight.Spectral analysis of FOXSI data for these events indicate presence of plasma at higher temperatures (>10MK). We attempt to study the plasma content in the corona at different temperatures, characterized by the differential emission measure (DEM), over the FOXSI-2 observed flare regions using the Atmospheric Imaging Assembly (SDO/AIA) data. We utilize AIA observations in different EUV filters that are sensitive to ionized iron lines, to determine the DEM by using a regularized inversion method. This poster will show the properties of hot plasma as derived from FOXSI-2 HXR spectra with supporting DEM analysis using AIA observations.

  13. Falls documentation in nursing homes: agreement between the minimum data set and chart abstractions of medical and nursing documentation.

    PubMed

    Hill-Westmoreland, Elizabeth E; Gruber-Baldini, Ann L

    2005-02-01

    To assess the agreement between falls as recorded in the Minimum Data Set (MDS) and fall events abstracted from chart documentation of elderly nursing home (NH) residents. Secondary analysis of data from a longitudinal panel study. Fifty-six randomly selected NHs in Maryland stratified by facility size and geographic region. Four hundred sixty-two NH residents, aged 65 and older, in NHs for 1 year. Falls were abstracted from resident charts and compared with MDS fall variables. Fall events data obtained from other sources of chart documentation were matched for the corresponding periods of 30 and 180 days before the 1-year MDS assessment date. For a 30-day period, concordance between the MDS and chart abstractions of falls occurred in 65% of cases, with a kappa coefficient of 0.29 (P<.001), indicating fair agreement. Concordance occurred between the sources for 75% of cases for a 180-day period, with a kappa of 0.50 (P<.001), indicating moderate agreement. During the 180-day period, chart abstractions showed that 49% of the sample fell, whereas the MDS revealed that only 28% fell. An analysis of residents whose falls the MDS missed indicated that these residents had significantly more activity of daily living impairment and significantly less unsteady gait and cane/walker use. The MDS underreported falls. Nurses completing MDS assessments must carefully review residents' medical records for falls documentation. Future studies should use caution when employing MDS data as the only indicator of falls.

  14. The Computable Catchment: An executable document for model-data software sharing, reproducibility and interactive visualization

    NASA Astrophysics Data System (ADS)

    Gil, Y.; Duffy, C.

    2015-12-01

    This paper proposes the concept of a "Computable Catchment" which is used to develop a collaborative platform for watershed modeling and data analysis. The object of the research is a sharable, executable document similar to a pdf, but one that includes documentation of the underlying theoretical concepts, interactive computational/numerical resources, linkage to essential data repositories and the ability for interactive model-data visualization and analysis. The executable document for each catchment is stored in the cloud with automatic provisioning and a unique identifier allowing collaborative model and data enhancements for historical hydroclimatic reconstruction and/or future landuse or climate change scenarios to be easily reconstructed or extended. The Computable Catchment adopts metadata standards for naming all variables in the model and the data. The a-priori or initial data is derived from national data sources for soils, hydrogeology, climate, and land cover available from the www.hydroterre.psu.edu data service (Leonard and Duffy, 2015). The executable document is based on Wolfram CDF or Computable Document Format with an interactive open-source reader accessible by any modern computing platform. The CDF file and contents can be uploaded to a website or simply shared as a normal document maintaining all interactive features of the model and data. The Computable Catchment concept represents one application for Geoscience Papers of the Future representing an extensible document that combines theory, models, data and analysis that are digitally shared, documented and reused among research collaborators, students, educators and decision makers.

  15. Guidelines for Document Designers.

    ERIC Educational Resources Information Center

    Felker, Daniel B.; And Others

    Intended to improve the quality of public documents by making them clearer to the people who use them, this book contains document design principles concerned with writing documents that are visually distinct, attractive, and easily understood. Following an introduction, the major portion of the book presents the 25 principles, each of which…

  16. Pretest analysis document for Test S-FS-6

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shaw, R.A.; Hall, D.G.

    This report documents the pretest analyses completed for Semiscale Test S-FS-6. This test will simulate a transient initiated by a 100% break in a steam generator bottom feedwater line downstream of the check valve. The initial conditions represent normal operating conditions for a C-E System 80 nuclear power plant. Predictions of transients resulting from feedwater line breaks in these plants have indicated that significant primary system overpressurization may occur. The enclosed analyses include a RELAP5/MOD2/CY21 code calculation and preliminary results from a facility hot, integrated test which was conducted to near S-FS-6 specifications. The results of these analyses indicate thatmore » the test objectives for Test S-FS-6 can be achieved. The primary system overpressurization will pose no threat to personnel or plant integrity.« less

  17. Extended Subject Access to Hypertext Online Documentation. Part III: The Document-Boundaries Problem.

    ERIC Educational Resources Information Center

    Girill, T. R.

    1991-01-01

    This article continues the description of DFT (Document, Find, Theseus), an online documentation system that provides computer-managed on-demand printing of software manuals as well as the interactive retrieval of reference passages. Document boundaries in the hypertext database are discussed, search vocabulary complexities are described, and text…

  18. Testing inter-observer reliability of the Transition Analysis aging method on the William M. Bass forensic skeletal collection.

    PubMed

    Fojas, Christina L; Kim, Jieun; Minsky-Rowland, Jocelyn D; Algee-Hewitt, Bridget F B

    2018-01-01

    Skeletal age estimation is an integral part of the biological profile. Recent work shows how multiple-trait approaches better capture senescence as it occurs at different rates among individuals. Furthermore, a Bayesian statistical framework of analysis provides more useful age estimates. The component-scoring method of Transition Analysis (TA) may resolve many of the functional and statistical limitations of traditional phase-aging methods and is applicable to both paleodemography and forensic casework. The present study contributes to TA-research by validating TA for multiple, differently experienced observers using a collection of modern forensic skeletal cases. Five researchers independently applied TA to a random sample of 58 documented individuals from the William M. Bass Forensic Skeletal Collection, for whom knowledge of chronological age was withheld. Resulting scores were input into the ADBOU software and maximum likelihood estimates (MLEs) and 95% confidence intervals (CIs) were produced using the forensic prior. Krippendorff's alpha was used to evaluate interrater reliability and agreement. Inaccuracy and bias were measured to gauge the magnitude and direction of difference between estimated ages and chronological ages among the five observers. The majority of traits had moderate to excellent agreement among observers (≥0.6). The superior surface morphology had the least congruence (0.4), while the ventral symphyseal margin had the most (0.9) among scores. Inaccuracy was the lowest for individuals younger than 30 and the greatest for individuals over 60. Consistent over-estimation of individuals younger than 30 and under-estimation of individuals over 40 years old occurred. Individuals in their 30s showed a mixed pattern of under- and over-estimation among observers. These results support the use of the TA method by researchers of varying experience levels. Further, they validate its use on forensic cases, given the low error overall. © 2017 Wiley

  19. Analysis of Spatial Autocorrelation for Optimal Observation Network in Korea

    NASA Astrophysics Data System (ADS)

    Park, S.; Lee, S.; Lee, E.; Park, S. K.

    2016-12-01

    Many studies for improving prediction of high-impact weather have been implemented, such as THORPEX (The Observing System Research and Predictability Experiment), FASTEX (Fronts and Atlantic Storm-Track Experiment), NORPEX (North Pacific Experiment), WSR/NOAA (Winter Storm Reconnaissance), and DOTSTAR (Dropwindsonde Observations for Typhoon Surveillance near the TAiwan Region). One of most important objectives in these studies is to find effects of observation on forecast, and to establish optimal observation network. However, there are lack of such studies on Korea, although Korean peninsula exhibits a highly complex terrain so it is difficult to predict its weather phenomena. Through building the future optimal observation network, it is necessary to increase utilization of numerical weather prediction and improve monitoring·tracking·prediction skills of high-impact weather in Korea. Therefore, we will perform preliminary study to understand the spatial scale for an expansion of observation system through Spatial Autocorrelation (SAC) analysis. In additions, we will develop a testbed system to design an optimal observation network. Analysis is conducted with Automatic Weather System (AWS) rainfall data, global upper air grid observation (i.e., temperature, pressure, humidity), Himawari satellite data (i.e., water vapor) during 2013-2015 of Korea. This study will provide a guideline to construct observation network for not only improving weather prediction skill but also cost-effectiveness.

  20. BENCHMARK DOSE TECHNICAL GUIDANCE DOCUMENT ...

    EPA Pesticide Factsheets

    The purpose of this document is to provide guidance for the Agency on the application of the benchmark dose approach in determining the point of departure (POD) for health effects data, whether a linear or nonlinear low dose extrapolation is used. The guidance includes discussion on computation of benchmark doses and benchmark concentrations (BMDs and BMCs) and their lower confidence limits, data requirements, dose-response analysis, and reporting requirements. This guidance is based on today's knowledge and understanding, and on experience gained in using this approach.

  1. Automated Management Of Documents

    NASA Technical Reports Server (NTRS)

    Boy, Guy

    1995-01-01

    Report presents main technical issues involved in computer-integrated documentation. Problems associated with automation of management and maintenance of documents analyzed from perspectives of artificial intelligence and human factors. Technologies that may prove useful in computer-integrated documentation reviewed: these include conventional approaches to indexing and retrieval of information, use of hypertext, and knowledge-based artificial-intelligence systems.

  2. Analysis of British American Tobacco's questionable use of privilege and protected document claims at the Guildford Depository

    PubMed Central

    LeGresley, Eric; Lee, Kelley

    2017-01-01

    Background Tobacco companies have a documented history of attempting to hide information from public scrutiny, including inappropriate privilege claims. The 1998 Minnesota Consent Judgement created two depositories to provide public access to discovered documents. Users raised concerns about the access conditions and ongoing integrity of the Guildford Depository collection operated until 2015 by British American Tobacco (BAT). Methods A metadata search of the Legacy Tobacco Documents Library identified inconsistent privilege claims, and duplicates of documents withheld by BAT from public visitors. A review of the validity of claims, for documents obtained through these searches, was conducted against recognised legal definitions of privilege. Findings BAT has asserted inappropriate privilege claims over 49% of the documents reviewed (n=63). The quantity of such claims and consistency of the stated rationale for the privilege claims suggest a concerted effort rather than human error. Conclusions There was insufficient attention given to the operation of the Guildford Depository by the original plaintiffs, including to the subsequent use of privilege claims. Appropriate access to these documents, commensurate with the terms of legal settlements creating the collection, was critical given their public interest value for enhancing understanding of industry strategies and activities, informing of policy interventions, and for holding the industry to account. Future legal settlements should prevent defendants from subsequently withholding disclosed documents, aside from those legitimately privileged, from public view. Control of publicly disclosed documents should not be placed back into the hands of defendant tobacco companies. Plaintiffs also need to invest adequate resources into policing claims of legal privilege. PMID:27354678

  3. Slant correction for handwritten English documents

    NASA Astrophysics Data System (ADS)

    Shridhar, Malayappan; Kimura, Fumitaka; Ding, Yimei; Miller, John W. V.

    2004-12-01

    Optical character recognition of machine-printed documents is an effective means for extracting textural material. While the level of effectiveness for handwritten documents is much poorer, progress is being made in more constrained applications such as personal checks and postal addresses. In these applications a series of steps is performed for recognition beginning with removal of skew and slant. Slant is a characteristic unique to the writer and varies from writer to writer in which characters are tilted some amount from vertical. The second attribute is the skew that arises from the inability of the writer to write on a horizontal line. Several methods have been proposed and discussed for average slant estimation and correction in the earlier papers. However, analysis of many handwritten documents reveals that slant is a local property and slant varies even within a word. The use of an average slant for the entire word often results in overestimation or underestimation of the local slant. This paper describes three methods for local slant estimation, namely the simple iterative method, high-speed iterative method, and the 8-directional chain code method. The experimental results show that the proposed methods can estimate and correct local slant more effectively than the average slant correction.

  4. Ancient administrative handwritten documents: X-ray analysis and imaging

    PubMed Central

    Albertin, F.; Astolfo, A.; Stampanoni, M.; Peccenini, Eva; Hwu, Y.; Kaplan, F.; Margaritondo, G.

    2015-01-01

    Handwritten characters in administrative antique documents from three centuries have been detected using different synchrotron X-ray imaging techniques. Heavy elements in ancient inks, present even for everyday administrative manuscripts as shown by X-ray fluorescence spectra, produce attenuation contrast. In most cases the image quality is good enough for tomography reconstruction in view of future applications to virtual page-by-page ‘reading’. When attenuation is too low, differential phase contrast imaging can reveal the characters from refractive index effects. The results are potentially important for new information harvesting strategies, for example from the huge Archivio di Stato collection, objective of the Venice Time Machine project. PMID:25723946

  5. Ancient administrative handwritten documents: X-ray analysis and imaging.

    PubMed

    Albertin, F; Astolfo, A; Stampanoni, M; Peccenini, Eva; Hwu, Y; Kaplan, F; Margaritondo, G

    2015-03-01

    Handwritten characters in administrative antique documents from three centuries have been detected using different synchrotron X-ray imaging techniques. Heavy elements in ancient inks, present even for everyday administrative manuscripts as shown by X-ray fluorescence spectra, produce attenuation contrast. In most cases the image quality is good enough for tomography reconstruction in view of future applications to virtual page-by-page `reading'. When attenuation is too low, differential phase contrast imaging can reveal the characters from refractive index effects. The results are potentially important for new information harvesting strategies, for example from the huge Archivio di Stato collection, objective of the Venice Time Machine project.

  6. 32 CFR 651.8 - Disposition of final documents.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 4 2010-07-01 2010-07-01 true Disposition of final documents. 651.8 Section 651.8 National Defense Department of Defense (Continued) DEPARTMENT OF THE ARMY (CONTINUED) ENVIRONMENTAL QUALITY ENVIRONMENTAL ANALYSIS OF ARMY ACTIONS (AR 200-2) Introduction § 651.8 Disposition of...

  7. Duplicate document detection in DocBrowse

    NASA Astrophysics Data System (ADS)

    Chalana, Vikram; Bruce, Andrew G.; Nguyen, Thien

    1998-04-01

    Duplicate documents are frequently found in large databases of digital documents, such as those found in digital libraries or in the government declassification effort. Efficient duplicate document detection is important not only to allow querying for similar documents, but also to filter out redundant information in large document databases. We have designed three different algorithm to identify duplicate documents. The first algorithm is based on features extracted from the textual content of a document, the second algorithm is based on wavelet features extracted from the document image itself, and the third algorithm is a combination of the first two. These algorithms are integrated within the DocBrowse system for information retrieval from document images which is currently under development at MathSoft. DocBrowse supports duplicate document detection by allowing (1) automatic filtering to hide duplicate documents, and (2) ad hoc querying for similar or duplicate documents. We have tested the duplicate document detection algorithms on 171 documents and found that text-based method has an average 11-point precision of 97.7 percent while the image-based method has an average 11- point precision of 98.9 percent. However, in general, the text-based method performs better when the document contains enough high-quality machine printed text while the image- based method performs better when the document contains little or no quality machine readable text.

  8. Study of parameters of the nearest neighbour shared algorithm on clustering documents

    NASA Astrophysics Data System (ADS)

    Mustika Rukmi, Alvida; Budi Utomo, Daryono; Imro’atus Sholikhah, Neni

    2018-03-01

    Document clustering is one way of automatically managing documents, extracting of document topics and fastly filtering information. Preprocess of clustering documents processed by textmining consists of: keyword extraction using Rapid Automatic Keyphrase Extraction (RAKE) and making the document as concept vector using Latent Semantic Analysis (LSA). Furthermore, the clustering process is done so that the documents with the similarity of the topic are in the same cluster, based on the preprocesing by textmining performed. Shared Nearest Neighbour (SNN) algorithm is a clustering method based on the number of "nearest neighbors" shared. The parameters in the SNN Algorithm consist of: k nearest neighbor documents, ɛ shared nearest neighbor documents and MinT minimum number of similar documents, which can form a cluster. Characteristics The SNN algorithm is based on shared ‘neighbor’ properties. Each cluster is formed by keywords that are shared by the documents. SNN algorithm allows a cluster can be built more than one keyword, if the value of the frequency of appearing keywords in document is also high. Determination of parameter values on SNN algorithm affects document clustering results. The higher parameter value k, will increase the number of neighbor documents from each document, cause similarity of neighboring documents are lower. The accuracy of each cluster is also low. The higher parameter value ε, caused each document catch only neighbor documents that have a high similarity to build a cluster. It also causes more unclassified documents (noise). The higher the MinT parameter value cause the number of clusters will decrease, since the number of similar documents can not form clusters if less than MinT. Parameter in the SNN Algorithm determine performance of clustering result and the amount of noise (unclustered documents ). The Silhouette coeffisient shows almost the same result in many experiments, above 0.9, which means that SNN algorithm works well

  9. Embedding the shapes of regions of interest into a Clinical Document Architecture document.

    PubMed

    Minh, Nguyen Hai; Yi, Byoung-Kee; Kim, Il Kon; Song, Joon Hyun; Binh, Pham Viet

    2015-03-01

    Sharing a medical image visually annotated by a region of interest with a remotely located specialist for consultation is a good practice. It may, however, require a special-purpose (and most likely expensive) system to send and view them, which is an unfeasible solution in developing countries such as Vietnam. In this study, we design and implement interoperable methods based on the HL7 Clinical Document Architecture and the eXtensible Markup Language Stylesheet Language for Transformation standards to seamlessly exchange and visually present the shapes of regions of interest using web browsers. We also propose a new integration architecture for a Clinical Document Architecture generator that enables embedding of regions of interest and simultaneous auto-generation of corresponding style sheets. Using the Clinical Document Architecture document and style sheet, a sender can transmit clinical documents and medical images together with coordinate values of regions of interest to recipients. Recipients can easily view the documents and display embedded regions of interest by rendering them in their web browser of choice. © The Author(s) 2014.

  10. Dynamical analysis of Jovian polar observations by Juno

    NASA Astrophysics Data System (ADS)

    Tabataba-Vakili, Fachreddin; Orton, Glenn S.; Adriani, Alberto; Eichstaedt, Gerald; Grassi, Davide; Ingersoll, Andrew P.; Li, Cheng; Hansen, Candice; Momary, Thomas W.; Moriconi, Maria Luisa; Mura, Alessandro; Read, Peter L.; Rogers, John; Young, Roland M. B.

    2017-10-01

    The JunoCAM and JIRAM instruments onboard the Juno spacecraft have generated unparalleled observations of the Jovian polar regions. These observations reveal a turbulent environment with an unexpected structure of cyclonic polar vortices. We measure the wind velocity in the polar region using correlation image velocimetry of consecutive images. From this data, we calculate the kinetic energy fluxes between different length scales. An analysis of the kinetic energy spectra and eddy-zonal flow interactions may improve our understanding of the mechanisms maintaining the polar macroturbulence in the Jovian atmosphere.

  11. A Observational Study of the Internal Structure of Airmass Thunderstorms

    NASA Astrophysics Data System (ADS)

    Kingsmill, David Edmund

    The internal structure of airmass thunderstorms is examined with Doppler and dual-polarization radar, photographic and rawinsonde data from the 1986 MIST project. A kinematic, dynamic and thermodynamic analysis of one well documented case shows a life cycle which closely resembles the Byers and Braham model for airmass storms. Other less detailed cases, examined to supplement this analysis, largely confirm these findings. However, several phenomena never documented for this storm-type are discussed. One of these is a midlevel inflow, which in one case caused a visible constriction in a storm cloud. This inflow appears to arise from the mass compensation required when a strong updraft driven by buoyancy from glaciation forms above a weaker updraft loaded down by the precipitation core. A downdraft at midlevels with an associated "weak-echo" trench is also observed. Its origin appears related to a shear induced wake entrainment process. In addition, microburst intensity surface outflows are observed. The downdrafts responsible for these events appear to be restricted to low levels and to be separate from the midlevel downdraft. One case shows this type of downdraft to be initiated by precipitation loading and intensified by negative thermal buoyancy. In light of these new features, the Byers and Braham model of the cumulus, mature and dissipating stages is reexamined.

  12. Linking Indigenous Knowledge and Observed Climate Change Studies

    NASA Technical Reports Server (NTRS)

    Alexander, Chief Clarence; Bynum, Nora; Johnson, Liz; King, Ursula; Mustonen, Tero; Neofotis, Peter; Oettle, Noel; Rosenzweig, Cynthia; Sakakibara, Chie; Shadrin, Chief Vyacheslav; hide

    2010-01-01

    We present indigenous knowledge narratives and explore their connections to documented temperature and other climate changes and observed climate change impact studies. We then propose a framework for enhancing integration of these indigenous narratives of observed climate change with global assessments. Our aim is to contribute to the thoughtful and respectful integration of indigenous knowledge with scientific data and analysis, so that this rich body of knowledge can inform science, and so that indigenous and traditional peoples can use the tools and methods of science for the benefit of their communities if they choose to do so. Enhancing ways of understanding such connections are critical as the Intergovernmental Panel on Climate Change Fifth Assessment process gets underway.

  13. System Documentation Manual.

    ERIC Educational Resources Information Center

    Semmel, Melvyn I.; Olson, Jerry

    The document is a system documentation manual of the Computer-Assisted Teacher Training System (CATTS) developed by the Center for Innovation in Teaching the Handicapped (Indiana University). CATTS is characterized as a system capable of providing continuous, instantaneous, and/or delayed feedback of relevant teacher-student interaction data to a…

  14. Observation impact studies with the Mercator Ocean analysis and forecasting systems

    NASA Astrophysics Data System (ADS)

    Remy, E. D.; Le Traon, P. Y.; Lellouche, J. M.; Drevillon, M.; Turpin, V.; Benkiran, M.

    2016-02-01

    Mercator Ocean produces and delivers in real-time ocean analysis and forecasts on a daily basis. The quality of the analysis highly relies on the availability and quality of the assimilated observations.Tools are developed to estimate the impact of the present network and to help designing the future evolutions of the observing systems in the context of near real time production of ocean analysis and forecasts. OSE and OSSE are the main approaches used in this context. They allow the assessment of the efficiency of a given data set to constrain the ocean model circulation through the data assimilation process. Illustrations will mainly focus on the present and future evolution of the Argo observation network and altimetry constellation, including the potential impact of future SWOT data. Our systems show clear sensitivities to observation array changes, mainly depending on the specified observation error and regional dynamic. Impact on non observed variables can be important and are important to evaluate. Dedicated diagnostics has to be define to measure the improvements bring by each data set. Alternative approaches to OSE and OSSE are also explored: approximate computation of DFS will be presented and discussed. Limitations of each approach will be discussed in the context of real time operation.

  15. Utopia documents: linking scholarly literature with research data

    PubMed Central

    Attwood, T. K.; Kell, D. B.; McDermott, P.; Marsh, J.; Pettifer, S. R.; Thorne, D.

    2010-01-01

    Motivation: In recent years, the gulf between the mass of accumulating-research data and the massive literature describing and analyzing those data has widened. The need for intelligent tools to bridge this gap, to rescue the knowledge being systematically isolated in literature and data silos, is now widely acknowledged. Results: To this end, we have developed Utopia Documents, a novel PDF reader that semantically integrates visualization and data-analysis tools with published research articles. In a successful pilot with editors of the Biochemical Journal (BJ), the system has been used to transform static document features into objects that can be linked, annotated, visualized and analyzed interactively (http://www.biochemj.org/bj/424/3/). Utopia Documents is now used routinely by BJ editors to mark up article content prior to publication. Recent additions include integration of various text-mining and biodatabase plugins, demonstrating the system's ability to seamlessly integrate on-line content with PDF articles. Availability: http://getutopia.com Contact: teresa.k.attwood@manchester.ac.uk PMID:20823323

  16. Tablet-based cardiac arrest documentation: a pilot study.

    PubMed

    Peace, Jack M; Yuen, Trevor C; Borak, Meredith H; Edelson, Dana P

    2014-02-01

    Conventional paper-based resuscitation transcripts are notoriously inaccurate, often lacking the precision that is necessary for recording a fast-paced resuscitation. The aim of this study was to evaluate whether a tablet computer-based application could improve upon conventional practices for resuscitation documentation. Nurses used either the conventional paper code sheet or a tablet application during simulated resuscitation events. Recorded events were compared to a gold standard record generated from video recordings of the simulations and a CPR-sensing defibrillator/monitor. Events compared included defibrillations, medication deliveries, and other interventions. During the study period, 199 unique interventions were observed in the gold standard record. Of these, 102 occurred during simulations recorded by the tablet application, 78 by the paper code sheet, and 19 during scenarios captured simultaneously by both documentation methods These occurred over 18 simulated resuscitation scenarios, in which 9 nurses participated. The tablet application had a mean sensitivity of 88.0% for all interventions, compared to 67.9% for the paper code sheet (P=0.001). The median time discrepancy was 3s for the tablet, and 77s for the paper code sheet when compared to the gold standard (P<0.001). Similar to prior studies, we found that conventional paper-based documentation practices are inaccurate, often misreporting intervention delivery times or missing their delivery entirely. However, our study also demonstrated that a tablet-based documentation method may represent a means to substantially improve resuscitation documentation quality, which could have implications for resuscitation quality improvement and research. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  17. LCS Content Document Application

    NASA Technical Reports Server (NTRS)

    Hochstadt, Jake

    2011-01-01

    My project at KSC during my spring 2011 internship was to develop a Ruby on Rails application to manage Content Documents..A Content Document is a collection of documents and information that describes what software is installed on a Launch Control System Computer. It's important for us to make sure the tools we use everyday are secure, up-to-date, and properly licensed. Previously, keeping track of the information was done by Excel and Word files between different personnel. The goal of the new application is to be able to manage and access the Content Documents through a single database backed web application. Our LCS team will benefit greatly with this app. Admin's will be able to login securely to keep track and update the software installed on each computer in a timely manner. We also included exportability such as attaching additional documents that can be downloaded from the web application. The finished application will ease the process of managing Content Documents while streamlining the procedure. Ruby on Rails is a very powerful programming language and I am grateful to have the opportunity to build this application.

  18. Dynamic reduction of dimensions of a document vector in a document search and retrieval system

    DOEpatents

    Jiao, Yu; Potok, Thomas E.

    2011-05-03

    The method and system of the invention involves processing each new document (20) coming into the system into a document vector (16), and creating a document vector with reduced dimensionality (17) for comparison with the data model (15) without recomputing the data model (15). These operations are carried out by a first computer (11) while a second computer (12) updates the data model (18), which can be comprised of an initial large group of documents (19) and is premised on the computing an initial data model (13, 14, 15) to provide a reference point for determining document vectors from documents processed from the data stream (20).

  19. Language Learning in the Public Eye: An Analysis of Newspapers and Official Documents in England

    ERIC Educational Resources Information Center

    Graham, Suzanne; Santos, Denise

    2015-01-01

    This article considers the issue of low levels of motivation for foreign language learning in England by exploring how language learning is conceptualised by different key voices in that country through the examination of written data: policy documents and reports on the UK's language needs, curriculum documents and press articles. The extent to…

  20. Functions and requirements document for interim store solidified high-level and transuranic waste

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith-Fewell, M.A., Westinghouse Hanford

    1996-05-17

    The functions, requirements, interfaces, and architectures contained within the Functions and Requirements (F{ampersand}R) Document are based on the information currently contained within the TWRS Functions and Requirements database. The database also documents the set of technically defensible functions and requirements associated with the solidified waste interim storage mission.The F{ampersand}R Document provides a snapshot in time of the technical baseline for the project. The F{ampersand}R document is the product of functional analysis, requirements allocation and architectural structure definition. The technical baseline described in this document is traceable to the TWRS function 4.2.4.1, Interim Store Solidified Waste, and its related requirements, architecture,more » and interfaces.« less

  1. SPEAKEASY HELP documents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fink, J.K.

    1972-07-01

    The HELP documents provide SPEAKEASY users with concise definitions of most of the words available in the current processors. In this report, the documents are given in a variety of formats to enable one to find specific information quickly. The bulk of this report consists of computer read-out of the HELP library via SPEAKEASY.

  2. Generating Hierarchical Document Indices from Common Denominators in Large Document Collections.

    ERIC Educational Resources Information Center

    O'Kane, Kevin C.

    1996-01-01

    Describes an algorithm for computer generation of hierarchical indexes for document collections. The resulting index, when presented with a graphical interface, provides users with a view of a document collection that permits general browsing and informal search activities via an access method that requires no keyboard entry or prior knowledge of…

  3. Knowledge Construction and Knowledge Representation in High School Students' Design of Hypermedia Documents

    ERIC Educational Resources Information Center

    Chen, Pearl; McGrath, Diane

    2003-01-01

    This study documented the processes of knowledge construction and knowledge representation in high school students' hypermedia design projects. Analysis of knowledge construction in linking and structural building yielded distinct types and subtypes of hypermedia documents, which were characterized by four features of knowledge representation: (a)…

  4. [Written information for patients: From papers to documents].

    PubMed

    Cortés-Criado, M C

    2014-01-01

    There is a high variability in the level of information intended for patients, with different content, format and presentation. To determine the perceived safety of the patients treated at the Country Hospital of Melilla (HCML) and to assess the quality of the documents using criteria adapted to the «International Patient Decision Aid Standards» (IPDAS). Descriptive study of the documents given to patients by the HCML. They included questionnaires on perceived safety, classification of the documents, and the level of adherence to the IPDAS criteria. The Information given to patients during their stay in the HCML, their participation in decision-making, and the information about medication, did not exceed the average on the acceptance scale. Only 40 documents were studied (of the 131 collected), on being published in-house, and were classified, following the definitions of the RAE, into instructions (20), recommendations (14) and guidelines (6). Of these, only the 27.5% showed hospital logo. In the content analysis according to the IPDAS criteria, there was an overall adherence rate of 24.1% in instructions, 24.8% in recommendations, and 61.5% in guidelines. The perception of patient safety expressed in the questionnaire, and its assessment according IPDAS criteria, shows there may be a significant improvement within the organization. Furthermore, the quality of patient documentation provided can help decision making. Copyright © 2013 SECA. Published by Elsevier Espana. All rights reserved.

  5. An analysis of electronic document management in oncology care.

    PubMed

    Poulter, Thomas; Gannon, Brian; Bath, Peter A

    2012-06-01

    In this research in progress, a reference model for the use of electronic patient record (EPR) systems in oncology is described. The model, termed CICERO, comprises technical and functional components, and emphasises usability, clinical safety and user acceptance. One of the functional components of the model-an electronic document and records management (EDRM) system-is monitored in the course of its deployment at a leading oncology centre in the UK. Specifically, the user requirements and design of the EDRM solution are described.The study is interpretative and forms part a wider research programme to define and validate the CICERO model. Preliminary conclusions confirm the importance of a socio-technical perspective in Onco-EPR system design.

  6. Perspectives on Social Network Analysis for Observational Scientific Data

    NASA Astrophysics Data System (ADS)

    Singh, Lisa; Bienenstock, Elisa Jayne; Mann, Janet

    This chapter is a conceptual look at data quality issues that arise during scientific observations and their impact on social network analysis. We provide examples of the many types of incompleteness, bias and uncertainty that impact the quality of social network data. Our approach is to leverage the insights and experience of observational behavioral scientists familiar with the challenges of making inference when data are not complete, and suggest avenues for extending these to relational data questions. The focus of our discussion is on network data collection using observational methods because they contain high dimensionality, incomplete data, varying degrees of observational certainty, and potential observer bias. However, the problems and recommendations identified here exist in many other domains, including online social networks, cell phone networks, covert networks, and disease transmission networks.

  7. Analysis of British American Tobacco's questionable use of privilege and protected document claims at the Guildford Depository.

    PubMed

    LeGresley, Eric; Lee, Kelley

    2017-05-01

    Tobacco companies have a documented history of attempting to hide information from public scrutiny, including inappropriate privilege claims. The 1998 Minnesota Consent Judgement created two depositories to provide public access to discovered documents. Users raised concerns about the access conditions and ongoing integrity of the Guildford Depository collection operated until 2015 by British American Tobacco (BAT). A metadata search of the Legacy Tobacco Documents Library identified inconsistent privilege claims, and duplicates of documents withheld by BAT from public visitors. A review of the validity of claims, for documents obtained through these searches, was conducted against recognised legal definitions of privilege. BAT has asserted inappropriate privilege claims over 49% of the documents reviewed (n=63). The quantity of such claims and consistency of the stated rationale for the privilege claims suggest a concerted effort rather than human error. There was insufficient attention given to the operation of the Guildford Depository by the original plaintiffs, including to the subsequent use of privilege claims. Appropriate access to these documents, commensurate with the terms of legal settlements creating the collection, was critical given their public interest value for enhancing understanding of industry strategies and activities, informing of policy interventions, and for holding the industry to account. Future legal settlements should prevent defendants from subsequently withholding disclosed documents, aside from those legitimately privileged, from public view. Control of publicly disclosed documents should not be placed back into the hands of defendant tobacco companies. Plaintiffs also need to invest adequate resources into policing claims of legal privilege. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  8. Text line extraction in free style document

    NASA Astrophysics Data System (ADS)

    Shen, Xiaolu; Liu, Changsong; Ding, Xiaoqing; Zou, Yanming

    2009-01-01

    This paper addresses to text line extraction in free style document, such as business card, envelope, poster, etc. In free style document, global property such as character size, line direction can hardly be concluded, which reveals a grave limitation in traditional layout analysis. 'Line' is the most prominent and the highest structure in our bottom-up method. First, we apply a novel intensity function found on gradient information to locate text areas where gradient within a window have large magnitude and various directions, and split such areas into text pieces. We build a probability model of lines consist of text pieces via statistics on training data. For an input image, we group text pieces to lines using a simulated annealing algorithm with cost function based on the probability model.

  9. Automatic extraction of numeric strings in unconstrained handwritten document images

    NASA Astrophysics Data System (ADS)

    Haji, M. Mehdi; Bui, Tien D.; Suen, Ching Y.

    2011-01-01

    Numeric strings such as identification numbers carry vital pieces of information in documents. In this paper, we present a novel algorithm for automatic extraction of numeric strings in unconstrained handwritten document images. The algorithm has two main phases: pruning and verification. In the pruning phase, the algorithm first performs a new segment-merge procedure on each text line, and then using a new regularity measure, it prunes all sequences of characters that are unlikely to be numeric strings. The segment-merge procedure is composed of two modules: a new explicit character segmentation algorithm which is based on analysis of skeletal graphs and a merging algorithm which is based on graph partitioning. All the candidate sequences that pass the pruning phase are sent to a recognition-based verification phase for the final decision. The recognition is based on a coarse-to-fine approach using probabilistic RBF networks. We developed our algorithm for the processing of real-world documents where letters and digits may be connected or broken in a document. The effectiveness of the proposed approach is shown by extensive experiments done on a real-world database of 607 documents which contains handwritten, machine-printed and mixed documents with different types of layouts and levels of noise.

  10. Do published studies of educational outreach provide documentation of potentially important characteristics?

    PubMed

    Van Hoof, Thomas J; Miller, Nicole E; Meehan, Thomas P

    2013-01-01

    Educational outreach is a common intervention used to translate research findings into practice; however, the intervention has a mixed effect on changing clinician behavior and improving patient outcomes. Based on a published set of characteristics aimed at standardizing the approach to educational outreach, the authors undertook a careful review of the literature to determine the consistency and completeness of documentation. Using a 25-item abstraction tool, the authors reviewed 68 published studies of a recent Cochrane meta-analysis to determine the extent to which educational outreach studies provide recommended documentation of important characteristics. The results indicate that studies are generally inconsistent (documentation range of 0% to 100% across characteristics) and incomplete (documentation average of 43.1% across studies) in their descriptions. Documentation shortcomings of educational outreach studies make understanding the intervention and interpreting its findings particularly challenging. The authors recommend the creation of a guideline to help improve documentation of educational outreach efforts.

  11. Improving Data Discovery, Access, and Analysis to More Than Three Decades of Oceanographic and Geomorphologic Observations

    NASA Astrophysics Data System (ADS)

    Forte, M.; Hesser, T.; Knee, K.; Ingram, I.; Hathaway, K. K.; Brodie, K. L.; Spore, N.; Bird, A.; Fratantonio, R.; Dopsovic, R.; Keith, A.; Gadomski, K.

    2016-02-01

    The U.S. Army Engineer Research and Development Center's (USACE ERDC) Coastal and Hydraulics Laboratory (CHL) Coastal Observations and Analysis Branch (COAB) Measurements Program has a 35-year record of coastal observations. These datasets include oceanographic point source measurements, Real-Time Kinematic (RTK) GPS bathymetry surveys, and remote sensing data from both the Field Research Facility (FRF) in Duck, NC and from other project and experiment sites around the nation. The data has been used to support a variety of USACE mission areas, including coastal wave model development, beach and bar response, coastal project design, coastal storm surge, and other coastal hazard investigations. Furthermore these data have been widely used by a number of federal and state agencies, academic institutions, and private industries in hundreds of scientific and engineering investigations, publications, conference presentations and model advancement studies. A limiting factor to the use of FRF data has been rapid, reliable access and publicly available metadata for each data type. The addition of web tools, accessible data files, and well-documented metadata will open the door to much future collaboration. With the help of industry partner RPS ASA and the U.S. Army Corps of Engineers Mobile District Spatial Data Branch, a Data Integration Framework (DIF) was developed. The DIF represents a combination of processes, standards, people, and tools used to transform disconnected enterprise data into useful, easily accessible information for analysis and reporting. A front-end data portal connects the user to the framework that integrates both oceanographic observation and geomorphology measurements using a combination of ESRI and open-source technology while providing a seamless data discovery, access, and analysis experience to the user. The user interface was built with ESRI's JavaScript API and all project metadata is managed using Geoportal. The geomorphology data is made

  12. The Earth System Documentation (ES-DOC) Software Process

    NASA Astrophysics Data System (ADS)

    Greenslade, M. A.; Murphy, S.; Treshansky, A.; DeLuca, C.; Guilyardi, E.; Denvil, S.

    2013-12-01

    Earth System Documentation (ES-DOC) is an international project supplying high-quality tools & services in support of earth system documentation creation, analysis and dissemination. It is nurturing a sustainable standards based documentation eco-system that aims to become an integral part of the next generation of exa-scale dataset archives. ES-DOC leverages open source software, and applies a software development methodology that places end-user narratives at the heart of all it does. ES-DOC has initially focused upon nurturing the Earth System Model (ESM) documentation eco-system and currently supporting the following projects: * Coupled Model Inter-comparison Project Phase 5 (CMIP5); * Dynamical Core Model Inter-comparison Project (DCMIP); * National Climate Predictions and Projections Platforms Quantitative Evaluation of Downscaling Workshop. This talk will demonstrate that ES-DOC implements a relatively mature software development process. Taking a pragmatic Agile process as inspiration, ES-DOC: * Iteratively develops and releases working software; * Captures user requirements via a narrative based approach; * Uses online collaboration tools (e.g. Earth System CoG) to manage progress; * Prototypes applications to validate their feasibility; * Leverages meta-programming techniques where appropriate; * Automates testing whenever sensibly feasible; * Streamlines complex deployments to a single command; * Extensively leverages GitHub and Pivotal Tracker; * Enforces strict separation of the UI from underlying API's; * Conducts code reviews.

  13. Electron holes observed in the Moon Plasma Wake

    NASA Astrophysics Data System (ADS)

    Hutchinson, I. H.; Malaspina, D.; Zhou, C.

    2017-10-01

    Electrostatic instabilities are predicted in the magnetized wake of plasma flowing past a non-magnetic absorbing object such as a probe or the moon. Analysis of the data from the Artemis satellites, now orbiting the moon at distances ten moon radii and less, shows very clear evidence of fast-moving isolated solitary potential structures causing bipolar electric field excursions as they pass the satellite's probes. These structures have all the hallmarks of electron holes: BGK solitons typically a few Debye-lengths in size, self-sustaining by a deficit of phase-space density on trapped orbits. Electron holes are now observed to be widespread in space plasmas. They have been observed in PIC simulations of the moon wake to be the non-linear consequence of the predicted electron instabilities. Simulations document hole prevalence, speed, length, and depth; and theory can explain many of these features from kinetic analysis. The solar wind wake is certainly the cause of the overwhelming majority of the holes observed by Artemis, because we observe almost all holes to be in or very near to the wake. We compare theory and simulation of the hole generation, lifetime, and transport mechanisms with observations. Work partially supported by NASA Grant NNX16AG82G.

  14. REVEAL: Software Documentation and Platform Migration

    NASA Technical Reports Server (NTRS)

    Wilson, Michael A.; Veibell, Victoir T.; Freudinger, Lawrence C.

    2008-01-01

    The Research Environment for Vehicle Embedded Analysis on Linux (REVEAL) is reconfigurable data acquisition software designed for network-distributed test and measurement applications. In development since 2001, it has been successfully demonstrated in support of a number of actual missions within NASA s Suborbital Science Program. Improvements to software configuration control were needed to properly support both an ongoing transition to operational status and continued evolution of REVEAL capabilities. For this reason the project described in this report targets REVEAL software source documentation and deployment of the software on a small set of hardware platforms different from what is currently used in the baseline system implementation. This report specifically describes the actions taken over a ten week period by two undergraduate student interns and serves as a final report for that internship. The topics discussed include: the documentation of REVEAL source code; the migration of REVEAL to other platforms; and an end-to-end field test that successfully validates the efforts.

  15. Bayesian data analysis in observational comparative effectiveness research: rationale and examples.

    PubMed

    Olson, William H; Crivera, Concetta; Ma, Yi-Wen; Panish, Jessica; Mao, Lian; Lynch, Scott M

    2013-11-01

    Many comparative effectiveness research and patient-centered outcomes research studies will need to be observational for one or both of two reasons: first, randomized trials are expensive and time-consuming; and second, only observational studies can answer some research questions. It is generally recognized that there is a need to increase the scientific validity and efficiency of observational studies. Bayesian methods for the design and analysis of observational studies are scientifically valid and offer many advantages over frequentist methods, including, importantly, the ability to conduct comparative effectiveness research/patient-centered outcomes research more efficiently. Bayesian data analysis is being introduced into outcomes studies that we are conducting. Our purpose here is to describe our view of some of the advantages of Bayesian methods for observational studies and to illustrate both realized and potential advantages by describing studies we are conducting in which various Bayesian methods have been or could be implemented.

  16. AIR QUALITY CRITERIA DOCUMENT(S) FOR LEAD

    EPA Science Inventory

    This collection of documents intend to assess the latest scientific information on the health and environmental fate and effects of lead to provide scientific bases for periodic review and possible revision of the National Ambient Air Quality Standards (NAAQS) for lead.

  17. From documents to datasets: A MediaWiki-based method of annotating and extracting species observations in century-old field notebooks

    PubMed Central

    Thomer, Andrea; Vaidya, Gaurav; Guralnick, Robert; Bloom, David; Russell, Laura

    2012-01-01

    Abstract Part diary, part scientific record, biological field notebooks often contain details necessary to understanding the location and environmental conditions existent during collecting events. Despite their clear value for (and recent use in) global change studies, the text-mining outputs from field notebooks have been idiosyncratic to specific research projects, and impossible to discover or re-use. Best practices and workflows for digitization, transcription, extraction, and integration with other sources are nascent or non-existent. In this paper, we demonstrate a workflow to generate structured outputs while also maintaining links to the original texts. The first step in this workflow was to place already digitized and transcribed field notebooks from the University of Colorado Museum of Natural History founder, Junius Henderson, on Wikisource, an open text transcription platform. Next, we created Wikisource templates to document places, dates, and taxa to facilitate annotation and wiki-linking. We then requested help from the public, through social media tools, to take advantage of volunteer efforts and energy. After three notebooks were fully annotated, content was converted into XML and annotations were extracted and cross-walked into Darwin Core compliant record sets. Finally, these recordsets were vetted, to provide valid taxon names, via a process we call “taxonomic referencing.” The result is identification and mobilization of 1,068 observations from three of Henderson’s thirteen notebooks and a publishable Darwin Core record set for use in other analyses. Although challenges remain, this work demonstrates a feasible approach to unlock observations from field notebooks that enhances their discovery and interoperability without losing the narrative context from which those observations are drawn. “Compose your notes as if you were writing a letter to someone a century in the future.” Perrine and Patton (2011) PMID:22859891

  18. Engineering Documentation and Data Control

    NASA Technical Reports Server (NTRS)

    Matteson, Michael J.; Bramley, Craig; Ciaruffoli, Veronica

    2001-01-01

    Mississippi Space Services (MSS) the facility services contractor for NASA's John C. Stennis Space Center (SSC), is utilizing technology to improve engineering documentation and data control. Two identified improvement areas, labor intensive documentation research and outdated drafting standards, were targeted as top priority. MSS selected AutoManager(R) WorkFlow from Cyco software to manage engineering documentation. The software is currently installed on over 150 desctops. The outdated SSC drafting standard was written for pre-CADD drafting methods, in other words, board drafting. Implementation of COTS software solutions to manage engineering documentation and update the drafting standard resulted in significant increases in productivity by reducing the time spent searching for documents.

  19. Using formal methods for content validation of medical procedure documents.

    PubMed

    Cota, Érika; Ribeiro, Leila; Bezerra, Jonas Santos; Costa, Andrei; da Silva, Rosiana Estefane; Cota, Gláucia

    2017-08-01

    We propose the use of a formal approach to support content validation of a standard operating procedure (SOP) for a therapeutic intervention. Such an approach provides a useful tool to identify ambiguities, omissions and inconsistencies, and improves the applicability and efficacy of documents in the health settings. We apply and evaluate a methodology originally proposed for the verification of software specification documents to a specific SOP. The verification methodology uses the graph formalism to model the document. Semi-automatic analysis identifies possible problems in the model and in the original document. The verification is an iterative process that identifies possible faults in the original text that should be revised by its authors and/or specialists. The proposed method was able to identify 23 possible issues in the original document (ambiguities, omissions, redundant information, and inaccuracies, among others). The formal verification process aided the specialists to consider a wider range of usage scenarios and to identify which instructions form the kernel of the proposed SOP and which ones represent additional or required knowledge that are mandatory for the correct application of the medical document. By using the proposed verification process, a simpler and yet more complete SOP could be produced. As consequence, during the validation process the experts received a more mature document and could focus on the technical aspects of the procedure itself. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Hurricane Gustav: Observations and Analysis of Coastal Change

    USGS Publications Warehouse

    Doran, Kara S.; Stockdon, Hilary F.; Plant, Nathaniel G.; Sallenger, Asbury H.; Guy, Kristy K.; Serafin, Katherine A.

    2009-01-01

    Understanding storm-induced coastal change and forecasting these changes require knowledge of the physical processes associated with a storm and the geomorphology of the impacted coastline. The primary physical processes of interest are the wind field, storm surge, currents, and wave field. Not only does wind cause direct damage to structures along the coast, but it is ultimately responsible for much of the energy that is transferred to the ocean and expressed as storm surge, mean currents, and surface waves. Waves and currents are the processes most responsible for moving sediments in the coastal zone during extreme storm events. Storm surge, which is the rise in water level due to the wind, barometric pressure, and other factors, allows both waves and currents to attack parts of the coast not normally exposed to these processes. Coastal geomorphology, including shapes of the shoreline, beaches, and dunes, is also a significant aspect of the coastal change observed during extreme storms. Relevant geomorphic variables include sand dune elevation, beach width, shoreline position, sediment grain size, and foreshore beach slope. These variables, in addition to hydrodynamic processes, can be used to predict coastal vulnerability to storms. The U.S. Geological Survey (USGS) National Assessment of Coastal Change Hazards project (http://coastal.er.usgs.gov/hurricanes) strives to provide hazard information to those concerned about the Nation's coastlines, including residents of coastal areas, government agencies responsible for coastal management, and coastal researchers. As part of the National Assessment, observations were collected to measure morphological changes associated with Hurricane Gustav, which made landfall near Cocodrie, Louisiana, on September 1, 2008. Methods of observation included oblique aerial photography, airborne topographic surveys, and ground-based topographic surveys. This report documents these data-collection efforts and presents qualitative and

  1. SureChEMBL: a large-scale, chemically annotated patent document database

    PubMed Central

    Papadatos, George; Davies, Mark; Dedman, Nathan; Chambers, Jon; Gaulton, Anna; Siddle, James; Koks, Richard; Irvine, Sean A.; Pettersson, Joe; Goncharoff, Nicko; Hersey, Anne; Overington, John P.

    2016-01-01

    SureChEMBL is a publicly available large-scale resource containing compounds extracted from the full text, images and attachments of patent documents. The data are extracted from the patent literature according to an automated text and image-mining pipeline on a daily basis. SureChEMBL provides access to a previously unavailable, open and timely set of annotated compound-patent associations, complemented with sophisticated combined structure and keyword-based search capabilities against the compound repository and patent document corpus; given the wealth of knowledge hidden in patent documents, analysis of SureChEMBL data has immediate applications in drug discovery, medicinal chemistry and other commercial areas of chemical science. Currently, the database contains 17 million compounds extracted from 14 million patent documents. Access is available through a dedicated web-based interface and data downloads at: https://www.surechembl.org/. PMID:26582922

  2. 34 CFR 668.60 - Deadlines for submitting documentation and the consequences of failing to provide documentation.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... consequences of failing to provide documentation. 668.60 Section 668.60 Education Regulations of the Offices of... Deadlines for submitting documentation and the consequences of failing to provide documentation. (a) An... Stafford/Ford Loan programs— (1) If an applicant fails to provide the requested documentation within a...

  3. 34 CFR 668.60 - Deadlines for submitting documentation and the consequences of failing to provide documentation.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... consequences of failing to provide documentation. 668.60 Section 668.60 Education Regulations of the Offices of... Deadlines for submitting documentation and the consequences of failing to provide documentation. (a) An... Stafford/Ford Loan programs— (1) If an applicant fails to provide the requested documentation within a...

  4. Ad-Hoc Queries over Document Collections - A Case Study

    NASA Astrophysics Data System (ADS)

    Löser, Alexander; Lutter, Steffen; Düssel, Patrick; Markl, Volker

    We discuss the novel problem of supporting analytical business intelligence queries over web-based textual content, e.g., BI-style reports based on 100.000's of documents from an ad-hoc web search result. Neither conventional search engines nor conventional Business Intelligence and ETL tools address this problem, which lies at the intersection of their capabilities. "Google Squared" or our system GOOLAP.info, are examples of these kinds of systems. They execute information extraction methods over one or several document collections at query time and integrate extracted records into a common view or tabular structure. Frequent extraction and object resolution failures cause incomplete records which could not be joined into a record answering the query. Our focus is the identification of join-reordering heuristics maximizing the size of complete records answering a structured query. With respect to given costs for document extraction we propose two novel join-operations: The multi-way CJ-operator joins records from multiple relationships extracted from a single document. The two-way join-operator DJ ensures data density by removing incomplete records from results. In a preliminary case study we observe that our join-reordering heuristics positively impact result size, record density and lower execution costs.

  5. Observational Analysis of Coronal Fans

    NASA Technical Reports Server (NTRS)

    Talpeanu, D.-C.; Rachmeler, L; Mierla, Marilena

    2017-01-01

    Coronal fans (see Figure 1) are bright observational structures that extend to large distances above the solar surface and can easily be seen in EUV (174 angstrom) above the limb. They have a very long lifetime and can live up to several Carrington rotations (CR), remaining relatively stationary for many months. Note that they are not off-limb manifestation of similarly-named active region fans. The solar conditions required to create coronal fans are not well understood. The goal of this research was to find as many associations as possible of coronal fans with other solar features and to gain a better understanding of these structures. Therefore, we analyzed many fans and created an overview of their properties. We present the results of this statistical analysis and also a case study on the longest living fan.

  6. Challenges to nurses' efforts of retrieving, documenting, and communicating patient care information.

    PubMed

    Keenan, Gail; Yakel, Elizabeth; Dunn Lopez, Karen; Tschannen, Dana; Ford, Yvonne B

    2013-01-01

    To examine information flow, a vital component of a patient's care and outcomes, in a sample of multiple hospital nursing units to uncover potential sources of error and opportunities for systematic improvement. This was a qualitative study of a sample of eight medical-surgical nursing units from four diverse hospitals in one US state. We conducted direct work observations of nursing staff's communication patterns for entire shifts (8 or 12 h) for a total of 200 h and gathered related documentation artifacts for analyses. Data were coded using qualitative content analysis procedures and then synthesized and organized thematically to characterize current practices. Three major themes emerged from the analyses, which represent serious vulnerabilities in the flow of patient care information during nurse hand-offs and to the entire interdisciplinary team across time and settings. The three themes are: (1) variation in nurse documentation and communication; (2) the absence of a centralized care overview in the patient's electronic health record, ie, easily accessible by the entire care team; and (3) rarity of interdisciplinary communication. The care information flow vulnerabilities are a catalyst for multiple types of serious and undetectable clinical errors. We have two major recommendations to address the gaps: (1) to standardize the format, content, and words used to document core information, such as the plan of care, and make this easily accessible to all team members; (2) to conduct extensive usability testing to ensure that tools in the electronic health record help the disconnected interdisciplinary team members to maintain a shared understanding of the patient's plan.

  7. IDC Re-Engineering Phase 2 System Requirements Document V1.3.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, James M.; Burns, John F.; Satpathi, Meara Allena

    2015-12-01

    This System Requirements Document (SRD) defines waveform data processing requirements for the International Data Centre (IDC) of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO). The IDC applies, on a routine basis, automatic processing methods and interactive analysis to raw International Monitoring System (IMS) data in order to produce, archive, and distribute standard IDC products on behalf of all States Parties. The routine processing includes characterization of events with the objective of screening out events considered to be consistent with natural phenomena or non-nuclear, man-made phenomena. This document does not address requirements concerning acquisition, processing and analysis of radionuclide datamore » but includes requirements for the dissemination of radionuclide data and products.« less

  8. IDC Re-Engineering Phase 2 System Requirements Document Version 1.4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, James M.; Burns, John F.; Satpathi, Meara Allena

    This System Requirements Document (SRD) defines waveform data processing requirements for the International Data Centre (IDC) of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO). The IDC applies, on a routine basis, automatic processing methods and interactive analysis to raw International Monitoring System (IMS) data in order to produce, archive, and distribute standard IDC products on behalf of all States Parties. The routine processing includes characterization of events with the objective of screening out events considered to be consistent with natural phenomena or non-nuclear, man-made phenomena. This document does not address requirements concerning acquisition, processing and analysis of radionuclide data,more » but includes requirements for the dissemination of radionuclide data and products.« less

  9. 34 CFR 668.60 - Deadlines for submitting documentation and the consequences of failing to provide documentation.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... consequences of failing to provide documentation. 668.60 Section 668.60 Education Regulations of the Offices of... § 668.60 Deadlines for submitting documentation and the consequences of failing to provide documentation..., excluding the Federal Pell Grant Program— (1) If an applicant fails to provide the requested documentation...

  10. 34 CFR 668.137 - Deadlines for submitting documentation and the consequences of failure to submit documentation.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 3 2010-07-01 2010-07-01 false Deadlines for submitting documentation and the consequences of failure to submit documentation. 668.137 Section 668.137 Education Regulations of the Offices... documentation and the consequences of failure to submit documentation. (a) A student shall submit before a...

  11. Documentation of mountain lions in Marin County, California, 2010–2013

    USGS Publications Warehouse

    Fifield, Virginia L.; Rossi, Aviva J.; Boydston, Erin E.

    2015-01-01

    Prior to 2010, mountain lions (Puma concolor) have rarely been documented in Marin County, California. Although there are reports of sightings of mountain lions or observations of mountain lion sign, most have not been verified by photographs or physical samples. Beginning in 2010, we conducted a pilot study of mountain lions in Marin County using motion-triggered cameras. Our objectives were to obtain additional documentations, confirm the presence of mountain lions outside of Point Reyes National Seashore, and determine if mountain lions had a regular presence in the county. 

  12. 10 CFR 55.27 - Documentation.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 2 2014-01-01 2014-01-01 false Documentation. 55.27 Section 55.27 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) OPERATORS' LICENSES Medical Requirements § 55.27 Documentation. The facility licensee shall document and maintain the results of medical qualifications data, test results, and...

  13. Communication pitfalls of traditional history and physical write-up documentation

    PubMed Central

    Brown, Jeffrey L

    2017-01-01

    Background An unofficial standardized “write-up” outline is commonly used for documenting history and physical examinations, giving oral presentations, and teaching clinical skills. Despite general acceptance, there is an apparent discrepancy between the way clinical encounters are conducted and how they are documented. Methods Fifteen medical school websites were randomly selected from search-engine generated lists. One example of a history and physical write-up from each of six sites, one teaching outline from each of nine additional sites, and recommendations for documentation made in two commonly used textbooks were compared for similarities and differences. Results Except for minor variations in documenting background information, all sampled materials utilized the same standardized format. When the examiners’ early perceptions of the patients’ degree of illness or level of distress were described, they were categorized as “general appearance” within the physical findings. Contrary to clinical practice, none of the examples or recommendations documented these early perceptions before chief concerns and history were presented. Discussion An examiner’s initial perceptions of a patient’s affect, degree of illness, and level of distress can influence the content of the history, triage decisions, and prioritization of likely diagnoses. When chief concerns and history are shared without benefit of this information, erroneous assumptions and miscommunications can result. Conclusion This survey confirms common use of a standardized outline for documenting, communicating, and teaching history-taking and physical examination protocol. The present outline shares early observations out of clinical sequence and may provide inadequate context for accurate interpretation of chief concerns and history. Corrective actions include modifying the documentation sequence to conform to clinical practice and teaching contextual methodology for sharing patient

  14. Life Cycle Management Considerations of Remotely Sensed Geospatial Data and Documentation for Long Term Preservation

    NASA Technical Reports Server (NTRS)

    Khayat, Mohammad G.; Kempler, Steven J.

    2015-01-01

    As geospatial missions age, one of the challenges for the usability of data is the availability of relevant and updated metadata with sufficient documentation that can be used by future generations of users to gain knowledge from the original data. Given that remote sensing data undergo many intermediate processing steps, for example, an understanding of the exact algorithms employed and the quality of that data produced, could be key considerations for these users. As interest in global climate data is increasing, documentation about older data, their origins, and provenance are valuable to first time users attempting to perform historical climate research or comparative analysis of global change. Incomplete or missing documentation could be what stands in the way of a new researcher attempting to use the data. Therefore, preservation of documentation and related metadata is sometimes just as critical as the preservation of the original observational data. The Goddard Earth Sciences - Data and Information Service Center (GES DISC), a NASA Earth science Distributed Active Archive Center (DAAC), that falls under the management structure of the Earth Science Data and Information System (ESDIS), is actively pursuing the preservation of all necessary artifacts needed by future users. In this paper we will detail the data custodial planning and the data lifecycle process developed for content preservation, our implementation of a Preservation System to safeguard documents and associated artifacts from legacy (older) missions, as well as detail lessons learned regarding access rights and confidentiality of information issues. We also elaborate on key points that made our preservation effort successful; the primary points being: the drafting of a governing baseline for historical data preservation from satellite missions, and using the historical baseline as a guide to content filtering of what documents to preserve. The Preservation System currently archives

  15. Dysplastic naevus: histological criteria and their inter-observer reproducibility.

    PubMed

    Hastrup, N; Clemmensen, O J; Spaun, E; Søndergaard, K

    1994-06-01

    Forty melanocytic lesions were examined in a pilot study, which was followed by a final series of 100 consecutive melanocytic lesions, in order to evaluate the inter-observer reproducibility of the histological criteria proposed for the dysplastic naevus. The specimens were examined in a blind fashion by four observers. Analysis by kappa statistics showed poor reproducibility of nuclear features, while reproducibility of architectural features was acceptable, improving in the final series. Consequently, we cannot apply the combined criteria of cytological and architectural features with any confidence in the diagnosis of dysplastic naevus, and, until further studies have documented that architectural criteria alone will suffice in the diagnosis of dysplastic naevus, we, as pathologists, shall avoid this term.

  16. NASA Hybrid Wing Aircraft Aeroacoustic Test Documentation Report

    NASA Technical Reports Server (NTRS)

    Heath, Stephanie L.; Brooks, Thomas F.; Hutcheson, Florence V.; Doty, Michael J.; Bahr, Christopher J.; Hoad, Danny; Becker, Lawrence; Humphreys, William M.; Burley, Casey L.; Stead, Dan; hide

    2016-01-01

    This report summarizes results of the Hybrid Wing Body (HWB) N2A-EXTE model aeroacoustic test. The N2A-EXTE model was tested in the NASA Langley 14- by 22-Foot Subsonic Tunnel (14x22 Tunnel) from September 12, 2012 until January 28, 2013 and was designated as test T598. This document contains the following main sections: Section 1 - Introduction, Section 2 - Main Personnel, Section 3 - Test Equipment, Section 4 - Data Acquisition Systems, Section 5 - Instrumentation and Calibration, Section 6 - Test Matrix, Section 7 - Data Processing, and Section 8 - Summary. Due to the amount of material to be documented, this HWB test documentation report does not cover analysis of acquired data, which is to be presented separately by the principal investigators. Also, no attempt was made to include preliminary risk reduction tests (such as Broadband Engine Noise Simulator and Compact Jet Engine Simulator characterization tests, shielding measurement technique studies, and speaker calibration method studies), which were performed in support of this HWB test. Separate reports containing these preliminary tests are referenced where applicable.

  17. An analysis of satellite state vector observability using SST tracking data

    NASA Technical Reports Server (NTRS)

    Englar, T. S., Jr.; Hammond, C. L.

    1976-01-01

    Observability of satellite state vectors, using only SST tracking data was investigated by covariance analysis under a variety of satellite and station configurations. These results indicate very precarious observability in most short arc cases. The consequences of this are large variances on many state components, such as the downrange component of the relay satellite position. To illustrate the impact of observability problems, an example is given of two distinct satellite orbit pairs generating essentially the same data arc. The physical bases for unobservability are outlined and related to proposed TDRSS configurations. Results are relevant to any mission depending upon TDRSS to determine satellite state. The required mathematical analysis and the software used is described.

  18. Recommended HSE-7 documents hierarchy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klein, R.B.; Jennrich, E.A.; Lund, D.M.

    1990-12-12

    This report recommends a hierarchy of waste management documents at Los Alamos National Laboratory (LANL or Laboratory''). The hierarchy addresses documents that are required to plan, implement, and document waste management programs at Los Alamos. These documents will enable the waste management group and the six sections contained within that group to satisfy requirements that are imposed upon them by the US Department of Energy (DOE), DOE Albuquerque Operations, US Environmental Protection Agency, various State of New Mexico agencies, and Laboratory management.

  19. Recommended HSE-7 documents hierarchy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klein, R.B.; Jennrich, E.A.; Lund, D.M.

    1990-12-12

    This report recommends a hierarchy of waste management documents at Los Alamos National Laboratory (LANL or ``Laboratory``). The hierarchy addresses documents that are required to plan, implement, and document waste management programs at Los Alamos. These documents will enable the waste management group and the six sections contained within that group to satisfy requirements that are imposed upon them by the US Department of Energy (DOE), DOE Albuquerque Operations, US Environmental Protection Agency, various State of New Mexico agencies, and Laboratory management.

  20. Handling a Collection of PDF Documents

    EPA Pesticide Factsheets

    You have several options for making a large collection of PDF documents more accessible to your audience: avoid uploading altogether, use multiple document pages, and use document IDs as anchors for direct links within a document page.

  1. ANOPP programming and documentation standards document

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Standards defining the requirements for preparing software for the Aircraft Noise Prediction Program (ANOPP) were given. It is the intent of these standards to provide definition, design, coding, and documentation criteria for the achievement of a unity among ANOPP products. These standards apply to all of ANOPP's standard software system. The standards encompass philosophy as well as techniques and conventions.

  2. Issues in Commercial Document Delivery.

    ERIC Educational Resources Information Center

    Marcinko, Randall Wayne

    1997-01-01

    Discusses (1) the history of document delivery; (2) the delivery process--end-user request, intermediary request, vendor reference, citation verification, obtaining document and source relations, quality control, transferring document to client, customer service and status, invoicing and billing, research and development, and copyright; and (3)…

  3. Continued Analysis of EUVE Solar System Observations

    NASA Technical Reports Server (NTRS)

    Gladstone, G. Randall

    2001-01-01

    This is the final report for this project. We proposed to continue our work on extracting important results from the EUVE (Extreme UltraViolet Explorer) archive of lunar and jovian system observations. In particular, we planned to: (1) produce several monochromatic images of the Moon at the wavelengths of the brightest solar EUV emission lines; (2) search for evidence of soft X-ray emissions from the Moon and/or X-ray fluorescence at specific EUV wavelengths; (3) search for localized EUV and soft X-ray emissions associated with each of the Galilean satellites; (4) search for correlations between localized Io Plasma Torus (IPT) brightness and volcanic activity on Io; (5) search for soft X-ray emissions from Jupiter; and (6) determine the long term variability of He 58.4 nm emissions from Jupiter, and relate these to solar variability. However, the ADP review panel suggested that the work concentrate on the Jupiter/IPT observations, and provided half the requested funding. Thus we have performed no work on the first two tasks, and instead concentrated on the last three. In addition we used funds from this project to support reduction and analysis of EUVE observations of Venus. While this was not part of the original statement of work, it is entirely in keeping with extracting important results from EUVE solar system observations.

  4. NHEXAS PHASE I MARYLAND STUDY--LIST OF AVAILABLE DOCUMENTS: PROTOCOLS AND SOPS

    EPA Science Inventory

    This document lists available protocols and SOPs for the NHEXAS Phase I Maryland study. It identifies protocols and SOPs for the following study components: (1) Sample collection and field operations, (2) Sample analysis and general laboratory procedures, (3) Data Analysis Proced...

  5. Documentation and knowledge acquisition

    NASA Technical Reports Server (NTRS)

    Rochowiak, Daniel; Moseley, Warren

    1990-01-01

    Traditional approaches to knowledge acquisition have focused on interviews. An alternative focuses on the documentation associated with a domain. Adopting a documentation approach provides some advantages during familiarization. A knowledge management tool was constructed to gain these advantages.

  6. TOF-SIMS Analysis of Red Color Inks of Writing and Printing Tools on Questioned Documents.

    PubMed

    Lee, Jihye; Nam, Yun Sik; Min, Jisook; Lee, Kang-Bong; Lee, Yeonhee

    2016-05-01

    Time-of-flight secondary ion mass spectrometry (TOF-SIMS) is a well-established surface technique that provides both elemental and molecular information from several monolayers of a sample surface while also allowing depth profiling or image mapping to be performed. Static TOF-SIMS with improved performances has expanded the application of TOF-SIMS to the study of a variety of organic, polymeric, biological, archaeological, and forensic materials. In forensic investigation, the use of a minimal sample for the analysis is preferable. Although the TOF-SIMS technique is destructive, the probing beams have microsized diameters so that only small portion of the questioned sample is necessary for the analysis, leaving the rest available for other analyses. In this study, TOF-SIMS and attenuated total reflectance Fourier transform infrared (ATR-FTIR) were applied to the analysis of several different pen inks, red sealing inks, and printed patterns on paper. The overlapping areas of ballpoint pen writing, red seal stamping, and laser printing in a document were investigated to identify the sequence of recording. The sequence relations for various cases were determined from the TOF-SIMS mapping image and the depth profile. TOF-SIMS images were also used to investigate numbers or characters altered with two different red pens. TOF-SIMS was successfully used to determine the sequence of intersecting lines and the forged numbers on the paper. © 2016 American Academy of Forensic Sciences.

  7. Grants Document-Generation System

    NASA Technical Reports Server (NTRS)

    Hairell, Terri; Kreymer, Lev; Martin, Greg; Sheridan, Patrick

    2008-01-01

    The Grants Document-Generation System (GDGS) software allows the generation of official grants documents for distribution to the appropriate parties. The documents are created after the selection and entry of specific data elements and clauses. GDGS is written in Cold Fusion that resides on an SQL2000 database and is housed on-site at Goddard Space Flight Center. It includes access security written around GSFC's (Goddard Space Flight Center's) LIST system, and allows for the entry of Procurement Request information necessary for the generation of the resulting Grant Award.

  8. PEM Electrolysis H2A Production Case Study Documentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    James, Brian; Colella, Whitney; Moton, Jennie

    2013-12-31

    This report documents the development of four DOE Hydrogen Analysis (H2A) case studies for polymer electrolyte membrane (PEM) electrolysis. The four cases characterize PEM electrolyzer technology for two hydrogen production plant sizes (Forecourt and Central) and for two technology development time horizons (Current and Future).

  9. Computer integrated documentation

    NASA Technical Reports Server (NTRS)

    Boy, Guy

    1991-01-01

    The main technical issues of the Computer Integrated Documentation (CID) project are presented. The problem of automation of documents management and maintenance is analyzed both from an artificial intelligence viewpoint and from a human factors viewpoint. Possible technologies for CID are reviewed: conventional approaches to indexing and information retrieval; hypertext; and knowledge based systems. A particular effort was made to provide an appropriate representation for contextual knowledge. This representation is used to generate context on hypertext links. Thus, indexing in CID is context sensitive. The implementation of the current version of CID is described. It includes a hypertext data base, a knowledge based management and maintenance system, and a user interface. A series is also presented of theoretical considerations as navigation in hyperspace, acquisition of indexing knowledge, generation and maintenance of a large documentation, and relation to other work.

  10. Soil Moisture Active Passive (SMAP) Mission Level 4 Carbon (L4_C) Product Specification Document

    NASA Technical Reports Server (NTRS)

    Glassy, Joe; Kimball, John S.; Jones, Lucas; Reichle, Rolf H.; Ardizzone, Joseph V.; Kim, Gi-Kong; Lucchesi, Robert A.; Smith, Edmond B.; Weiss, Barry H.

    2015-01-01

    This is the Product Specification Document (PSD) for Level 4 Surface and Root Zone Soil Moisture (L4_SM) data for the Science Data System (SDS) of the Soil Moisture Active Passive (SMAP) project. The L4_SM data product provides estimates of land surface conditions based on the assimilation of SMAP observations into a customized version of the NASA Goddard Earth Observing System, Version 5 (GEOS-5) land data assimilation system (LDAS). This document applies to any standard L4_SM data product generated by the SMAP Project.

  11. Documentation: Motivation and training or automation

    NASA Technical Reports Server (NTRS)

    Mouton, M. L.

    1970-01-01

    The road blocks and mental blocks in areas where automation is not taking care of basic documentation problems are discussed. Original project documentation, documentation for project maintenance, and comparison of preliminary and final documentation are described. The use of flow charts is also mentioned.

  12. 40 CFR 1508.10 - Environmental document.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Environmental document. 1508.10 Section 1508.10 Protection of Environment COUNCIL ON ENVIRONMENTAL QUALITY TERMINOLOGY AND INDEX § 1508.10 Environmental document. Environmental document includes the documents specified in § 1508.9 (environmental...

  13. 40 CFR 1508.10 - Environmental document.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Environmental document. 1508.10 Section 1508.10 Protection of Environment COUNCIL ON ENVIRONMENTAL QUALITY TERMINOLOGY AND INDEX § 1508.10 Environmental document. Environmental document includes the documents specified in § 1508.9 (environmental...

  14. Introduction to United States Public Documents.

    ERIC Educational Resources Information Center

    Morehead, Joseph

    This textbook, designed for use in library school government documents courses, provides an overview of the functions and characteristics of United States public documents. Chapters cover the Government Printing Office, the Superintendent of Documents, the depository library system, administration of documents collections, general guides to…

  15. Computerized Observation System (COS) for Field Experiences.

    ERIC Educational Resources Information Center

    Reed, Thomas M.; And Others

    The Computerized Observation System (COS) is a software program which an observer can use with a portable microcomputer to document preservice and inservice teacher performance. Specific observable behavior such as appropriate questions and responses shown to increase student achievement are recorded as Low Inference Observation Measures. Time on…

  16. [The Breast Unit in the European and national policy documents: similarities and differences].

    PubMed

    Marcon, Anna; Albertini, Giovanna; Di Gregori, Valentina; Ghirarduzzi, Angelo; Fantini, Maria Pia

    2013-11-01

    Aim of this study is to assess differences and similarities in official European and Italian Ministry of Health policy documents referring to the subject "Breast Unit". The T-Lab software package for textual analysis was used to analyze the documents. This instrument permits the identification of the most frequent used words and the semantic network associated with "Breast Unit". Results show that the European document gives more emphasis to the concept of "integrated care", delivered by a multi-professional team that meets the clinical, psychological and informational needs of the patient. The Italian document gives more prominence to themes related to the clinical content of the interventions and managerial aspects through the use of clinical guidelines.

  17. Deferred modification of antiretroviral regimen following documented treatment failure in Asia: results from the TREAT Asia HIV Observational Database (TAHOD)

    PubMed Central

    Zhou, J; Li, PCK; Kumarasamy, N; Boyd, M; Chen, YMA; Sirisanthana, T; Sungkanuparph, S; Oka, S; Tau, G; Phanuphak, P; Saphonn, V; Zhang, FJ; Omar, SFS; Lee, CKC; Ditangco, R; Merati, TP; Lim, PL; Choi, JY; Law, MG; Pujari, S

    2010-01-01

    Objective The aim of the study was to examine the rates and predictors of treatment modification following combination antiretroviral therapy (cART) failure in Asian patients with HIV enrolled in the TREAT Asia HIV Observational Database (TAHOD). Methods Treatment failure (immunological, virological and clinical) was defined by World Health Organization criteria. Countries were categorized as high or low income by World Bank criteria. Results Among 2446 patients who initiated cART, 447 were documented to have developed treatment failure over 5697 person-years (7.8 per 100 person-years). A total of 253 patients changed at least one drug after failure (51.6 per 100 person-years). There was no difference between patients from high- and low-income countries [adjusted hazard ratio (HR) 1.02; P = 0.891]. Advanced disease stage [Centers for Disease Control and Prevention (CDC) category C vs. A; adjusted HR 1.38, P = 0.040], a lower CD4 count (≥ 51 cells/μL vs. ≤ 50 cells/μL; adjusted HR 0.61, P = 0.022) and a higher HIV viral load (≥ 400 HIV-1 RNA copies/mL vs. < 400 copies/mL; adjusted HR 2.69, P < 0.001) were associated with a higher rate of treatment modification after failure. Compared with patients from low-income countries, patients from high-income countries were more likely to change two or more drugs (67% vs. 49%; P = 0.009) and to change to a protease-inhibitor-containing regimen (48% vs. 16%; P< 0.001). Conclusions In a cohort of Asian patients with HIV infection, nearly half remained on the failing regimen in the first year following documented treatment failure. This deferred modification is likely to have negative implications for accumulation of drug resistance and response to second-line treatment. There is a need to scale up the availability of second-line regimens and virological monitoring in this region. PMID:19601993

  18. Web-based document image processing

    NASA Astrophysics Data System (ADS)

    Walker, Frank L.; Thoma, George R.

    1999-12-01

    Increasing numbers of research libraries are turning to the Internet for electron interlibrary loan and for document delivery to patrons. This has been made possible through the widespread adoption of software such as Ariel and DocView. Ariel, a product of the Research Libraries Group, converts paper-based documents to monochrome bitmapped images, and delivers them over the Internet. The National Library of Medicine's DocView is primarily designed for library patrons are beginning to reap the benefits of this new technology, barriers exist, e.g., differences in image file format, that lead to difficulties in the use of library document information. To research how to overcome such barriers, the Communications Engineering Branch of the Lister Hill National Center for Biomedical Communications, an R and D division of NLM, has developed a web site called the DocMorph Server. This is part of an ongoing intramural R and D program in document imaging that has spanned many aspects of electronic document conversion and preservation, Internet document transmission and document usage. The DocMorph Server Web site is designed to fill two roles. First, in a role that will benefit both libraries and their patrons, it allows Internet users to upload scanned image files for conversion to alternative formats, thereby enabling wider delivery and easier usage of library document information. Second, the DocMorph Server provides the design team an active test bed for evaluating the effectiveness and utility of new document image processing algorithms and functions, so that they may be evaluated for possible inclusion in other image processing software products being developed at NLM or elsewhere. This paper describes the design of the prototype DocMorph Server and the image processing functions being implemented on it.

  19. [Law courts and clinical documentation].

    PubMed

    Jiménez Carnicero, M P; Magallón, A I; Gordillo, A

    2006-01-01

    Background. Until 2004, requests for clinical documentation proceeding from the Judicial Administration on Specialist Care of Pamplona were received in six different centres and were processed independently, with different procedures, and documents were even sent in duplicate, with the resulting work load. This article describes the procedure for processing requests for documentation proceeding from the Law Courts and analyses the requests received. Methods. A circuit was set up to channel the judicial requests that arrived at the Specialist Health Care Centres of Pamplona and at the Juridical Regime Service of the Health System of Navarra-Osasunbidea, and a Higher Technician in Health Documentation was contracted to centralise these requests. A proceedings protocol was established to unify criteria and speed up the process, and a database was designed to register the proceedings. Results. In the course of 2004, 210 requests for documentation by legal requirement were received. Of these, 24 were claims of patrimonial responsibility and 13 were requested by lawyers with the patient's authorisation. The most frequent jurisdictional order was penal (43.33%). Ninety-three point one five percent (93.15%) of the requests proceeded from law courts in the autonomous community of Navarra. The centre that received the greatest number of requests was the "Príncipe de Viana" Consultation Centre (33.73%).The most frequently requested documentation was a copy of reports (109) and a copy of the complete clinical record (39). On two occasions the original clinical record was required. The average time of response was 6.6 days. Conclusions. The centralisation of administration has brought greater agility to the process and homogeneity in the criteria of processing. Less time is involved in preparing and dispatching the documentation, the dispatch of duplicate documents is avoided, the work load has been reduced and the dispersal of documentation is avoided, a situation that

  20. Document Ranking Based upon Markov Chains.

    ERIC Educational Resources Information Center

    Danilowicz, Czeslaw; Balinski, Jaroslaw

    2001-01-01

    Considers how the order of documents in information retrieval responses are determined and introduces a method that uses a probabilistic model of a document set where documents are regarded as states of a Markov chain and where transition probabilities are directly proportional to similarities between documents. (Author/LRW)

  1. 29 CFR 16.203 - Documentation of fees and expenses.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... documentation of the fees and expenses, including the cost of any study, analysis, engineering report, test... each professional firm or individual whose services are covered by the application, showing the hours... date, number of hours per date and the services performed during those hours. In order to establish the...

  2. 29 CFR 16.203 - Documentation of fees and expenses.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... documentation of the fees and expenses, including the cost of any study, analysis, engineering report, test... each professional firm or individual whose services are covered by the application, showing the hours... date, number of hours per date and the services performed during those hours. In order to establish the...

  3. Evaluation of the assessment and documentation of chronic wounds in residential social care in the Czech Republic.

    PubMed

    Saibertová, S; Pokorná, A

    2016-11-02

    Accurate evaluation of non-healing, chronic wounds followed by the selection of an appropriate therapeutic strategy is a must for the foundation of health-care management. Assessment of non-healing chronic wounds in clinical practice in the Czech Republic is not standardised in acute care settings or in residential social care facilities. The aim of the study was to analyse the methods being used to assess non-healing, chronic wounds in residential social services in the Czech Republic, where more patients with chronic wounds are present because of the increasing incidence of wounds in old age. The research was carried out at 66 residential social care institutions across all regions of the Czech Republic. A mixed model was used for the research (participatory observation including creation of field notes and content analysis of documents for documentation and analysis of qualitative and quantitative data). The same methodology was used in previous work which has been done in acute care settings in 2013. The results of this research have corroborated the inconsistencies in procedures used by general nurses for assessment of non-healing, chronic wounds. However, the situation was found to be more positive with regard to the evaluation of basic/fundamental parameters of a wound (e.g. size, depth and location of the wound) compared with the evaluation of more specific parameters (e.g. exudate or signs of infection). This included not only the number of observed variables, but also the action taken. Both were improved when a consultant for wound healing was present. An effective strategy for wound management depends on the method and scope of the assessment of non-healing, chronic wounds in place in clinical practice in observed facilities; improvement may be expected following the general introduction of 'non-healing, chronic wound assessment' algorithm.

  4. Gaia DR2 documentation Chapter 7: Variability

    NASA Astrophysics Data System (ADS)

    Eyer, L.; Guy, L.; Distefano, E.; Clementini, G.; Mowlavi, N.; Rimoldini, L.; Roelens, M.; Audard, M.; Holl, B.; Lanzafame, A.; Lebzelter, T.; Lecoeur-Taïbi, I.; Molnár, L.; Ripepi, V.; Sarro, L.; Jevardat de Fombelle, G.; Nienartowicz, K.; De Ridder, J.; Juhász, Á.; Molinaro, R.; Plachy, E.; Regibo, S.

    2018-04-01

    This chapter of the Gaia DR2 documentation describes the models and methods used on the 22 months of data to produce the Gaia variable star results for Gaia DR2. The variability processing and analysis was based mostly on the calibrated G and integrated BP and RP photometry. The variability analysis approach to the Gaia data has been described in Eyer et al. (2017), and the Gaia DR2 results are presented in Holl et al. (2018). Detailed methods on specific topics will be published in a number of separate articles. Variability behaviour in the colour magnitude diagram is presented in Gaia Collaboration et al. (2018c).

  5. 21 CFR 820.40 - Document controls.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Document controls. 820.40 Section 820.40 Food and... QUALITY SYSTEM REGULATION Document Controls § 820.40 Document controls. Each manufacturer shall establish and maintain procedures to control all documents that are required by this part. The procedures shall...

  6. 21 CFR 820.40 - Document controls.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Document controls. 820.40 Section 820.40 Food and... QUALITY SYSTEM REGULATION Document Controls § 820.40 Document controls. Each manufacturer shall establish and maintain procedures to control all documents that are required by this part. The procedures shall...

  7. 21 CFR 820.40 - Document controls.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Document controls. 820.40 Section 820.40 Food and... QUALITY SYSTEM REGULATION Document Controls § 820.40 Document controls. Each manufacturer shall establish and maintain procedures to control all documents that are required by this part. The procedures shall...

  8. 21 CFR 820.40 - Document controls.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Document controls. 820.40 Section 820.40 Food and... QUALITY SYSTEM REGULATION Document Controls § 820.40 Document controls. Each manufacturer shall establish and maintain procedures to control all documents that are required by this part. The procedures shall...

  9. 21 CFR 820.40 - Document controls.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Document controls. 820.40 Section 820.40 Food and... QUALITY SYSTEM REGULATION Document Controls § 820.40 Document controls. Each manufacturer shall establish and maintain procedures to control all documents that are required by this part. The procedures shall...

  10. Language Documentation in the Americas

    ERIC Educational Resources Information Center

    Franchetto, Bruna; Rice, Keren

    2014-01-01

    In the last decades, the documentation of endangered languages has advanced greatly in the Americas. In this paper we survey the role that international funding programs have played in advancing documentation in this part of the world, with a particular focus on the growth of documentation in Brazil, and we examine some of the major opportunities…

  11. Documenting Climate Models and Their Simulations

    DOE PAGES

    Guilyardi, Eric; Balaji, V.; Lawrence, Bryan; ...

    2013-05-01

    The results of climate models are of increasing and widespread importance. No longer is climate model output of sole interest to climate scientists and researchers in the climate change impacts and adaptation fields. Now nonspecialists such as government officials, policy makers, and the general public all have an increasing need to access climate model output and understand its implications. For this host of users, accurate and complete metadata (i.e., information about how and why the data were produced) is required to document the climate modeling results. We describe a pilot community initiative to collect and make available documentation of climatemore » models and their simulations. In an initial application, a metadata repository is being established to provide information of this kind for a major internationally coordinated modeling activity known as CMIP5 (Coupled Model Intercomparison Project, Phase 5). We expected that for a wide range of stakeholders, this and similar community-managed metadata repositories will spur development of analysis tools that facilitate discovery and exploitation of Earth system simulations.« less

  12. Emergency Medicine Resident Physicians’ Perceptions of Electronic Documentation and Workflow

    PubMed Central

    Neri, P.M.; Redden, L.; Poole, S.; Pozner, C.N.; Horsky, J.; Raja, A.S.; Poon, E.; Schiff, G.

    2015-01-01

    Summary Objective To understand emergency department (ED) physicians’ use of electronic documentation in order to identify usability and workflow considerations for the design of future ED information system (EDIS) physician documentation modules. Methods We invited emergency medicine resident physicians to participate in a mixed methods study using task analysis and qualitative interviews. Participants completed a simulated, standardized patient encounter in a medical simulation center while documenting in the test environment of a currently used EDIS. We recorded the time on task, type and sequence of tasks performed by the participants (including tasks performed in parallel). We then conducted semi-structured interviews with each participant. We analyzed these qualitative data using the constant comparative method to generate themes. Results Eight resident physicians participated. The simulation session averaged 17 minutes and participants spent 11 minutes on average on tasks that included electronic documentation. Participants performed tasks in parallel, such as history taking and electronic documentation. Five of the 8 participants performed a similar workflow sequence during the first part of the session while the remaining three used different workflows. Three themes characterize electronic documentation: (1) physicians report that location and timing of documentation varies based on patient acuity and workload, (2) physicians report a need for features that support improved efficiency; and (3) physicians like viewing available patient data but struggle with integration of the EDIS with other information sources. Conclusion We confirmed that physicians spend much of their time on documentation (65%) during an ED patient visit. Further, we found that resident physicians did not all use the same workflow and approach even when presented with an identical standardized patient scenario. Future EHR design should consider these varied workflows while trying to

  13. Document retrieval on repetitive string collections.

    PubMed

    Gagie, Travis; Hartikainen, Aleksi; Karhu, Kalle; Kärkkäinen, Juha; Navarro, Gonzalo; Puglisi, Simon J; Sirén, Jouni

    2017-01-01

    Most of the fastest-growing string collections today are repetitive, that is, most of the constituent documents are similar to many others. As these collections keep growing, a key approach to handling them is to exploit their repetitiveness, which can reduce their space usage by orders of magnitude. We study the problem of indexing repetitive string collections in order to perform efficient document retrieval operations on them. Document retrieval problems are routinely solved by search engines on large natural language collections, but the techniques are less developed on generic string collections. The case of repetitive string collections is even less understood, and there are very few existing solutions. We develop two novel ideas, interleaved LCPs and precomputed document lists , that yield highly compressed indexes solving the problem of document listing (find all the documents where a string appears), top- k document retrieval (find the k documents where a string appears most often), and document counting (count the number of documents where a string appears). We also show that a classical data structure supporting the latter query becomes highly compressible on repetitive data. Finally, we show how the tools we developed can be combined to solve ranked conjunctive and disjunctive multi-term queries under the simple [Formula: see text] model of relevance. We thoroughly evaluate the resulting techniques in various real-life repetitiveness scenarios, and recommend the best choices for each case.

  14. Integration of scanned document management with the anatomic pathology laboratory information system: analysis of benefits.

    PubMed

    Schmidt, Rodney A; Simmons, Kim; Grimm, Erin E; Middlebrooks, Michael; Changchien, Rosy

    2006-11-01

    Electronic document management systems (EDMSs) have the potential to improve the efficiency of anatomic pathology laboratories. We implemented a novel but simple EDMS for scanned documents as part of our laboratory information system (AP-LIS) and collected cost-benefit data with the intention of discerning the value of such a system in general and whether integration with the AP-LIS is advantageous. We found that the direct financial benefits are modest but the indirect and intangible benefits are large. Benefits of time savings and access to data particularly accrued to pathologists and residents (3.8 h/d saved for 26 pathologists and residents). Integrating the scanned document management system (SDMS) into the AP-LIS has major advantages in terms of workflow and overall simplicity. This simple, integrated SDMS is an excellent value in a practice like ours, and many of the benefits likely apply in other practice settings.

  15. A portable digital microphotography unit for rapid documentation of periungual nailfold capillary changes in autoimmune connective tissue diseases.

    PubMed

    Sontheimer, Richard D

    2004-03-01

    While employing a DermLite dermoscopy unit to assess pigment pattern networks in melanocytic skin lesions, it was observed that this compact, portable dermoscopy unit can also be used to quickly detect nailfold capillary changes when entertaining a diagnosis of autoimmune connective tissue diseases (CTD) such as dermatomyositis (DM), scleroderma/systemic sclerosis (SSc), or systemic lupus erythematosus. Aware that the suppliers of the DermLite dermoscopy unit also market a portable digital microphotography unit based on the DermLite optical principles for efficiently documenting cutaneous pigment network patterns, we investigated whether this unit (DermLite Foto flash unit attached to a Nikon Coolpix digital camera) might be used to photographically document nailfold capillary changes in patients with autoimmune CTD. A DermLite Foto flash unit attached to a Nikon Coolpix digital camera was used in a controlled observational study to obtain digital photographs of nailfold capillaries in a small sequential sample of patients with autoimmune CTD attending a rheumatic skin disease subspecialty clinic in an academic department of dermatology. The digital microphotography system proved to be highly useful in documenting the nailfold vascular changes observed in a small sample of patients with DM. We observed that the nailfold capillary changes seen in patients with clinically amyopathic DM were qualitatively and quantitatively similar to those seen in patients with classical DM. Digital microphotography systems designed for examining pigmented skin lesions can be used easily to document nailfold capillary changes often observed in DM and SSc. Nailfold capillary changes documented in this manner appear to be indistinguishable in clinically amyopathic DM and classical DM.

  16. 47 CFR 61.16 - Base documents.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 3 2012-10-01 2012-10-01 false Base documents. 61.16 Section 61.16... for Electronic Filing § 61.16 Base documents. (a) The Base Document is a complete tariff which incorporates all effective revisions, as of the last day of the preceding month. The Base Document should be...

  17. 47 CFR 61.16 - Base documents.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 3 2013-10-01 2013-10-01 false Base documents. 61.16 Section 61.16... for Electronic Filing § 61.16 Base documents. (a) The Base Document is a complete tariff which incorporates all effective revisions, as of the last day of the preceding month. The Base Document should be...

  18. 47 CFR 61.16 - Base documents.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... for Electronic Filing § 61.16 Base documents. (a) The Base Document is a complete tariff which incorporates all effective revisions, as of the last day of the preceding month. The Base Document should be... 47 Telecommunication 3 2011-10-01 2011-10-01 false Base documents. 61.16 Section 61.16...

  19. 47 CFR 61.16 - Base documents.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 3 2010-10-01 2010-10-01 false Base documents. 61.16 Section 61.16... for Electronic Filing § 61.16 Base documents. (a) The Base Document is a complete tariff which incorporates all effective revisions, as of the last day of the preceding month. The Base Document should be...

  20. Clustering XML Documents Using Frequent Subtrees

    NASA Astrophysics Data System (ADS)

    Kutty, Sangeetha; Tran, Tien; Nayak, Richi; Li, Yuefeng

    This paper presents an experimental study conducted over the INEX 2008 Document Mining Challenge corpus using both the structure and the content of XML documents for clustering them. The concise common substructures known as the closed frequent subtrees are generated using the structural information of the XML documents. The closed frequent subtrees are then used to extract the constrained content from the documents. A matrix containing the term distribution of the documents in the dataset is developed using the extracted constrained content. The k-way clustering algorithm is applied to the matrix to obtain the required clusters. In spite of the large number of documents in the INEX 2008 Wikipedia dataset, the proposed frequent subtree-based clustering approach was successful in clustering the documents. This approach significantly reduces the dimensionality of the terms used for clustering without much loss in accuracy.

  1. VET in Schools Students: Characteristics and Post-School Employment and Training Experiences. Support Document

    ERIC Educational Resources Information Center

    Misko, Josie; Korbel, Patrick; Blomberg, Davinia

    2017-01-01

    This document was produced by the author based on their research for the report, "VET in Schools Students: Characteristics and Post-School Employment and Training Experiences," and is an added resource for further information. This support document presents the variables used and the findings of the supplementary analysis in the linked…

  2. Improved documentation of spectral lines for inductively coupled plasma emission spectrometry

    NASA Astrophysics Data System (ADS)

    Doidge, Peter S.

    2018-05-01

    An approach to improving the documentation of weak spectral lines falling near the prominent analytical lines used in inductively coupled plasma optical emission spectrometry (ICP-OES) is described. Measurements of ICP emission spectra in the regions around several hundred prominent lines, using concentrated solutions (up to 1% w/v) of some 70 elements, and comparison of the observed spectra with both recent published work and with the output of a computer program that allows calculation of transitions between the known energy levels, show that major improvements can be made in the coverage of spectral atlases for ICP-OES, with respect to "classical" line tables. It is argued that the atomic spectral data (wavelengths, energy levels) required for the reliable identification and documentation of a large majority of the weak interfering lines of the elements detectable by ICP-OES now exist, except for most of the observed lines of the lanthanide elements. In support of this argument, examples are provided from a detailed analysis of a spectral window centered on the prominent Pb II 220.353 nm line, and from a selected line-rich spectrum (W). Shortcomings in existing analyses are illustrated with reference to selected spectral interferences due to Zr. This approach has been used to expand the spectral-line library used in commercial ICP-ES instruments (Agilent 700-ES/5100-ES). The precision of wavelength measurements is evaluated in terms of the shot-noise limit, while the absolute accuracy of wavelength measurement is characterised through comparison with a small set of precise Ritz wavelengths for Sb I, and illustrated through the identification of Zr III lines; it is further shown that fractional-pixel absolute wavelength accuracies can be achieved. Finally, problems with the wavelengths and classifications of certain Au I lines are discussed.

  3. [Support of the nursing process through electronic nursing documentation systems (UEPD) – Initial validation of an instrument].

    PubMed

    Hediger, Hannele; Müller-Staub, Maria; Petry, Heidi

    2016-01-01

    Electronic nursing documentation systems, with standardized nursing terminology, are IT-based systems for recording the nursing processes. These systems have the potential to improve the documentation of the nursing process and to support nurses in care delivery. This article describes the development and initial validation of an instrument (known by its German acronym UEPD) to measure the subjectively-perceived benefits of an electronic nursing documentation system in care delivery. The validity of the UEPD was examined by means of an evaluation study carried out in an acute care hospital (n = 94 nurses) in German-speaking Switzerland. Construct validity was analyzed by principal components analysis. Initial references of validity of the UEPD could be verified. The analysis showed a stable four factor model (FS = 0.89) scoring in 25 items. All factors loaded ≥ 0.50 and the scales demonstrated high internal consistency (Cronbach's α = 0.73 – 0.90). Principal component analysis revealed four dimensions of support: establishing nursing diagnosis and goals; recording a case history/an assessment and documenting the nursing process; implementation and evaluation as well as information exchange. Further testing with larger control samples and with different electronic documentation systems are needed. Another potential direction would be to employ the UEPD in a comparison of various electronic documentation systems.

  4. Electronic Document Supply Systems.

    ERIC Educational Resources Information Center

    Cawkell, A. E.

    1991-01-01

    Describes electronic document delivery systems used by libraries and document image processing systems used for business purposes. Topics discussed include technical specifications; analogue read-only laser videodiscs; compact discs and CD-ROM; WORM; facsimile; ADONIS (Article Delivery over Network Information System); DOCDEL; and systems at the…

  5. Document Design: Part 1.

    ERIC Educational Resources Information Center

    Andrews, Deborah C., Ed.; Dyrud, Marilyn, Ed.

    1996-01-01

    Presents four articles that provide suggestions for teaching document design: (1) "Teaching the Rhetoric of Document Design" (Michael J. Hassett); (2) "Teaching by Example: Suggestions for Assignment Design" (Marilyn A. Dyrud); (3) "Teaching the Page as a Visual Unit" (Bill Hart-Davidson); and (4) "Designing a…

  6. ENDF/B summary documentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kinsey, R.

    1979-07-01

    This publication provides a localized source of descriptions for the evaluations contained in the ENDF/B Library. The summary documentation presented is intended to be a more detailed description than the (File 1) comments contained in the computer readable data files, but not so detailed as the formal reports describing each ENDF/B evaluation. The summary documentations were written by the CSEWB (Cross Section Evaluation Working Group) evaluators and compiled by NNDC (National Nuclear Data Center). This edition includes documentation for materials found on ENDF/B Version V tapes 501 to 516 (General Purpose File) excluding tape 504. ENDF/B-V also includes tapes containingmore » partial evaluations for the Special Purpose Actinide (521, 522), Dosimetry (531), Activation (532), Gas Production (533), and Fission Product (541-546) files. The materials found on these tapes are documented elsewhere. Some of the evaluation descriptions in this report contain cross sections or energy level information. (RWR)« less

  7. SureChEMBL: a large-scale, chemically annotated patent document database.

    PubMed

    Papadatos, George; Davies, Mark; Dedman, Nathan; Chambers, Jon; Gaulton, Anna; Siddle, James; Koks, Richard; Irvine, Sean A; Pettersson, Joe; Goncharoff, Nicko; Hersey, Anne; Overington, John P

    2016-01-04

    SureChEMBL is a publicly available large-scale resource containing compounds extracted from the full text, images and attachments of patent documents. The data are extracted from the patent literature according to an automated text and image-mining pipeline on a daily basis. SureChEMBL provides access to a previously unavailable, open and timely set of annotated compound-patent associations, complemented with sophisticated combined structure and keyword-based search capabilities against the compound repository and patent document corpus; given the wealth of knowledge hidden in patent documents, analysis of SureChEMBL data has immediate applications in drug discovery, medicinal chemistry and other commercial areas of chemical science. Currently, the database contains 17 million compounds extracted from 14 million patent documents. Access is available through a dedicated web-based interface and data downloads at: https://www.surechembl.org/. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  8. Using Aoristic Analysis to Link Remote and Ground-Level Phenological Observations

    NASA Astrophysics Data System (ADS)

    Henebry, G. M.

    2013-12-01

    Phenology is about observing events in time and space. With the advent of publically accessible geospatial datastreams and easy to use mapping software, specifying where an event occurs is much less of a challenge than it was just two decades ago. In contrast, specifying when an event occurs remains a nontrivial function of a population of organismal responses, sampling interval, compositing period, and reporting precision. I explore how aoristic analysis can be used to analyzing spatiotemporal events for which the location is known to acceptable levels of precision but for which temporal coordinates are poorly specified or only partially bounded. Aoristic analysis was developed in the late 1990s in the field of quantitative criminology to leverage temporally imprecise geospatial data of crime reports. Here I demonstrate how aoristic analysis can be used to link remotely sensed observations of land surface phenology to ground-level observations of organismal phenophase transitions. Explicit representation of the windows of temporal uncertainty with aoristic weights enables cross-validation exercises and forecasting efforts to avoid false precision.

  9. Software design and documentation language

    NASA Technical Reports Server (NTRS)

    Kleine, H.

    1977-01-01

    A communications medium to support the design and documentation of complex software applications is studied. The medium also provides the following: (1) a processor which can convert design specifications into an intelligible, informative machine reproducible document; (2) a design and documentation language with forms and syntax that are simple, unrestrictive, and communicative; and (3) methodology for effective use of the language and processor.

  10. Customer Communication Document

    NASA Technical Reports Server (NTRS)

    2009-01-01

    This procedure communicates to the Customers of the Automation, Robotics and Simulation Division (AR&SD) Dynamics Systems Test Branch (DSTB) how to obtain services of the Six-Degrees-Of-Freedom Dynamic Test System (SDTS). The scope includes the major communication documents between the SDTS and its Customer. It established the initial communication and contact points as well as provides the initial documentation in electronic media for the customer. Contact the SDTS Manager (SM) for the names of numbers of the current contact points.

  11. High-level waste tank farm set point document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anthony, J.A. III

    1995-01-15

    Setpoints for nuclear safety-related instrumentation are required for actions determined by the design authorization basis. Minimum requirements need to be established for assuring that setpoints are established and held within specified limits. This document establishes the controlling methodology for changing setpoints of all classifications. The instrumentation under consideration involve the transfer, storage, and volume reduction of radioactive liquid waste in the F- and H-Area High-Level Radioactive Waste Tank Farms. The setpoint document will encompass the PROCESS AREA listed in the Safety Analysis Report (SAR) (DPSTSA-200-10 Sup 18) which includes the diversion box HDB-8 facility. In addition to the PROCESS AREASmore » listed in the SAR, Building 299-H and the Effluent Transfer Facility (ETF) are also included in the scope.« less

  12. Photometric Observations and Analysis of 1082 Pirola

    NASA Astrophysics Data System (ADS)

    Baker, Ronald E.; Pilcher, Frederick; Benishek, Vladimir

    2011-04-01

    CCD observations of the main-belt asteroid 1082 Pirola were recorded during the period 2010 October to 2011 January. Analysis of the lightcurve found a synodic period of P = 15.8525 ± 0.0005 h and amplitude A = 0.53 ± 0.01 mag. The phase curve referenced to mean magnitude suggests the absolute magnitude and phase slope parameter: H = 10.507 ± 0.014 mag; G = 0.080 ± 0.016. The phase curve referenced to maximum light suggests: H = 10.320 ± 0.013 mag; G = 0.107 ± 0.016.

  13. Documentation of the GLAS fourth order general circulation model. Volume 1: Model documentation

    NASA Technical Reports Server (NTRS)

    Kalnay, E.; Balgovind, R.; Chao, W.; Edelmann, J.; Pfaendtner, J.; Takacs, L.; Takano, K.

    1983-01-01

    The volume 1, of a 3 volume technical memoranda which contains a documentation of the GLAS Fourth Order General Circulation Model is presented. Volume 1 contains the documentation, description of the stratospheric/tropospheric extension, user's guide, climatological boundary data, and some climate simulation studies.

  14. NILDE, Network Inter Library Document Exchange: An Italian Document Delivery System

    NASA Astrophysics Data System (ADS)

    Brunetti, F.; Gasperini, A.; Mangiaracina, S.

    2007-10-01

    This poster presents NILDE, a document delivery system supporting the exchange of documents via the internet. The system has been set up by the Central Library of the National Research Council of Bologna (Italy) in order to make use of new internet technology, to promote cooperation between Italian university libraries and research libraries, and to achieve quick response times in satisfying DD requests. The Arcetri Astrophysical Observatory Library was the first astronomical library to join the NILDE project from its earliest days in 2002. Many were the reasons for this choice: automation of the DD processes, security and reliability of the network, creation of usage statistics and reports, reduction of DD System management costs and so on. This work describes the benefits of NILDE and discusses the role of an organized document delivery system as an important tool to cope with the difficult constraints of the publishing market.

  15. IDC System Specification Document.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clifford, David J.

    2014-12-01

    This document contains the system specifications derived to satisfy the system requirements found in the IDC System Requirements Document for the IDC Reengineering Phase 2 project. Revisions Version Date Author/Team Revision Description Authorized by V1.0 12/2014 IDC Reengineering Project Team Initial delivery M. Harris

  16. Unsupervised Word Spotting in Historical Handwritten Document Images using Document-oriented Local Features.

    PubMed

    Zagoris, Konstantinos; Pratikakis, Ioannis; Gatos, Basilis

    2017-05-03

    Word spotting strategies employed in historical handwritten documents face many challenges due to variation in the writing style and intense degradation. In this paper, a new method that permits effective word spotting in handwritten documents is presented that it relies upon document-oriented local features which take into account information around representative keypoints as well a matching process that incorporates spatial context in a local proximity search without using any training data. Experimental results on four historical handwritten datasets for two different scenarios (segmentation-based and segmentation-free) using standard evaluation measures show the improved performance achieved by the proposed methodology.

  17. Challenges to nurses' efforts of retrieving, documenting, and communicating patient care information

    PubMed Central

    Yakel, Elizabeth; Dunn Lopez, Karen; Tschannen, Dana; Ford, Yvonne B

    2013-01-01

    Objective To examine information flow, a vital component of a patient's care and outcomes, in a sample of multiple hospital nursing units to uncover potential sources of error and opportunities for systematic improvement. Design This was a qualitative study of a sample of eight medical–surgical nursing units from four diverse hospitals in one US state. We conducted direct work observations of nursing staff's communication patterns for entire shifts (8 or 12 h) for a total of 200 h and gathered related documentation artifacts for analyses. Data were coded using qualitative content analysis procedures and then synthesized and organized thematically to characterize current practices. Results Three major themes emerged from the analyses, which represent serious vulnerabilities in the flow of patient care information during nurse hand-offs and to the entire interdisciplinary team across time and settings. The three themes are: (1) variation in nurse documentation and communication; (2) the absence of a centralized care overview in the patient's electronic health record, ie, easily accessible by the entire care team; and (3) rarity of interdisciplinary communication. Conclusion The care information flow vulnerabilities are a catalyst for multiple types of serious and undetectable clinical errors. We have two major recommendations to address the gaps: (1) to standardize the format, content, and words used to document core information, such as the plan of care, and make this easily accessible to all team members; (2) to conduct extensive usability testing to ensure that tools in the electronic health record help the disconnected interdisciplinary team members to maintain a shared understanding of the patient's plan. PMID:22822042

  18. Reimbursement for injury-induced medical expenses in Chinese social medical insurance schemes: A systematic analysis of legislative documents

    PubMed Central

    Gao, Yuyan; Li, Li; Schwebel, David C.; Ning, Peishan; Cheng, Peixia

    2018-01-01

    Social medical insurance schemes are crucial for realizing universal health coverage and health equity. The aim of this study was to investigate whether and how reimbursement for injury-induced medical expenses is addressed in Chinese legislative documents relevant to social medical insurance. We retrieved legislative documents from the China National Knowledge Infrastructure and the Lawyee databases. Four types of social medical insurance schemes were included: urban employee basic medical insurance, urban resident basic medical insurance, new rural cooperative medical system, and urban and rural resident medical insurance. Text analyses were conducted on all identified legislative documents. As a result, one national law and 1,037 local legislative documents were identified. 1,012 of the 1,038 documents provided for reimbursement. Of the 1,012 documents, 828 (82%) provided reimbursement only for injuries without a legally responsible person/party or not caused by self-harm, alcohol use, drug use, or other law violations, and 162 (16%) did not include any details concerning implementation. Furthermore, 760 (92%) of the 828 did not provide an exception clause applying to injuries when a responsible person/party could not be contacted or for situations when the injured person cannot obtain reimbursement from the responsible person/party. Thus, most Chinese legislative documents related to social medical insurance do not provide reimbursement for medical expenses from injuries having a legally responsible person/party or those caused by illegal behaviors. We argue that all injury-induced medical expenses should be covered by legislative documents related to social medical insurance in China, no matter what the cause of the injury. Further research is needed to explore the acceptability and feasibility of such policy changes. PMID:29543913

  19. Functional Vision Observation. Technical Assistance Paper.

    ERIC Educational Resources Information Center

    Florida State Dept. of Education, Tallahassee. Bureau of Education for Exceptional Students.

    Technical assistance is provided concerning documentation of functional vision loss for Florida students with visual impairments. The functional vision observation should obtain enough information for determination of special service eligibility. The observation is designed to supplement information on the medical eye examination, and is conducted…

  20. Documentation of daily sit-to-stands performed by community-dwelling adults.

    PubMed

    Bohannon, Richard W; Barreca, Susan R; Shove, Megan E; Lambert, Cynthia; Masters, Lisa M; Sigouin, Christopher S

    2008-01-01

    No information exists about how many sit-to-stands (STSs) are performed daily by community-dwelling adults. We, therefore, examined the feasibility of using a tally counter to document daily STSs, documented the number of daily STSs performed, and determined if the number of STSs was influenced by demographic or health variables. Ninety-eight community-dwelling adults (19-84 years) agreed to participate. After providing demographic and health information, subjects used a tally counter to document the number of STSs performed daily for 7 consecutive days. All but two subjects judged their counter-documented STS number to be accurate. Excluding data from these and two other subjects, the mean number of STSs for subjects was 42.8 to 49.3, depending on the day. The number was significantly higher on weekdays than weekends. No demographic or health variable was significantly related to the number of STSs in univariate or multivariate analysis. In conclusion, this study suggests that a tally counter may be a practical aid to documenting STS activity. The STS repetitions recorded by the counter in this study provide an estimate of the number of STSs that community-dwelling adults perform daily.

  1. Application of Laser Scanning for Creating Geological Documentation

    NASA Astrophysics Data System (ADS)

    Buczek, Michał; Paszek, Martyna; Szafarczyk, Anna

    2018-03-01

    A geological documentation is based on the analyses obtained from boreholes, geological exposures, and geophysical methods. It consists of text and graphic documents, containing drilling sections, vertical crosssections through the deposit and various types of maps. The surveying methods (such as LIDAR) can be applied in measurements of exposed rock layers, presented in appendices to the geological documentation. The laser scanning allows obtaining a complete profile of exposed surfaces in a short time and with a millimeter accuracy. The possibility of verifying the existing geological cross-section with laser scanning was tested on the example of the AGH experimental mine. The test field is built of different lithological rocks. Scans were taken from a single station, under favorable measuring conditions. The analysis of the signal intensity allowed to divide point cloud into separate geological layers. The results were compared with the geological profiles of the measured object. The same approach was applied to the data from the Vietnamese hard coal open pit mine Coc Sau. The thickness of exposed coal bed deposits and gangue layers were determined from the obtained data (point cloud) in combination with the photographs. The results were compared with the geological cross-section.

  2. Observation model and parameter partials for the JPL geodetic (GPS) modeling software 'GPSOMC'

    NASA Technical Reports Server (NTRS)

    Sovers, O. J.

    1990-01-01

    The physical models employed in GPSOMC, the modeling module of the GIPSY software system developed at JPL for analysis of geodetic Global Positioning Satellite (GPS) measurements are described. Details of the various contributions to range and phase observables are given, as well as the partial derivatives of the observed quantities with respect to model parameters. A glossary of parameters is provided to enable persons doing data analysis to identify quantities with their counterparts in the computer programs. The present version is the second revision of the original document which it supersedes. The modeling is expanded to provide the option of using Cartesian station coordinates; parameters for the time rates of change of universal time and polar motion are also introduced.

  3. Video document

    NASA Astrophysics Data System (ADS)

    Davies, Bob; Lienhart, Rainer W.; Yeo, Boon-Lock

    1999-08-01

    The metaphor of film and TV permeates the design of software to support video on the PC. Simply transplanting the non- interactive, sequential experience of film to the PC fails to exploit the virtues of the new context. Video ont eh PC should be interactive and non-sequential. This paper experiments with a variety of tools for using video on the PC that exploits the new content of the PC. Some feature are more successful than others. Applications that use these tools are explored, including primarily the home video archive but also streaming video servers on the Internet. The ability to browse, edit, abstract and index large volumes of video content such as home video and corporate video is a problem without appropriate solution in today's market. The current tools available are complex, unfriendly video editors, requiring hours of work to prepare a short home video, far more work that a typical home user can be expected to provide. Our proposed solution treats video like a text document, providing functionality similar to a text editor. Users can browse, interact, edit and compose one or more video sequences with the same ease and convenience as handling text documents. With this level of text-like composition, we call what is normally a sequential medium a 'video document'. An important component of the proposed solution is shot detection, the ability to detect when a short started or stopped. When combined with a spreadsheet of key frames, the host become a grid of pictures that can be manipulated and viewed in the same way that a spreadsheet can be edited. Multiple video documents may be viewed, joined, manipulated, and seamlessly played back. Abstracts of unedited video content can be produce automatically to create novel video content for export to other venues. Edited and raw video content can be published to the net or burned to a CD-ROM with a self-installing viewer for Windows 98 and Windows NT 4.0.

  4. Description and Documentation of the Dental School Dental Delivery System.

    ERIC Educational Resources Information Center

    Chase, Rosen and Wallace, Inc., Alexandria, VA.

    A study was undertaken to describe and document the dental school dental delivery system using an integrated systems approach. In late 1976 and early 1977, a team of systems analysts and dental consultants visited three dental schools to observe the delivery of dental services and patient flow and to interview administrative staff and faculty.…

  5. Final safety analysis report for the Galileo Mission: Volume 2, Book 2: Accident model document: Appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1988-12-15

    This section of the Accident Model Document (AMD) presents the appendices which describe the various analyses that have been conducted for use in the Galileo Final Safety Analysis Report II, Volume II. Included in these appendices are the approaches, techniques, conditions and assumptions used in the development of the analytical models plus the detailed results of the analyses. Also included in these appendices are summaries of the accidents and their associated probabilities and environment models taken from the Shuttle Data Book (NSTS-08116), plus summaries of the several segments of the recent GPHS safety test program. The information presented in thesemore » appendices is used in Section 3.0 of the AMD to develop the Failure/Abort Sequence Trees (FASTs) and to determine the fuel releases (source terms) resulting from the potential Space Shuttle/IUS accidents throughout the missions.« less

  6. Time-motion analysis of clinical nursing documentation during implementation of an electronic operating room management system for ophthalmic surgery.

    PubMed

    Read-Brown, Sarah; Sanders, David S; Brown, Anna S; Yackel, Thomas R; Choi, Dongseok; Tu, Daniel C; Chiang, Michael F

    2013-01-01

    Efficiency and quality of documentation are critical in surgical settings because operating rooms are a major source of revenue, and because adverse events may have enormous consequences. Electronic health records (EHRs) have potential to impact surgical volume, quality, and documentation time. Ophthalmology is an ideal domain to examine these issues because procedures are high-throughput and demand efficient documentation. This time-motion study examines nursing documentation during implementation of an EHR operating room management system in an ophthalmology department. Key findings are: (1) EHR nursing documentation time was significantly worse during early implementation, but improved to a level near but slightly worse than paper baseline, (2) Mean documentation time varied significantly among nurses during early implementation, and (3) There was no decrease in operating room turnover time or surgical volume after implementation. These findings have important implications for ambulatory surgery departments planning EHR implementation, and for research in system design.

  7. Sri Dalada Maligawa - 3D-Scanning and Documentation of the Temple of the Sacred Tooth Relic at Kandy, Sri Lanka

    NASA Astrophysics Data System (ADS)

    Rahrig, M.; Luib, A.

    2017-08-01

    Sri Dalada Maligawa - the Temple of the Sacred Tooth Relic - is one of the most important pilgrim sites in Buddhist culture. It is the main part of the UNESCO World Heritage Site Sacred City of Kandy. Since the end of the 17th century the temple has been keeping the sacred tooth of the Buddha. Until now an accurate documentation of the temple with all its rich decorations is missing. The Temple is built in an area vulnerable to environmental factors like earthquakes or monsoon rains and was the target of terrorist attacks. To help preserving this important cultural heritage a research project was carried out. Main part of the project was a 3D-documentation of the entire temple by using Terrestrial-Laser-Scanning (TLS) and the creating of CAD-Plans. In addition to the documentation of the architecture several details were taken in high resolution by Structured-Light-Scanning (SLS). All data will be part of the digital archive of the temple and were used as a base for a general site monitoring, especially to observe cracks. Next to the mere documentation a transfer of knowledge was another aim of the project. In future most of the analysis of the scan data can be done by local specialists.

  8. Program Helps Standardize Documentation Of Software

    NASA Technical Reports Server (NTRS)

    Howe, G.

    1994-01-01

    Intelligent Documentation Management System, IDMS, computer program developed to assist project managers in implementing information system documentation standard known as NASA-STD-2100-91, NASA STD, COS-10300, of NASA's Software Management and Assurance Program. Standard consists of data-item descriptions or templates, each of which governs particular component of software documentation. IDMS helps program manager in tailoring documentation standard to project. Written in C language.

  9. Linking boundary-layer circulations and surface processes during FIFE89. Part 1: Observational analysis

    NASA Technical Reports Server (NTRS)

    Smith, Eric A.; Wai, Mickey M.-K.; Cooper, Harry J.; Rubes, Michael T.; Hsu, Ann

    1994-01-01

    Surface, aircraft, and satellite observations are analyzed for the 21-day 1989 intensive field campaign of the First ISLSCP Field Experiment (FIFE) to determine the effect of precipitation, vegetation, and soil moisture distributions on the thermal properties of the surface including the heat and moisture fluxes, and the corresponding response in the boundary-layer circulation. Mean and variance properties of the surface variables are first documented at various time and space scales. These calculations are designed to set the stage for Part 2, a modeling study that will focus on how time-space dependent rainfall distribution influences the intensity of the feedback between a vegetated surface and the atmospheric boundary layer. Further analysis shows strongly demarked vegetation and soil moisture gradients extending across the FIFE experimental site that were developed and maintained by the antecedent and ongoing spatial distribution of rainfall over the region. These gradients are shown to have a pronounced influence on the thermodynamic properties of the surface. Furthermore, perturbation surface wind analysis suggests for both short-term steady-state conditions and long-term averaged conditions that the gradient pattern maintained a diurnally oscillating local direct circulation with perturbation vertical velocities of the same order as developing cumulus clouds. Dynamical and scaling considerations suggest that the embedded perturbation circulation is driven by surface heating/cooling gradients and terrain ef fects rather than the manifestation of an inertial oscillation. The implication is that at even relatively small scales (less than 30 km), the differential evolution in vegetation density and soil moisture distribution over a relatively homogenous ecotone can give rise to preferential boundary-layer circulations capable of modifying local-scale horizontal and vertical motions.

  10. Automatic system for computer program documentation

    NASA Technical Reports Server (NTRS)

    Simmons, D. B.; Elliott, R. W.; Arseven, S.; Colunga, D.

    1972-01-01

    Work done on a project to design an automatic system for computer program documentation aids was made to determine what existing programs could be used effectively to document computer programs. Results of the study are included in the form of an extensive bibliography and working papers on appropriate operating systems, text editors, program editors, data structures, standards, decision tables, flowchart systems, and proprietary documentation aids. The preliminary design for an automated documentation system is also included. An actual program has been documented in detail to demonstrate the types of output that can be produced by the proposed system.

  11. Document image retrieval through word shape coding.

    PubMed

    Lu, Shijian; Li, Linlin; Tan, Chew Lim

    2008-11-01

    This paper presents a document retrieval technique that is capable of searching document images without OCR (optical character recognition). The proposed technique retrieves document images by a new word shape coding scheme, which captures the document content through annotating each word image by a word shape code. In particular, we annotate word images by using a set of topological shape features including character ascenders/descenders, character holes, and character water reservoirs. With the annotated word shape codes, document images can be retrieved by either query keywords or a query document image. Experimental results show that the proposed document image retrieval technique is fast, efficient, and tolerant to various types of document degradation.

  12. Personal information documents for people with dementia: Healthcare staff 's perceptions and experiences.

    PubMed

    Baillie, Lesley; Thomas, Nicola

    2018-01-01

    Person-centred care is internationally recognised as best practice for the care of people with dementia. Personal information documents for people with dementia are proposed as a way to support person-centred care in healthcare settings. However, there is little research about how they are used in practice. The aim of this study was to analyse healthcare staff 's perceptions and experiences of using personal information documents, mainly Alzheimer's Society's 'This is me', for people with dementia in healthcare settings. The method comprised a secondary thematic analysis of data from a qualitative study, of how a dementia awareness initiative affected care for people with dementia in one healthcare organisation. The data were collected through 12 focus groups (n = 58 participants) and 1 individual interview, conducted with a range of healthcare staff, both clinical and non-clinical. There are four themes presented: understanding the rationale for personal information documents; completing personal information documents; location for personal information documents and transfer between settings; impact of personal information documents in practice. The findings illuminated how healthcare staff use personal information documents in practice in ways that support person-centred care. Practical issues about the use of personal information documents were revealed and these may affect the optimal use of the documents in practice. The study indicated the need to complete personal information documents at an early stage following diagnosis of dementia, and the importance of embedding their use across care settings, to support communication and integrated care.

  13. The ADVANTAGE seeding trial: a review of internal documents.

    PubMed

    Hill, Kevin P; Ross, Joseph S; Egilman, David S; Krumholz, Harlan M

    2008-08-19

    Seeding trials, clinical studies conducted by pharmaceutical companies that are designed to seem as if they answer a scientific question but primarily fulfill marketing objectives, have not been described in detail. To describe a known seeding trial, ADVANTAGE (Assessment of Differences between Vioxx and Naproxen To Ascertain Gastrointestinal Tolerability and Effectiveness), through documents of the trial sponsor, Merck & Co. (Whitehouse Station, New Jersey). Merck internal and external correspondence, reports, and presentations elicited to inform legal proceedings of Cona v Merck and Co., Inc., and McDarby v Merck and Co., Inc. The documents were created between 1998 and 2006. An iterative case-study process of review, discussion, and re-review of documents to identify themes relevant to the design and conduct of ADVANTAGE. To supplement the case-study review, the authors did a systematic review of the literature to identify published manuscripts focused on seeding trials and their conduct. Review of the documents revealed 3 key themes: The trial was designed by Merck's marketing division to fulfill a marketing objective; Merck's marketing division handled both the scientific and the marketing data, including collection, analysis, and dissemination; and Merck hid the marketing nature of the trial from participants, physician investigators, and institutional review board members. Although the systematic review of the literature identified 6 articles that focused on the practice of seeding trials, none provided documentary evidence of their existence or conduct. The legal documents in these cases provide useful, but limited, information about the practices of the pharmaceutical industry. This description of 1 company's actions is incomplete and may have limited generalizability. Documentary evidence shows that ADVANTAGE is an example of marketing framed as science. The documents indicate that ADVANTAGE was a seeding trial developed by Merck's marketing division to

  14. Patterns for Effectively Documenting Frameworks

    NASA Astrophysics Data System (ADS)

    Aguiar, Ademar; David, Gabriel

    Good design and implementation are necessary but not sufficient pre-requisites for successfully reusing object-oriented frameworks. Although not always recognized, good documentation is crucial for effective framework reuse, and often hard, costly, and tiresome, coming with many issues, especially when we are not aware of the key problems and respective ways of addressing them. Based on existing literature, case studies and lessons learned, the authors have been mining proven solutions to recurrent problems of documenting object-oriented frameworks, and writing them in pattern form, as patterns are a very effective way of communicating expertise and best practices. This paper presents a small set of patterns addressing problems related to the framework documentation itself, here seen as an autonomous and tangible product independent of the process used to create it. The patterns aim at helping non-experts on cost-effectively documenting object-oriented frameworks. In concrete, these patterns provide guidance on choosing the kinds of documents to produce, how to relate them, and which contents to include. Although the focus is more on the documents themselves, rather than on the process and tools to produce them, some guidelines are also presented in the paper to help on applying the patterns to a specific framework.

  15. Time-Motion Analysis of Clinical Nursing Documentation During Implementation of an Electronic Operating Room Management System for Ophthalmic Surgery

    PubMed Central

    Read-Brown, Sarah; Sanders, David S.; Brown, Anna S.; Yackel, Thomas R.; Choi, Dongseok; Tu, Daniel C.; Chiang, Michael F.

    2013-01-01

    Efficiency and quality of documentation are critical in surgical settings because operating rooms are a major source of revenue, and because adverse events may have enormous consequences. Electronic health records (EHRs) have potential to impact surgical volume, quality, and documentation time. Ophthalmology is an ideal domain to examine these issues because procedures are high-throughput and demand efficient documentation. This time-motion study examines nursing documentation during implementation of an EHR operating room management system in an ophthalmology department. Key findings are: (1) EHR nursing documentation time was significantly worse during early implementation, but improved to a level near but slightly worse than paper baseline, (2) Mean documentation time varied significantly among nurses during early implementation, and (3) There was no decrease in operating room turnover time or surgical volume after implementation. These findings have important implications for ambulatory surgery departments planning EHR implementation, and for research in system design. PMID:24551402

  16. A classification of errors in lay comprehension of medical documents.

    PubMed

    Keselman, Alla; Smith, Catherine Arnott

    2012-12-01

    Emphasis on participatory medicine requires that patients and consumers participate in tasks traditionally reserved for healthcare providers. This includes reading and comprehending medical documents, often but not necessarily in the context of interacting with Personal Health Records (PHRs). Research suggests that while giving patients access to medical documents has many benefits (e.g., improved patient-provider communication), lay people often have difficulty understanding medical information. Informatics can address the problem by developing tools that support comprehension; this requires in-depth understanding of the nature and causes of errors that lay people make when comprehending clinical documents. The objective of this study was to develop a classification scheme of comprehension errors, based on lay individuals' retellings of two documents containing clinical text: a description of a clinical trial and a typical office visit note. While not comprehensive, the scheme can serve as a foundation of further development of a taxonomy of patients' comprehension errors. Eighty participants, all healthy volunteers, read and retold two medical documents. A data-driven content analysis procedure was used to extract and classify retelling errors. The resulting hierarchical classification scheme contains nine categories and 23 subcategories. The most common error made by the participants involved incorrectly recalling brand names of medications. Other common errors included misunderstanding clinical concepts, misreporting the objective of a clinical research study and physician's findings during a patient's visit, and confusing and misspelling clinical terms. A combination of informatics support and health education is likely to improve the accuracy of lay comprehension of medical documents. Published by Elsevier Inc.

  17. Multispectral image restoration of historical documents based on LAAMs and mathematical morphology

    NASA Astrophysics Data System (ADS)

    Lechuga-S., Edwin; Valdiviezo-N., Juan C.; Urcid, Gonzalo

    2014-09-01

    This research introduces an automatic technique designed for the digital restoration of the damaged parts in historical documents. For this purpose an imaging spectrometer is used to acquire a set of images in the wavelength interval from 400 to 1000 nm. Assuming the presence of linearly mixed spectral pixels registered from the multispectral image, our technique uses two lattice autoassociative memories to extract the set of pure pigments conforming a given document. Through an spectral unmixing analysis, our method produces fractional abundance maps indicating the distributions of each pigment in the scene. These maps are then used to locate cracks and holes in the document under study. The restoration process is performed by the application of a region filling algorithm, based on morphological dilation, followed by a color interpolation to restore the original appearance of the filled areas. This procedure has been successfully applied to the analysis and restoration of three multispectral data sets: two corresponding to artificially superimposed scripts and a real data acquired from a Mexican pre-Hispanic codex, whose restoration results are presented.

  18. Spectral Analysis of Forecast Error Investigated with an Observing System Simulation Experiment

    NASA Technical Reports Server (NTRS)

    Prive, N. C.; Errico, Ronald M.

    2015-01-01

    The spectra of analysis and forecast error are examined using the observing system simulation experiment (OSSE) framework developed at the National Aeronautics and Space Administration Global Modeling and Assimilation Office (NASAGMAO). A global numerical weather prediction model, the Global Earth Observing System version 5 (GEOS-5) with Gridpoint Statistical Interpolation (GSI) data assimilation, is cycled for two months with once-daily forecasts to 336 hours to generate a control case. Verification of forecast errors using the Nature Run as truth is compared with verification of forecast errors using self-analysis; significant underestimation of forecast errors is seen using self-analysis verification for up to 48 hours. Likewise, self analysis verification significantly overestimates the error growth rates of the early forecast, as well as mischaracterizing the spatial scales at which the strongest growth occurs. The Nature Run-verified error variances exhibit a complicated progression of growth, particularly for low wave number errors. In a second experiment, cycling of the model and data assimilation over the same period is repeated, but using synthetic observations with different explicitly added observation errors having the same error variances as the control experiment, thus creating a different realization of the control. The forecast errors of the two experiments become more correlated during the early forecast period, with correlations increasing for up to 72 hours before beginning to decrease.

  19. Microgravity Experiments Safety and Integration Requirements Document Tree

    NASA Technical Reports Server (NTRS)

    Hogan, Jean M.

    1995-01-01

    This report is a document tree of the safety and integration documents required to develop a space experiment. Pertinent document information for each of the top level (tier one) safety and integration documents, and their applicable and reference (tier two) documents has been identified. This information includes: document title, revision level, configuration management, electronic availability, listed applicable and reference documents, source for obtaining the document, and document owner. One of the main conclusions of this report is that no single document tree exists for all safety and integration documents, regardless of the Shuttle carrier. This document also identifies the need for a single point of contact for customers wishing to access documents. The data in this report serves as a valuable information source for the NASA Lewis Research Center Project Documentation Center, as well as for all developers of space experiments.

  20. Documents, Practices and Policy

    ERIC Educational Resources Information Center

    Freeman, Richard; Maybin, Jo

    2011-01-01

    What are the practices of policy making? In this paper, we seek to identify and understand them by attending to one of the principal artefacts--the document--through which they are organised. We review the different ways in which researchers have understood documents and their function in public policy, endorsing a focus on content but noting that…